Showing posts sorted by date for query Mutation. Sort by relevance Show all posts
Showing posts sorted by date for query Mutation. Sort by relevance Show all posts

Monday, November 18, 2024

Atavism

Atavism (pronounced at-uh-viz-uhm)

(1) In biology (most often in zoology & botany), the reappearance in an individual of characteristics of some (typically) remote ancestor which have not manifested in intervening generations.

(2) An individual embodying such a reversion.

(3) Reversion to an earlier or more primitive type (a “throwback” in the vernacular).

(4) In sociology and political science, the recurrence or reversion to a past behavior, method, characteristic or style after a long period of absence, used especially of a reversion to violence.

1825-1830: The construct was the Latin atav(us) (great-great-great grandfather; remote ancestor, forefather” (the construct being at- (akin to atta (familiar name for a father) and used perhaps to suggest “beyond”)  + avus (grandfather, ancestor) + -ism.  The –ism suffix was from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Atavism & atavist are nouns, atavic, atavistic & atavistical are adjectives and atavistically is an adverb; the noun plural is atavisms.

The primitive Indo-European awo meant “adult male relative other than the father”, the most obvious descendent the modern “uncle”.  The English form was influenced by the French atavisme (the coining attributed usually to the botanist Antoine Nicolas Duchesne (1747-1827 Paris) and was first used in biology in the sense of “reversion by influence of heredity to ancestral characteristics, resemblance of a given organism to some remote ancestor, return to an early or original type”.  The adjective atavistic (pertaining to atavism) appeared in 1847, joined three year later by the now rare atavic (pertaining to a remote ancestor, exhibiting atavism).  Atavism (and its related forms) are none of those words which can be used as a neutral descriptor (notably in botany) or to denote something positive or negative.  Although the core meaning is always some “past or ancestral characteristic”, it tends to be pejorative if use of people or human cultures reverting to some “primitive characteristics” (especially if they be war or other forms of violence.  In the vernacular, the earthier “throwback” has been more common than the rather formal “atavistic” although the circumlocution “skip a generation” is often used for traits that occur after a generation of absence and “throwback” anyway became a “loaded” term because of its association with race (in the sense of skin-color).

Medicine has constructed its own jargon associated with the phenomenon in which an inherited condition appears to “skip a generation”: it’s described often as “autosomal recessive inheritance” or “incomplete penetrance”.  While the phrase “skipping a generation” is not uncommon in informal use, the actual mechanisms depend on the genetic inheritance pattern of the condition.  Autosomal Recessive Inheritance is defined as a “condition is caused by mutations in both copies of a specific gene” (one inherited from each parent).  This can manifest as an individual inheriting only one mutated copy (which means they will be a carrier but will remain asymptomatic) but if two carriers have issue, there is (1) a 25% chance the offspring will inherit both mutated copies and express the condition, (2) a 50% chance the offspring will be a carrier and (3) a 25% chance the offspring will inherit no mutations.  Thus, the condition may appear (and for practical purposes does) skip a generation in those cases where no symptoms exist; the classic examples include sickle cell anemia and cystic fibrosis.  Incomplete Penetrance occurs when an individual inherits a gene mutation which creates in them a genetic predisposition to a condition but symptoms do not develop because of environmental factors, other genetic influences or “mere chance” (and in the matter of diseases like those classified as “cancer”, the influence of what might be called “bad luck” is still probably underestimated, and certainly not yet statistically measured.  In such cases, the mutation may be passed to the next generation, where it might manifest, giving the appearance of skipping a generation and the BRCA1 & BRCA2 mutations for (hereditary) breast cancer are well-known examples.

Lindsay Lohan and her lawyer in court, Los Angeles, December, 2011.

In political science, “atavism” is used to refer to a reversion to older, more “primitive” means of furthering political ends.  Although it’s most associated with a critique of violence, political systems, ideologies, behaviors or economic policies have all be described as “atavistic” and their manifestation is linked often with ideas presented as representing (and implicitly offering a return to) a perceived “golden age”, a past structure which is idealized; it appear often as a reaction to change, notably modernity, globalization, or what is claimed to be a “decline in values”.  Political scientists identify stands in nominally non-violent atavism including: (1) Nostalgic Nationalism.  Nationalist movements are almost always race-based (in the sense of longing for a return to a “pure” ethnicity in which a population is “untainted” by ethnic diversity.  It’s usually a romanticization of a nation's past (historically, “purity” was less common than some like to believe) offering the hope of a return to traditional values, cultural practices, or forms of governance.  (2) Tribalism and Identity Politics. A call to primordial loyalties (such as ethnic or tribal identities), over modern, pluralistic, or institutional frameworks has been a feature of recent decades and was the trigger for the wars in the Balkans during the 1990s, the conflict which introduced to the language the euphemism “ethnic cleansing”, a very atavistic concept.  Tribalism and identity politics depends on group identities & allegiance overshadowing any broader civic or national unity on the basis of overturning an artificial (and often imposed) structure and returning to a pre-modern arrangement. (3) Anti-modernism or Anti-globalization. These are political threads which sound “recent” but both have roots which stretch back at least to the nineteenth century and Pius IX’s (1792–1878; pope 1846-1878) Syllabus Errorum (Syllabus of Errors, 1864) was one famous list of objections to change.  The strategy behind such atavism may be identifiably constant but tactics can vary and there’s often a surprising degree of overlap in the messaging of populists from the notional right & left which is hardly surprising given that in the last ten years both Donald Trump (b 1946; US president 2017-2021; president elect 2024) and Bernie Sanders (b 1941; senior US senator (Independent, Vermont) since 2007) honed their messaging to appeal to the same disgruntled mass.

Elizabeth Boody Schumpeter (1898-1953, left) & Joseph Schumpeter (1883–1950, right).  It was his third marriage.

Austrian political economist Joseph Schumpeter used the word “atavism” in his analysis of the dynamics which contributed to the outbreak of World War I (1914-1918), something he attributed to the old, autocratic regimes of Central and Eastern Europe “dragging the modern, liberal West” back in time.  Schumpeter believed that if commercial ties created interdependence between nations then armed conflict would become unthinkable and US author Thomas Friedman (b 1953) in The Lexus and the Olive Tree: Understanding Globalization (1999) suggested the atavistic tendency of man to go to war could be overcome by modern commerce making connectivity between economies so essential to the well-being of citizens that no longer would they permit war because such a thing would be so dangerous for the economy; it was an attractive argument because we have long since ceased to be citizens and are merely economic units.  Friedman’s theory didn’t actually depend on his earlier phrase which suggested: “…countries with McDonalds outlets don’t go to war with each other” but that was how readers treated it.  Technically, it was a bit of a gray area (Friedman treated the earlier US invasion of Panama (1989) as a police action) but the thesis was anyway soon disproved in the Balkans.  Now, Schumpeter and Friedman seem to be cited most often in pieces disproving their theses and atavism remains alive and kicking.

Friday, October 11, 2024

Floppy

Floppy (pronounced flop-ee)

(1) A tendency to flop.

(2) Limp, flexible, not hard, firm, or rigid; flexible; hanging loosely.

(3) In IT, a clipping of “floppy diskette”.

(4) In historic military slang (Apartheid-era South Africa & Rhodesia (now Zimbabwe), an insurgent in the Rhodesian Bush War (the “Second Chimurenga” (from the Shona chimurenga (revolution)) 1964-1979), the use a reference to the way they were (in sardonic military humor) said to “flop” when shot.

(5) In informal use, a publication with covers made with a paper stock little heavier and more rigid that that used for the pages; Used mostly for comic books.

(6) In slang, a habitué of a flop-house (a cheap hotel, often used as permanent or semi-permanent accommodation by the poor or itinerant who would go there to “flop down” for a night) (archaic).

(7) In slang, as “floppy cats”, the breeders’ informal term for the ragdoll breed of cat, so named for their propensity to “go limp” when picked up (apparently because of a genetic mutation).

1855-1860: The construct was flop + -y.  Flop dates from 1595–1605 and was a variant of the verb “flap” (with the implication of a duller, heavier sound).  Flop has over the centuries gained many uses in slang and idiomatic form but in this context it meant “loosely to swing; to flap about”.  The sense of “fall or drop heavily” was in use by the mid-1830s and it was used to mean “totally to fail” in 1919 in the wake of the end of World War I (1914-1918), the conflict which wrote finis to the dynastic rule of centuries also of the Romanovs in Russia, the Habsburgs in Austria-Hungary and the Ottomans in Constantinople although in the 1890s it was recorded as meaning “some degree of failure”.  The comparative is floppier, the superlative floppiest.  Floppy a noun & adjective, floppiness is a noun, flopped is a noun & verb, flopping is a verb, floppier& floppiest are adjectives and floppily is an adverb; the noun plural is floppies.  The adjective floppish is non-standard and used in the entertainment & publishing industries to refer to something which hasn’t exactly “flopped” (failed) but which had not fulfilled the commercial expectations.

Lindsay Lohan in "floppy-brim" hat, on-set during filming of Liz & Dick (2012).  In fashion, many "floppy-brim" hats actually have a stiff brim, formed in a permanently "floppy" shape.  The true "floppy hats" are those worn while playing sport or as beachwear etc.

The word is used as a modifier in pediatric medicine (floppy baby syndrome; floppy infant syndrome) and as “floppy-wristed” (synonymous with “limp-wristed”) was used as a gay slur.  “Flippy-floppy” was IT slang for “floppy diskette” and unrelated to the previous use of “flip-flop” or “flippy-floppy” which, dating from the 1880s was used to mean “a complete reversal of direction or change of position” and used in politics to suggest inconsistency.  In the febrile world of modern US politics, to be labelled a “flip-flopper” can be damaging because it carries with it the implication what one says can’t be relied upon and campaign “promises” might thus not be honored.  Whether that differs much from the politicians’ usual behaviour can be debated but still, few enjoy being accused of flip-floppery (definitely a non-standard noun).  The classic rejoinder to being called a flip-flopper is the quote: “When the facts change, I change my mind. What do you do, sir?”  That’s often attributed to the English economist and philosopher Lord Keynes (John Maynard Keynes, 1883-1946) but it was said originally by US economist Paul Samuelson (1915–2009) the 1970 Nobel laureate in Economics.  In the popular imagination Keynes is often the “go to” economist for quote attribution in the way William Shakespeare (1564–1616) is a “go to author” and Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) a “go to politician”, both credited with thing they never said but might have said.  I phraseology, the quality of “Shakespearian” or “Churchillian” not exactly definable but certainly recognizable.  In the jargon of early twentieth century electronics, a “flip-flop” was a reference to switching circuits that alternate between two states.

Childless cat lady Taylor Swift with her “floppy cat”, Benjamin Button (as stole).  Time magazine cover, 25 December 2023, announcing Ms Swift as their 2023 Person of the Year.  "Floppy cat" is the the breeders' informal term for the ragdoll breed an allusion to their tendency to “go limp” when picked up, a behavior believed caused by a genetic mutation.

The other use of flop in IT is the initialism FLOP (floating point operations per second).  Floating-point (FB) arithmetic (FP) a way of handling big real numbers using an integer with a fixed precision, scaled by an integer exponent of a fixed base; FP doesn’t really make possible what would not in theory be achievable using real numbers but does make this faster and practical and the concept became familiar in the 1980s when Intel made available FPUs (floating point units, also known as math co-processors) which could supplement the CPUs (central processing units) of their x86 family.  The 8087 FPU worked with the 8086 CPU and others followed (80286/80287, 80386/80387, i486/i487 etc) until eventually the FPU for the Pentium range was integrated into the CPU, the early implementation something of a debacle still used as a case study in a number of fields departments including management and public relations.

FLOPs are an expression of specific performance and are used to measure those computations requiring floating-point calculations (typically in math-intensive work) and for purposes of “benchmarking” or determining “real-world” performance under those conditions, it’s a more informative number than the traditional rating of instructions per second (iSec).  The FLOPs became something of a cult in the 1990s when the supercomputers of the era first breached the trillion FLOP mark and as speeds rose, the appropriate terms were created:

kiloFLOPS: (kFLOPS, 103)
megaflops: (MFLOPS, 106)
gigaflops: GFLOPS, 109)
teraflops: TFLOPS, 1012)
petaFLOPS: PFLOPS, 1015)
exaFLOPS: (EFLOPS, 1018)
zettaFLOPS: ZFLOPS, 1021)
yottaFLOPS: YFLOPS, 1024)
ronnaFLOPS: RFLOPS, 1027)
quettaFLOPS: QFLOPS, 1030)

In the mysterious world of quantum computing, FLOPs are not directly applicable because the architecture and methods of operation differ fundamentally from those of classical computers.  Rather than FLOPs, the performance of quantum computers tends to be measured in qubits (quantum bits) and quantum gates (the operations that manipulate qubits).  The architectural difference is profound and explained with the concepts of superposition and entanglement:  Because a qubit simultaneously can represent both “0” & “1” (superposition) and these can be can be entangled (a relationship in which distance is, at least in theory, irrelevant), under such multi-string parallelism, performance cannot easily be reduced to simple arithmetic or floating-point operations which remain the domain of classical computers which operate using the binary distinction between “0” (off) and “1” (on).

Evolution of the floppy diskette: 8 inch (left), 5¼ inch (centre) & 3½ inch (right).  The track of the floppy for the past half-century has been emblematic of the IT industry in toto: smaller, higher capacity and cheaper.  Genuinely it was one of the design parameters for the 3½ inch design that it fit into a man's shirt pocket.

In IT, the term “floppy diskette” used the WORM (write once, read many, ie "read only" after being written) principle first appeared in 1971 (soon doubtless clipped to “floppy” although the first known use of this dates from 1974).  The first floppy diskettes were in an 8 inch (2023 mm) format which may sound profligate for something with a capacity of 80 kB (kilobyte) but the 10-20 MB (megabit) hard drives of the time were typically the same diameter as the aperture of domestic front-loading washing machine so genuinely they deserved the diminutive suffix (-ette, from the Middle English -ette, a borrowing from the Old French -ette, from the Latin -itta, the feminine form of -ittus.  It was used to form nouns meaning a smaller form of something).  They were an advance also in convenience because until they became available, the usual way to transfer files between devices was to hard-wire them together.  Introduced by IBM in 1971, the capacity was two years later raised to 256 kB and by 1977 to a heady 1.2 MB (megabyte) with the advent of a double-sided, double-density format.  However, even then it was obvious the future was physically smaller media and in 1978 the 5¼ inch (133 mm) floppy debuted, initially with a formatted capacity of 360 kB but by 1982 this too had be raised to 1.2 MB using the technological advance if a HD (high density) file system and it was the 5¼ floppy which would become the first widely adopted industry “standard” for both home and business use, creating the neologism “sneakernet”, the construct being sneaker + net(work), the image being of IT nerds in their jeans and sneakers walking between various (unconnected) computers and exchanging files via diskette.  Until well into the twenty-first century the practice was far from functionally extinct and it persists even today with the use of USB sticks.

Kim Jong-un (Kim III, b 1982; Supreme Leader of DPRK (North Korea) since 2011) with 3½ inch floppy diskette (believed to be a HD (1.44 MB)).

The meme-makers use the floppy because it has become a symbol of technological bankruptcy. In OS (operating system) GUIs (graphical user interface) however, it does endure as the "save" icon and all the evidence to date does suggest that symbolic objects like icons do tend to outlive their source, thus the ongoing use in IT of analogue, rotary dial phones in iconography and the sound of a camera's physical shutter in smart phones.  Decades from now, we may still see representations of floppy diskettes.

The last of the mainstream floppy diskettes was the 3½ inch (89 mm) unit, introduced in 1983 in double density form with a capacity of 720 KB (although in one of their quixotic moves IBM used a unique 360 kB version for their JX range aimed at the educational market) but the classic 3½ was the HD 1.44 MB unit, released in 1986.  That really was the end of the line for the format because although in 1987 a 2.88 MB version was made available, few computer manufacturers offered the gesture of adding support at the BIOS (basic input output system) so adoption was infinitesimal.  The 3½ inch diskette continued in wide use and there was even the DMF (Distribution Media Format) with a 1.7 MB capacity which attracted companies like Microsoft, not because it wanted more space but to attempt to counter software piracy; within hours of Microsoft Office appearing in shrink-wrap with, copying cracks appeared on the bulletin boards (where nerds did stuff before the www (worldwideweb).  It was clear the floppy diskette was heading for extinction although slighter larger versions with capacities as high as 750 MB did appear but, expensive and needing different drive hardware, they were only ever a niche product seen mostly inside corporations.  By the time the CD-ROM (Compact Disc-Read-only Memory) reached critical mass in the mid-late 1990s the once ubiquitous diskette began rapid to fade from use, the release in the next decade of the USB sticks (pen drives) a final nail in the coffin for most.

In the mid 1990s, installing OS/2 Warp 4.0 (Merlin) with the optional packs and a service pack could require a user to insert and swap up to 47 diskettes.  It could take hours, assuming one didn't suffer the dreaded "floppy failure".

That was something which pleased everyone except the floppy diskette manufacturers who had in the early 1990s experienced a remarkable boom in demand for their product when Microsoft Windows 3.1 (7 diskettes) and IBM’s OS/2 2.0 (21 diskettes) were released. Not only was the CD-ROM a cheaper solution than multiple diskettes (a remarkably labor-intensive business for software distributors) but it was also much more reliable, tales of an installation process failing on the “final diskette” legion and while some doubtlessly were apocryphal, "floppy failure" was far from unknown.  By the time OS/2 Warp 3.0 was released in 1994, it required a minimum of 23 floppy diskettes and version 4.0 shipped with a hefty 30 for a base installation.  Few mourned the floppy diskette and quickly learned to love the CD-ROM.

What lay inside a 3½ inch floppy diskette.

Unlike optical discs (CD-ROM, DVD (Digital Versatile Disc) & Blu-Ray) which were written and read with the light of a laser, floppy diskettes were read with magnetic heads.  Inside the vinyl sleeve was a woven liner impregnated with a lubricant, this to reduce friction on the spinning media and help keep the surfaces clean.

Curiously though, niches remained where the floppy lived on and it was only in 2019 the USAF (US Air Force) finally retired the use of floppy diskettes which since the 1970s had been the standard method for maintaining and distributing the data related to the nation’s nuclear weapons deployment.  The attractions of the system for the military were (1) it worked, (2) it was cheap and (3) it was impervious to outside tampering.  Global thermo-nuclear war being a serious business, the USAF wanted something secure and knew that once data was on a device in some way connected to the outside world there was no way it could be guaranteed to be secure from those with malign intent (ayatollahs, the Secret Society of the Les Clefs d'Or, the CCP (Chinese Communist Party), the Freemasons, those in the Kremlin or Pyongyang et al) whereas a diskette locked in briefcase or a safe was, paradoxically, the state of twenty-first century security, the same philosophy which has seen some diplomatic posts in certain countries revert to typewriters & carbon paper for the preparation of certain documents.  In 2019 however, the USAF announced that after much development, the floppies had been retired and replaced with what the Pentagon described as a “highly-secure solid-state digital storage solution which work with the Strategic Automated Command and Control System (SACCS).

It can still be done: Although no longer included in PCs & laptops, USB floppy diskette drives remain available (although support for Windows 11 systems is said to be "inconsistent").  Even 5¼ inch units have been built.

It thus came as a surprise in 2024 to learn Japan, the nation which had invented motorcycles which didn’t leak oil (the British though they’d proved that couldn’t be done) and the QR (quick response) code, finally was abandoning the floppy diskette.  Remarkably, even in 2024, the government of Japan still routinely asked corporations and citizens to submit documents on floppies, over 1000 statutes and regulations mandating the format.  The official in charge of updating things (in 2021 he’d “declared war” on floppy diskettes) in July 2024 announced “We have won the war on floppy disks!” which must have be satisfying because he’d earlier been forced to admit defeat in his attempt to defenestrate the country’s facsimile (fax) machines, the “pushback” just too great to overcome.  The news created some interest on Japanese social media, one tweet on X (formerly known as Twitter) damning the modest but enduring floppy as a “symbol of an anachronistic administration”, presumably as much a jab at the “tired old men” of the ruling LDP (Liberal Democratic Party) as the devices.  There may however been an element of technological determinism in the reform because Sony, the last manufacturer of the floppy, ended production of them in 2011 so while many remain extant, the world’s supply is dwindling.  In some ways so modern and innovative, in other ways Japanese technology sometimes remains frozen, many businesses still demanding official documents to be endorsed using carved personal stamps called the印鑑 (ikan) or 判子 (hanko); despite the government's efforts to phase them out, their retirement is said to be proceeding at a “glacial pace”.  The other controversial aspect of the hanko is that the most prized are carved from ivory and it’s believed a significant part of the demand for black-market ivory comes from the hanko makers, most apparently passing through Hong Kong, for generations a home to “sanctions busters”.

Sunday, September 15, 2024

Cynophagia

Cynophagia (pronounced)

The practice of eating dog meat.

Late 1700-early 1800s: The construct was cyno- + phagia.  Cyno was a combining form of the Ancient Greek κύων (kúōn or kýon) (dog) and the suffix –phagia was from the Ancient Greek -φαγία (-phagía) (and related to -φαγος (-phagos) (eater)), corresponding to φαγεῖν (phageîn) (to eat), infinitive of ἔφαγον (éphagon) (I eat), which serves as infinitive aorist for the defective verb ἐσθίω (esthíō) (I eat).  In English, use is now most frequent in mental health to reference the consumption of untypical items.  Being a cynophagist (a person who engages in cynophagia) is not synonymous with being a cynophile (a person who loves canines) although it’s not impossible there may be some overlap in the predilections.  The construct was cyno- +‎ -phile.  The –phile suffix was from the Latin -phila, from the Ancient Greek φίλος (phílos). (dear, beloved) and was used to forms noun & adjectives to convey the meanings “loving”, “friendly”, “admirer” or “friend”.  In the context of metal health, the condition would be described as cynophilia.  The -philia suffix was from the Ancient Greek φιλία (philía) (fraternal) love).  It was used to form nouns conveying a liking or love for something and in clinical use was applied often to an abnormal or obsessive interest, especially if it came to interfere with other aspects of life (the general term is paraphilia).  The companion suffix is the antonym -phobia. The related forms are the prefixes phil- & philo- and the suffixes -philiac, -philic, -phile & -phily.  Cynophagia, cynophagy, cynophagism & cynophagist are nouns and cynophagic is an adjective; the noun plural is cynophagists.

The word cynophagia was coined as part of the movement in European scholarship in the late eighteenth & early nineteenth centuries which used words from classical languages (Ancient Greek & Latin) as elements to create the lexicon of “modern” science & medicine, reflecting the academic & professional reverence for the supposed purity of the Ancient world.  The reason there was a cynophagia but not a “ailourphagia” (which would have meant “the practice of eating cat meat”) is probably because while the reports from European explorers & colonial administrators would have sent from the orient many reports of the eating of dogs, there were likely few accounts of felines as food.  The construct of “ailourphagia” would have been ailour-, from the Ancient Greek αἴλουρος (aílouros) (cat) + phagia.  The Greek elements of ailouros were aiolos (quick-moving or nimble) & oura (tail), the allusion respectively to the agility of cats and their characteristic tail movements.  There are of course ailurophiles (one especially fond of cats), notably the "childless cat ladies" and disturbingly, there's also paedophage (child eater). 

Historically, east of Suez, consuming dog meat was not uncommon and in some cultures it was a significant contribution to regional protein intake while in other places it was either unlawful of taboo.  Carnivorism (the practice of eating meat) is an almost universal human practice but what is acceptable varies between cultures.  Some foods are proscribed (such as shellfish or pig-meat) and while it’s clear the origin of this was as a kind of “public heath” measure (the rules created in hot climates in the pre-refrigeration age) but the observance became a pillar of religious observance.  Sometimes, a similar rule seems originally to have had an economic imperative such as the Hindu restriction on the killing of cattle for consumption, thus the phrase “sacred cow”, the original rationale being the calculation the live beasts made an economic contribution which much outweighed their utility as a protein source.  So, what is thought acceptable and not is a cultural construct and that varies from place-to-place, the Western aversion to eating cats & dogs attributable to the sentimental view of them which has evolved because of the role for millennia as domestic pets.  Over history, it’s likely every animal in the world has at some point been used as a food source, some an acquired taste such as the “deep fried tarantula” which, long a tasty snack in parts of Cambodia, became a novelty item in Cambodian restaurants in the West.  There are though probably some creatures which taste so awful they’re never eaten, such as parrots which ate the seeds of tobacco plants, lending their flesh a “distinctive flavor”.  The recipe for their preparation was:

(1) Place plucked parrot and an old boot in vat of salted water and slow-cook for 24 hours.
(2) After 24 hours remove parrot & boot.
(3) Throw away parrot and eat old boot.

Analysts had expected “more of the same” from Donald Trump (b 1946; US president 2017-2021) in his debate with Kamala Harris (b 1964; US vice president since 2021): the southern border, illegal immigrants, inflation et al.  What none predicted was that so much of the post-debate traffic would be about Mr Trump’s assertion Haitian immigrants in Springfield, Ohio (one of literally dozens of localities in the country so named, one factor which influenced it becoming the name of the town in the Fox cartoon series The Simpsons) were eating the pets of the residents (ie their cats & dogs).  As racist tropes go, it followed the script in terms of the “otherness, barbarism, incompatibility” etc of “outsiders in our midst” although there seemed to be nothing to suggest there was any tradition of such consumption in Haiti.  Still, at least it was something novel and it wasn’t the first time pet cats had been mentioned in the 2024 presidential campaign, Mr Trump’s choice of running mate as JD Vance (b 1984; US senator (Republican-Ohio) since 2023) bring renewed attention to the latter’s 2021 interview then Fox News host Tucker Carlson (b 1969) in which he observed the US had fallen into the hands of corporate oligarchs. Radical Democratic Party politicians and “…a bunch of childless cat ladies who are miserable at their own lives and the choices that they've made and so they want to make the rest of the country miserable, too.

Eventually, that would be answered by the childless cat ladies, notably the most famous: the singer Taylor Swift who posted an endorsement of Kamala Harris, posing with Benjamin Button, the Ragdoll she adopted in 2019.  Benjamin Button was no stranger to fame, the seemingly nonplussed puss appearing of the cover announcing Ms Swift as Time magazine’s 2023 Person of the Year.

Childless cat lady Taylor Swift with ragdoll Benjamin Button (as stole).  Ragdoll cats make good stoles because (apparently because of a genetic mutation), they tend to "go limp" when picked up.  

Ms Swift is of course a song-writer so well accustomed to crafting text to achieve the desired effect and one word nerd lawyer quickly deconstructed, much taken by the first three paragraphs which interlaced the first person (“I” & “me/my”) and the “you” while avoiding starting any sentence with “I” (a technique taught as a way of conveying “objectivity”) until the she announces her conclusion:

 Like many of you, I watched the debate tonight. If you haven’t already, now is a great time to do your research on the issues at hand and the stances these candidates take on the topics that matter to you the most. As a voter, I make sure to watch and read everything I can about their proposed policies and plans for this country.

Recently I was made aware that AI of ‘me’ falsely endorsing Donald Trump’s presidential run was posted to his site. It really conjured up my fears around AI, and the dangers of spreading misinformation. It brought me to the conclusion that I need to be very transparent about my actual plans for this election as a voter. The simplest way to combat misinformation is with the truth.

I will be casting my vote for Kamala Harris and Tim Walz in the 2024 Presidential Election. I’m voting for @kamalaharris because she fights for the rights and causes I believe need a warrior to champion them. I think she is a steady-handed, gifted leader and I believe we can accomplish so much more in this country if we are led by calm and not chaos. I was so heartened and impressed by her selection of running mate @timwalz, who has been standing up for LGBTQ+ rights, IVF, and a woman’s right to her own body for decades.

So, a classic example of a technique which might be used by someone disinterested: two premises which lead to a conclusion, the rhythm of the lyric being “I, I, you, you, you.”  Then, after the “you, you, you” of the “discussion” has made it clear where her focus is, every sentence in the third paragraph begins with “I”, emulation a cadence which might appear in a musical track: “I’ve done my research, and I’ve made my choice. Your research is all yours to do, and the choice is yours to make.  One can see why her songs are said to be so catchy.

The intervention of Ms Swift and Benjamin Button produced reactions. 

Newspapers haven’t always been effective in changing voting intentions or nudging governments in particular public policy directions.  During the inter-war years the Beaverbrook (the Daily & Sunday Express and the less disreputable Evening Standard) press in the UK ran a long and ineffective campaign promoting “empire free trade” and the evidence suggests the editorial position a publication adopted to advocate its readers vote one way or the other was more likely to reflect than shift public opinion.  One reason is that in the West, while politics is very interested in the people, the people tend not to be interested in politics and most thoughtful editorials are barely read.  People are however rabid consumers of popular culture and one opposition leader would later claim an interview a woman’s magazine conducted with his (abandoned) ex-wife did him more political damage than anything written by political or economics reporters, however critical.  With 283 million followers on Instagram (Ms Harris has 18 million), Ms Swift’s intervention may prove decisive if she shifts just a few votes in the famous “battleground states”.

Celebrity endorsements are not unusual; some successful, some not.  In 2016, Lindsay Lohan endorsed crooked Hillary Clinton (who did win the popular vote so there was that).

Whether Ms Swift’s endorsement of Kamala Harris will shift many opinions isn’t known (many analysts concluding the electorate long ago coalesced into “Trump” & “anti-Trump” factions) but the indications are she may have been remarkably effective in persuading to vote those who may not otherwise have bothered, the assumption being most of these converts to participation will follow her lead and it’s long been understood that to win elections in the US, the theory is simple: get those who don’t vote to vote for you.  In practice, that has been difficult to achieve at scale (the best executions in recent years by the campaign teams of George W Bush (George XLIII, b 1946; US president 2001-2009) in 2004 and Barack Obama (b 1961; US president 2009-2017) in 2008.

However, in including a custom URL which directed people to vote.gov where they could register to vote produced a spike in voter registration, the US General Services Administration (GSA) revealing an “unprecedented” 338,000-odd unique visits to their portal in the hours after Ms Swift’s post.  Although the “shape” of the hits isn’t known, most seem to be assuming that (as well as some childless cat ladies), those who may be voting for the first time will tend to be (1) young and (2) female, reflecting the collective profile of Ms Swift’s “Swifties”.  They are the demographic the Democratic Party wants.  The GSA called it the “Swift effect” and added that while in the past there had been events which produced smaller spikes, they were brief in duration unlike the Swifties woh for days kept up the traffic, the aggregate numbers dwarfing even the “intensity and enthusiasm” in the wake of the US Supreme Court (SCOTUS) overturning Roe v Wade (1973) prior to the 2022 mid-term congressional elections.

In an interview with JD Vance, Fox News asked what he thought might be the significance of Ms Swift mobilizing the childless cat lady vote and he responded: “We admire Taylor Swift’s music. But I don’t think most Americans, whether they like her music, or are fans of hers or not, are going to be influenced by a billionaire celebrity who I think is fundamentally disconnected from the interests and problems of most people.  When grocery prices go up by 20 per cent, it hurts most Americans. It doesn’t hurt Taylor Swift. When housing prices become unaffordable, it doesn’t affect Taylor Swift, or any other billionaire.  Fox News choose not to pursue the matter of whether self-described “billionaire celebrity” Donald Trump could be said to be “…fundamentally disconnected from the interests and problems of most people.

In “damage-limitation” mode, the Trump campaign mobilized generative AI in an attempt to re-capture the childless cat lady vote.  After the debate, Mr Trump had added geese to the alleged diet of Springfield’s Haitian residents.

Mr Trump may have himself to blame for Ms Swift’s annoying endorsement because he’d earlier posted fake, AI-generated images on his social media platform, Truth Social, suggesting she’d urged her the Swifties to vote for him.  Such things were of course not foreseen by the visionary AI (artificial intelligence) researchers of the 1950s, the genie is out of the bottle and given that upholding the “freedom of speech” guaranteed by the First Amendment to the constitution is one of the few things on which the SCOTUS factions agree, the genie is not going back.

The meme-makers have really taken to generative AI.

So while generative AI doesn’t allow mean the meme makers can suddenly create images once impossible, it does mean they can be produced by those without artistic skills or specialized resources and the whole matter of the culinary preferences of Haitians in Ohio is another blow for the state.  It was only in May 2024 that a number of schools in issued a ban on Gen Alpha slang terms including:

Ohio: It means “bad” with all that implies (dull, boring, ugly, poor etc).  Because of the way language evolves, it may also come to mean “people who eat pet cats & dogs”.  The implication is it’s embarrassing to be from Ohio.

Skibidi: A reference to a viral meme of a person’s head coming out of a toilet; it implies the subject so described is “weird”.

Sigma: Unrelated to the 18th letter of the Greek alphabet, it’s been re-purposed as a rung on the male social hierarchy somewhat below the “alpha-male”.

Rizz: This one has a respectable pedigree, being the the Oxford English Dictionary’s (OED) 2023 word of the year.  It’s said technically to be a “Gen Z word”, short for “charisma”.  It has been banned because Gen Alpha like to use it in the negative (ie “lacking rizz”; “no rizz” etc).

Mewing: A retort or exclamation used to interrupt someone who is complaining about something trivial.  Gen Alpha are using it whenever their teachers say something they prefer not discuss.

Gyatt: A woman with a big butt, said originally based on the expression “goddam your ass thick.”

Bussin’: “Good, delicious, high quality” etc.

Baddie: A tough, bolshie girl who “doesn’t take shit form no one”.  It’s a similar adaptation of meaning to a term like “filth” which means “very attractive”.

Wednesday, June 26, 2024

Mutation

Mutation (pronounced myoo-tey-shuhn)

(1) In biology (also as “break”), a sudden departure from the parent type in one or more heritable characteristics, caused by a change in a gene or a chromosome.

(2) In biology, (also as “sport”), an individual, species, or the like, resulting from such a departure.

(3) The act or process of mutating; change; alteration.

(4) A resultant change or alteration, as in form or nature.

(5) In phonetics (in or of Germanic languages), the umlaut (the assimilatory process whereby a vowel is pronounced more like a following vocoid that is separated by one or more consonants).

(6) In structural linguistics (in or of Celtic languages), syntactically determined morphophonemic phenomena that affect initial sounds of words (the phonetic change in certain initial consonants caused by a preceding word).

(7) An alternative word for “mutant”

(8) In cellular biology & genetics, a change in the chromosomes or genes of a cell which, if occurring in the gametes, can affect the structure and development of all or some of any resultant off-spring; any heritable change of the base-pair sequence of genetic material.

(9) A physical characteristic of an individual resulting from this type of chromosomal change.

(10) In law, the transfer of title of an asset in a register.

(11) In ornithology, one of the collective nouns for the thrush (the more common forms being “hermitage” & “rash”)

1325–1375: From the Middle English mutacioun & mutacion (action or process of changing), from the thirteenth century Old French mutacion and directly from the Latin mūtātion- (stem of mūtātiō) (a changing, alteration, a turn for the worse), noun of action from past-participle stem of mutare (to change), from the primitive Indo-European root mei- (to change, go, move).  The construct can thus be understood as mutat(e) +ion.  Dating from 1818, the verb mutate (to change state or condition, undergo change) was a back-formation from mutation.  It was first used in genetics to mean “undergo mutation” in 1913.  The –ion suffix was from the Middle English -ioun, from the Old French -ion, from the Latin -iō (genitive -iōnis).  It was appended to a perfect passive participle to form a noun of action or process, or the result of an action or process. The use in genetics in the sense of “process whereby heritable changes in DNA arise” dates from 1894 (although the term "DNA" (deoxyribonucleic acid) wasn't used until 1938 the existence of the structure (though not its structural detail) was fist documented in 1869 after the identification of nuclein).  In linguistics, the term “i-mutation” was first used in 1874, following the earlier German form “i-umlaut”, the equivalent in English being “mutation”.  The noun mutagen (agent that causes mutation) was coined in 1946, the construct being muta(tion) + -gen.  The –gen suffix was from the French -gène, from the Ancient Greek -γενής (-gens).  It was appended to create a word meaning “a producer of something, or an agent in the production of something” and is familiar in the names of the chemical elements hydrogen, nitrogen, and oxygen.  From mutagen came the derive forms mutagenic, mutagenesis & mutagenize.  Mutation, mutationist & mutationism is a noun, mutability is a noun, mutable & mutant are nouns & adjectives, mutated & mutating are verbs & adjectives, mutational & mutationistic are adjective and mutationally is an adverb; the noun plural is mutations.  For whatever reasons, the adverb mutationistically seems not to exist.

In scientific use the standard abbreviation is mutat and forms such as nonmutation, remutation & unmutational (used both hyphenated and not) are created as required and there is even demutation (used in computer modeling).  In technical use, the number of derived forms is vast, some of which seem to enjoy some functional overlap although in fields like genetics and cellular biology, the need for distinction between fine details of process or consequence presumably is such that the proliferation may continue.  In science and linguistics, the derived forms (used both hyphenated and not) include animutation, antimutation, backmutation, e-mutation, ectomutation, endomutation, epimutation, extramutation, frameshift mutation, hard mutation, heteromutation, homomutation, hypermutation, hypomutation, i-mutation, intermutation, intramutation, intromutation, macromutation, macromutational, megamutation, mesomutation, micromutation, missense mutation, mixed mutation, multimutation, mutationless, mutation pressure, nasal mutation, neomutation, nonsense mutation, oncomutation, paramutation. Pentamutation, phosphomutation. point mutation, postmutation, premutation, radiomutation, retromutation, soft mutation, spirant mutation, stem mutation, stereomutation, ultramutation & vowel mutation.

Ginger, copper, auburn & chestnut are variations on the theme of red-headedness: Ranga Lindsay Lohan demonstrates the possibilities.

Red hair is the result of a mutation in the melanocortin 1 receptor (MC1R) gene responsible for producing the MC1R protein which plays a crucial role also in determining skin-tone. When the MC1R gene is functioning normally, it helps produce eumelanin, a type of melanin that gives hair a dark color.  However, a certain mutation in the MC1R gene leads to the production of pheomelanin which results in red hair.  Individuals with two copies of the mutated MC1R gene (one from each parent) typically have red hair, fair skin, and a higher sensitivity to ultraviolet (UV) light, a genetic variation found most often in those of northern & western European descent.

A mutation is a change in the structure of the genes or chromosomes of an organism and mutations occurring in the reproductive cells (such as an egg or sperm), can be passed from one generation to the next.  It appears most mutations occur in “junk DNA” and the orthodox view is these generally have no discernible effects on the survivability of an organism.  The term junk DNA was coined to describe those portions of an organism's DNA which do not encode proteins and were thought to have no functional purpose (although historically there may have been some).  The large volume of these “non-coding regions” surprised researchers when the numbers emerged because the early theories had predicted they would comprise a much smaller percentage of the genome.  The term junk DNA was intentionally dismissive and reflected the not unreasonable assumption the apparently redundant sequences were mere evolutionary “leftovers” without an extant biological function of any significance.

However, as advances in computing power have enabled the genome further to be explored, it’s been revealed that many of these non-coding regions do fulfil some purpose including: (1) A regulatory function: (the binary regulation of gene expression, influencing when, where, and how genes are turned on or off; (2) As superstructure: (Some regions contribute to the structural integrity of chromosomes (notably telomeres and centromeres); (3) In RNA (ribonucleic acid) molecules: Some non-coding DNA is transcribed into non-coding RNA molecules (such as microRNAs and long non-coding RNAs), which are involved in various cellular processes; (4) Genomic Stability: It’s now clear there are non-coding regions which contribute to the maintenance of genomic stability and the protection of genetic information.  Despite recent advances, the term junk DNA is still in use in mapping but is certainly misleading for those not immersed in the science; other than in slang, in academic use and technical papers, “non-coding DNA” seems now the preferred term and where specific functions have become known, these regions are described thus.

There’s also now some doubt about the early assumptions that of the remaining mutations, the majority have harmful effects and only a minority operate to increase an organism's ability to survive, something of some significance because a mutation which benefits a species may evolve by means of natural selection into a trait shared by some or all members of the species.  However, there have been suggestions the orthodox view was (at least by extent) influenced by the slanting of the research effort towards diseases, syndromes and other undesirable conditions and that an “identification bias” may thus have emerged.  So the state of the science now is that there are harmful & harmless mutations but there are also mutations which may appear to have no substantive effect yet may come to be understood as significant, an idea which was explored in an attempt to understand why some people found to be inflected with a high viral-load of SARS-Cov-2 (the virus causing Covid-19) remained asymptomatic.

In genetics, a mutation is a change in the DNA sequence of an organism and it seems they can occur in any part of the DNA and can vary in size and type.  Most associated with errors during DNA replication, mutations can also be a consequence of viral infection or exposure to certain chemicals or radiation, or as a result of viral infections.  The classification of mutations has in recent years been refined to exist in three categories:

(1) By the Effect on DNA Sequence:  These are listed as Point Mutations which are changes in a single nucleotide and include (1.1) Substitutions in which one base pair is replaced by another, (1.2) Insertions which describe the addition of one or more nucleotide pairs and (1.3) Deletions, the removal of one or more nucleotide pairs.

(2) By the Effect on Protein Sequence: These are listed as: (2.1) Silent Mutations which do not change the amino acid sequence of the protein, (2.2) Missense Mutations in which there is a change one amino acid in the protein, potentially affecting its function, (2.3) Nonsense Mutations which create a premature stop codon, leading to a truncated and usually non-functional protein and (2.4) Frameshift Mutations which result from insertions or deletions that change the reading frame of the gene, often leading to a completely different and non-functional protein.

(3) By the Effect on Phenotype: These are listed as (3.1) Beneficial Mutations which provide some advantage to the organism, (3.2) Neutral Mutations which have no apparent significant effect on the organism's fitness and (3.3) Deleterious Mutations which are harmful to the organism and can cause diseases or other problems.

(4) By the Mechanism of Mutation: These are listed as (4.1) Spontaneous Mutations which occur naturally without any external influence, due often to errors in DNA replication and (4.2) Induced Mutations which result from exposure to mutagens environmental factors such as chemicals or radiation that can cause changes in DNA),

Because of the association with disease, genetic disorders and disruptions to normal biological functions, in the popular imagination mutations are thought undesirable.  They are however a crucial part of the evolutionary process and life on this planet as it now exists would not be possible without the constant process of mutation which has provided the essential genetic diversity within populations and has driven the adaptation and evolution of species.  Although it will probably never be known if life on earth started and died out before beginning the evolutionary chain which endures to this day, as far as is known, everything now alive (an empirically, that means in the entire universe) ultimately has a single common ancestor.  Mutations have played a part in the diversity which followed and of all the species which once have inhabited earth, a tiny fraction remain, the rest extinct.

Nuclear-induced mutations

Especially since the first A-Bombs were used in 1945, the idea of “mutant humans” being created by the fallout from nuclear war or power-plants suffering a meltdown have been a staple for writers of science fiction (SF) and producers of horror movies, the special-effects and CGI (computer generated graphics) crews ever imaginative in their work.  The fictional works are disturbing because radiation-induced human mutations are not common but radiation can cause changes in DNA, leading to mutations and a number of factors determine the likelihood and extent of damage.  The two significant types of radiation are: (1) ionizing radiation which includes X-rays, gamma rays, and particles such as alpha and beta particles.  Ionizing radiation has enough energy to remove tightly bound electrons from atoms, creating ions and directly can damage DNA or create reactive oxygen species that cause indirect damage.  In high doses, ionizing radiation can increase the risk of cancer and genetic mutations and (2) non-ionizing radiation which includes ultraviolet (UV) light, visible light, microwaves, and radiofrequency radiation.  Because this does not possess sufficient energy to ionize atoms or molecules, which there is a risk of damage to DNA (seen most typically in some types of skin cancer), but the risk of deep genetic mutations is much lower than that of ionizing radiation.  The factors influencing the extent of damage include the dose, duration of exposure, the cell type(s) affected, a greater or lesser genetic predisposition and age.

Peter Dutton (b 1970; leader of the opposition and leader of the Australian Liberal Party since May 2022) announces the Liberal Party's new policy advocating the construction of multiple nuclear power-plants in Australia.

The prosthetic used in the digitally-altered image (right) was a discarded proposal for the depiction of Lord Voldemort in the first film version of JK Rowling's (b 1965) series of Harry Potter children's fantasy novels; it used a Janus-like two-faced head.  It's an urban myth Mr Dutton auditioned for the part when the first film was being cast but was rejected as being "too scary".  If ever there's another film, the producers might reconsider and should his career in politics end (God forbid), he could bring to Voldemort the sense of menacing evil the character has never quite achieved.  Interestingly, despite many opportunities, Mr Dutton has never denied being a Freemason.

On paper, while not without challenges, Australia does enjoy certain advantages in making nuclear part of the energy mix: (1)  With abundant potential further to develop wind and solar generation, the nuclear plants would need only to provide the baseload power required when renewable sources were either inadequate or unavailable; (2) the country would be self-sufficient in raw uranium ore (although it has no enrichment capacity) and (3) the place is vast and geologically stable so in a rational world it would be nominated as the planet's repository of spent nuclear fuel and other waste.  The debate as it unfolds is likely to focus on other matters and nobody images any such plant can in the West be functioning in less than twenty-odd years (the Chinese Communist Party (CCP) gets things done much more quickly) so there's plenty of time to squabble and plenty of people anxious to join in this latest theatre of the culture wars.  Even National Party grandee Barnaby Joyce (b 1967; thrice (between local difficulties) deputy prime minister of Australia 2016-2022) has with alacrity become a champion of all things nuclear (electricity, submarines and probably bombs although, publicly, he seems not to have discussed the latter).  The National Party has never approved of solar panels and wind turbines because they associate them with feminism, seed-eating veganshomosexuals and other symbols of all which is wrong with modern society.  While in his coal-black heart Mr Joyce's world view probably remains as antediluvian as ever, he can sniff the political wind in a country now beset by wildfires, floods and heatwaves and talks less of the beauty of burning fossil fuels.  Still, in the wake of Mr Dutton's announcement, conspiracy theorists have been trying to make Mr Joyce feel better, suggesting the whole thing is just a piece of subterfuge designed to put a spanner in the works of the transition to renewable energy generation, the idea being to protect the financial positions of those who make much from fossil fuels, these folks being generous donors to party funds and employers of "helpful" retired politicians in lucrative and undemanding roles.