Showing posts with label Engineering. Show all posts
Showing posts with label Engineering. Show all posts

Friday, April 19, 2024

Rabbit

Rabbit (pronounced rab-it)

(1) Any of several soft-furred, large-eared, rodentlike burrowing mammals of the family Leporidae, allied with the hares and pikas in the order Lagomorpha, having a divided upper lip and long hind legs, usually smaller than the hares and mainly distinguished from them by bearing blind and furless young in nests rather than fully developed young in the open.

(2) Any of various small hares.

(3) The fur of a rabbit or hare, often processed to imitate another fur.

(4) A runner in a distance race whose goal is chiefly to set a fast pace, either to exhaust a particular rival so that a teammate can win or to help another entrant break a record; pacesetter.

(5) In sport, a person poor at a sport; in cricket specifically, an unskilled batter (also as “batting bunny”, usually clipped to bunny).

(5) As Welsh rabbit, an alternative form of Welsh rarebit & Welsh ribbit (A snack made of cheese melted with a little ale and served on toast).  Welsh rabbit was the original form but was erroneously marked as a corruption in a dictionary published in 1785 although it’s not clear if the editor made the assumption or drew the conclusion from oral evidence.

(6) In nuclear engineering, a pneumatically-controlled tool used to insert small samples of material inside the core of a nuclear reactor.

(7) In computing theory, a large element at the beginning of a list of items to be bubble sorted, and thus tending to be quickly swapped into the correct position.

(8) In northern English regional slang, as “rabbit catcher”, a midwife or one who by force of circumstance assists in the delivery of a baby.

(9) As “rabbit ears”, the indoor dipole television antenna which typically sat atop the early analogue sets which received a terrestrial signal.

(10) Incessantly or nonsensically to talk.

(11) To hunt rabbits.

(12) In US slang, to flee.

1375-1425: From the late Middle English rabet & rabette, from the Anglo-Latin rabettus, from the Middle French rabouillet (baby rabbit), from the dialectal Old North French rabotte, probably a diminutive of Middle Dutch or West Flemish robbe (rabbit, seal), of uncertain origin but which may be an imitative verb (perhaps robben or rubben (to rub)) and used to allude to a characteristic of the animal.  The related forms include the French rabot (plane), the Middle Dutch robbe (rabbit; seal (from which Modern Dutch gained rob (seal (also “rabbit”), the Middle Low German robbe & rubbe (rabbit), the later Low German Rubbe (seal), the West Frisian robbe (seal), the Saterland Frisian Rubbe (seal) and the North Frisian rob (“seal”) eventually borrowed as the German Robbe (seal).  Early dictionary editors thus described the word as “a Germanic noun with a French suffix”.  Rabbit is a noun & verb, rabbitiness is a noun, rabbited is a verb, rabbitlike & rabbity are adjectives and rabbiting is a noun & verb; the noun plural is rabbits and (especially in the collective) rab·bit.

Lindsay Lohan with rabbit.

Until the late nineteenth century, the meaning was exclusively what would now be understood as “a young rabbit” but it came to be used of the whole species, replacing the original coney, owing to the latter's resemblance to and use as a euphemism for cunny (“vulva” and linked obviously with “cunt” although despite that the preferred slang with some zoological allusion came to include “beaver”, “camel toe” and (especially) “pussy, rather than “bunny”).  The noun coney dates from the early thirteenth century and was abstracted from the Anglo-French conis and the Old French coniz, (plurals of conil (long-eared rabbit; (Lepus cunicula)) from the Latin cuniculus, the source also of the Spanish conejo, the Portuguese coelho and the Italian coniglio), the small, Spanish variant of the Italian hare (Latin lepus).  The word may ultimately be from the Iberian Celtic although classical writers said it was Hispanic.  In Middle English the two forms were cony & conny (the derivations including coning, cunin & conyng) while the Old French had conil alongside conin.  The evolution seems to be that the plural form conis (from conil, with the -l- elided) was taken into English and regularly single-ized as cony.  The Old French form was borrowed in the Dutch konijn and the German Kaninchen (a diminutive), and is preserved in the surname Cunningham (from a place-name in Ayrshire).  Rabbits not being native to northern Europe, there was no Germanic word for them.  In the fourteenth century “rabbit” came to describe the young of the species and over the centuries came to supplant coney, a process complete by the early nineteenth.  It was another of those exercises in sanitization because in English & Welsh slang, coney had been adopted as a punning synonym for cunny (cunt).  That was complicated by it appearing in the Book of Proverbs in the King James Version of the Bible (KJV, 1611) so the work-around was to change the pronunciation of the original short vowel (rhyming with honey, money) to rhyme with bony, stony.  In the Old Testament, the word translates the Hebrew shaphan (rock-badger).

When Volkswagen in 1974 introduced the Golf in the North American market, it was named the Rabbit, apparently because it would thought the name would suggest qualities such as “agility, speed & playfulness” which were positive attributes in what was then (by US standards) a very small car, much smaller than the more recent versions.  Because of the international success of the Golf, when the revised model was released in 1983, the North American cars switched to that name and it’s been marketed that way since except between 2003-2008 when the Rabbit badge was revived.  The revival was in retrospect a curious choice given the obvious advantages offered by using the one name globally but at the time VW America had a rationalization: “We think we have some opportunities to do something creative with the Rabbit nameplate and recognizes the Golf nameplate has never really caught on with North American consumers as it was overshadowed by the Jetta sedan and wagon.  Volkswagen customers want a relationship with their cars and names like The Thing, Beetle, Fox and Rabbit support this."  Whatever the opportunities may have been, the linguistic experiment wasn’t continued and since 2009, it’s been Golfs all the way.

US market VW Golfs: 1974 Rabbit L (Generation 1)  (left) and 2007 Rabbit TSI (Generation 5).

There was some linguistic irony in VW’s choice because as the US satirist & critic HL Mencken (1880–1956) pointed out in The American Language; An Inquiry into the Development of English in the United States (1919): “Zoologically speaking, there are no native rabbits in the United States; they are all hares. But the early colonists, for some unknown reason, dropped the word hare out of their vocabulary, and it is rarely heard in American speech to this day. When it appears it is almost always applied to the so-called Belgian hare, which, curiously enough, is not a hare at all, but a true rabbit.

The White Rabbit was a character in Lewis Carroll’s (1832–1898) Alice’s Adventures in Wonderland (1865) and one which appears often, always in a waistcoat with pocket watch and in a hurry, fearful always of the impending fury the duchess will visit upon him should he be a moment late.  It’s the white rabbit which Alice follows down the rabbit hole, leading to the bizarre adventures recounted.  One of popular culture’s best-known rabbits gave rise to the phrase “bunny boiler”, a reference to the scene in the film Fatal Attraction (1987) in which a scorned woman revenged herself upon her adulterous ex-lover by tossing his daughter’s pet rabbit into a pot of boiling water; he arrives home to discover a boiled bunny.  The Warner Brother cartoon character Bugs Bunny first appeared on the screen in 1938 and is often described by his shotgun wielding antagonist, the lisping Elmer Fudd, as "that wascally wabbit".

In idiomatic use there’s “pull a rabbit out of the hat” (to find or obtain a sudden solution to a problem), “rabbit-hearted” (someone timid or inclined to be flighty), “rabbit food” (a disapproving view of vegetables held by some meat-eaters), “the rabbit test” (an early pregnancy test involving the injection of the tested woman's urine into a female rabbit, then examining the rabbit's ovaries a few days later for changes in response to a hormone (“the rabbit died” the phrase indicating a positive test or an admission of one’s pregnancy)), “breed like rabbits” (slang for an individual, family, or sub-group of a population with a high birth-rate), “down the rabbit hole” (a time-consuming tangent or detour, often one from which it’s psychologically difficult to extricate oneself), lucky rabbit’s foot, (the carrying of a luckless bunny’s preserved rabbit’s foot as a lucky charm), “like a rabbit warren” (a confusingly labyrinthine environment (used literally & figuratively)), “rabbit in the headlights (an allusion to the way rabbits (like some other wildlife) sometimes “freeze” when caught in the light of an oncoming vehicle’s headlamps) and the inevitable “rabbit fucker” (a general term of disparagement (although it could be applied literally in the right circumstances)).

The “earless” rabbit with “eared” companions.

In May 2011, some weeks after the meltdown at Fukushima Dai-ichi nuclear plant which suffered severe damage in the aftermath of the earthquake and tsunami, a video of an “earless rabbit” began to circulate, purportedly captured in an area just beyond the crippled plant’s exclusion area.  The immediate speculation was of course the creature’s unusual state was a result of a radiation-induced genetic mutation.  Geneticists however had a less troubling explanation.  Although there’s no doubt the radiation emitting from Fukushima Dai-ichi (some 225 kilometres (140 miles) north-east of Tokyo) represents a major risk to health and the long-term environmental effects remain unclear, the scientists say not only is it unlikely to be linked with the earless rabbit, such creatures are far from unusual.  According to a  statement issued from Colorado State University's Department of Environmental and Radiological Health Sciences: …radiation can cause mutations that can be occasionally expressed as obvious birth defects, such as shown in the video.  However, to say this is the result of contamination from the Fukushima accident is a stretch, because natural radiation, as well as many other chemical substances in the environment and other factors, can also be mutagenic.  In most cases, the cause of congenital birth defects in humans and other animals cannot be determined and as far as science has shown, there have never been mutations produced by ionizing radiations that do not occur spontaneously as well.

Rabbits used in nuclear reactors: Polyethylene 1-inch (25 mm) rabbit (left), Polyethylene 2-inch (50 mm) rabbit (centre) and Titanium 2-inch (50 mm) rabbit.

The rabbit does though have a place in nuclear engineering.  In the industry, the term “rabbit” is used to describe a range of pneumatically controlled tools which are used remotely to insert or retrieve items from a nuclear reactor or other radioactive environments.  The name is thought to come from the devices being tubular (on the model of the rabbit borrow) which allows samples rapidly to be injected into the periphery of a reactor core, the injectables moving “with the speed of startled rabbits” although there may also be the implication of rabbits as expendable creatures, the tool essential for maintenance, inspection, and repair tasks in nuclear facilities, where direct human intervention is either dangerous or impossible because of high radiation levels.

Winston Churchill inspecting the progress of project White Rabbit No, 6, Clumber Park, Nottinghamshire, England, November 1941.

The World War II (1939-1945) era White Rabbit No. 6 was an engineering project by the British Admiralty although as a security measure the official code-name was changed to Cultivator No. 6 to make it sound less mysterious and more like a piece of agricultural equipment.  It was a military trench-digging machine and an example of the adage that “generals are always preparing to fight the last war” and although designed exclusively for army use on (and at least partially under) land, it came under the auspices of the Royal Navy because it was a brainchild (one of many) of Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) who, between the outbreak of war in 1939 and his assumption of the premiership some months later, served as First Lord of the Admiralty (the service’s civilian head).  Trenches and artillery had been the two dominant features of World War I (1914-1918) and Churchill had spent some months (1915-1916) in one of the former while under fire from the latter while commanding a battalion; before the implications of mechanization and the German’s Blitzkrieg (lightning war) tactics were apparent, he assumed the new war in France would unfold something like the old, thus the interest in something which would “revolutionize trench warfare”.  Trench warfare however wasn’t repeated so White Rabbit No.6 was soon realized to be already obsolete and the project was abandoned and although the most fully developed of the prototypes did perform according to the design parameters, whether it would have been effective remains doubtful; remarkably, work on these things wasn’t wholly abandoned until 1942.  The “White Rabbit” project codes came from Churchill’s sense of humor, his ideas coming, as he said: “like rabbits I pull from my hat” and he supported many, some of which were of great military value while others, like the “floating runways” (artificial icebergs made with a mixture of shards of timber & frozen water), were quixotic.

White Rabbit © Copperpenny Music, Mole Music Co

Surrealistic Pillow album cover, 1967.

White Rabbit was a song by Grace Slick (b 1939) and released on the album Surrealistic Pillow by Jefferson Airplane.  The lyrics were inspired by Lewis Carroll's Alice's Adventures in Wonderland and the sequel Through the Looking-Glass (1871).  It was the psychedelic era and drug references were common in popular music and in the case of White Rabbit it may have been appropriate if the speculation the books been written while the author was under the influence of Laudanum (a then widely-available opiate-infused drug) is true (there's no evidence beyond the circumstantial).  Given the imagery in the text, it’s not difficult to believe he may have been on something and among authors and poets it was a popular way to stimulate the imagination, inspiring at least some of one of the most beloved fragments of English verse, Samuel Taylor Coleridge’s (1772-1834) Kubla Khan (1797) which ends abruptly at 54 lines.  According to Coleridge, he was unable to recall the rest of the 300-odd which had come to him in an opium-laced dream (the original publication was sub-titled “A Vision in a Dream”) because he was interrupted by “a person on business from Porlock” (a nearby Somerset village).  Grace Slick would have sympathized with an artist being intruded on by commerce.

White Rabbit lyrics:

One pill makes you larger
And one pill makes you small
And the ones that mother gives you
Don't do anything at all
Go ask Alice
When she's ten feet tall
 
And if you go chasing rabbits
And you know you're going to fall
Tell 'em a hookah-smoking caterpillar
Has given you the call
Call Alice
When she was just small
 
When the men on the chessboard
Get up and tell you where to go
And you've just had some kind of mushroom
And your mind is moving low
Go ask Alice
I think she'll know
 
When logic and proportion
Have fallen sloppy dead
And the White Knight is talking backwards
And the Red Queen's off with her head
Remember what the dormouse said
Feed your head

Monday, April 15, 2024

MADD

MADD, Madd MaDD (pronounced mad)

(1) The acronym (as MADD) for Mothers Against Drunk Driving, a non-profit education and lobbying operation founded in California in 1982 with a remit to campaign against driving while drink or drug-affected.

(2) The acronym (as MADD) for Myoadenylate deaminase deficiency or Adenosine monophosphate deaminase.

(3) The acronym (as MADD) for multiple acyl-CoA dehydrogenase deficiency (known also as the genetic disorder Glutaric acidemia type 2).

(4) In computing (as MADD), the acronym for Multiple-Antenna Differential Decoding (a technique used in wireless comms using multiple antennas for both transmit & receive which improves performance by exploiting spatial diversity & multipath propagation of the wireless channel).

(5) As the gene MADD (or MAP kinase), an activating death domain protein.

(6) As Madd, the fruit of Saba senegalensis (a fruit-producing plant of the Apocynaceae family, native to the Sahel region of sub-Saharan Africa).

(7) As madd, a clipping of maddah (from the From Arabic مَدَّة (madda)), the English form of the Arabic diacritic (a distinguishing mark applied to a letter or character) used in both the Arabic & Persian.

(8) The acronym (as MaDD), Maladaptive Daydreaming Disorder.

(9) The acronym (as MADD), for mutually assured digital destruction: a theory of cyber-warfare whereby each participant demonstrates to the other their capacity to inflict equal or more severe damage in retaliation, thereby deterring a cyber-attack (based on the earlier MAD (mutually assured destruction), a description of nuclear warfare deterrence).

From AD to MAD, 1962-1965

The period between the addition of nuclear weapons to the US arsenal in 1945 and 1949 when the USSR detonated their first atomic bomb was unique, a brief anomaly in the history of great-power conflict.  It's possible to find periods in history when one power has possessed an overwhelming preponderance of military strength that would have enabled them easily to defeat any enemy or possible coalition but never was the imbalance of force so asymmetric as it was between 1945-1949.   Once both the US and USSR possessed strategic nuclear arsenals, the underlying metric of Cold War became the two sides sitting in their bunkers counting warheads and the centrality of that lasted as long as the bombs were gravity devices delivered by aircraft which needed to get to a point above the target.  At this point, the military’s view was that nuclear war was possible and the only deterrent was to maintain a creditable threat of retaliation and, still in the age of the “bomber will always get through” doctrine, both sides literally kept squadrons of nuclear-armed bombers in the air 24/7.  Once ground-based intercontinental ballistic missiles (ICBMs) and (especially) submarine-launched ballistic missile (SLBMs) were deployed, the calculation of nuclear war changed from damage assessment to an acknowledgement that, in the worse case scenarios made possible by the preservation of large-scale second-strike retaliatory capacity, although the "total mutual annihilation" of the popular imagination was never likely, the damage inflicted would have been many times worse and more extensive than in any previous conflict and, although the climatarian implications weren't at the time well-understood, the consequences would have been global and lasted to one degree or another for centuries.

It was thus politically and technologically deterministic that the idea of mutually assured destruction (MAD) would evolve and it was a modification of a deterrence doctrine known as AD (assured destruction) which appeared in Pentagon documents as early as 1962.  AD was intended as a way to deter the USSR from staging a first-strike against the US, the notion being that the engineering and geographical deployment of the US's retaliatory capacity was such that whatever was achieved by a Soviet attack, their territory would suffer something much worse.  To the Pentagon planners in their bunker, the internal logic of AD was compelling and was coined as a description of the prevailing situation rather than a theoretical doctrine.  To the general population, it obviously meant MAD (mutually assured destruction) and while as a doctrine of deterrence, the metrics remained the same, after 1966 when the term gained currency, it began to be used as an argument against the mere possession of nuclear arsenals, the paradox being the same acronym was also used to underpin the standard explanation of the structural reason nuclear warfare was avoided.  Just as paradoxically, while serving to prevent their use, MAD also fueled the arms race because the stalemate created its own inertia and it would be almost a decade before the cost and absurdity of maintaining the huge number of useless warheads was addressed.  MAD probably also contributed to both sides indulging in conflict by proxy, supporting wars and political movements which served as surrogate battles made too dangerous by the implications of MAD to be contested between the two big protagonists.

Maladaptive Daydreaming Disorder

There are those who criticize the existence of MADD (Maladaptive Daydreaming Disorder) as an example of the trend to “medicalize” aspects of human behaviour which have for millennia been regarded as “normal”, the implication being the sudden creation of a cohort of customers for psychiatrists and the pharmaceutical industry, the suspicion being MADD is of such interest to the medical-industrial complex because the catchment is of the “worried well”, those with sufficient disposable income to make the condition worthwhile, the poor too busy working to ensure food and shelter for their families for there to be much time to daydream.

Still, the consequences of MADD are known to be real and while daydreaming is a common and untroubling experience for many, in cases where it’s intrusive and frequent, it can cause real problems with everyday activities such as study or employment as well as being genuinely dangerous if associated with tasks such as driving or the use of heavy machinery.  The condition was first defined by Professor Eli Somer (b 1951; a former President of both the International Society for the Study of Trauma and Dissociation (ISSTD) and the European Society for Trauma and Dissociation (ESTD)) who described one manifestation as possibly an “escape or coping mechanism from trauma or abuse”, noting it may “involve long periods of structured fantasy”.  Specific research into MADD has been limited but small-scale studies have found some similarities to behavioral addictions, the commonality being a compulsion to engage in activities despite negative impacts on a person’s mental or physical health or ability to function various aspects of life. 

Despite the suggestion of similarities to diagnosable conditions, latest edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-5-TR, 2022) did not add an entry for MADD and the debate among those in the profession interested in the matter is between those arguing it represents an unidentified clinical syndrome which demands a specific diagnosis and those who think either it fits within the rubric of obsessive compulsive disorder (OCD) or is a dissociative condition.  Accordingly, in the absence of formal recognition of MADD, while a psychiatrist may decline to acknowledge the condition as a specific syndrome, some may assess the described symptoms and choose to prescribe the drugs used to treat anxiety or OCD or refer the patient to sessions of cognitive behavior therapy (CBT) or the mysterious mindfulness meditation.

Mutually Assured Digital Destruction

Authors in 2021 suggested MADD (mutually assured digital destruction) as the term to describe the strategic stalemate achieved by the major powers infecting each other’s critical (civilian & military) digital infrastructure with crippleware, logic-bombs and other latent tools of control or destruction.  The core the idea was based on old notion of “the bomber always gets through”, a recognition it’s neither possible to protect these systems from infiltration nor clean up what’s likely there and still undiscovered.  So, rather than being entirely covert, MADD instead makes clear to the other side its systems are also infected and there will be retaliation in kind to any cyber attack with consequences perhaps even worse than any suffered in the first strike.  Like the nuclear submarines with their multiple SLBMs silently which cruise the world's oceans, the strategic charm of the latent penetration of digital environments is that detection of all such devices is currently impossible; one knows they (and their SLMBs) are somewhere in firing range but not exactly where.  Oceans are big places but so is analogously is the digital environment and a threat may be in the hardware, software or the mysterious middleware and sometimes a treat can actually be observed yet not understood as such.

For individuals, groups and corporations, there's also the lure of unilateral destruction, something quite common in the social media age.  For a variety of reasons, an individual may choose to "delete" their history of postings and while it's true this means what once was viewable no longer is, it does not mean one's thoughts and images are "forever gone" in the sense one can use the phrase as one watches one's diary burn.  That was possible (with the right techniques or a power drill) when a PC sat on one's desk and was connected to nothing beyond but as soon as a connection with a network (most obviously the internet) is made and data is transferred, whatever is sent is in some sense "in the wild".  That was always true but in the modern age it's now effectively impossible to know where one's data may exist, such are the number of "pass-through" devices which may exist between sender and receiver.  On the internet, even if the path of the data packets can be traced and each device identified, there is no way to know where things have been copied (backup tapes, replica servers et al) and that's even before one wonders what copies one's followers have taken.  There may often be good reasons to curate one's social media presence to the point of deletion but that shouldn't be thought of as destruction.

Friday, April 12, 2024

TikToker

TikToker (pronounced tik-tok-ah)

(1) One who is a regular or frequent viewer of the content posted on the short-form video (which, with mission-creep, can no been up to ten (10) minutes in duration) sharing site TikTok.com.

(2) One who is a regular or frequent content provider on the TikTok platform.

(3) With a variety of spellings (ticktocker, tictoker, tiktoka etc), a slang term for a clock or watch, derived from the alternating ticking sound, as that made by a clock (archaic).

(4) In computing, with the spelling ticktocker (or ticktocker), slang for a software element which emulates the sound of a ticking clock, used usually in conjunction with digitals depictions of analogue clocks.

2018: The ancestor form (ticktock or tick-tock) seems not to have been used until the mid-nineteenth century and was purely imitative of the sound of mechanical clocks. Tick (in the sense of "a quiet but sharp sound") was from the Middle English tek (light touch, tap) and tock was also onomatopoeic; when used in conjunction with tick was a reference to the clicking sounds similar to those made by the movements of a mechanical clock.  The use of TikToker (in the sense of relating to users (consumers & content providers) of the short-form video (which, with mission-creep, can be up to ten (10) minutes in duration) sharing site TikTok.com probably began in 2018 (the first documented reference) although it may early have been in oral use.  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  TikToker is a noun & adjective; the noun plural is TikTokers (the mixed upper & lower case is correct by commercial convention but not always followed).  The PRC- (People’s Republic of China) based holding company ByteDance is said to have chosen the name “TikTok” because it was something suggestive of the “short, snappy” nature of the platform’s content; they understood the target market and its alleged attention span (which, like the memory famously associated with goldfish might be misleading).

Billie Eilish, Vogue, June, 2021.

Those who use TikTok (whether as content providers & consumers) are called “tiktokers” and the longer the aggregate duration of one’s engagement with the platform, the more of a tiktoker one is.  The formation followed the earlier, self-explanatory “YouTuber” and the use for similar purposes (indicating association) for at least decades.  So the noun tiktoker is a neutral descriptor but it can also be used as a slur.   In February 2024, at the People’s Choice Awards ceremony held in Los Angeles, singer Billie Eilish (b 2001) was filmed leaning over to Kylie Minogue (b 1968) ,making the sotto voce remark “There’s some, like, TikTokers here…” with the sort of distaste Marie Antoinette (1755–1793; Queen Consort of France 1774-1792) might have displayed if pointing out to her sympatetic the unpleasing presence of peasants.  The clip went viral on X (formerly known as Twitter) before spreading to Tiktok.  Clearly there is a feeling of hierarchy in the industry and her comments triggered some discussion about the place of essentially amateur content creators at mainstream Hollywood events.  That may sound strange given that a platform like TikTok would, prima facie, seem the very definition of the “people’s choice” but these events have their own history, associations and implications and what social media sites have done to the distribution models has been quite a disturbance and many established players, even some who have to some extent benefited from the platforms, find the intrusion of the “plague of TikTokers” disturbing.

Pop Crave's clip of the moment, Billie Eilish & Kylie Minogue, People's Choice Awards ceremony, Los Angeles, February 2024.

There will be layers to Ms Eilish’s view.  One is explained in terms of mere proximity, the segregation of pop culture celebrities into “A List”, B List, D List” etc an important component of the creation and maintenance of one’s public image and an A Lister like her would not appreciate being photographed at an event with those well down the alphabet sitting at the next table; it cheapens her image.  Properly managed, these images can translate into millions (and these days even billions) of dollars so this is not a matter of mere vanity and something for awards ceremonies to consider; if the TikTokers come to be seen as devaluing their brand to the extent the A Listers ignore their invitations, the events either have to move to a down-market niche or just be cancelled.  Marshall McLuhan’s (1911-1980) book Understanding Media: The Extensions of Man (1964) pre-dates social media by decades but its best-remembered phrase (“The medium is the message”) could have been designed for the era, the idea being that the medium on which content is distributed should be first point of understanding significance, rather than actual content.  McLuhan’s point was that the initial assessment of the veracity or the value of something relies on its source.  In the case of pop music, this meant a song distributed by a major label possessed an inherent credibility and prestige in a way something sung by a busker in a train station did not.  What the existence of YouTube and TikTok meant was the buskers and the artists signed to the labels suddenly began to appear on the same medium, thus at some level gaining some sort of equivalency.  On TikTok, it’s all the same screen.

Ms Eilish and her label has been adept at using the socials as tool for this and that so presumably neither object to the existence or the technology of the sites (although her label (Universal Music) has only recently settled its dispute with TikTok over the revenue sharing) but there will be an understanding that while there’s now no alternative to in a sense sharing the digital space and letting the people choose, that doesn’t mean she’ll be happy about being in the same photo frame when the trophies are handed out.  Clearly, there are stars and there are TikTokers and while the latter can (and have) become the former, there are barriers not all can cross.

1966 Jaguar Mark X 4.2 (left), 1968 Dodge Charger RT 440 (centre) and 1981 Mercedes-Benz 500 SLC (right).  Only the Americans called the shared tachometer/clock a “Tic-Toc Tach”.

Jaguar had long been locating a small clock at the bottom of the tachometer but in 1963 began to move the device to the centre of the dashboard, phasing in the change as models were updated or replaced.  By 1968 the horological shift was almost complete (only the last of the Mark II (now known as 240, 340 & Daimler V8 250 models still with the shared dial) and it was then Chrysler adopted the idea although, with a flair the British never showed, the called it the "Tic-Toc-Tachometer.  Popularly known as the “Tic-Toc Tach”, it was also used by other US manufacturers during the era, the attraction being an economical use of dash space, the clock fitting in a space at the centre of the tachometer dial which would otherwise be unused.  Mercedes-Benz picked up the concept in 1971 when the 350 SL (R107) was introduced and it spread throughout the range, universal after 1981 when production of the 600 (W100) ended.  Mercedes-Benz would for decades use the shared instrument.  A tachometer (often called a “rev counter”) is a device for measuring the revolutions per minute (RPMs) of a revolving shaft such as the crankshaft of an internal combustion engine (ICE) (thus determining the “engine speed”).  The construct was tacho- (an alternative form of tachy-, from the Ancient Greek ταχύς (takhús) (rapid) + meter (the suffix from the Ancient Greek μέτρον (métron) (measure) used to form the names of measuring devices).

Conventions in English and Ablaut Reduplication

In 2016, the BBC explained why we always say “tick tock” rather than “tock-tic” although, based on the ticking of the clocks at the time the phrase originated, there would seem to be no objective reasons why one would prevail over the other but the “rule” can be constructed thus: “If there are three words then the order has to go I, A, O.  If there are two words then the first is I and the second is either A or O which is why we enjoy mish-mash, chit-chat, clip-clop, dilly-dally, shilly-shally, tip-top, hip-hop, flip-flop, tic tac, sing song, ding dong, King Kong & ping pong.  Obviously, the “rule” is unwritten so may be better thought a convention such as the one which dictates why the words in “Little Red Riding Hood” appear in the familiar order; there the convention specifies that in English, adjectives run in the textual string: opinion; size; age; shape; colour; origin; material; purpose noun.  Thus there are “little green men” but no “green little men” and if “big bad wolf” is cited as a violation of the required “opinion (bad); size (big); noun (wolf)” wolf, that’s because the I-A-O convention prevails, something the BBC explains with a number of examples, concluding “Maybe the I, A, O sequence just sounds more pleasing to the ear.”, a significant factor in the evolution of much that is modern English (although that hardly accounts for the enduring affection some have for proscribing the split infinitive, something which really has no rational basis in English, ancient or modern.  All this is drawn from what is in structural linguistics called “Ablaut Reduplication” (the first vowel is almost always a high vowel and the reduplicated vowel is a low vowel) but, being English, “there are exceptions” so the pragmatic “more pleasing to the ear” may be more helpful in general conversation.

Lindsay Lohan announces she is now a Tiktoker.

Rolls-Royce, the Ford LTD and NVH

Rolls-Royce Silver Cloud II, 1959.  Interestingly, the superseded Silver Cloud (1955-1958) might have been quieter still because the new, all-aluminium 6¼ litre (380 cubic inch) V8 didn’t match the smoothness & silence of the previous cast iron, 4.9 litre (300 cubic inch) straight-six.

The “tick-tocking” sound of a clock was for some years a feature of the advertising campaigns of the Rolls-Royce Motor Company, the hook being that: “At 60 mph (100 km/h) the loudest noise in a Rolls-Royce comes from the electric clock”.  Under ideal conditions, that was apparently true but given electric clocks can be engineered to function silently, the conclusion was the company fitted time-pieces which made a deliberately loud “tick-tock” sound, just to ensure the claims were true.  They certainly were, by the standards of the time, very quiet vehicles but in the US, Ford decided they could mass-produce something quieter still and at the fraction of the cost.  Thus the 1965 Ford LTD, a blinged-up Ford (intruding into the market segment the corporation had previously allocated to the companion Mercury brand), advertised as: “Quieter than a Rolls-Royce”.  Just to ensure this wasn’t dismissed as mere puffery, Ford had an independent acoustic engineering company conduct tests and gleefully published the results, confirming what the decibel (dB) meters recorded.  Sure enough, a 1965 Ford LTD was quieter than a 1965 Rolls-Royce Silver Cloud III.  Notably, while Rolls-Royce offered only one mechanical configuration while the Ford was tested only when fitted with the mild-mannered 289 cubic inch (4.7 litre) V8; had the procedure included another variation on the full-size line which used the 427 (7.0) V8, the results would have been different, the raucous 427 having many charms but they didn’t include unobtrusiveness.

1965 Ford LTD (technically a “Galaxie 500 LTD” because in the first season the LTD was a Galaxie option, not becoming a stand-alone model until the 1966 model year).

Ford did deserve some credit for what was achieved in 1965 because it wasn’t just a matter of added sound insulation.  The previous models had a good reputation for handling and durability but couldn’t match the smoothness of the competitive Chevrolets so within Ford a department dedicated to what came to be called HVH (Noise, Vibration & Harshness) was created and this team cooperated in what would now be understood as a “multi-disciplinary” effort, working with body engineers and suspension designers to ensure all components worked in harmony to minimize NVH.  What emerged was a BoF (Body on Frame) platform, a surprise to some as the industry trend had been towards unitary construction to ensure the stiffest possible structure but the combination of the frame’s rubber body-mounts, robust torque boxes and a new, compliant, coil rear suspension delivered what was acknowledged as the industry’s quietest, smoothest ride.  Ford didn’t mention the tick-tock of the clock.

Saturday, March 30, 2024

Swirl

Swirl (pronounced swurl)

(1) A twist, as of hair around the head or of trimming on a hat; a whorl or curl.

(2) Any curving, twisting line, shape, or form.

(3) A descriptor of a state or confusion or disorder.

(4) A swirling movement; whirl; eddy; to turn or cause to turn in a twisting spinning fashion (used especially of running water).

(5) In fishing, the upward rushing of a fish through the water to take the bait.

(6) To move around or along with a whirling motion; a whirl; an eddy.

(7) To feel dizzy or giddy (the idea of a “spinning head”).

(8) To cause to whirl; twist.

(9) To be arranged in a twist, spiral or whorl.

(10) Figuratively, to circulate, especially in a social situation.

(11) In AAVE (African-American Vernacular English), to in some way mingle interracially (dating, sex, marriage etc) (dated; now rare).

(12) In internal combustion engines (ICE), as “swirl chamber”, a now generic term for a type of combustion chamber design.

1375-1425: From the late (northern) Middle English swirlen (to eddy, swirl) which was probably from the Old Norse svirla (to swirl), a frequentative form of Old Norse sverra (to swing, twirl).  It was cognate with the Scots swirl & sworl (to eddy, swirl), the Norwegian Nynorsk svirla (to whirl around; swirl), the Swedish sorla (to murmur, buzz) and the Dutch zwirrelen (to swirl).  Related forms included the dialectal German schwirrlen (to totter), the West Frisian swiere (to reel, whirl), the Dutch zwieren (to reel, swing around), the German Low German swirren (to whizz, whirl or buzz around), the German schwirren (to whirr, whizz, buzz), the Swedish svirra (to whirr about, buzz, hum), the Danish svirre (to whizz, whirr) and the English swarm.  The construct may be understood as the Germanic root swir- + -l- (the frequentative suffix).  Swirl is a noun & verb, swirled is a verb & adjective, swirling is a noun, verb & adjective, swirly is a noun & adjective, swirler is a noun and swirlingly is an adverb; the noun plural is swirls.

In English, the late (northern) Middle English noun swirlen (to eddy, swirl) seems originally to have come from a Scottish word, the origin of which is undocumented but etymologists seem convinced of the Scandinavian links.  The sense of a “whirling movement” emerged in the early nineteenth century although the meaning “a twist or convolution (in hair, the grain of wood etc)” was in use by 1786.  The verb as a transitive in the sense of “give a swirling or eddying motion to” was in use in the early sixteenth century but it may by then long have been in oral use, one text from the fourteenth containing an example and the source of that may have been either Germanic (such as the Dutch zwirrelen (to swirl) or the Norwegian Nynorsk svirla (to whirl around; swirl) or it may have evolved from the English noun.  The intransitive sense (have a whirling motion, form or whirl in eddies) dates from 1755.  The adjective swirly existed by 1785 in the sense of “twisted or knotty” but by the middle of the next century it had come also to describe anything “whirling or eddying”, applied especially to anything aquatic.  By 1912, it was used also to mean “full of contortions or twists” although “swirling” in this sense had by then been in (gradually increasing) use for a century.

Of curls & swirls: Lindsay Lohan with curls (left) and swirls (right).

In hairdressing, although customers sometimes use the words “curl” and “swirl” interchangeably, to professionals the use should be distinct.  A swirl is a movement or pattern in which hair is styled or arranged, typically with a rounded or circular pattern and swirls can be natural (the pattern at the crown of the head where the hair grows in a circular direction) or stylized (the look deliberately created and most obvious in “up-dos) or the formal styles associated with weddings and such).  The end result is a wide vista and the swirl is more a concept than something which exists within defined parameters.  A curl is (1) a type of hair texture or (2) the act of creating a curl with techniques using tools and/or product.  Some people (and there’s a strong ethnic (ie genetic) association) naturally have curly hair due to the shape of their follicles and within the rubric of what used to be called the ulotrichous, hairdressers classify curls as tyree types: (1) tight (small, corkscrew-like structures), (2) medium (tighter curls but with a softer appearance) and (3) long spirals with a large diameter).  Some commercial product also lists “ringlets” as a type but as tight, well-defined spirals, they’re really a descriptive variation of the tight or medium.  So, the essential difference is that a swirl is a pattern or movement of the hair, while a curl describes texture or shape and while a swirl is a matter or arrangement, a curl demands changing the hair’s natural texture or shape.  Swirls are very much set-piece styles associated with formal events while curls are a popular way to add volume, texture, and movement to the hair.

In internal combustion engines (ICE), the “swirl chamber is a now generic term used to describe a widely-used type of combustion chamber when upon introduction, the fuel-air mixture “swirls around” prior to detonation.  The design is not new, Buick’s straight-8 “Fireball” cylinder head using a simple implementation as long ago as the 1920s and it would serve the corporation into the 1950s.  The critical aspect of the engineering was the interaction between a receded exhaust valve and a rising in the top of the piston which “pushed” most of the fuel-air mixture into what was a comparatively small chamber, producing what was then called a “high-swirl” effect, the “Fireball” moniker gained by virtue of the actual combustion “ball of fire” being smaller in volume than was typical at the time.  The benefit of the approach was two-fold: a reduction in fuel consumption because less was required per power-stroke and (2) a more consistent detonation of the poor quality fuel then in use.  As fuel improved in quality and compression ratios rose (two of the dominant trends of the post-war years), the attraction of swirl chambers diminished but the other great trend was the the effective reduction in the cost of gasoline (petrol) and as cars became larger & heavier and roads more suited to higher speeds, the quest was for power.

Swirling around: The swirl process in a diesel combustion chamber.

Power in those years usually was gained by increased displacement & combustion chamber designs optimized for flow; significantly too, many popular designs of combustion chamber (most notably those in the so-called “wedge” heads) were cheaper to produce and in those years, few gave much thought to air-pollution.  The cars of the 1950s & 1960s had really toxic exhaust emissions.  By the mid 1960s however, the problem of air pollution in US cities was so obvious and the health effects were beginning to be publicized, as was the contribution to all of this by motor vehicles.  Regulations began to appear, California in 1961 (because of the high vehicle population and certain geographical & climatic phenomena, Los Angeles & San Francisco were badly affected by air pollution) passing the first statute and the manufacturers quickly agreed to adopt this standard nationally, fearing other states might begin to impose more onerous laws.  Those however arrived by mid-decade and although there was specific no road-map, few had any doubts the rules would become stricter as the years passed.  The industry’s only consolation was that these laws would be federal legislation so they would need to offer only one specification for the whole country (although the time would come when California would decide things should be tougher and by the 1970s there were “Californian cars” and “49 state cars”).  K Street wasn’t the force then it later became and the manufacturers conformed with (relatively) little protest.

Fuel was still cheap and plentiful but interest in swirl chambers was revived by the promise of cleaner burning engines.  Because it wasn’t new technology, the research attracted little attention outside of the engineering community but in 1970, German-born Swiss engineer Michael May (b 1934) demonstrated a Ford (Cologne) Capri with his take on the swirl chamber in a special cylinder head.  In a nod to the Buick original, May nick-named his head design the “Fireball” (professional courtesy a thing among engineers).  What Herr May had done was add a small groove (essentially a channel surrounding the intake valve) to the chamber, meaning during the last faction of a second of piston movement, the already swirling fuel-air mixture got a final nudge in the right direction: instead of there being a randomness to the turbulence of the mix, the shape was controlled and was thus able to be lower in volume (a smaller fireball) and precisely controlled at the point at which the spark triggered detonation; May called this a “higher swirl”.  Not only did this reduce exhaust emissions but it also cut fuel consumption for a given state of tune so designers could choose their desired path: more power for the same fuel consumption or the same power for less and within a short time, just about the whole world was taking great interest in fuel consumption.

Detail of the original "flathead" cylinder head of the Jaguar V12 (left) and the later "Fireball" head with swirl chambers (right).

A noted use of May’s design was its adoption in 1981 on Jaguar’s infamously thirsty V12 (1971-1997), an innovation celebrated by the addition of the HE (High Efficiency) label for the revised power-plant.  The notion of “high efficiency” was comparative rather than absolute and the V12 remained by most standards a thirsty beast but the improvement could be in the order of 40% (depending on conditions) and it was little worse than the similar displacement Mercedes-Benz V8s of the era which could match the Jaguar for power but not the turbine-like smoothness.  Threatened with axing due to its profligate ways, the swirl chambers saved the V12 and it survived another sixteen years which included two severe recessions.  Debuting even before the Watergate scandal, it lasted until the Monica Lewinsky affair.  In the decades since, computer simulations and high-speed photography have further enhanced the behavior of swirl & turbulence, the small fireballs now contained in the center of the chamber, prevent heat from radiating to the surrounding surfaces, ensuring the energy (heat) is expended on pushing the piston down to being the next cycle, not wasting it by heating metal.  The system is popular also in diesel engines.