Saturday, May 4, 2024

ACL

ACL (pronounced eh-see-elle)

The abbreviation for anterior cruciate ligament, one of a pair of cruciate ligaments in the human knee.

1887 (in Italian); early 20th century (in English): The construct was anterior + cruciate + ligament. Anterior was from the Latin anterior (that is before, foremost).  Cruciate was from the Latin cruciatus, the perfect passive participle of cruciō, from crux (cross).  Ligament was from the Middle English ligament, from the Latin ligāmentum, from ligō (tie, bind).  The vital but unexciting body part sounds much better if spoken in other European languages including Portuguese (ligamento cruzado anterior), Spanish (ligamento cruzado anterior), Catalan (lligament encreuat anterior), French (ligament croisé antérieur) and especially Italian (legamento crociato anteriore).  Anterior cruciate ligament is a noun; the noun plural is anterior cruciate ligaments.

In the world of acronyms and abbreviations, there are literally dozens of other ACLs including the American Classical League which promotes the study of Antiquity and the classics, the Association for Computational Linguistics, a professional organization for those working on natural language processing, the Australian Christian Lobby, a right wing Christian pressure group which disapproves of the last three centuries-odd, the Access Control List, an element in computer security, ACL2, a modular software noted for its theorem prover, as code ACL, Akar-Bale language, an extinct Great Andamanese language (ISO (International Standard) 639-3), allowable combat load, in military aviation, the inventory of weapons system for which an airframe is rated and the wonderful anthroponotic cutaneous leishmaniasis, a form of Old World cutaneous leishmaniasis, usually with a prolonged incubation period and confined to urban areas.

The long and painful history of the anterior cruciate ligament

Ligaments of the right knee.

Descriptions of the anterior cruciate ligament (ACL) appear in some surviving medical texts from Antiquity, the earliest known reference thought to be in the drawings of the physician Galen (Claudius Galenus or Aelius Galenus; 129-216) although he made no mention of injuries associated with this body part, the aspect for which it’s now best known although there is evidence of corrective surgery being undertaken in Ancient Egypt.  Presumably, during the many centuries when falling from horses was far from uncommon, such injuries were frequent but because neither surgical correction nor sophisticated rehabilitation regimes had evolved, victims had to suffer or perhaps retire from more rigorous pursuits.  The Irish surgeon Robert Adams (1791-1875) in 1837 noted a clinical case of an ACL tear but in an age when treatments rightly were conservative because the risk death from any surgical intervention was high, Dr Adams’ report was purely observational.  The literature was augmented in 1850 by the Scottish GP (family doctor) James Stark (1811-1890) who published two cases of cruciate tears, describing the different manifestations of knee instability in patients with damaged ACLs but the first record of ACL repair was an operation performed in 1895 by the English surgeon Sir Arthur Mayo-Robson (1853-1933).  The early approach was the use of primary open sutures but while this produced good initial results, decoration was rapid.  No substantive improvements in method were reported so the suturing approach was abandoned and the profession turned to reconstruction.

Lindsay Lohan's knees.

The Russian-born surgeon Ivan Grekov (1867-1934) is credited with having in 1914 been the first to adopt the use of autologous tissue (of cells or tissues obtained from the same individual) for ACL rupture reconstruction in 1914, the technique also documented by the English professor of orthopaedic surgery, Ernest Hey Groves (1872-1944) who performed a number of procedures between 1917-1920.  The Hey Groves approach is strikingly modern and essentially the technique used today but the efficacy clearly wasn’t understood because in the following decades what the historians describe as “…a period of startling ingenuity which created an amazing variety of different surgical procedures often based more on surgical fashion and the absence of a satisfactory alternative than any indication that continued refinements were leading to improved results.  It is hence not surprising that real inventors were forgotten, good ideas discarded and untried surgical methods adopted with uncritical enthusiasm only to be set aside without further explanation.”  That to some extent may explain why ACL reconstructions became rare and it wasn’t until the 1970s when, as the implications of broadcasting allowed professional sport to become a multi-billion dollar industry that with sports medicine becoming a mainstream medical discipline that the operation became common; it was certainly a common injury.  Still, innovation continued and just as there was experimentation with xenografts (tissue graft taken from a species different from that of the recipient.) & allografts (a tissue graft between genetically different individuals of the same species) before the autologous prevailed.  Even synthetic graft materials enjoyed some popularity in the 1980 and 1990s, apparently because in laboratory testing artificial ligaments appeared to be more durable and better able to withstand stresses and strains; real-world experience proved otherwise.

Torn ACL: Exactly what it says.

The increasing participation of female participation in elite-level (often professional) sports such as the various football codes and basketball has in recent years seen a striking rise in ACL injuries.  While to reported volume of incidents is still less than those suffered in gymnastics, long the most common source, it’s in these team sports where the rate of increase has been greatest.  Although the male & female knee look much the same, the physiological differences exist and, given there are differences between almost every human cell which is in some way specifically male or female, that shouldn’t be surprising.  Anatomists note certain structural divergences such as those in the alignment of the leg & pelvis and the muscular protection of the knee joint, added to which the hormone estrogen is known to influence all ligaments but probably of greater consequence are the variations in neuromuscular behavior which human movement studies have documented.  Essentially, these focus on the different positions of the knee and the upper body (compared to the typical male) and a striking predilection when landing to apportion most weight to one rather than both feet.  Theories have been offered to account for this but the most obvious consequence is that the forces generated by landing are less absorbed by the foot and lower leg muscles (analogous with the “crumple-zones” in modern automobiles), meaning a higher proportion of the stress impacts upon the ACL of the “landing knee”.  Added to this, because the typical female tends to land with the upper body tilted to the side of the “landing knee”, this imposes a greater rotational force on the ACL at the same time a vertical impact is being absorbed.  This crucial aspect of behavior is known as the “ankle-dominant strategy”

Some novel research however emerged in 2024 and it may be a candidate for one of the ten Ig Nobel Prizes, awarded annual by the Annals of Improbable Research (AIR) to acknowledge “unusual or trivial achievements in scientific research”.  What the study hypothesized was there might be a link between the sports bras and ACL injuries.  Impressionistically, the connection is not immediately obvious and what the researchers found was not, as might be imagined, simply a product of weight distribution and the effective “multiplier effect” of mass in movement, the further it is from the pivot point, illustrated by the recommendations provided for placing weight in trailers when being towed.  The physics of both are presumably vaguely similar but the interplay of factors relating to women's ACL injuries seems to be more complex. 

Lindsay Lohan in "low-impact" sports bra.

It transpires the multiplier effect of the upper-body mass wasn’t the issue.  What the international team of experts in biomechanics and sports medicine did was study 35 female recreational athletes, finding that the more supportive were the sports bras (the so-called “high-impact” designs), the greater the decrease in the common risk factors associated with ACL injuries, the “knee flexion angles” reducing, meaning the knee didn't have to bend as much on landing.  Additionally, there was a reduction in “dynamic knee valgus”, the knee moving inwards from the foot, something of mechanical significance because females tend to have more inward collapsing knees (increased dynamic knee valgus) during landing activities.  Dynamically, what the study revealed was that when there was no or only minimal breast support, the greater was the tendency to adopt the “ankle-dominant strategy” which had the effect of transferring the stress to the knee and thus the ACL.

By contrast, when wearing a high-impact sports bra, females Used a more “hip-dominant” strategy which puts less strain on the ACL.  The mechanics of the “hip-dominant” approach is that the trunk moves less, making pelvic control easier the “…movement patterns at the trunk, pelvis and lower extremities… all connected.”  The study was published in the Journal of Applied Biometrics and the study cohort of 35 included women with bra cup sizes between B & D, the findings suggesting the larger the cup size, the higher the risk of traumatic knee injury although, perhaps counter-intuitively, the researchers weren’t prepared to say that “…definitively say breast size drives injury risk…” because (1) it was a small number of participants in the study and (2) there are “…so many differences in movement patterns from person to person.”  In the spirit of good research, one reviewer noted the study “…scratches the surface…" of an area that needs …further investigation. 

Human movement studies have a long history but the bulk of the research has been on men and the presence of breasts is the most obvious difference in the bio-mechanics of movement and something which might yet have implications not yet understood.  The physics of it is breasts move up and down and side-to-side during exercise and in a sixty minute session of running, they can bounce in a figure-eight pattern some 10,000 times.  Traditionally, for the sports bra manufacturers the focus in advertizing has been on comfort but there are also performance effects which at the elite level can be vital because the difference between success and failure can be measured in thousands of a second and fractions of an inch.  The appropriate bra can actually reduce oxygen consumption when running which translates into running economy (the distance traveled per volume of oxygen consumed) and the oxygen can be better used by the brain and muscles; also, if the breast movement minimized, the strides become longer, another aspect of economy.  Those matters were known but the apparent explanation of the wrong choice of sports bra being a factor in the higher incidence of ACL injuries in women is something new.

Friday, May 3, 2024

Sommelier

Sommelier (pronounced suhm-uhl-yey or saw-muh-lyey (French))

A waiter, in a club or restaurant or something similar, who is in charge of wines; sometimes known as the wine steward.

1889: Details of the etymology are contested at the margins.  All agree it’s a dissimilated form of the Middle French sommerier (a butler), from the thirteenth century sommier (a military officer who had charge of provisions, a position which evolved into as aspect of the modern role of quartermaster (the the expression used to describe staff in these roles as "on the 'Q' side")).   One version traces this from the twelfth century somme (pack) from the Vulgar Latin salma, a corruption of the Late Latin sagma (a pack-saddle (and later "the pack on the saddle")).  The alternative suggestion was it was from the Old French Provençal saumalier (pack-animal driver) again from Late Latin sagma, the origin of which was the Ancient Greek ságma (covering, pack saddle).  Polish, Portuguese, Spanish & Swedish all use an unadapted borrowing of the French sommelier.  Sommelier is a noun & verb and sommeliering & sommeliered are verbs; the noun plural is sommeliers. 

Fifty-odd years of the Court of Master Sommeliers

Although they call themselves cork-dorks, at the most elite level, a sommelier can belong to a most exclusive club.  The Court of Master Sommeliers was established in 1977, formalizing the layers of qualification that began in 1969 in London with the first Master Sommelier examination, conducted now by the various chapters of the court and globally, they’re a rare few.  While over 600 people have been to space and there are rumored to be some 4000 members of the Secret Society of the Les Clefs d'Or, there are currently only 262 Master Sommeliers in the world. 

In training; practice makes perfect.

The Certified Sommelier Examination (CME) exists as three part concept: (1) a focus on a candidate’s ability to demonstrate proficiency in deductive tasting, (2) the technical aspects of wine production & distribution and (3) the practical skills and techniques of salesmanship required for those working as sommeliers in restaurants and other establishments.  It’s thus a vocational qualification for those who wish to pursue a career in hospitality, either in the specialized field of beverage services or as a prelude to moving into management.  Like the Secret Society of the Les Clefs d’Or, upon graduation as a Certified Sommelier, a candidate becomes entitled to a certificate, title and a lapel pin (in the singular and worn to the left rather than pair of cross keys issued by the Les Clefs d’Or).

It’s a structured process.  As a prerequisite, candidates must have completed the classes and passed the Introductory Sommelier Examination (ISE) although it’s no longer required of students that the CME be completed within three years of the ISE, candidates now encouraged to proceed to the next level when “best prepared”.  The court suggests a minimum of three years industry experience is desirable because both the content of both the ISE & SME are predicated on the assumption those sitting will have this background and it’s further advise it’s best to work in the field for at least twelve months between the two.  The CSE is a one-day examination in three parts and the minimum passing grade is 60% in each (all within the one sitting):

(1) A tasting examination using the court’s Deductive Tasting Method (DTM), candidates during which candidates must with a high degree of accuracy & clarity describe and identify four wines (two white and two red).  The format of this is a written four-section essay which must be completed within 45 minutes and the DTM exists in a structured format which candidates must learn prior to the exam.

The Court of Master Sommelier's Deductive Tasting Method.

(2) A theory examination which is designed to test candidates' knowledge and understanding of wine, beverage, and the sommelier trade.  The test consists of multiple choice, short answer, some non-abstract math (ie the sort of arithmetic relevant to the profession) and matching questions. Candidates must complete the 45-question examination within 38 minutes.

Lindsay Lohan demonstrating early sommelier skills in The Parent Trap (1998).  She decided to focus on acting, pursuing wine-tasting only as a hobby.

(3) A Service Examination: The service examination is a practical-level experience, conducted in a setting which emulates a “real: restaurant environment.  It’s designed to allow candidates to demonstrate salesmanship, knowledge and appropriate conversational skills, all while performing the tableside tasks associated with the job.  The test typically includes opening still or sparkling wines in the correct manner and is not limited purely to what’s in the bottle, students expected to be able to recommend cocktails, spirits or other drinks and discuss the interplay of food and wine; what goes best with what.  As befits a practical exam, candidates must dress and deport themselves exactly as they would if employed as a restaurant sommelier.

Thursday, May 2, 2024

Bug

Bug (pronounced buhg)

(1) Any insect of the order Hemiptera, especially any of the suborder Heteroptera (a hemipteran or hemipteron; a hemipterous insect), having piercing and sucking mouthparts specialized as a beak (rostrum) and known loosely as the “true bug”.

(2) Any of various species of marine or freshwater crustaceans.

(3) In casual use, any insect or insect-like invertebrate (ie used often of spiders and such because of their supposed “bug-like” quality).

(4) In casual use, any micro-organism causing disease, applied especially to especially a virus or bacterium.

(5) An instance of a disease caused by such a micro-organism; a class of such conditions.

(6) In casual (and sometimes structured) use, a defect or imperfection, most associated with computers but applied also to many mechanical devices or processes.

(7) A craze or obsession (usually widespread or of long-standing).

(8) In slang, a person who has a great enthusiasm for such a craze or obsession (often as “one bitten by the bug”).

(9) In casual (and sometimes structured) use, a hidden microphone, camera or other electronic eavesdropping device (a clipping of bugging device) and used analogously of the small and effectively invisible (often a single-pixel image) image on a web page, installed usually for the purpose of tracking users.

(10) Any of various small mechanical or electrical gadgets, as one to influence a gambling device, give warning of an intruder, or indicate location.

(11) A mark, as an asterisk, that indicates a particular item, level, etc.

(12) In US horse racing, the five-pound (2¼ kg) weight allowance able to be claimed by an apprentice jockey and by extension (1) the asterisk used to denote an apprentice jockey's weight allowance & (2) in slang, US, a young apprentice jockey (sometimes as “bug boy” (apparently used thus also of young female jockeys, “bug girl” seemingly beyond the pale.)).

(13) A telegraph key that automatically transmits a series of dots when moved to one side and one dash when moved to the other.

(14) In the slang of poker, a joker which may be used only as an ace or as a wild card to fill a straight or a flush.

(15) In commercial printing, as “union bug”, a small label printed on certain matter to indicate it was produced by a unionized shop.

(16) In fishing, a any of various plugs resembling an insect.

(17) In slang, a clipping of bedbug (mostly UK).

(18) A bogy; hobgoblin (extinct).

(19) In slang, as “bug-eyed”, protruding eyes (the medical condition exophthalmos).

(20) A slang term for the Volkswagen Beetle (Type 1; 1938-2003 & the two retro takes; 1997-2019).

(21) In broadcasting, a small (often transparent or translucent) image placed in a corner of a television program identifying the broadcasting network or channel.

(22) In aviation, a manually positioned marker in flight instruments.

(23) In gay (male) slang in the 1980s & 1990s as “the bug”, HIV/AIDS.

(24) In the slang of paleontology, a trilobite.

(25) In gambling slang, a small piece of metal used in a slot machine to block certain winning combinations.

(26) In gambling slang, a metal clip attached to the underside of a table, etc and used to hold hidden cards (a type of cheating).

(27) As the Bug (or Western Bug), a river in Eastern Europe flows through Belarus, Poland, and Ukraine with a total length of 481 miles (774 km).  The Southern Bug (530 miles (850 km)) in south west Ukraine flows into the Dnieper estuary and is some 530 miles (850 km) long.

(28) A past tense and past participle of big (obsolete).

(29) As ISO (international standard) 639-2 & ISO 639-3, the language codes for Buginese.

(30) To install a secret listening device in a room, building etc or on a telephone or other communications device.

(31) To badger, harass, bother, annoy or pester someone.

1615–1625: The original use was to describe insects, apparently as a variant of the earlier bugge (beetle), thought to be an alteration of the Middle English budde, from the Old English -budda (beetle) but etymologists are divided on whether the phrase “bug off” (please leave) is related to the undesired presence of insects or was of a distinct origin.  Bug, bugging & debug are nouns & verbs, bugged is a verb & adjective and buggy is a noun & adjective; the noun plural is bugs.  Although “unbug” makes structural sense (ie remove a bug, as opposed to the sense of “debug”), it doesn’t exist whereas forms such as the adjectives unbugged (not bugged) and unbuggable (not able to be bugged) are regarded as standard.

The array of compound forms meaning “someone obsessed with an idea, hobby etc) produced things like “shutterbug” (amateur photographer) & firebug (arsonist) seems first to have emerged in the mid nineteenth century.  The development of this into “a craze or obsession” is thought rapidly to have accelerated in the years just before World War I (1914-1918), again based on the notion of “bitten by the bug” or “caught the bug”, thus the idea of being infected with an unusual enthusiasm for something.  The use to mean a demon, evil spirit, spectre or hobgoblin was first recorded in the mid-fourteenth century and was a clipping of the Middle English bugge (scarecrow, demon, hobgoblin) or uncertain origin although it may have come from the Middle Welsh bwg (ghost; goblin (and linked to the Welsh bwgwl (threat (and earlier “fear”) and the Middle Irish bocanách (supernatural being).  There’s also speculation it may have come from the scary tales told to children which included the idea of a bugge (beetle) at a gigantic scale.  That would have been a fearsome sight and the idea remains fruitful to this day for artists and film-makers needing something frightening in the horror or SF (science fiction) genre.  The use in this sense is long obsolete although the related forms bugbear and bugaboo survive.  Dating from the 1570s, a bugbear was in folklore a kind of “large goblin”, used to inspire fear in children (both as a literary device & for purposes of parental control) and for adults it soon came to mean “a source of dread, resentment or irritation; in modern use it's an “ongoing problem”, a recurring obstacle or adversity or one’s pet peeve.  The obsolete form bugg dates from circa 1620s and was a reference to the troublesome bedbug, the construct a conflation of the middle English bugge (scarecrow, hobgoblin) and the Middle English budde (beetle).  The colloquial sense of “a microbe or germ” dates from 1919, the emergence linked to the misleadingly-named “Spanish flu” pandemic.

Like the rest of us, even scientists, entomologists and zoologists generally probably say “bug” in general conversation, whether about the insects or the viruses and such which cause disease but in writing academic papers they’ll take care to be more precise.  Because to most of us “bugs” can be any of the small, creepy pests which intrude on our lives (some of which are actually helpful in that quietly and unobtrusively they dispose of the really annoying bugs which bite us), the word is casually and interchangeably applied to bees, ants, bees, millipedes, beetles, spiders and anything else resembling an insect.  That use may be reinforced by the idea of the thing “bugging” us by their very presence.  To the professionals however, insects are those organisms in the classification Insecta, a very large class of animals, the members of which have a three-part body, six legs and (usually) two pairs of wings whereas a bug is a member of the order Hemiptera (which in the taxonomic system is within the Insecta class) and includes cicadas, aphids and stink bugs; to emphasize the point, scientists often speak of those in the order Hemiptera as “true bugs”.  The true bugs are those insects with mouthparts adapted for piercing and sucking, contained usually in a beak-shaped structure, a vision agonizingly familiar to anyone who has suffered the company of bedbugs.  That’s why lice are bugs and cockroaches are not but the latter will continue to be called bugs, often with some preceding expletive.

9 September 1947: The engineer's note (with physical evidence) of electronic computing's "first bug".

In computing, where the term “bug” came to be used to describe “glitches, crashes” and such, it has evolved to apply almost exclusively to software issues and even if events are caused by hardware flaws, unless it’s something obvious (small explosions, flame & smoke etc) most users probably assume a fault in some software layer.  The very first documented bug however was an interaction recorded on 9 September 1947 between the natural world and hardware, an engineer’s examination of an early (large) computer revealing an insect had sacrificially landed on one of the circuits, shorting it out and shutting-down the machine.  As proof, the unfortunate moth was taped to the report.  On a larger scale (zoologically rather than the hardware), the problem of small rodents such as mice entering the internals of printers, there to die from various causes (impact injuries, starvation, heat et al) remains not uncommon, resulting sometimes in mechanical damage, sometimes just the implications of decaying flesh.

The idea of a bug as a “defect, flaw, fault or glitch” in a mechanical or electrical device was first recorded in the late 1800s as engineer’s slang, the assumption being they wished to convey the idea of “a small fault” (and thus easily fixed, as opposed to some fundamental mistake which would necessitate a re-design).  Some sources suggest the origin lies with Thomas Edison (1847-1931) who is reported as describing the consequences of an insect “getting into the works”.  Programmers deploy an array of adjectives to "bug" (major, minor, serious, critical & non-critical et al) although between themselves (and certainly when disparaging of the code of others) the most commonly heard phrase is probably “stupid bug”.  The “debugging” (also as de-bugging) process is something with a wide definition but in general it refers to any action or set of actions taken to remove errors.  The name of the debug.exe (originally debug.com) program included with a number of (almost all 16 & 32-bit) operating systems was a little misleading because in addition to fixing things, it could be used for other purposes and is fondly remembered by those who wrote Q&D (quick & dirty) work-arounds which, written in assembler, ran very fast.  The verb debug was first used in 1945 in the sense of “remove the faults from a machine” and by 1964 it appeared in field service manuals documenting the steps to be taken to “remove a concealed microphone”.  Although the origin of the use of “bug” in computing (probably the now most commonly used context) can be traced to 1947, the term wasn’t widely used beyond universities, industry and government sites before the 1960s when the public first began to interact at scale with the implications (including the bugs) of those institutions using computerized processes.  Software (or any machinery) badly afflicted by bugs can be called “buggy”, a re-purposing of the use of an adjective dating from 1714 meaning “a place infested with bugs”.

Some bugs gained notoriety.  In the late 1990s, it wasn’t uncommon for the press to refer to the potential problems of computer code using a two-numeral syntax for years as the “Y2K bug” which was an indication of how wide was the vista of the common understanding of "bug" and one quite reasonable because that was how the consequences would be understood.  A massive testing & rectification effort was undertaken by the industry (and corporations, induced by legislation and the fear of litigation) and with the coming of 1 January 2000 almost nothing strange happened and that may also have been the case had nothing been done but, on the basis of the precautionary principle, it was the right approach.  Of course switching protocols to use four-numeral years did nothing about the Y10K bug but a (possible) problem 8000 years hence would have been of little interest to politicians or corporate boards.  Actually, YxK bugs will re-occur (with decreasing frequency) whenever a digit needs to be added.  The obvious solution is trailing zeros although if one thinks in terms of infinity, it may be that, in the narrow technical sense, such a solution would just create an additional problem although perhaps one of no practical significance.  Because of the way programmers exploit the way computers work, there have since the 1950s been other date (“time” to a computer) related “bugs” and management of these and the minor problems caused has been handled well.  Within the industry the feeling is things like the “year 2029 problem” and “year 2038 problem” will, for most of the planet, be similarly uneventful.

The DOSShell, introduced with PC-DOS 4.0; this was as graphical as DOS got.  The DOSShell was bug-free.

Bugs can also become quirky industry footnotes.  As late as 1987, IBM had intended to release the update of PC-DOS 3.3 as version 3.4, reflecting the corporation’s roadmap of DOS as something of an evolutionary dead-end, doomed ultimately to end up in washing machine controllers and such while the consumer and corporate market would shift to OS/2, the new operating system which offered pre-emptive multi-tasking and access to bigger storage and memory addressing.  However, at that point, both DOS & OS/2 were being co-developed by IBM & Microsoft and agreement was reached to release a version 4 of DOS.  DOS 4 also included a way of accessing larger storage space (through a work-around with a program called share.exe) and more memory (in a way less elegant than the OS/2 approach but it did work, albeit more slowly), both things of great interest to Microsoft because they would increase the appeal of its upcoming Windows 3.0, a graphical shell which ran on top of DOS; unlike OS/2, Windows was exclusive to Microsoft and so was the revenue stream.  Unfortunately, it transpired the memory tricks used by PC-DOS 4.0 were “buggy” when used with some non-IBM hardware and the OS gained a bad reputation from which it would never recover.  By the time the code was fixed, Microsoft was ready to release its own version as MS-DOS 4.0 but, noting all the bad publicity, after a minor updates and some cosmetic revisions, the mainstream release was MS-DOS 4.01.  In the code of the earlier bug-afflicted bits, there is apparently no difference between MS-DOS 4.01 the few existing copies of MS-DOS 4.0.

Lindsay Lohan with Herbie the "Love Bug", Herbie: Fully Loaded (Disney Pictures, 2005). 

In idiomatic and other uses, bug has a long history.  By the early twentieth century “bugs” meant “mad; crazy" and by then “bug juice” had been in use for some thirty years, meaning both “propensity of the use of alcoholic drink to induce bad behaviour” and “bad whiskey” (in the sense of a product being of such dubious quality it was effectively a poison).  A slang dictionary from 1811 listed “bug-hunter” as “an upholsterer”, an allusion to the fondness bugs and other small creatures show for sheltering in the dark, concealed parts of furniture.  As early as the 1560s, a “bug-word” was a word or phrase which “irritated or vexed”.  The idea of “bug-eyed” was in use by the early 1870s and that’s thought either to be a humorous mispronunciation of bulge or (as is thought more likely) an allusion to the prominent, protruding eyes of creatures like frogs, the idea being they sat on the body like “a pair of bugs”.  The look became so common in the movies featuring aliens from space that by the early 1950s the acronym BEM (bug-eyed monster) had become part of industry slang.  The correct term for the medical condition of "bulging eyes" is exophthalmos.

To “bug someone” in the sense of “to annoy or irritate” seems not to have been recorded until 1949 and while some suggest the origin of that was in swing music slang, it remains obscure.  The now rare use of “bug off” to mean “to scram, to skedaddle” is documented since 1956 and is of uncertain origin but may be linked to the Korean War (1950-1953) era US Army slang meaning “stage a precipitous retreat”, first used during a military reversal.  The ultimate source was likely the UK, Australian & New Zealand slang “bugger off” (please leave).  The “doodle-bug” was first described in 1865 and was Southern US dialect for a type of beetle.  In 1944, the popular slang for the German Vergeltungswaffen eins (the V-1 (reprisal weapon 1) which was the first cruise missile) was “flying bomb” or “buzz bomb”) but the Royal Air Force (RAF) pilots preferred “doodle-bug”.

The ultimate door-stop for aircraft hangers: Bond Bug 700.

The popularity of three wheeler cars in the UK during the post-war years was a product of cost breakdown.  They were taxed at a much lower rate than conventional four-wheel vehicles, were small and thus economical and could be operated by anyone with only a motorcycle licence.  Most were actually genuine four-seaters and thus an attractive alternative for families and, being purely utilitarian, there were few attempts to introduce elements of style.  The Bond Bug (1970-1974) was an exception in that it was designed to appeal to the youth market with a sporty-looking two-seater using the then popular “wedge-styling” and in its most powerful form it could touch 80 mph (130 km/h), faster than any other three wheeler available.  However, the UK in 1973 introduced a value-added tax (VAT) and this removed many of the financial advantages the three-wheelers.  In an era of rising prosperity, the appeal of the compromise waned and coupled with some problems in the early productions runs, in 1974, after some 2¼ thousand Bugs had been built, the zany little machine was dropped; not even the oil crisis of the time (which had doomed a good number of bigger, thirstier cars) could save it.  Even in its best years it was never all that successful, essentially because it was really a novelty and there were “real” cars available for less money.  Still, the survivors have a following in their niche at the lower end of the collector market.

The business of spying is said to be the “second oldest profession” and even if not literally true, few doubt the synergistic callings of espionage and war are among man’s earliest and most enduring endeavors.  Although the use of “bug” to mean “equip with a concealed microphone” seems not to have been in use until 1946, bugging devices probably go back thousands of years (in a low-tech sort of way) and those known to have been used in Tudor-era England (1485-1603) are representative of the way available stuff was adapted, the most popular being tubular structures which, if pressed against a thin wall (or preferably a door’s keyhole) enabled one to listen to what was being discussed in a closed room.  Bugging began to assume its modern form when messages began to be transmitted over copper wires which could stretch for thousands of miles and the early term for a “phone bug” was “phone tap”, based upon the idea of “tapping into” the line as one might a water pipe.  Bugs (the name picked-up because many of the early devices were small, black and “bug-like”), whether as concealed microphones or phone taps, swiftly became part of the espionage inventory in diplomacy, commerce and crime and as technology evolved, so did the bugging techniques.

Henry Cabot Lodge Jr (1902–1985), US Ambassador to the UN (United Nations) at a May 1960 session of the Security Council, using the Great Seal bug to illustrate the extent of Soviet bugging.  The context was a tu quoque squabble between the Cold War protagonists, following Soviet revelations about the flight-paths of the American's U2 spy planes.  Lodge would be Richard Nixon’s (1913-1994; US president 1969-1974) running mate in that year's presidential election.     

A classic bug of High Cold War was the Great Seal bug, (known to the security services as the thing), a Soviet designed and built concealed listening device which was so effective because it used passive transmission protocols for its audio signal, thereby rendering it invisible to conventional “bug-detection” techniques.  The bug was concealed inside large, carved wooden rendition of the US Great Seal which, in 1945, the Kremlin presented as a “gift of friendship” to the US Ambassador to the USSR Averell Harriman (1891-1986); in a nice touch, it was a group of Russian school children who handed over the carving.  Sitting in the ambassador’s Moscow office for some seven years, it was a masterpiece of its time because (1) being activated only when exposed to a low-energy radio signal which Soviet spies would transmit from outside, when subjected to a US “bug detection” it would appear to be a piece of wood and (2) as it needed no form of battery or other power supply (and indeed, no maintenance at all), its lifespan was indefinite.  Had it not by chance been discovered by a communications officer at the nearby British embassy who happened to be tuned to the same frequency while the Soviets were sending their signal, it may well have remained in place for decades.  Essentially, the principles of the Great Seal bug were those used in modern radio-frequency identification (RFID) systems.

Wednesday, May 1, 2024

Privity

Privity (pronounced priv-i-tee)

(1) Private or secret knowledge.

(2) Participation in the knowledge of something private or secret, especially as implying concurrence or consent.

(3) Privacy or secrecy (obsolete).

(4) In medieval theology, a divine mystery; something known only to God, or revealed only in the Holy Scriptures (obsolete).

(5) The genitals (archaic, and only in the plural).

(6) In law, a relationship between parties seen as being a result of their mutual interest or participation in a given transaction, usually in contract.

(7) The fact of being privy to something; knowledge, compliance (now rare).

1175–1225: From the Anglo-Norman priveté & privitee and the Middle English privete & private, from the Old French priveté, privité & priveté (privacy; a secret, private matter), the construct being privé (from the Late Latin privus (set apart, belonging to oneself)) + -té (from the Middle French -té, from the Old French -té, from the Latin -itātem or -tātem, accusative singular of -tās, ultimately from the primitive Indo-European -tehts; the suffix was used to form nouns, often denoting a quality or a property).  The ultimate source was the Classical Latin privātus (perfect passive participle of prīvō (I bereave, deprive; I free, release).  Privity is a noun; the noun plural is privities.

Between the twelfth & sixteenth centuries a privity was “a divine mystery; something known only to God, or revealed only in the Holy Scriptures and by the late 1200s this meaning had leaked into a general sense of “privacy; secrecy”, used between the fourteenth & seventeenth centuries to refer to “a private matter, a secret”.  The use to describe the genitals (presumably influenced in some way by “private parts” or “the private”) as “the privities” is attested from the late fourteen century and didn’t wholly fade from use until the early nineteenth although use had by then long declined to a northern English, Irish & Scottish regionalism.  The word was used from the 1520s as a technical term in the laws regulating feudal land tenure and other fields of law picked it up in the general sense of “a relationship between parties seen as being a result of their mutual interest or participation in a given transaction”; it was in contract law this would assume it’s important meaning as “privity of contract” (describing the special status of the parties to a contract (as legally defined), something which would for centuries be of critical importance and still in use today.  Less precise was the sixteenth century sense of “the fact of being privy to something; knowledge, compliance” and while there are better ways of saying it, such use is not yet extinct.

Privity of contract, Donoghue v Stevenson and the snail.

The classic case (drummed for almost a century into law students) in the demolition of the sense of the absolute in privity of contract was Donoghue v Stevenson ([1932] A.C. 562, [1932] UKHL 100, 1932 S.C. (H.L.) 31, 1932 S.L.T. 317, [1932] W.N. 139), finally decided before the House of Lords.  It was the case which more than any other established the foundation of the doctrine of product liability, refined the concept of negligence (transforming tort law) and remains a core part of the framework for the principles of “duty of care” which substantially it expanded.

The extraordinary case began with events which transpired in the modest settings of the Wellmeadow Café in Paisle, Scotland, Mrs Donoghue’s friend on 26 August 1928 buying her a ginger-beer, served in a bottle made from a dark, opaque glass.  After she’d consumed about half, the remainder was poured into a tumbler at which point the partially decomposed remains of a snail floated out, inducing an alleged shock and severe gastro-enteritis.  Because Mrs Stevenson was not a party to the contractual purchase of the ginger beer, she was unable to claim through breach of warranty of a contract: she was not party to any contract because, at law, she received the drink as a gift.  Accordingly, she issued proceedings against Stevenson (the manufacturer) and, after some four years in the lower courts, the matter ended up before the House of Lords, then the UK’s highest appellate court.

All were aware it was an important case.  The lower courts, bound by precedent, had been compelled to find the absence of privity of contract doomed the suit but the issue of product liability in the modern era of consumers interacting usually not directly with the producer of goods but their agents or retailers had for some time been discussed as an area of law in which reform was required.  What the Law Lords had to decide was whether the manufacturer owed Mrs Donoghue a duty of care in the absence of contractual relations contrary to established case law.  The important point was not if she was owed compensation for damages suffered but if a cause of action existed.

Previously, as a general principle, manufacturers owed no duty of care to consumers except if (1) the product was inherently dangerous and no warning of this sate was provided and (2) the manufacturer was aware that the product was dangerous because of a defect and this had been concealed from the consumer.  The Lords found for Mrs Donoghue although in a cautious judgement which could be read as offering little scope for others except the specific matter of ginger beer in opaque bottles containing the decomposed remains of a dead snail when sold to a Scottish widow.  However, the mood for reform was in the legal air and the judgment established (1) negligence is distinct and separate in tort, (2) there need not be privity of contract for a duty of care to be established and (3) manufacturers owe a duty to the consumers who they intend to use their products.

In the leading judgment, Lord Atkin (James Richard Atkin, 1867–1944; lord of appeal in ordinary 1928-1944) wrote, inter alia, what was at that time the widest definition of the “neighbour principle”: “The rule that you are to love your neighbour becomes in law, you must not injure your neighbour; and the lawyer’s question, Who is my neighbour? receives a restricted reply.  You must take reasonable care to avoid acts or omissions which you can reasonably foresee would be likely to injure your neighbour.  Who, then, in law is my neighbour? The answer seems to be – persons who are so closely and directly affected by my act that I ought reasonably to have them in contemplation as being so affected when I am directing my mind to the acts or omissions which are called in question.  On this basis, if no other, the Lords held Mrs Donoghue’s action had succeeded and she had a cause of action in law, the culmination of a growing appreciation by the courts that the law needed to evolve to reflect the patterns of modern commerce.  Some years before Donoghue v Stevenson had been decided, another judge had observed “it would appear to be reasonable and equitable to hold that, in the circumstances and apart altogether from contract, there exists a relationship of duty as between the maker and the consumer

Once, if someone bought two bottles of ginger beer and gave one to a friend, were both to be injured by decomposing snails within, only the consumer who handed over the cash could have recovered damages because they alone enjoyed a privity of contract.  Since Donoghue v Stevenson, both can in court seek remedy in tort on the basis of “product liability”, a manufacturer’s duty of care held to extend to all consumers of their products.

Being the common law, what was effectively a new doctrine (and one, as the term “neighbour principle” suggests, rooted in Christian morality) it was also a general principle and thus a foundation on which the building blocks of subsequent judgments would sit; it could not be treated, in the words of Lord Reid (James Scott Cumberland Reid, 1890–1975, lord of appeal in ordinary 1948-1975): “as if it were a statutory definition. It will require qualification in new circumstances.  The courts in the years after 1932 had ample opportunity to refine things and this included the development of the modern tests in tort for the “foreseeability of damage” and “proximity” to which was later appended the surprisingly recent “fairness”, something which came to be regarded as within the rubric of public policy, all able to work in conjunction and as one judge noted, the distinctions between them were “somewhat porous but they are probably none the worse for that.  From Donoghue v Stevenson has evolved the modern notion of product liability and it would now to many seem strange there was in living memory a time when a manufacturer could escape liability for selling defective goods simply on the basis the injured party wasn’t the purchaser.  One curious quirk of Donoghue v Stevenson remains that the facts were not tested so it will never be known if the most important character in the case (the decomposing snail) ever existed.

Tuesday, April 30, 2024

Cybernetic

Cybernetic (pronounced sahy-ber-net-ik)

(1) Of or relating to cybernetics (the theoretical study of communication and control processes in biological, mechanical, and electronic systems, especially the comparison of these processes in biological and artificial systems).

(2) Of or relating to computers and the Internet (largely archaic (ie "so 1990s").

1948 (in English): From the Ancient Greek κυβερνητικός (kubernētikós) (good at steering, a good pilot (of a vessel)), from κυβερνητική τέχνη (kubernētikḗ tékhnē) (the pilot’s art), from κυβερνισμός (kubernismós) or κυβέρνησις (kubérnēsis) (steering, pilotage, guiding), from κυβερνάω (kubernáō) (to steer, to drive, to guide, to act as a pilot (and the ultimate source of the Modern English "govern").  Cybernetic & cybernetical are adjectives, cybernetics, cyberneticist & cybernetician are nouns and cybernetically is an adverb; the noun cybernetics is sometimes used as a plural but functions usually as a as singular (used with a singular verb)  

Although it's undocumented, etymologists suspect the first known instance of use in English in 1948 may have been based on the 1830s French cybernétique (the art of governing); that was in a paper by by US mathematician and philosopher Norbert Wiener (1894-1964) who was influenced by the cognate term "governor" (the name of an early control device proposed by Scottish physicist James Clerk Maxwell (1831–1879)), familiar in mechanical devices as a means of limiting (ie "governing") a machine's speed (either to a preferred rate or a determined maximum).  That was obviously somewhat different from the source in the original Greek kubernētēs (steersman) from kubernan (to steer, control) but the idea in both was linked by the notion of "control".  The French word cybernétique had been suggested by French physicist and mathematician André-Marie Ampère (1775-1836), (one of the founders of the science of electromagnetism and after whom is named the SI (International System of Units) unit of measurement of electric current, the ampere (amp)) to, describe the then non-existent study of the control of governments; it never caught on.  From cybernetics came the now ubiquitous back-formation cyber which has, and continues, to coin words, sometimes with some intellectual connection to the original, sometimes not: cybercafé, cybercurrency, cybergirlfriend, cybermania, cybertopia, cyberculture, cyberhack, cybermob, cybernate, cybernation, cyberpet, cyberphobia, cyberpunk, cybersecurity, cybersex, cyberspace, cyberfashion, cybergoth, cyberemo, cyberdelic et al.

Feedback

MIT Professor Norbert Wiener was an American mathematician and philosopher and one of the early thinkers developing the theory that the behaviour of all intelligent species was the result of feedback mechanisms that perhaps could be simulated by machines.  Now best remembered for the word cybernetics, his work remains among the foundations of artificial intelligence (AI).

The feedback loop at its most simple.

Cybernetics was an outgrowth of control theory, at the time something of a backwater in applied mathematics relevant to the control of physical processes and systems.  Although control theory had connections with classical studies in mathematics such as the calculus of variations and differential equations, it became a recognised field only in the late 1950s when the newly available power of big machine computers and databases were applied to problems in economics and engineering.  The results indicated the matters being studied manifested as variants of problems in differential equations and in the calculus of variations.  As the computer models improved, it was recognised the theoretical and industrial problems all had the same mathematical structure and control theory emerged.  The technological determinism induced by computing wasn’t new; the embryonic field had greatly been advanced by the machines of the eighteenth and nineteenth centuries.

Cybernetics can be represented as a simple model which is of most use when applied to complex systems.  Essentially, it’s a model in which a monitor compares what is happening with what should be happening, this feedback passed to a controller which accordingly adjusts the system’s behavior.  Wiener defined cybernetics as “the science of control and communications in the animal and machine”, something quite audacious at the time, aligning as it did the working of machines with animal and human physiology, particularly the intricacies of the nervous system and the implication the controller was the human brain and the monitor, vision from the eyes.  While the inherently mechanistic nature of the theory attracted critics, the utility was demonstrated by some success in the work of constructing artificial limbs that could be connected to signals from the brain.  The early theories underpinned much of the early work in artificial intelligence (AI).

Of cyberpunks and cybergoths

A cyberpunk Lindsay Lohan sipping martinis with Johnny Depp and a silver alien by AiJunkie.

The youth subcultures “cyberpunk” and “cybergoth” had common threads in the visual imagery of science fiction (SF) but differ in matters of fashion and political linkages.  Academic studies have suggested elements of cyberpunk can be traced to the dystopian Central & Eastern European fiction of the 1920s which arose in reaction to the industrial and mechanized nature of World War I (1914-1918) but in its recognizably modern form it emerged as a literary genre in the 1980s, characterized by darkness, the effect heightened by the use of stark colors in futuristic, dystopian settings, the cultural theme being the mix of low-life with high-tech.  Although often there was much representation of violence and flashy weaponry, the consistent motifs were advanced technology, artificial intelligence and hacking, the message the evil of corporations and corrupt politicians exploiting technology to control society for their own purposes of profit and power.  Aesthetically, cyberpunk emphasized dark, gritty, urban environments where the dominant visual elements tended to be beyond the human scale, neon colors, strobe lighting and skyscrapers all tending to overwhelm people who often existed in an atmosphere of atonal, repetitive sound.

Cybergoth girls: The lasting legacy of the cybergoth's contribution to the goth aesthetic was designer colors, quite a change to the black & purple uniform.  Debate continues about whether they can be blamed for fluffy leg-warmers.

The cybergoth thing, dating apparently from 1988, thing was less political, focusing mostly on the look although a lifestyle (real and imagined) somewhat removed from mainstream society was implied.  It emerged in the late 1990s as a subculture within the goth scene, and was much influenced by the fashions popularized by cyberpunk and the video content associated with industrial music although unlike cyberpunk, there was never the overt connection with cybernetic themes.  Very much in a symbiotic relationship with Japanese youth culture, the cybergoth aesthetic built on the black & purple base of the classic goths with bright neon colors, industrial materials, and a mix of the futuristic and the industrial is the array of accessories which included props such as LED lights, goggles, gas masks, and synthetic hair extensions.  Unlike the cyberpunks who insisted usually on leather, the cybergoths embraced latex and plastics such as PVC (polyvinyl chloride), not to imitate the natural product but as an item while the hairstyles and makeup could be extravagantly elaborate.  Platform boots and clothing often adorned with spikes, studs and chains were common but tattoos, piercings and other body modifications were not an integral component although many who adopted those things also opted to include cybergoth elements. 

Although there was much visual overlap between the two, cyberpunk should be thought of as a dystopian literary and cinematic genre with an emphasis on high-tech while cybergoth was a goth subculture tied to certain variations in look and consumption of pop culture, notably the idea of the “industrial dance” which was an out-growth of the “gravers” (Gothic Ravers), movement, named as goths became a critical mass in the clubs built on industrial music.  While interest in cyberpunk remains strong, strengthened by the adaptability of generative AI to the creation of work in the area, the historic moment of cyberpunk as a force in pop culture has passed, the fate of many subcultures which have suffered the curse of popularity although history does suggest periodic revivals will happen and elements of the look will anyway endure.