Thursday, May 2, 2024

Bug

Bug (pronounced buhg)

(1) Any insect of the order Hemiptera, especially any of the suborder Heteroptera (a hemipteran or hemipteron; a hemipterous insect), having piercing and sucking mouthparts specialized as a beak (rostrum) and known loosely as the “true bug”.

(2) Any of various species of marine or freshwater crustaceans.

(3) In casual use, any insect or insect-like invertebrate (ie used often of spiders and such because of their supposed “bug-like” quality).

(4) In casual use, any micro-organism causing disease, applied especially to especially a virus or bacterium.

(5) An instance of a disease caused by such a micro-organism; a class of such conditions.

(6) In casual (and sometimes structured) use, a defect or imperfection, most associated with computers but applied also to many mechanical devices or processes.

(7) A craze or obsession (usually widespread or of long-standing).

(8) In slang, a person who has a great enthusiasm for such a craze or obsession (often as “one bitten by the bug”).

(9) In casual (and sometimes structured) use, a hidden microphone, camera or other electronic eavesdropping device (a clipping of bugging device) and used analogously of the small and effectively invisible (often a single-pixel image) image on a web page, installed usually for the purpose of tracking users.

(10) Any of various small mechanical or electrical gadgets, as one to influence a gambling device, give warning of an intruder, or indicate location.

(11) A mark, as an asterisk, that indicates a particular item, level, etc.

(12) In US horse racing, the five-pound (2¼ kg) weight allowance able to be claimed by an apprentice jockey and by extension (1) the asterisk used to denote an apprentice jockey's weight allowance & (2) in slang, US, a young apprentice jockey (sometimes as “bug boy” (apparently used thus also of young female jockeys, “bug girl” seemingly beyond the pale.)).

(13) A telegraph key that automatically transmits a series of dots when moved to one side and one dash when moved to the other.

(14) In the slang of poker, a joker which may be used only as an ace or as a wild card to fill a straight or a flush.

(15) In commercial printing, as “union bug”, a small label printed on certain matter to indicate it was produced by a unionized shop.

(16) In fishing, a any of various plugs resembling an insect.

(17) In slang, a clipping of bedbug (mostly UK).

(18) A bogy; hobgoblin (extinct).

(19) In slang, as “bug-eyed”, protruding eyes (the medical condition exophthalmos).

(20) A slang term for the Volkswagen Beetle (Type 1; 1938-2003 & the two retro takes; 1997-2019).

(21) In broadcasting, a small (often transparent or translucent) image placed in a corner of a television program identifying the broadcasting network or channel.

(22) In aviation, a manually positioned marker in flight instruments.

(23) In gay (male) slang in the 1980s & 1990s as “the bug”, HIV/AIDS.

(24) In the slang of paleontology, a trilobite.

(25) In gambling slang, a small piece of metal used in a slot machine to block certain winning combinations.

(26) In gambling slang, a metal clip attached to the underside of a table, etc and used to hold hidden cards (a type of cheating).

(27) As the Bug (or Western Bug), a river in Eastern Europe flows through Belarus, Poland, and Ukraine with a total length of 481 miles (774 km).  The Southern Bug (530 miles (850 km)) in south west Ukraine flows into the Dnieper estuary and is some 530 miles (850 km) long.

(28) A past tense and past participle of big (obsolete).

(29) As ISO (international standard) 639-2 & ISO 639-3, the language codes for Buginese.

(30) To install a secret listening device in a room, building etc or on a telephone or other communications device.

(31) To badger, harass, bother, annoy or pester someone.

1615–1625: The original use was to describe insects, apparently as a variant of the earlier bugge (beetle), thought to be an alteration of the Middle English budde, from the Old English -budda (beetle) but etymologists are divided on whether the phrase “bug off” (please leave) is related to the undesired presence of insects or was of a distinct origin.  Bug, bugging & debug are nouns & verbs, bugged is a verb & adjective and buggy is a noun & adjective; the noun plural is bugs.  Although “unbug” makes structural sense (ie remove a bug, as opposed to the sense of “debug”), it doesn’t exist whereas forms such as the adjectives unbugged (not bugged) and unbuggable (not able to be bugged) are regarded as standard.

Nerd humor.

The array of compound forms meaning “someone obsessed with an idea, hobby etc) produced things like “shutterbug” (amateur photographer) & firebug (arsonist) seems first to have emerged in the mid nineteenth century.  The development of this into “a craze or obsession” is thought rapidly to have accelerated in the years just before World War I (1914-1918), again based on the notion of “bitten by the bug” or “caught the bug”, thus the idea of being infected with an unusual enthusiasm for something.  The use to mean a demon, evil spirit, spectre or hobgoblin was first recorded in the mid-fourteenth century and was a clipping of the Middle English bugge (scarecrow, demon, hobgoblin) or uncertain origin although it may have come from the Middle Welsh bwg (ghost; goblin (and linked to the Welsh bwgwl (threat (and earlier “fear”) and the Middle Irish bocanách (supernatural being).  There’s also speculation it may have come from the scary tales told to children which included the idea of a bugge (beetle) at a gigantic scale.  That would have been a fearsome sight and the idea remains fruitful to this day for artists and film-makers needing something frightening in the horror or SF (science fiction) genre.  The use in this sense is long obsolete although the related forms bugbear and bugaboo survive.  Dating from the 1570s, a bugbear was in folklore a kind of “large goblin”, used to inspire fear in children (both as a literary device & for purposes of parental control) and for adults it soon came to mean “a source of dread, resentment or irritation; in modern use it's an “ongoing problem”, a recurring obstacle or adversity or one’s pet peeve.  The obsolete form bugg dates from circa 1620s and was a reference to the troublesome bedbug, the construct a conflation of the middle English bugge (scarecrow, hobgoblin) and the Middle English budde (beetle).  The colloquial sense of “a microbe or germ” dates from 1919, the emergence linked to the misleadingly-named “Spanish flu” pandemic.

Bugs: A ground beetle (left), a first generation der Käfer (the Volkswagen Beetle, 1938-2003) (centre) and an "New Beetle" (1997-2011).  Despite the appearance, the "New Beetle" was of front engine & front-wheel-drive configuration, essentially a re-bodied Volkswagen Golf.  The new car was sold purely as a retro, the price paid for the style, certain packaging inefficiencies.  Few have ever questioned why the original VW Beetle picked up the nickname “bug”.

Like the rest of us, even scientists, entomologists and zoologists generally probably say “bug” in general conversation, whether about the insects or the viruses and such which cause disease but in writing academic papers they’ll take care to be more precise.  Because to most of us “bugs” can be any of the small, creepy pests which intrude on our lives (some of which are actually helpful in that quietly and unobtrusively they dispose of the really annoying bugs which bite us), the word is casually and interchangeably applied to bees, ants, bees, millipedes, beetles, spiders and anything else resembling an insect.  That use may be reinforced by the idea of the thing “bugging” us by their very presence.  To the professionals however, insects are those organisms in the classification Insecta, a very large class of animals, the members of which have a three-part body, six legs and (usually) two pairs of wings whereas a bug is a member of the order Hemiptera (which in the taxonomic system is within the Insecta class) and includes cicadas, aphids and stink bugs; to emphasize the point, scientists often speak of those in the order Hemiptera as “true bugs”.  The true bugs are those insects with mouthparts adapted for piercing and sucking, contained usually in a beak-shaped structure, a vision agonizingly familiar to anyone who has suffered the company of bedbugs.  That’s why lice are bugs and cockroaches are not but the latter will continue to be called bugs, often with some preceding expletive.

9 September 1947: The engineer's note (with physical evidence) of electronic computing's "first bug".

In computing, where the term “bug” came to be used to describe “glitches, crashes” and such, it has evolved to apply almost exclusively to software issues and even if events are caused by hardware flaws, unless it’s something obvious (small explosions, flame & smoke etc) most users probably assume a fault in some software layer.  The very first documented bug however was an interaction recorded on 9 September 1947 between the natural world and hardware, an engineer’s examination of an early (large) computer revealing an insect had sacrificially landed on one of the circuits, shorting it out and shutting-down the machine.  As proof, the unfortunate moth was taped to the report.  On a larger scale (zoologically rather than the hardware), the problem of small rodents such as mice entering the internals of printers, there to die from various causes (impact injuries, starvation, heat etc) remains not uncommon, resulting sometimes in mechanical damage, sometimes just the implications of decaying flesh.

Revelle's Bug Bomb, 1970.

The idea of a bug as a “defect, flaw, fault or glitch” in a mechanical or electrical device was first recorded in the late 1800s as engineer’s slang, the assumption being they wished to convey the idea of “a small fault” (and thus easily fixed, as opposed to some fundamental mistake which would necessitate a re-design).  Some sources suggest the origin lies with Thomas Edison (1847-1931) who is reported as describing the consequences of an insect “getting into the works”.  Programmers deploy an array of adjectives to "bug" (major, minor, serious, critical & non-critical etc) although between themselves (and certainly when disparaging of the code of others) the most commonly heard phrase is probably “stupid bug”.  The “debugging” (also as de-bugging) process is something with a wide definition but in general it refers to any action or set of actions taken to remove errors.  The name of the debug.exe (originally debug.com) program included with a number of (almost all 16 & 32-bit) operating systems was a little misleading because in addition to fixing things, it could be used for other purposes and is fondly remembered by those who wrote Q&D (quick & dirty) work-arounds which, written in assembler, ran very fast.  The verb debug was first used in 1945 in the sense of “remove the faults from a machine” and by 1964 it appeared in field service manuals documenting the steps to be taken to “remove a concealed microphone”.  Although the origin of the use of “bug” in computing (probably the now most commonly used context) can be traced to 1947, the term wasn’t widely used beyond universities, industry and government sites before the 1960s when the public first began to interact at scale with the implications (including the bugs) of those institutions using computerized processes.  Software (or any machinery) badly afflicted by bugs can be called “buggy”, a re-purposing of the use of an adjective dating from 1714 meaning “a place infested with bugs”.

Some bugs gained notoriety.  In the late 1990s, it wasn’t uncommon for the press to refer to the potential problems of computer code using a two-numeral syntax for years as the “Y2K bug” which was an indication of how wide was the vista of the common understanding of "bug" and one quite reasonable because that was how the consequences would be understood.  A massive testing & rectification effort was undertaken by the industry (and corporations, induced by legislation and the fear of litigation) and with the coming of 1 January 2000 almost nothing strange happened and that may also have been the case had nothing been done but, on the basis of the precautionary principle, it was the right approach.  Of course switching protocols to use four-numeral years did nothing about the Y10K bug but a (possible) problem 8000 years hence would have been of little interest to politicians or corporate boards.  Actually, Ynnn~K bugs will re-occur (theoretically with decreasing frequency) whenever a digit needs to be added.  The obvious solution is trailing zeros although if one thinks in terms of infinity, it may be that, in the narrow technical sense, such a solution would just create an additional problem although perhaps one of no practical significance.  Because of the way programmers exploit the way computers work, there have since the 1950s been other date (“time” to a computer) related “bugs” and management of these and the minor problems caused has been handled well.  Within the industry the feeling is things like the “Y2038 problem” will, for most of the planet, be similarly uneventful.

The DOSShell, introduced with PC-DOS 4.0; this was as graphical as DOS got.  The text-based DOSShell was bug-free and a reasonable advance over what came before but the power users had already adopted XTree as their preferred file handler.

Bugs can also become quirky industry footnotes.  As late as 1987, IBM had intended to release the update of PC-DOS 3.3 as version 3.4, reflecting the corporation’s roadmap of DOS as something of an evolutionary dead-end, doomed ultimately to end up in washing machine controllers and such while the consumer and corporate market would shift to OS/2, the new operating system which offered pre-emptive multi-tasking and access to bigger storage and memory addressing.  However, at that point, both DOS & OS/2 were being co-developed by IBM & Microsoft and agreement was reached to release a version 4 of DOS.  DOS 4 also included a way of accessing larger storage space (through a work-around with a program called share.exe) and more memory (in a way less elegant than the OS/2 approach but it did work, albeit more slowly), both things of great interest to Microsoft because they would increase the appeal of its upcoming Windows 3.0, a graphical shell which ran on top of DOS; unlike OS/2, Windows was exclusive to Microsoft and so was the revenue stream.  Unfortunately, it transpired the PC-DOS 4.0 memory tricks were “buggy” when used with some non-IBM hardware and the OS gained a bad reputation from which it would never recover.  By the time the code was fixed, Microsoft was ready to release its own version as MS-DOS 4.0 but, noting all the bad publicity, after some cosmetic revisions, the mainstream release was MS-DOS 4.01.  In the code of the earlier, bug-afflicted bits, there seems no substantive difference between MS-DOS 4.01 the few extant copies of MS-DOS 4.0.

Herbie, the love bug

Lindsay Lohan (left) among the bugs (centre) on the red carpet for the Los Angeles premiere of Herbie Fully Loaded (a 2005 remake of The Love Bug (1968)), El Capitan Theater, Hollywood, Los Angeles, 19 June 19, 2005.  The Beetle (right) was one of the many replica “Herbies” in attendance and, on the day, Ms Lohan (using the celebrity-endorsed black Sharpie) autographed the glove-box lid, removed for the purpose. 

In idiomatic and other uses, bug has a long history.  By the early twentieth century “bugs” meant “mad; crazy" and by then “bug juice” had been in use for some thirty years, meaning both “propensity of the use of alcoholic drink to induce bad behaviour” and “bad whiskey” (in the sense of a product being of such dubious quality it was effectively a poison).  A slang dictionary from 1811 listed “bug-hunter” as “an upholsterer”, an allusion to the fondness bugs and other small creatures show for sheltering in the dark, concealed parts of furniture.  As early as the 1560s, a “bug-word” was a word or phrase which “irritated or vexed”.  The idea of “bug-eyed” was in use by the early 1870s and that’s thought either to be a humorous mispronunciation of bulge or (as is thought more likely) an allusion to the prominent, protruding eyes of creatures like frogs, the idea being they sat on the body like “a pair of bugs”.  The look became so common in the movies featuring aliens from space that by the early 1950s the acronym BEM (bug-eyed monster) had become part of industry slang.  The correct term for the medical condition of "bulging eyes" is exophthalmos.

Lindsay Lohan in promotional poster for Herbie: Fully Loaded (2005).

To “bug someone” in the sense of “to annoy or irritate” seems not to have been recorded until 1949 and while some suggest the origin of that was in swing music slang, it remains obscure.  The now rare use of “bug off” to mean “to scram, to skedaddle” is documented since 1956 and is of uncertain origin but may be linked to the Korean War (1950-1953) era US Army slang meaning “stage a precipitous retreat”, first used during a military reversal.  The ultimate source was likely the UK, Australian & New Zealand slang “bugger off” (please leave).  The “doodle-bug” was first described in 1865 and was Southern US dialect for a type of beetle.  In 1944, the popular slang for the German Vergeltungswaffen eins (the V-1 (reprisal weapon 1) which was the first cruise missile) was “flying bomb” or “buzz bomb”) but the Royal Air Force (RAF) pilots preferred “doodle-bug”.

The ultimate door-stop for aircraft hangers: Bond Bug 700.

The popularity of three wheeler cars in the UK during the post-war years was a product of cost breakdown.  They were taxed at a much lower rate than conventional four-wheel vehicles, were small and thus economical and could be operated by anyone with only a motorcycle licence.  Most were genuine (if not generous) four-seaters and thus an attractive alternative for families and, being purely utilitarian, there were few attempts to introduce elements of style.  The Bond Bug (1970-1974) was an exception in that it was designed to appeal to the youth market with a sporty-looking two-seater using the then popular “wedge-styling” and in its most powerful form it could touch 80 mph (130 km/h), faster than any other three wheeler available.  The bug was designed by Vienna-born British designer Tom Karen (1926–2022) who intended it as a “Ferrari for 16-year-olds” which may hint he knew more about cars than young males but in the 1970s such comparisons often were made, a tester in one magazine describing the diminutive Fiat 127 (1971-1983) as the 0.9 litre Ferrari” which was journalistic licence writ large but people knew what he meant.

An infestation of Bugs.

However, the UK in 1973 introduced VAT (value-added tax, a consumption tax) and this removed many of the financial advantages three-wheelers offered (it also doomed much of the “kit-car” business in which customers could buy the parts and assemble them with their own labor).  In an era of rising prosperity, the appeal of the compromise waned and coupled with some problems in the early productions runs, in 1974, after some 2¼ thousand Bugs had been built, the zany little machine was dropped; not even the oil crisis of the time (which had doomed a good number of bigger, thirstier cars) and the nasty recession which followed could save it.  Even in its best years it was never all that successful, essentially because it was really a novelty and there were “real” cars available for less money.  Still, the survivors have a following in their niche at the lower end of the collector market and it's a machine truly like no other.

The business of spying is said to be the “second oldest profession” and even if not literally true, few doubt the synergistic callings of espionage and war are among man’s earliest and most enduring endeavors.  Although the use of “bug” to mean “equip with a concealed microphone” seems not to have been in use until 1946, bugging devices probably go back thousands of years (in a low-tech sort of way) and those known to have been used in Tudor-era England (1485-1603) are representative of the way available stuff was adapted, the most popular being tubular structures which, if pressed against a thin wall (or preferably a door’s keyhole) enabled one to listen to what was being discussed in a closed room.  Bugging began to assume its modern form when messages began to be transmitted over copper wires which could stretch for thousands of miles and the early term for a “phone bug” was “phone tap”, based upon the idea of “tapping into” the line as one might a water pipe.  Bugs (the name picked-up because many of the early devices were small, black and “bug-like”), whether as concealed microphones or phone taps, swiftly became part of the espionage inventory in diplomacy, commerce and crime and as technology evolved, so did the bugging techniques.

Henry Cabot Lodge Jr (1902–1985), US Ambassador to the UN (United Nations) at a May 1960 session of the Security Council, using the Great Seal bug to illustrate the extent of Soviet bugging.  The context was a tu quoque squabble between the Cold War protagonists, following Soviet revelations about the flight-paths of the American's U2 spy planes.  Lodge would be Richard Nixon’s (1913-1994; US president 1969-1974) running mate in that year's presidential election.     

A classic bug of High Cold War was the Great Seal bug, (known to the security services as the thing), a Soviet designed and built concealed listening device which was so effective because it used passive transmission protocols for its audio signal, thereby rendering it invisible to conventional “bug-detection” techniques.  The bug was concealed inside large, carved wooden rendition of the US Great Seal which, in 1945, the Kremlin presented as a “gift of friendship” to the US Ambassador to the USSR Averell Harriman (1891-1986); in a nice touch, it was a group of Russian school children who handed over the carving.  Sitting in the ambassador’s Moscow office for some seven years, it was a masterpiece of its time because (1) being activated only when exposed to a low-energy radio signal which Soviet spies would transmit from outside, when subjected to a US “bug detection” it would appear to be a piece of wood and (2) as it needed no form of battery or other power supply (and indeed, no maintenance at all), its lifespan was indefinite.  Had it not by chance been discovered by a communications officer at the nearby British embassy who happened to be tuned to the same frequency while the Soviets were sending their signal, it may well have remained in place for decades.  Essentially, the principles of the Great Seal bug were those used in modern radio-frequency identification (RFID) systems.

Wednesday, May 1, 2024

Privity

Privity (pronounced priv-i-tee)

(1) Private or secret knowledge.

(2) Participation in the knowledge of something private or secret, especially as implying concurrence or consent.

(3) Privacy or secrecy (obsolete).

(4) In medieval theology, a divine mystery; something known only to God, or revealed only in the Holy Scriptures (obsolete).

(5) The genitals (archaic, and only in the plural).

(6) In law, a relationship between parties seen as being a result of their mutual interest or participation in a given transaction, usually in contract.

(7) The fact of being privy to something; knowledge, compliance (now rare).

1175–1225: From the Anglo-Norman priveté & privitee and the Middle English privete & private, from the Old French priveté, privité & priveté (privacy; a secret, private matter), the construct being privé (from the Late Latin privus (set apart, belonging to oneself)) + -té (from the Middle French -té, from the Old French -té, from the Latin -itātem or -tātem, accusative singular of -tās, ultimately from the primitive Indo-European -tehts; the suffix was used to form nouns, often denoting a quality or a property).  The ultimate source was the Classical Latin privātus (perfect passive participle of prīvō (I bereave, deprive; I free, release).  Privity is a noun; the noun plural is privities.

Between the twelfth & sixteenth centuries a privity was “a divine mystery; something known only to God, or revealed only in the Holy Scriptures and by the late 1200s this meaning had leaked into a general sense of “privacy; secrecy”, used between the fourteenth & seventeenth centuries to refer to “a private matter, a secret”.  The use to describe the genitals (presumably influenced in some way by “private parts” or “the private”) as “the privities” is attested from the late fourteen century and didn’t wholly fade from use until the early nineteenth although use had by then long declined to a northern English, Irish & Scottish regionalism.  The word was used from the 1520s as a technical term in the laws regulating feudal land tenure and other fields of law picked it up in the general sense of “a relationship between parties seen as being a result of their mutual interest or participation in a given transaction”; it was in contract law this would assume it’s important meaning as “privity of contract” (describing the special status of the parties to a contract (as legally defined), something which would for centuries be of critical importance and still in use today.  Less precise was the sixteenth century sense of “the fact of being privy to something; knowledge, compliance” and while there are better ways of saying it, such use is not yet extinct.

Privity of contract, Donoghue v Stevenson and the snail.

The classic case (drummed for almost a century into law students) in the demolition of the sense of the absolute in privity of contract was Donoghue v Stevenson ([1932] A.C. 562, [1932] UKHL 100, 1932 S.C. (H.L.) 31, 1932 S.L.T. 317, [1932] W.N. 139), finally decided before the House of Lords.  It was the case which more than any other established the foundation of the doctrine of product liability, refined the concept of negligence (transforming tort law) and remains a core part of the framework for the principles of “duty of care” which substantially it expanded.

The extraordinary case began with events which transpired in the modest settings of the Wellmeadow Café in Paisle, Scotland, Mrs Donoghue’s friend on 26 August 1928 buying her a ginger-beer, served in a bottle made from a dark, opaque glass.  After she’d consumed about half, the remainder was poured into a tumbler at which point the partially decomposed remains of a snail floated out, inducing an alleged shock and severe gastro-enteritis.  Because Mrs Stevenson was not a party to the contractual purchase of the ginger beer, she was unable to claim through breach of warranty of a contract: she was not party to any contract because, at law, she received the drink as a gift.  Accordingly, she issued proceedings against Stevenson (the manufacturer) and, after some four years in the lower courts, the matter ended up before the House of Lords, then the UK’s highest appellate court.

All were aware it was an important case.  The lower courts, bound by precedent, had been compelled to find the absence of privity of contract doomed the suit but the issue of product liability in the modern era of consumers interacting usually not directly with the producer of goods but their agents or retailers had for some time been discussed as an area of law in which reform was required.  What the Law Lords had to decide was whether the manufacturer owed Mrs Donoghue a duty of care in the absence of contractual relations contrary to established case law.  The important point was not if she was owed compensation for damages suffered but if a cause of action existed.

Previously, as a general principle, manufacturers owed no duty of care to consumers except if (1) the product was inherently dangerous and no warning of this sate was provided and (2) the manufacturer was aware that the product was dangerous because of a defect and this had been concealed from the consumer.  The Lords found for Mrs Donoghue although in a cautious judgement which could be read as offering little scope for others except the specific matter of ginger beer in opaque bottles containing the decomposed remains of a dead snail when sold to a Scottish widow.  However, the mood for reform was in the legal air and the judgment established (1) negligence is distinct and separate in tort, (2) there need not be privity of contract for a duty of care to be established and (3) manufacturers owe a duty to the consumers who they intend to use their products.

In the leading judgment, Lord Atkin (James Richard Atkin, 1867–1944; lord of appeal in ordinary 1928-1944) wrote, inter alia, what was at that time the widest definition of the “neighbour principle”: “The rule that you are to love your neighbour becomes in law, you must not injure your neighbour; and the lawyer’s question, Who is my neighbour? receives a restricted reply.  You must take reasonable care to avoid acts or omissions which you can reasonably foresee would be likely to injure your neighbour.  Who, then, in law is my neighbour? The answer seems to be – persons who are so closely and directly affected by my act that I ought reasonably to have them in contemplation as being so affected when I am directing my mind to the acts or omissions which are called in question.  On this basis, if no other, the Lords held Mrs Donoghue’s action had succeeded and she had a cause of action in law, the culmination of a growing appreciation by the courts that the law needed to evolve to reflect the patterns of modern commerce.  Some years before Donoghue v Stevenson had been decided, another judge had observed “it would appear to be reasonable and equitable to hold that, in the circumstances and apart altogether from contract, there exists a relationship of duty as between the maker and the consumer

Once, if someone bought two bottles of ginger beer and gave one to a friend, were both to be injured by decomposing snails within, only the consumer who handed over the cash could have recovered damages because they alone enjoyed a privity of contract.  Since Donoghue v Stevenson, both can in court seek remedy in tort on the basis of “product liability”, a manufacturer’s duty of care held to extend to all consumers of their products.

Being the common law, what was effectively a new doctrine (and one, as the term “neighbour principle” suggests, rooted in Christian morality) it was also a general principle and thus a foundation on which the building blocks of subsequent judgments would sit; it could not be treated, in the words of Lord Reid (James Scott Cumberland Reid, 1890–1975, lord of appeal in ordinary 1948-1975): “as if it were a statutory definition. It will require qualification in new circumstances.  The courts in the years after 1932 had ample opportunity to refine things and this included the development of the modern tests in tort for the “foreseeability of damage” and “proximity” to which was later appended the surprisingly recent “fairness”, something which came to be regarded as within the rubric of public policy, all able to work in conjunction and as one judge noted, the distinctions between them were “somewhat porous but they are probably none the worse for that.  From Donoghue v Stevenson has evolved the modern notion of product liability and it would now to many seem strange there was in living memory a time when a manufacturer could escape liability for selling defective goods simply on the basis the injured party wasn’t the purchaser.  One curious quirk of Donoghue v Stevenson remains that the facts were not tested so it will never be known if the most important character in the case (the decomposing snail) ever existed.

Tuesday, April 30, 2024

Cybernetic

Cybernetic (pronounced sahy-ber-net-ik)

(1) Of or relating to cybernetics (the theoretical study of communication and control processes in biological, mechanical, and electronic systems, especially the comparison of these processes in biological and artificial systems).

(2) Of or relating to computers and the Internet (largely archaic (ie "so 1990s").

1948 (in English): From the Ancient Greek κυβερνητικός (kubernētikós) (good at steering, a good pilot (of a vessel)), from κυβερνητική τέχνη (kubernētikḗ tékhnē) (the pilot’s art), from κυβερνισμός (kubernismós) or κυβέρνησις (kubérnēsis) (steering, pilotage, guiding), from κυβερνάω (kubernáō) (to steer, to drive, to guide, to act as a pilot (and the ultimate source of the Modern English "govern").  Cybernetic & cybernetical are adjectives, cybernetics, cyberneticist & cybernetician are nouns and cybernetically is an adverb; the noun cybernetics is sometimes used as a plural but functions usually as a as singular (used with a singular verb)  

Although it's undocumented, etymologists suspect the first known instance of use in English in 1948 may have been based on the 1830s French cybernétique (the art of governing); that was in a paper by by US mathematician and philosopher Norbert Wiener (1894-1964) who was influenced by the cognate term "governor" (the name of an early control device proposed by Scottish physicist James Clerk Maxwell (1831–1879)), familiar in mechanical devices as a means of limiting (ie "governing") a machine's speed (either to a preferred rate or a determined maximum).  That was obviously somewhat different from the source in the original Greek kubernētēs (steersman) from kubernan (to steer, control) but the idea in both was linked by the notion of "control".  The French word cybernétique had been suggested by French physicist and mathematician André-Marie Ampère (1775-1836), (one of the founders of the science of electromagnetism and after whom is named the SI (International System of Units) unit of measurement of electric current, the ampere (amp)) to, describe the then non-existent study of the control of governments; it never caught on.  From cybernetics came the now ubiquitous back-formation cyber which has, and continues, to coin words, sometimes with some intellectual connection to the original, sometimes not: cybercafé, cybercurrency, cybergirlfriend, cybermania, cybertopia, cyberculture, cyberhack, cybermob, cybernate, cybernation, cyberpet, cyberphobia, cyberpunk, cybersecurity, cybersex, cyberspace, cyberfashion, cybergoth, cyberemo, cyberdelic etc.

Feedback

MIT Professor Norbert Wiener was an American mathematician and philosopher and one of the early thinkers developing the theory that the behaviour of all intelligent species was the result of feedback mechanisms that perhaps could be simulated by machines.  Now best remembered for the word cybernetics, his work remains among the foundations of artificial intelligence (AI).

The feedback loop at its most simple.

Cybernetics was an outgrowth of control theory, at the time something of a backwater in applied mathematics relevant to the control of physical processes and systems.  Although control theory had connections with classical studies in mathematics such as the calculus of variations and differential equations, it became a recognised field only in the late 1950s when the newly available power of big machine computers and databases were applied to problems in economics and engineering.  The results indicated the matters being studied manifested as variants of problems in differential equations and in the calculus of variations.  As the computer models improved, it was recognised the theoretical and industrial problems all had the same mathematical structure and control theory emerged.  The technological determinism induced by computing wasn’t new; the embryonic field had greatly been advanced by the machines of the eighteenth and nineteenth centuries.

Cybernetics can be represented as a simple model which is of most use when applied to complex systems.  Essentially, it’s a model in which a monitor compares what is happening with what should be happening, this feedback passed to a controller which accordingly adjusts the system’s behavior.  Wiener defined cybernetics as “the science of control and communications in the animal and machine”, something quite audacious at the time, aligning as it did the working of machines with animal and human physiology, particularly the intricacies of the nervous system and the implication the controller was the human brain and the monitor, vision from the eyes.  While the inherently mechanistic nature of the theory attracted critics, the utility was demonstrated by some success in the work of constructing artificial limbs that could be connected to signals from the brain.  The early theories underpinned much of the early work in artificial intelligence (AI).

Of cyberpunks and cybergoths

A cyberpunk Lindsay Lohan sipping martinis with Johnny Depp and a silver alien by AiJunkie.

The youth subcultures “cyberpunk” and “cybergoth” had common threads in the visual imagery of science fiction (SF) but differ in matters of fashion and political linkages.  Academic studies have suggested elements of cyberpunk can be traced to the dystopian Central & Eastern European fiction of the 1920s which arose in reaction to the industrial and mechanized nature of World War I (1914-1918) but in its recognizably modern form it emerged as a literary genre in the 1980s, characterized by darkness, the effect heightened by the use of stark colors in futuristic, dystopian settings, the cultural theme being the mix of low-life with high-tech.  Although often there was much representation of violence and flashy weaponry, the consistent motifs were advanced technology, artificial intelligence and hacking, the message the evil of corporations and corrupt politicians exploiting technology to control society for their own purposes of profit and power.  Aesthetically, cyberpunk emphasized dark, gritty, urban environments where the dominant visual elements tended to be beyond the human scale, neon colors, strobe lighting and skyscrapers all tending to overwhelm people who often existed in an atmosphere of atonal, repetitive sound.

Cybergoth girls: The lasting legacy of the cybergoth's contribution to the goth aesthetic was designer colors, quite a change to the black & purple uniform.  Debate continues about whether they can be blamed for fluffy leg-warmers.

The cybergoth thing, dating apparently from 1988, thing was less political, focusing mostly on the look although a lifestyle (real and imagined) somewhat removed from mainstream society was implied.  It emerged in the late 1990s as a subculture within the goth scene, and was much influenced by the fashions popularized by cyberpunk and the video content associated with industrial music although unlike cyberpunk, there was never the overt connection with cybernetic themes.  Very much in a symbiotic relationship with Japanese youth culture, the cybergoth aesthetic built on the black & purple base of the classic goths with bright neon colors, industrial materials, and a mix of the futuristic and the industrial is the array of accessories which included props such as LED lights, goggles, gas masks, and synthetic hair extensions.  Unlike the cyberpunks who insisted usually on leather, the cybergoths embraced latex and plastics such as PVC (polyvinyl chloride), not to imitate the natural product but as an item while the hairstyles and makeup could be extravagantly elaborate.  Platform boots and clothing often adorned with spikes, studs and chains were common but tattoos, piercings and other body modifications were not an integral component although many who adopted those things also opted to include cybergoth elements. 

Although there was much visual overlap between the two, cyberpunk should be thought of as a dystopian literary and cinematic genre with an emphasis on high-tech while cybergoth was a goth subculture tied to certain variations in look and consumption of pop culture, notably the idea of the “industrial dance” which was an out-growth of the “gravers” (Gothic Ravers), movement, named as goths became a critical mass in the clubs built on industrial music.  While interest in cyberpunk remains strong, strengthened by the adaptability of generative AI to the creation of work in the area, the historic moment of cyberpunk as a force in pop culture has passed, the fate of many subcultures which have suffered the curse of popularity although history does suggest periodic revivals will happen and elements of the look will anyway endure.

Monday, April 29, 2024

Palliate

Palliate (pronounced pal-ee-yet)

(1) To relieve or lessen (pain, disease etc) without curing or removing; to mitigate; to alleviate.

(2) To attempt to mitigate or conceal the gravity of (conduct (especially as of offenses)) by excuses, reasons, apologies etc; to extenuate.

(3) To cause an offence to seem less serious; some act of concealment.

1490s: From the Late Latin palliāre (to cover up), from palliātus (cloaked, covered), (in Late Latin the past participle of palliare (to cover with a cloak)), from palliāre (to cover up) or pallium (cloak).  Palliate is a verb & adjective, palliation, palliator & pallium are nouns, palliative is a noun & adjective, unpalliated is an adjective, palliated & palliating are verbs and palliatively is an adverb; the common noun plural is palliatives.

Palliate is one of those words in English which has become mostly overwhelmed by the associative meaning of a derived form. Palliative medicine (or palliative care) is a branch of medicine which focuses on those terminally ill (usually with months, at the most, to live) by providing pain relief and attempting to allowing the dying to enjoy the best possible quality of life.  The alternative industry is that of voluntary euthanasia (the so-called right-to-die movement) which is now permitted and regulated by legislation in many jurisdictions.  Palliative medicine gained the name from the idea of the use of “palliatives”, drugs which provide pain relief for those for whom there is no possibility of a cure.  In that sense, the treatment regime “cloaks rather than cures” and expectations are limited to concealment of the consequences of the condition.  Although such practices (along with euthanasia, voluntary and not) had been part of medical practice for centuries, it was in the 1960s it came to be recognized as a discipline and a structural part of (or adjunct to depending on the jurisdiction) the hospital industry, and there are both academic courses in the subject and peer-reviewed journals such as the European Association for Palliative Care’s (EAPC) Palliative Medicine, published since 1987.  Although On Death and Dying (1969) by Swiss-American psychiatrist Elisabeth Kübler-Ross (1926–2004) is sometimes cited as the intellectual impetus for emergence, it happened really because of the mid-century advances in hygiene, nutrition, pharmaceuticals & surgical techniques and the extension of medical services in the welfare states which extended life-spans but not necessarily wellness, thus the increasing population of those terminally ill and in need of care.  The ability to prolong life (sometimes for decades) of someone in a debilitated condition, combined with the social changes which had seen the decline in numbers of extended family living arrangements, meant a substantially public-funded industry needed to evolve.

Cloaked for the occasion: Lindsay Lohan in appropriate Grim Reaper mode, fulfilling a court-mandated community service order at LA County Morgue, October 2011.

That has meant the word has faded from some of its historic uses.  In law, it used to be part of the language of courtrooms, defense counsel attempting to palliate the conduct of their client in the hop the just or jury would view the alleged act less harshly and deliver a verdict less severe.  That sense came into use in seventeenth century England and in courtrooms it described attempts to cover or disguise the seriousness of an offence by reasons (fanciful & not), excuses (plausible & not) or apologies (sincere & not).  In legal use, palliate has been replace by mitigation (a plea assembling reasons why conduct should be regarded more favourably than it may appear and be thus awarded with a lesser sentence), from the Middle French mitigation, from the Latin mitigation from mītigātus (softened, pacified).  The companion term is exculpation which etymologically and legally is unrelated both to palliate & mitigate.  Exculpate was from the Medieval Latin exculpātus, the perfect passive participle of exculpō, from the Latin ex culpa, the construct being ex- (out, from) + culpa (fault; blame (and familiar in Modern English as “culpability”)).  Whereas a plea of palliation or in mitigation was entered in the context of asking certain matters be considered so a guilty party may receive a lesser punishment, an successful exculpation exonerates the accused.  The lawyers in the 1630s picked-up and adapted palliate’s earlier meaning.  In the fifteenth century, true to the Latin origin derived from “a cloak”, it was used to mean “to relieve the symptoms of; to ameliorate” the sense (concealing the symptoms) to which palliative medicine would in the 1960s return.  This use was extended by the mid-1500s to become a general way to “conceal, hide or disguise” and was used widely in fields such as tailoring, architecture, landscaping, interior decorating and anywhere else where techniques of illusion were valued.

Many of the artistic depictions of scenes from Antiquity are probably at least misleading (no epoch has ever been so idealized) but one aspect of the fashions seems usually faithfully to have reflected what really was: the garb of the physicians, philosophers and teachers which was a woollen cloak, draped over the left shoulder and wrapped around the body; the Romans called it a pallium and it was the stage garment also of the hetaerae (plural of hetaera (in Ancient Greece, a high-price escort of some beauty & culture who entertained upper-class men with company, conversation and other services; they're sometimes referred to as courtesans but this can be misleading and a more accurate modern comparison is probably with the business model of the “sugar-babe”)).

Appreciative audience: Phryne revealed before the Areopagus (1861), oil on canvas by Jean-Léon Gérôme (1824-1904), Hamburger Kunsthalle, Hamburg.

The painting depicts Phryne (circa 371-320 BC), a legendarily beautiful hetaera of Ancient Greece, on trial before the Areopagus (from the Ancient Greek Ἄρειος Πάγος (Áreios Págos (literally “Rock of Ares”)) which during some periods in classical times functioned as the final appellate court (both civil & criminal matters) in Athens.  As a deliberative body, the Areopagus (it picked up the name from the location where the sittings were conducted) may also at times have been a legislative (or at least an advisory) assembly something like a senate.  The comparison with the UK's House of Lords in its historic role as both the (upper) house of review is sometimes made because of the dual function as both a legislative body and a final court of appeal but the history of the role of the Aeropagus in law-making is sketchy and as a judicial organ it seems also to have sat as a whole, never restricting (as the Lords eventually did) the judicial hearings to committees of those with appropriate legal experience.

Defended (and by dubious legend not very well) by the speech-writer Hypereides (circa 390–322 BC), she was arraigned before the Areopagus on a charge of Asebeia (a criminal indictment alleging impiety, something like blasphemy towards the divine objects and perhaps an occupation risk in her profession and the charge appears to have been brought by a jilted and vengeful ex) and the most told tale of the trial is that acquittal was secured when she bared her breasts to those assembled to judge.  Depending on which imaginative medieval scribe was writing, either her counsel pulled the pallium from her body or she disrobed herself although all agree the unusual legal tactic was resorted to because the defence was going not well.  The famous legal critique of the Roman writer Marcus Fabius Quintilianus (circa 35-circa 100), the verdict was secured “non Hyperidis actione... sed conspectus corporis” (not by Hypereides' pleading, but by the sight of her body") and as a gesture it wasn’t unknown in Athenian culture.  Although the trial and acquittal (by a majority vote) are uncontested history, whether the “boobs offered in mitigation” ever happened is at least suspect but if true, it’s not surprising the venerable gentlemen judging her were impressed because she also modelled her nude form for the sculptor Praxiteles who based his Aphrodite of Knidos on those sessions.  In the late eighteen century, something of a Phryne cult formed among European artists although what is history and what myth in the stories of her illustrious career is mostly uncertain although there’s no doubt she’d often have worn a pallium.

Containing bilberry, witch hazel, mangosteen, sage, rosemary, calendula, rose flower, sea buckthorn, lemon grass, grapefruit, nettle & Iceland moss, Life Roots' Palliate Cream is advertized as an agent to (1) moisturize, (2) reduce inflammation & (3) protect against dryness.  This would suggest the product is thought something which genuinely improves the state of the skin, rather than just “papering over the cracks” (as some skin-care products unashamedly are).  The phrase “to paper over the cracks” is a particular sense of palliation meaning “to use a temporary expedient; to create the semblance of order or agreement; temporarily to conceal problems”.  The phrase (in English translation) is attributed to the Prussian statesman Otto von Bismarck (1815-1989; Chancellor of the German Empire 1871-1890) who used the equivalent German expression in a letter dated 14 August 1865 during the negotiations of the Convention of Gastein (1865), a treaty between Austria and Prussia which temporarily would postpone the onset of the Austro-Prussian War (1866) and can thus be thought a prelude to the wars and the subsequent system of intricately interlocked treaties which would be the framework of the Bismarckian form of Reichism: “We are working eagerly to preserve the peace and to cover the cracks in the building.”  Under Bismarck, the stresses inherent in the structure were contained but in the hands of hiss less able successors, the forces became unleashed and consumed the continent ending the rule of four dynastic empires.  Still, “papering over the cracks” remains often the way politics is done, usually the way coalitions are formed and of late, a new flavor of the technique has emerged: Benjamin Netanyahu (b 1949; Israeli prime minister 1996-1999, 2009-2021 and since 2022) doesn’t care if people see the cracks through the paper.