Showing posts sorted by date for query Fail Safe. Sort by relevance Show all posts
Showing posts sorted by date for query Fail Safe. Sort by relevance Show all posts

Friday, October 11, 2024

Floppy

Floppy (pronounced flop-ee)

(1) A tendency to flop.

(2) Limp, flexible, not hard, firm, or rigid; flexible; hanging loosely.

(3) In IT, a clipping of “floppy diskette”.

(4) In historic military slang (Apartheid-era South Africa & Rhodesia (now Zimbabwe), an insurgent in the Rhodesian Bush War (the “Second Chimurenga” (from the Shona chimurenga (revolution)) 1964-1979), the use a reference to the way they were (in sardonic military humor) said to “flop” when shot.

(5) In informal use, a publication with covers made with a paper stock little heavier and more rigid that that used for the pages; Used mostly for comic books.

(6) In slang, a habitué of a flop-house (a cheap hotel, often used as permanent or semi-permanent accommodation by the poor or itinerant who would go there to “flop down” for a night) (archaic).

(7) In slang, as “floppy cats”, the breeders’ informal term for the ragdoll breed of cat, so named for their propensity to “go limp” when picked up (apparently because of a genetic mutation).

1855-1860: The construct was flop + -y.  Flop dates from 1595–1605 and was a variant of the verb “flap” (with the implication of a duller, heavier sound).  Flop has over the centuries gained many uses in slang and idiomatic form but in this context it meant “loosely to swing; to flap about”.  The sense of “fall or drop heavily” was in use by the mid-1830s and it was used to mean “totally to fail” in 1919 in the wake of the end of World War I (1914-1918), the conflict which wrote finis to the dynastic rule of centuries also of the Romanovs in Russia, the Habsburgs in Austria-Hungary and the Ottomans in Constantinople although in the 1890s it was recorded as meaning “some degree of failure”.  The comparative is floppier, the superlative floppiest.  Floppy a noun & adjective, floppiness is a noun, flopped is a noun & verb, flopping is a verb, floppier& floppiest are adjectives and floppily is an adverb; the noun plural is floppies.  The adjective floppish is non-standard and used in the entertainment & publishing industries to refer to something which hasn’t exactly “flopped” (failed) but which had not fulfilled the commercial expectations.

Lindsay Lohan in "floppy-brim" hat, on-set during filming of Liz & Dick (2012).  In fashion, many "floppy-brim" hats actually have a stiff brim, formed in a permanently "floppy" shape.  The true "floppy hats" are those worn while playing sport or as beachwear etc.

The word is used as a modifier in pediatric medicine (floppy baby syndrome; floppy infant syndrome) and as “floppy-wristed” (synonymous with “limp-wristed”) was used as a gay slur.  “Flippy-floppy” was IT slang for “floppy diskette” and unrelated to the previous use of “flip-flop” or “flippy-floppy” which, dating from the 1880s was used to mean “a complete reversal of direction or change of position” and used in politics to suggest inconsistency.  In the febrile world of modern US politics, to be labelled a “flip-flopper” can be damaging because it carries with it the implication what one says can’t be relied upon and campaign “promises” might thus not be honored.  Whether that differs much from the politicians’ usual behaviour can be debated but still, few enjoy being accused of flip-floppery (definitely a non-standard noun).  The classic rejoinder to being called a flip-flopper is the quote: “When the facts change, I change my mind. What do you do, sir?”  That’s often attributed to the English economist and philosopher Lord Keynes (John Maynard Keynes, 1883-1946) but it was said originally by US economist Paul Samuelson (1915–2009) the 1970 Nobel laureate in Economics.  In the popular imagination Keynes is often the “go to” economist for quote attribution in the way William Shakespeare (1564–1616) is a “go to author” and Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) a “go to politician”, both credited with thing they never said but might have said.  I phraseology, the quality of “Shakespearian” or “Churchillian” not exactly definable but certainly recognizable.  In the jargon of early twentieth century electronics, a “flip-flop” was a reference to switching circuits that alternate between two states.

Childless cat lady Taylor Swift with her “floppy cat”, Benjamin Button (as stole).  Time magazine cover, 25 December 2023, announcing Ms Swift as their 2023 Person of the Year.  "Floppy cat" is the the breeders' informal term for the ragdoll breed an allusion to their tendency to “go limp” when picked up, a behavior believed caused by a genetic mutation.

The other use of flop in IT is the initialism FLOP (floating point operations per second).  Floating-point (FB) arithmetic (FP) a way of handling big real numbers using an integer with a fixed precision, scaled by an integer exponent of a fixed base; FP doesn’t really make possible what would not in theory be achievable using real numbers but does make this faster and practical and the concept became familiar in the 1980s when Intel made available FPUs (floating point units, also known as math co-processors) which could supplement the CPUs (central processing units) of their x86 family.  The 8087 FPU worked with the 8086 CPU and others followed (80286/80287, 80386/80387, i486/i487 etc) until eventually the FPU for the Pentium range was integrated into the CPU, the early implementation something of a debacle still used as a case study in a number of fields departments including management and public relations.

FLOPs are an expression of specific performance and are used to measure those computations requiring floating-point calculations (typically in math-intensive work) and for purposes of “benchmarking” or determining “real-world” performance under those conditions, it’s a more informative number than the traditional rating of instructions per second (iSec).  The FLOPs became something of a cult in the 1990s when the supercomputers of the era first breached the trillion FLOP mark and as speeds rose, the appropriate terms were created:

kiloFLOPS: (kFLOPS, 103)
megaflops: (MFLOPS, 106)
gigaflops: GFLOPS, 109)
teraflops: TFLOPS, 1012)
petaFLOPS: PFLOPS, 1015)
exaFLOPS: (EFLOPS, 1018)
zettaFLOPS: ZFLOPS, 1021)
yottaFLOPS: YFLOPS, 1024)
ronnaFLOPS: RFLOPS, 1027)
quettaFLOPS: QFLOPS, 1030)

In the mysterious world of quantum computing, FLOPs are not directly applicable because the architecture and methods of operation differ fundamentally from those of classical computers.  Rather than FLOPs, the performance of quantum computers tends to be measured in qubits (quantum bits) and quantum gates (the operations that manipulate qubits).  The architectural difference is profound and explained with the concepts of superposition and entanglement:  Because a qubit simultaneously can represent both “0” & “1” (superposition) and these can be can be entangled (a relationship in which distance is, at least in theory, irrelevant; under parallelism, performance cannot easily be reduced to simple arithmetic or floating-point operations which remain the domain of classical computers which operate using the binary distinction between “O” (off) and “1” (on).

Evolution of the floppy diskette: 8 inch (left), 5¼ inch (centre) & 3½ inch (right).  The track of the floppy for the past half-century has been emblematic of the IT industry in toto: smaller, higher capacity and cheaper.  Genuinely it was one of the design parameters for the 3½ inch design that it fit into a man's shirt pocket.

In IT, the term “floppy diskette” used the WORM (write once, read many, ie "read only" after being written) principle first appeared in 1971 (soon doubtless clipped to “floppy” although the first known use of this dates from 1974).  The first floppy diskettes were in an 8 inch (2023 mm) format which may sound profligate for something with a capacity of 80 kB (kilobyte) but the 10-20 MB (megabit) hard drives of the time were typically the same diameter as the aperture of domestic front-loading washing machine so genuinely they deserved the diminutive suffix (-ette, from the Middle English -ette, a borrowing from the Old French -ette, from the Latin -itta, the feminine form of -ittus.  It was used to form nouns meaning a smaller form of something).  They were an advance also in convenience because until they became available, the usual way to transfer files between devices was to hard-wire them together.  Introduced by IBM in 1971, the capacity was two years later raised to 256 kB and by 1977 to a heady 1.2 MB (megabyte) with the advent of a double-sided, double-density format.  However, even then it was obvious the future was physically smaller media and in 1978 the 5¼ inch (133 mm) floppy debuted, initially with a formatted capacity of 360 kB but by 1982 this too had be raised to 1.2 MB using the technological advance if a HD (high density) file system and it was the 5¼ floppy which would become the first widely adopted industry “standard” for both home and business use, creating the neologism “sneakernet”, the construct being sneaker + net(work), the image being of IT nerds in their jeans and sneakers walking between various (unconnected) computers and exchanging files via diskette.  Until well into the twenty-first century the practice was far from functionally extinct and it persists even today with the use of USB sticks.

Kim Jong-un (Kim III, b 1982; Supreme Leader of DPRK (North Korea) since 2011) with 3½ inch floppy diskette (believed to be a HD (1.44 MB)).

The meme-makers use the floppy because it has become a symbol of technological bankruptcy. In OS (operating system) GUIs (graphical user interface) however, it does endure as the "save" icon and all the evidence to date does suggest that symbolic objects like icons do tend to outlive their source, thus the ongoing use in IT of analogue, rotary dial phones in iconography and the sound of a camera's physical shutter in smart phones.  Decades from now, we may still see representations of floppy diskettes.

The last of the mainstream floppy diskettes was the 3½ inch (89 mm) unit, introduced in 1983 in double density form with a capacity of 720 KB (although in one of their quixotic moves IBM used a unique 360 kB version for their JX range aimed at the educational market) but the classic 3½ was the HD 1.44 MB unit, released in 1986.  That really was the end of the line for the format because although in 1987 a 2.88 MB version was made available, few computer manufacturers offered the gesture of adding support at the BIOS (basic input output system) so adoption was infinitesimal.  The 3½ inch diskette continued in wide use and there was even the DMF (Distribution Media Format) with a 1.7 MB capacity which attracted companies like Microsoft, not because it wanted more space but to attempt to counter software piracy; within hours of Microsoft Office appearing in shrink-wrap with, copying cracks appeared on the bulletin boards (where nerds did stuff before the www (worldwideweb).  It was clear the floppy diskette was heading for extinction although slighter larger versions with capacities as high as 750 MB did appear but, expensive and needing different drive hardware, they were only ever a niche product seen mostly inside corporations.  By the time the CD-ROM (Compact Disc-Read-only Memory) reached critical mass in the mid-late 1990s the once ubiquitous diskette began rapid to fade from use, the release in the next decade of the USB sticks (pen drives) a final nail in the coffin for most.

In the mid 1990s, installing OS/2 Warp 4.0 (Merlin) with the optional packs and a service pack could require a user to insert and swap up to 47 diskettes.  It could take hours, assuming one didn't suffer the dreaded "floppy failure".

That was something which pleased everyone except the floppy diskette manufacturers who had in the early 1990s experienced a remarkable boom in demand for their product when Microsoft Windows 3.1 (7 diskettes) and IBM’s OS/2 2.0 (21 diskettes) were released. Not only was the CD-ROM a cheaper solution than multiple diskettes (a remarkably labor-intensive business for software distributors) but it was also much more reliable, tales of an installation process failing on the “final diskette” legion and while some doubtlessly were apocryphal, "floppy failure" was far from unknown.  By the time OS/2 Warp 3.0 was released in 1994, it required a minimum of 23 floppy diskettes and version 4.0 shipped with a hefty 30 for a base installation.  Few mourned the floppy diskette and quickly learned to love the CD-ROM.

What lay inside a 3½ inch floppy diskette.

Unlike optical discs (CD-ROM, DVD (Digital Versatile Disc) & Blu-Ray) which were written and read with the light of a laser, floppy diskettes were read with magnetic heads.  Inside the vinyl sleeve was a woven liner impregnated with a lubricant, this to reduce friction on the spinning media and help keep the surfaces clean.

Curiously though, niches remained where the floppy lived on and it was only in 2019 the USAF (US Air Force) finally retired the use of floppy diskettes which since the 1970s had been the standard method for maintaining and distributing the data related to the nation’s nuclear weapons deployment.  The attractions of the system for the military were (1) it worked, (2) it was cheap and (3) it was impervious to outside tampering.  Global thermo-nuclear war being a serious business, the USAF wanted something secure and knew that once data was on a device in some way connected to the outside world there was no way it could be guaranteed to be secure from those with malign intent (ayatollahs, the Secret Society of the Les Clefs d'Or, the CCP (Chinese Communist Party), the Freemasons, those in the Kremlin or Pyongyang et al) whereas a diskette locked in briefcase or a safe was, paradoxically, the state of twenty-first century security, the same philosophy which has seen some diplomatic posts in certain countries revert to typewriters & carbon paper for the preparation of certain documents.  In 2019 however, the USAF announced that after much development, the floppies had been retired and replaced with what the Pentagon described as a “highly-secure solid-state digital storage solution which work with the Strategic Automated Command and Control System (SACCS).

It can still be done: Although no longer included in PCs & laptops, USB floppy diskette drives remain available (although support for Windows 11 systems is said to be "inconsistent").  Even 5¼ inch units have been built.

It thus came as a surprise in 2024 to learn Japan, the nation which had invented motorcycles which didn’t leak oil (the British though they’d proved that couldn’t be done) and the QR (quick response) code, finally was abandoning the floppy diskette.  Remarkably, even in 2024, the government of Japan still routinely asked corporations and citizens to submit documents on floppies, over 1000 statutes and regulations mandating the format.  The official in charge of updating things (in 2021 he’d “declared war” on floppy diskettes) in July 2024 announced “We have won the war on floppy disks!” which must have be satisfying because he’d earlier been forced to admit defeat in his attempt to defenestrate the country’s facsimile (fax) machines, the “pushback” just too great to overcome.  The news created some interest on Japanese social media, one tweet on X (formerly known as Twitter) damning the modest but enduring floppy as a “symbol of an anachronistic administration”, presumably as much a jab at the “tired old men” of the ruling LDP (Liberal Democratic Party) as the devices.  There may however been an element of technological determinism in the reform because Sony, the last manufacturer of the floppy, ended production of them in 2011 so while many remain extant, the world’s supply is dwindling.  In some ways so modern and innovative, in other ways Japanese technology sometimes remains frozen, many businesses still demanding official documents to be endorsed using carved personal stamps called the印鑑 (ikan) or 判子 (hanko); despite the government's efforts to phase them out, their retirement is said to be proceeding at a “glacial pace”.  The other controversial aspect of the hanko is that the most prized are carved from ivory and it’s believed a significant part of the demand for black-market ivory comes from the hanko makers, most apparently passing through Hong Kong, for generations a home to “sanctions busters”.

Sunday, November 12, 2023

Efficacy

Efficacy (pronounced ef-i-kuh-see)

(1) A capacity for producing a desired result or effect; effectiveness; an ability to produce a desired effect under ideal testing conditions.

(2) The quality of being successful in producing an intended result; effectiveness; a measure of the degree of ability to produce a desired effect.

1520-1530: From the Old French efficace (quality of being effectual, producing the desired effect), from the Late Latin efficācia (efficacy), from efficāx (powerful, effectual, efficient), genitive efficacis (powerful, effective), from stem of efficere (work out, accomplish).  In eleventh century English, in much the same sense was efficace from the Old French eficace from the same Latin root efficācia; there was also the early fifteenth century efficacite from the Latin efficacitatem.  The sixteenth century adjective efficacious (certain to have the desired effect) was often used of medicines (presumably a favorite of apothecaries), the construct being the Latin efficaci-, stem of efficax from the stem of efficere (work out, accomplish)  + -ous.  The –ous suffix was from the Middle English -ous, from the Old French –ous & -eux, from the Latin -ōsus (full, full of); a doublet of -ose in an unstressed position.  It was used to form adjectives from nouns, to denote possession or presence of a quality in any degree, commonly in abundance.  In chemistry, it has a specific technical application, used in the nomenclature to name chemical compounds in which a specified chemical element has a lower oxidation number than in the equivalent compound whose name ends in the suffix -ic.  For example, sulphuric acid (H2SO4) has more oxygen atoms per molecule than sulphurous acid (H2SO3).  The noun inefficacy (want of force or virtue to produce the desired effect) dates from the 1610s, from the Late Latin inefficacia, from inefficacem (nominative inefficax), the construct being in- (not, opposite of) + efficax.

The most familiar related form in modern use is efficacious but in general use this is often used in a more nuanced way than the pass/fail dichotomy of "efficacy" familiar in medical trials.  In general use, efficacious is a "spectrum word" which describes degrees of the ameliorative effects of treatments although while the comparative is "more efficacious", a more common form is "quite efficacious"; the superlative "most efficacious" appears to be popular among the small subset of the population who use efficacious at all.  Efficacy, efficacity & efficaciousness are nouns, effectuate is a verb, effectual & efficacious are adjectives and efficaciously is an adverb; the noun plural is efficacies.

Clinical trials in the pharmaceutical industry

In the development of vaccines (and medicinal drugs in general), efficacy trials (sometimes called phase III or explanatory trials) determine the percentage reduction of disease in a vaccinated group of people compared to an unvaccinated group, under the most favorable conditions, which is with the subjects housed in a hospital equipped to handle intensive care patients.  Conducted on human subjects if tests on animals proved satisfactory, it’s a purely clinical exercise, practiced since 1915 and can be done as a double-blind, randomized study if no safety concerns exist.  One potentially distorting aspect of both efficacy and (particularly) safety trials is a historic bias towards healthy young males as the subjects.  The antonym of the adjective efficacious is inefficacious but the word is rarely used when drug trials produce unsatisfactory results: the punchier "failed" is almost always used.  Under normal circumstances, the testing process can take many years, the industry usually referring to trials as phases:

Phase I: Safety Trial

Phase I trials are done to test a new biomedical intervention for the first time in a small group of people (typically 20-100) to evaluate safety.  Essentially, this determines the safe dosage range and identifies side effects.

Phase II: Efficacy Trial

Phase II trials are done to study an intervention in a larger group of people (several hundred or more depending on the product) to determine efficacy (ie whether it works as intended) and further to evaluate safety.

Phase III: Clinical Study

Phase III studies are done to study the efficacy of an intervention in large groups of trial participants (often thousands) by comparing the intervention to other standard or experimental interventions (or to non-interventional standard care).  Phase III studies are also used to monitor adverse effects and to collect information that will allow the intervention to be used safely.

Phase IV: Efficiency Study

Phase IV studies are done after the drug has been released and is being prescribed.  These studies are designed to monitor the effectiveness of the approved intervention in the general population and to collect information about any adverse effects associated with widespread use over longer periods of time.  They may also be used to investigate the potential use of the intervention in a different condition, or in combination with other therapies.

Proven efficacy: Adderall.

Adderall and Mydayis are trade names for a combination drug called mixed amphetamine salts (a mix of four salts of amphetamine).  In all Western jurisdictions  Belonging to a class of drugs known as stimulants, Adderall is a prescription medication and it contains two active ingredients: the amphetamines and dextroamphetamine.  As a prescribed medicine, primarily Adderall is used in the treatment of attention deficit hyperactivity disorder (ADHD), a neurobehavioral disorder characterized by symptoms such as inattention, hyperactivity, and impulsivity.  Adderall works by increasing the levels of certain neurotransmitters (most critically dopamine and norepinephrine) in the brain, these both mechanisms which play some role in regulating attention, focus, and impulse control.  Beyond ADHD, Adderall is sometimes prescribed off-label for the treatment of narcolepsy, a sleep disorder characterized by excessive daytime sleepiness and sudden, unpredictable episodes of sleep.

Adderall also has something of a cult following among those who seek to experience some of its more desirable side-effects.  Like many of the earlier amphetamines (most famously Tenuate Dospan (diethylpropion or amfepramone) an appetite suppressant of legendary efficacy), Adderall can assist in weight-loss and can safely be used for this by most people but because of its potential for dependence, it should be taken (for whatever purpose) only under clinical supervision.  For those prescribed Adderall, in some circumstances, it may continue to be taken (at the prescribed level) even if one is in a substance rehabilitation facility as was the case in 2013 when Lindsay Lohan completed a 48-hour drug detox at the Betty Ford Clinic in Indio, California.  Ms Lohan was prescribed Adderall after being diagnosed with ADHD but the standard protocol used by rehab clinics is that doctors routinely re-evaluate (1) the ADHA diagnosis and (2) the efficacy of the treatment regime.  Depending on their findings, doctors can prescribe alternative drugs or cease drug intervention entirely.  Ms Lohan was quoted as saying she’d been using Adderall "for years" and that she cannot function without it and her choice of rehab facility was once which would permit both smoking (tobacco) and the use of Adderall.

As she earlier explained it: “I have severe ADD. I can’t stand still.  So, I take Adderall for that; it calms me.”  Ms Lohan further noted she was not unaware there were those who took Adderall for its side-effects, notably weight loss or the ability to function effectively for extended durations without needing to sleep but that she wasn’t someone who needed to regulate her weight and that her sleeping patterns were normal.  However, the cult if anything is growing and in the US shortages of Adderall were reported (not for the first time) in late 2022.  The US Food and Drug Administration (FDA) responded by issuing a statement noting that while there was nothing unusual about episodic shortages of generic drugs at any given time because the profit margins are low and production is sometimes restricted to avoid a sacrifice in the opportunity cost vis-a-vis higher margin products, Adderall was “a special case because it is a controlled substance and the amount available for prescription is controlled by the Drug Enforcement Administration (DEA).”  The FDA added that because there had been “a tremendous increase in prescribing” because of virtual medicine (e-consultations) and a general trend towards over-prescribing and over-diagnosing, the periodic shortages were likely to continue.  THE FDA’s conclusion was that “if only those who needed these drugs got them, there probably wouldn't be a [stimulant medication] shortage” but the diagnosis of ADHD continues to grow and the desire for rapid weight-loss solutions remains strong.

Friday, July 7, 2023

Cruise

Cruise (pronounced krooz)

(1) To sail about on a pleasure trip (often as cruising).

(2) To sail about, as a warship patrolling a body of water.

(3) To travel about without a particular purpose or destination.

(4) To fly, drive, or sail at a constant speed that permits maximum operating efficiency for sustained travel.

(5) In aeronautics, the portion of aircraft travel at a constant airspeed and altitude between ascent and descent phases.

(6) To travel at a moderately fast, easily controllable speed.

(7) To travel about slowly, looking for customers or for something demanding attention.

(8) As cruise missile, an intermediate-range weapon.

(9) Among male homosexuals, actively to seek a casual sexual partner by moving about a particular area known to be frequented by those there for such purposes, an area known to be productive known as “cruisy” (“to troll” & “trolling” were once used as a synonyms but those terms have now been claimed by their use on the internet).

(10) In informal use in the US military, a period spent in the Marine Corps.

(11) In casual use in sporting competition, easily to win.

1645-1655:  From the Dutch kruisen (to cross, sail to and fro), from kruis or cruis (cross), from the Middle Dutch cruce, from the Latin crux.  Root was the primitive Indo-European sker (to turn, to bend); etymologists suggest it may be cognate with the Latin circus (circle) and curvus (curve).  In English, it began to be used as a noun in 1706 in the sense of “a voyage taken in courses” and by 1906 as “a voyage taken by tourists on a ship".  It was related to the French croiser (to cross, cruise), the Spanish cruzar and the German kreuzen.  The alternative spelling cruize is obsolete.  Cruise & cruising are nouns & verbs, cruised is a verb, cruiser is a noun and cruisy is an adjective; the noun plural is cruises.

Cruiser in the sense of "one who or that which cruises"(agent noun from the verb cruise) is from the 1670s, probably, borrowed from similar words in continental languages (such as the Dutch cruiser & French croiseur).  In older use, a cruiser was a warship built to patrol and protect commerce of the state to which it belongs and to chase hostile ships; cruisers were the classic gun boats used by the European colonial powers for patrolling their empires.  In this use they were often compared to the frigates of old in that they possessed good speed and were employed to protect the trade-routes, to glean intelligence, and to act as the “eyes of the fleet” and in casual use, during the eighteenth century, the term was often applied to the ships of privateers (pirates).  Cruiser was used to describe homosexuals “cruising for sex partners" (ie frequenting and lingering in places well-known for such things) from 1903, as a boxing weight (cruiserweight) class, from 1920.  The meaning "police patrol car" is a 1929 adoption of American English.

Royal Navy battlecruiser HMS Hood entering Valletta harbor, Malta 1937.

In admiralty use, cruisers are now the largest of the conventional warships still in service.  Navies used to use the term “cruiser” more as a description of the tasks for which the ships were used rather than specific nature of the construction, the early cruisers those ships which were used for long-range missions such as costal raiding or scouting and it was only in the late nineteenth century as the fleets grew and became more specialized that the classic model of the corvette / frigate / destroyer / cruiser / battleship evolved.  Even then there were distinctions such as light & heavy cruisers but the most interesting development in warship architecture was the battlecruiser, built essentially because the Dreadnought had created “a gap in the market”.  Battlecruisers were battleships with less armor, therefore gaining speed at the cost of greater vulnerability.  The theory was they would have the firepower to out-gun all but the battleships and those they could out-run with their greater speed.  The concept seemed sound and in December 1914, at the Battle of the Falkland Islands, two Royal Navy battlecruisers vindicated the theory when they chased and destroyed the German East Asia Squadron. However, in 1916, the performance of the battlecruisers in the Jutland engagement forced the Admiralty to re-consider.  Jutland was the closest thing to the great battle of the fleets which had been anticipated for decades but proved anti-climatic, both sides ultimately choosing to avoid the decisive encounter which offered the chance of victory or defeat.  What it did prove was that the naval theorists had been right; the battlecruiser could not fight the battleship and if their paths threatened to cross, the less-armored vessel should retreat and rely on greater speed to make good her escape.  There were technical deficiencies in the British ships, without which perhaps three of their battlecruisers wouldn’t have been lost, but what happened at Jutland made it clear to the admirals that uneven contests between the big capital ships were to be avoided.  The consequence was that the battlecruiser became unfashionable and after the round of disarmament in the 1920s, none were built until, unexpectedly, the Soviet Navy commissioned four in the 1980s.  They proved the last of the breed.

Origin of cruise missiles

US Pershing II cruise missiles in Neu-Ulm military base, Swabia, Bavaria in the then Federal Republic of Germany (The FRG, the old West Germany), 1984.

Carrying large warheads long distances, cruise missiles are guided weapons, used against ground targets; they fly at both subsonic and supersonic speed, remain in the atmosphere and, self-propelled for the most of their flight, travel for mostly at a constant speed.  In this they differ from ballistic missiles which fly in an arc, often reaching suborbital flight with a final trajectory much like a bullet because, once the fuel is expended, the path from that point is determined by the speed and direction of launch and the force of gravity pulling towards Earth.  Both cruise and ballistic missiles can carry nuclear warheads but cruise missiles are most often equipped with conventional warheads.  Theorists and researchers were exploring the possibility of military missiles as early as 1908, described then as the aerial torpedo, envisaged as remote-controlled weapons with which to shoot-down airships bombing London, perceived then as the most credible airborne delivery system.  .  Between the first and second world wars, the major powers all devoted resources to research but few projects reached even the prototype stage.

Annotated schematic of the V-1 (left) and a British Military Intelligence drawing (dated 16 June 1944, 3 days after the first V-1 attacks on London (right). 

First deployed in 1944 the German Vergeltungswaffen eins (“retaliatory weapon 1” or "reprisal weapon 1” and eventually known as the V-1) was the world’s first cruise missile.  One of the rare machines to use a pulse-jet, it emitted such a distinctive sound that those at whom it was aimed nicknamed it the “buzz-bomb” although it attracted other names including “flying bomb” and “doodlebug”.  In Germany, before Dr Joseph Goebbels (1897–1945; Reich Minister of Propaganda 1933-1945) decided it was the V-1, the official military code name was Fi 103 (The Fi stood for Fieseler, the original builder of the airframe and most famous for their classic Storch (Stork), short take-off & landing (STOL) aircraft) but there were also the code-names Maikäfer (maybug) & Kirschkern (cherry stone).  While the Allied defenses against the V-1 did improve over time, it was only the destruction of the launch sites and the occupation of territory within launch range that ceased the attacks.  Until then, the V-1 remained a highly effective terror weapon but, like the V-2 and so much of the German armaments effort, bureaucratic empire-building and political intrigue compromised the efficiency of the project.

Lindsay Lohan on a cruise in the Maldives, January 2019.

The V-1 used a gyroscope guidance system and was fitted with an unusual triple-layer fuse system, the primary device and a backup augmented by a fail-safe designed to ensure destruction of “duds” (weapons which fail to detonate) so they couldn’t be examined.  The accuracy of the thing was sufficient only for use against very large targets (such as the general area of a city which made sprawling London ideal) while the range of 250 km (155 miles) was significantly less than that of a medium bomber carrying the same payload. The main advantages were speed (although not sufficient to outrun the fastest of the low-altitude propeller-driven interceptors), expendability and economy of operation.  Indeed, it was probably the war’s outstanding delivery system in terms of cost per ton of explosive, able to carry a warhead of 850 kg (1,870 lb) to London at a tiny fraction of the cost of using manned aircraft for the same task with the priceless additional benefit of not risking the loss of aircrew.  The production cost of a V-1 was also only a small fraction of that of the supersonic V-2 ballistic missile which carried a warhead only of a similar-size although once launched, it was effectively invulnerable.  Unlike the V-2, the initial deployments of the V-1 required large, fixed launch ramps which were relatively easy to detect and susceptible to bombardment.  Later experiments produced much smaller launch facilities which provided for a greater rate of sustained fire.  Bomber-launched variants of the V-1 saw limited operational service near the end of the war, with the pioneering V-1's design reverse-engineered by the Americans as the Republic-Ford JB-2 cruise missile.

Luftwaffe Mistel Aircraft ( Focke-Wulf Fw 190 (upper) & Junkers Ju 88 (lower), Merseburg, Germany, 1945.

The "cruise missile" project which was the best example of the improvisation which characterized much of the ad-hoc weapon development of war time was the Mistel (mistletoe) or Beethoven-Gerät (Beethoven Device) composite aircraft program which the Germans developed in 1943.  It was a rudimentary air-launched cruise missile, made by a piloted fighter aircraft being mounted atop an unpiloted bomber-sized aircraft, packed with explosives and the larger aircraft would be released to glide towards the target.  Calling it the mistletoe reveals a sense of humor mot usually associated with the Luftwaffe but it was known rather more evocatively as the Vati und Sohn (Daddy and Son) or the Huckepack (Piggyback).  Although built in the hundreds, by the time it was available for deployment, the scope for attacking large targets with manned aircraft had reduced and the need was for precision delivery, something for which the Mistel was ill-suited and success was limited.

Saturday, June 24, 2023

Deadman

Deadman (pronounced ded-man or ded-muhn)

(1) In architecture and civil engineering a heavy plate, log, wall, or block buried in the ground that acts as an anchor for a retaining wall, sheet pile etc, usually by a tie connecting the two.

(2) A crutch-like prop, used temporarily to support a pole or mast during the erection process.

(3) In nautical use, an object fixed on shore temporarily to hold a mooring line.

(4) In nautical use, a rope for hauling the boom of a derrick inboard after discharge of a load of cargo.

(5) In mountaineering a metal plate with a wire loop attached for thrusting into firm snow to serve as a belay point, a smaller version being known as a deadboy.

(6) In slang, a bottle of alcoholic drink that has been consumed (ie is empty).

(7) In the operation of potentially dangerous machinery, a control or switch on a powered machine or vehicle that disengages a blade or clutch, applies the brake, shuts off the engine etc, when the driver or operator ceases to press a pedal, squeeze a throttle, etc; known also as the deadman throttle or the deadman control.  The hyphenated form dead-man is often used, both as noun and adjective.  Deadman is a noun and the noun plural is deadmans which seems ugly and a resulting formation such as "seven deadmans" is surely clumsy but most authoritative reference sources insist only "deadmans" will do.  Deadmen or dead-men is tolerated (by some liberal types) on the same basis as computer "mice" although "mouses" doesn't jar in the way "deadmans" seems to.

Circa 1895: A compound word, the construct being dead + man.  Dead was from the Middle English ded & deed, from Old English dēad, from the Proto-West Germanic daud, from daudaz.  The Old English dēad (a dead person; the dead collectively, those who have died) was the noun use of the adjective dead, the adverb (in a dead or dull manner, as if dead," also "entirely") attested from the late fourteenth century, again derived from the adjective.  The Proto-Germanic daudaz was the source also of the Old Saxon dod, the Danish død, the Swedish död, the Old Frisian dad, the Middle Dutch doot, the Dutch dood, the Old High German tot, the German tot, the Old Norse dauðr & the Gothic dauþs.  It's speculated the ultimate root was the primitive Indo-European dheu (to die).

Man was from the Middle English man, from the Old English mann (human being, person, man), from the Proto-West Germanic mann, from the Proto-Germanic mann (human being, man).  Doublet of Manu.  The specific sense of "adult male of the human race" (distinguished from a woman or boy) was known in the Old English by circa 1000.   Old English used wer and wif to distinguish the sexes, but wer began to disappear late in the thirteenth century, replaced by mann and increasingly man.  Man also was in Old English as an indefinite pronoun (one, people, they) and used generically for "the human race, mankind" by circa 1200.  Although often thought a modern adoption, use as a word of familiar address, originally often implying impatience is attested as early as circa 1400, hence probably its use as an interjection of surprise or emphasis since Middle English.  It became especially popular from the early twentieth century.

Calameo Dual-purpose MIL-SIM-FX mechanical dead-man and detonator switch (part-number MIL-12G-DMS).

The source of the name is the idea that if something is likely to in some way be dangerous if uncontrolled, operation is possible only if some device is maintained in a state which is possible only by a person not dead or in some debilitated condition.  The classic example is the train driver; if the driver does not maintain the switch in the closed position, the train slows to a halt.  Some manufactures describe the whole assembly as a "deadman's brake" and the part which is subject to human pressure as "deadman's switch" (or deadman's handle".  The phrase "dead man's fingers" is unrelated and is used variously in zoology, botany and in cooking and "dead man's rope" is a kind of seaweed (a synonym of sea-laces).  The legend of the "dead man's hand" (various combinations of aces and eights in poker) is based on the cards in the hand held by the unfortunate "Wild Bill" Hickok (1837–1876) when shot dead at the poker table.  A "dead man's arm" was a traditional English pudding, steamed and served in the cut-off sleeve of a man's shirt.  The phrase "dead man walking" began as US prison slang to refer to those on death row awaiting execution and it's since been adopted to describe figures like politicians, coaches, CEOs and the like who are thought about to be sacked.  Reflecting progress in other areas, dictionaries now list both "dead woman walking" and "dead person walking" but there scant evidence of use.

May have come across the odd dead man: Lindsay Lohan in hoodie arriving at the Los Angeles County Morgue to perform court-ordered community service, October 2011.

Deadman and the maintenance of MAD

The concept of nuclear deterrence depends on the idea of mutually assured destruction (MAD): that there would be certain retaliation, even if a nuclear first-strike destroyed the usual command and control structures of an adversary, that would not guarantee there wouldn’t be a nuclear counter-strike.  All front-line nuclear-weapon states employ systems to ensure a residual capacity to retaliate, even after suffering a catastrophic first strike, the best known of which are the Russian Мертвая рука (Dead Hand) and the US AN/DRC-8 (Emergency Rocket Communications System), both of which are often referred to as doomsday devices.  Both exist to close the strategic nuclear strike control loop and were inventions of the high Cold War, the USSR’s system later taken over by the successor Russian state.  The metaphor of a deadman is accurate to the extent of the need to keep closed a loop, the difference being the consequences.

Test launch of ground-based Russian RS-24 Yars ICBM from the Plesetsk facility in northwestern Russia, 9 December 2020.

The most extreme scenario is one in which there is left not a living soul with access to the loop.  In this case, the system switches from one designed to instigate a launch of ballistic missiles to one where some act is required to prevent the attack and is thus dubbed fail-deadly, the reverse of the fail-safe systems designed to prevent inadvertent launches.  The doomsday systems use a variety of mechanical and electronic monitoring protocols designed to (1) detect that a strike has occurred, (2) determine the extent of damage and (3) attempts to maintain or restore the usual communication channels of the military chain of command.  If the systems determine worst-case circumstances exist, a retaliatory launch of intercontinental ballistic missiles (ICBMs) will be triggered.  Neither the Kremlin nor the Pentagon tend to comment on such things but, over the years, there have been (what are assumed to be managed) leaks that the systems are usually inactive and activated only during times of crisis but the veracity of this is unknown.

Royal Navy test launch of UGM-133 Trident II nuclear submarine-launched ballistic missile (SLBM) from Vanguard class submarine HMS Vigilant, 28 October 2012.

One obvious theoretical vulnerability in the USSR’s and US systems is that at points it is electronic and therefore reliant on hardware, software and an energy source.  The UK government has an entirely analogue system which uses only pen and paper.  Known as letters of last resort, each incoming prime minister writes, in their own hand, four identical letters which are placed in a sealed envelope, given to the captain of each of the navy’s ballistic missile submarines who keeps it in his on-board safe.  The letters are only to be opened if an enemy (presumably nuclear) strike has damaged the chain of command to the extent it is no longer possible for the civilian government to instruct the military on what retaliatory action to take.  As soon as a prime-minister leaves office, the letters are, unopened, destroyed and replaced with ones from the new premier.  Those circumstances requiring a letter to be opened have never transpired and no prime-minister has ever commented publicly on what they wrote so the contents remain a genuine secret, known only to the writer and whomever they told.  So, although the only people who know the contents have never spoken, the consensus has long been the captains are likely to be given one of four options: 

(1) Retaliate.  Each of the four submarines is armed with up to sixteen 16 Trident II SLMBs (submarine-launched ballistic missiles), each missile equipped with up to twelve independently targeted warheads with a range of 7,000 miles (11,000 km).  There is always at least one at sea and the Admiralty never comments on its location although, in times of heightened political tension, additional boats may be activated.

(2) Not retaliate.

(3) The captains should use their judgment.  This, known as “the man on the ground” doctrine has a long tradition in the military although it was in some circumstances rendered redundant by advances in real-time communications.  In this case, it’s “the man under the water”.  An interesting question which touches on constitutional, international and military law, is the question of the point at which a state ceases to exist and the orders of a regime can be no longer said legally to be valid.

(4) Proceed to a place under an allied country's command or control.

Friedrich Wilhelm Nietzsche (1844-1900).

There is also a probably unexplored fifth option: a prime-minister could leave in the envelope a blank page.  This presumably would be substantively the same as option (3) but would denote a different political message to be mulled over in whatever remained of civilization.  No prime-minister has ever commented publicly on the thoughts which crossed their minds when writing these notes but perhaps some might have recalled Nietzche’s words in Beyond Good and Evil: Prelude to a Philosophy of the Future (1886): "He who fights with monsters might take care lest he thereby become a monster.  And if you gaze for long into an abyss, the abyss gazes also into you."  Although troubled when he wrote that, he wasn't yet quite mad.

Wednesday, July 27, 2022

Bosnywash

Bosnywash (pronounced baws-nee-wosh, bos-nee-wash or boz-nee-wawsh (varies in the US by locality))

An informal noun describing the densely populated conurbation extending from Boston to Washington, encompassing New York City, Philadelphia, and Baltimore.

1971 (1967 for Boswash):  The construct was Bos(ton) + n(ew) y(ork) + wash(ington) and the form was always either Bosnywash or bosnywash, boswash following the same convention.  The constructs come from the age of the typewriter and not only would BosNYWash have been harder to type and read, the use of initial capitals in the elements of portmanteaus, blends or contractions was not a practice which came into wide use in English until the 1980s, under the influence of the IT industry which was searching for points of differentiation.

It’s debatable whether Bosnywash is a portmanteau or a contraction.  A portmanteau word is a blend of two or more words or parts of words, combined to create a new word.  A contraction is a word created by joining two or more words which tend in normal use to appear in sequence.  The stems of words which comprise a contraction are not truncated so on which side one sits in this doubtlessly pointless debate hangs on whether one regards the accepted short forms Bos, NY & Wash as “words” for the technical purpose of construction.  The rules of structural linguistics complicate things further because if a portmanteau is created by using already shortened compounds, they result can also be defined a clipped compound.  Quite what interpretation would apply to Boswash been derived from Bosnywash would thus presumably be less certain still but most regard both as portmanteaus.

BosWash and Bosnywash mean exactly the same thing: a densely populated conurbation extending from Boston in the north to Washington in the south, encompassing New York City, Philadelphia, and Baltimore.  The concept of cities expanding to envelop an entire surrounding land mass to exist as one vast megalopolis was the vision of US systems theorist Herman Kahn (1922–1983) who in 1967 coined the word Boswash for one of his essays speculating about the future.  While the word Boswash was novel, the idea that part of the north-eastern US might develop into the one, contiguous populated area had been discussed by urban geographers for almost a decade and it was just one of several places urban area had The idea of vast and expanding cities had been noted as a demographic phenomenon for centuries but the sudden acceleration of the global population, beginning in the mid-nineteenth century (the cause: (1) the wide deployment of modern Western medical techniques which simultaneously lowered the infant mortality rate & lengthened the typical human lifespan, (2) the installation of sanitation systems which reduced disease, (3) vaccinations against disease and (4) increases in agricultural output (the so-called “green revolution)) focused the attention of economists and urban geographers who, extrapolating historic and contemporary trends, developed the concept of the modern mega-city.  Bosnywash (phonetically, probably a more attractive word) appeared half-a-decade after Boswash in The Bosnywash Megalopolis: A Region of Great Cities (1971) by Leonard Arthur Swatridge (b 1931) and was likely an exercise in legitimization, folk in NYC never likely to take much notice of anything which doesn’t include their city.  There, if it didn't happen in New York, it didn’t happen.

South-east Queensland (Australia) and the trend towards the Gold Coast-Brisbane-Sunshine Coast megalopolis.

The idea has been applied to many areas of high population growth and increasing urbanization (globally, the dominant trend of the last seventy-five years) where cities and towns grow towards each other.  The south-east corner of the Australian state of Queensland is noted as an instance of what was one as transport corridor tended to develop into a megalopolis, stretching from the southern border of the Gold Coast to the northern extremes of the Sunshine Coast.  The word megalopolis was from 1832, the construct being the Ancient Greek megalo- (great), from megas (genitive megalou) + -polis (city).  It was used to describe a big, densely populated urban complex and during Antiquity was an epithet of the great cities (Athens, Syracuse, Alexandria); it was also was the name of a former city in Arcadia.  The rarely used descriptor of an inhabitant was megalopolitan.

Herman Kahn is remembered as a futurist but he built his early career as a systems theorist and, while at the RAND Corporation, was prominent in constructing the theoretical framework on which the US political-military establishment constructed the strategies which dictated the scope and form of the nuclear arsenal and the plans for its use.  Perhaps the highest stakes version ever undertaken of what came to be known as scenario planning under the application of game theory, Khan’s models were among those which, in a reductionist process, led to some awful yet elegant expressions such as “mutually assured destruction (MAD)” which triggered a generation of specialists in the Pentagon and the Kremlin counting missiles as the basis of high Cold War politics.  Kahn was reputedly one of the figures who served as an inspiration for the title character in Stanley Kubrick's (1928-1999) dark satire Dr Strangelove (1964) and, unsurprisingly, Sidney Lumet (1924-2011) noted the character of Professor Groeteschele in his more somber film of nuclear war, Fail Safe (1964).

Bosnywash personified: Lindsay Lohan (with former special friend Samantha Ronson), Estate Nightclub, Boston, January 2009 (left), shopping in New York City, September 2013 (centre) & at the White House Correspondents' Dinner, Washington DC, April 2012 (right).