Showing posts sorted by relevance for query Fail Safe. Sort by date Show all posts
Showing posts sorted by relevance for query Fail Safe. Sort by date Show all posts

Monday, January 10, 2022

Failsafe

Failsafe (pronounced feyl-seyf)

(1) In electronics, pertaining to or noting a mechanism built into a system, as in an early warning system or a nuclear reactor, for insuring safety should the system fail to operate properly.

(2) Anything equipped with a secondary system that insures continued operation even if the primary system fails; something designed to work or function automatically to prevent breakdown of a mechanism, system, or the like.(3) In manned nuclear weapon delivery systems (airplanes), of, relating to, or designating a system of coded military controls in which bombers dispatched to a prearranged point as part of a standard operating procedure cannot advance farther without direct orders from a designated authority and cannot have the nuclear warheads they carry armed until they have passed their prearranged point (known as the failsafe point (sometimes initial capital letter)).

1945: A compound word, the construct being fail + safe, apparently a back-formation from the verb phrase "to fail safely" (which would for those poor souls who worry about the split infinitive be "safely to fail".  Fail was from the Middle English failen, from the Anglo-Norman faillir, from the Vulgar Latin fallire (an alteration of the Latin fallere (to deceive, disappoint)), from either the primitive Indo-European bhāl- (to lie, deceive) or the primitive Indo-European sgwhhzel- (to stumble).  It was related to the Dutch feilen & fallen (to fail, miss), the German fehlen (to fail, miss, lack), the Danish fejle (to fail, err), the Swedish fela (to fail, be wanting, do wrong), the Icelandic feila (to fail) and the Spanish fallar (to fail, miss).  Safe was from the Middle English sauf, safe, saf & saaf, from the Old French sauf, saulf & salf (safe), from the Latin salvus (whole, safe”), from the primitive Indo-European solhz- (whole, every).

The meaning "unscathed, unhurt, uninjured; free from danger or molestation, in safety, secure; saved spiritually, redeemed, not damned" emerged circa 1300 from the Old French sauf (protected, watched-over; assured of salvation), from the Latin salvus (uninjured, in good health, safe) and related to salus (good health) & saluber (healthful), all from the primitive Indo-European solwos from the root sol- (whole, well-kept).  The quasi-preposition from circa 1300 was on the model of the French and Latin cognates.  From the late fourteenth century, the sense "rescued, delivered; protected; left alive, unkilled" had formed, along with the meaning "not exposed to danger" (of places) whereas the same thing as applied to actions was attested from the 1580s and "sure, reliable, not a danger" from about two decades later.  The sense of "conservative; cautious" dates from 1823.  The noun term safe-conduct was from the late thirteenth century language of diplomacy, from the Old French sauf-conduit; it was used to describe the protected status of diplomats who would for example be afforded safe-passage from their mission in situations such as the outbreak of war between the two states.  Although most associated with nuclear-weapons delivery systems (The novel Fail-Safe (1962) by Eugene Burdick (1918-1965) and Harvey Wheeler (1918-2004) was about a nuclear attack caused by mechanical error), the term failsafe was used originally by engineers in reference to aircraft construction.  The spellings failsafe and fail-safe are used interchangeably.  Failsafe is a noun & adjective and fail-safed & fail-safeing are verbs (seemingly usually; the noun plural is failsafes.  The adjective failsafeish is engineer's humor.

In fiction: Failsafe and nuclear weapons

Two films from 1964, Sidney Lumet's (1924-2011) Fail-Safe and Stanley Kubrick's (1928-1999) Doctor Strangelove: Or How I Learned to Stop Worrying and Love the Bomb were both about the fear of a nuclear holocaust.  Kubrick had his project in pre-production in early 1963 when he learned another studio had purchased the rights to the Fail-Safe, planning a cinema release before Dr Strangelove.  Not happy, Kubrick alleged plagiarism and threatened a lawsuit, asserting the novel Fail-Safe was "copied largely” from the book on which Dr Strangelove was based, Peter George's (1924-1966) Red Alert.  Rather than pursuing the matter through the courts, Columbia Pictures, committed to Dr Strangelove, chose the M&A route and took over distribution of Fail-Safe which it scheduled for a release after Dr Strangelove.  Kubrick probably needn’t have worried, Dr Strangelove, a masterpiece of dark humour, was a critical and commercial success while Fail-Safe, although praised by many scholars and military analysts wasn't well received by reviewers who though it melodramatic and found the plot implausible, dooming it at the box-office.

US war-room film set for Dr Strangelove.  Upon becoming president in 1981, Ronald Reagan (1911-2004, US president 1981-1989) was reportedly disappointed no Situation Room quite so dramatic actually existed, the room in the White House something like what would be used by an insurance company to conduct sales training seminars.  The story is thought likely apocryphal but there is documentary evidence Mr Reagan did sometimes confuse historic fact with depictions he'd seen in movies.

Pleading in the Alternative

In law, the courtroom tactic of “alternative pleading” is sometimes called a "legal failsafe" but, in the sense of the etymology, that's true only if the tactic works; in some cases it should more correctly be classified as "a last resort".  In US law, “alternative pleading” is the legal strategy in which multiple claims or defenses (that may be mutually exclusive, inconsistent or contradictory) may be filed.  Under the Federal Rules of Civil Procedure, at the point of filing the rule is absolute and untested; a party may thus file a claim or defense which defies the laws of physics or is in some other way technically impossible.  The four key aspects of alternative pleading are:

(1) Cover All Bases: Whatever possible basis might be available in a statement of claim or defence should be invoked to ensure that if a reliance on one legal precept or theory fails, others remain available.  Just because a particular claim or defense has been filed, there is no obligation on counsel to pursue each.

(2) Multiple Legal Fields: A party can plead different areas of law are at play, even if they would be contradictory if considered together.  A plaintiff might allege a defendant is liable under both breach of contract and, alternatively, unjust enrichment if no contract is found afoot.

(3) Flexibility: Alternative pleading interacts with the “discovery process” (ie going through each other’s filing cabinets and digital storage) in that it makes maximum flexibility in litigation, parties able to take advantage of previously unknown information.  Thus, pleadings should be structured not only on the basis of “known knowns” but also “unknown unknowns”, “known unknowns” and even the mysterious “unknown knowns”.  He may have been evil but for some things, we should be grateful to Donald Rumsfeld (1932–2021: US defense secretary 1975-1977 & 2001-2006).

(4) No Admission of Facts: By pleading in the alternative, a party does not admit that any of the factual allegations are true but are, in effect, asserting if one set of facts is found to be true, then one legal theory applies while if another set is found to be true, another applies.  This is another aspect of flexibility which permits counsel fully to present a case without, at the initial stages of litigation, being forced to commit to a single version of the facts or a single legal theory.

In the US, alternative pleading (typically wordy (there was a time when in some places lawyers charged “per word” in documents), lawyers prefer “pleading in the alternative”) generally is permitted in criminal cases, it can manifest as a defendant simultaneously claiming (1) they did not commit alleged act, (2) at the time the committed the act they were afflicted by insanity they are, as a matter of law, not criminally responsible, (3) that at the time they committed the act they were intoxicated and thus the extent of their guilt is diminished or (4) the act committed way justified by some reason such as provocation or self defense.  Lawyers however are careful in the way the tactic is used because judges and juries can be suspicious of defendants claiming the benefits of both an alibi and self defense.  When elements in an alternative pleading include a logical inconsistency, it's an example of "kettle logic".

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011. 

Kettle logic

The term “Kettle logic” (originally in the French: la logique du chaudron) was coined by French philosopher Jacques Derrida (1930-2004), one of the major figures in the history of post-modernist thought, remembered especially for his work on deconstructionism.  Kettle logic is category of rhetoric in which multiple arguments are deployed to defend a point, all with some element of internal inconsistency, some actually contradictory.  Derrida drew the title from the “kettle-story” which appeared in two works by the founder of psychoanalysis, Sigmund Freud (1856-1939): The Interpretation of Dreams (1900) & Jokes and Their Relation to the Unconscious (1905).  In his analysis of “Irma's dream”, Freud recounted the three arguments offered by the man who returned in damaged condition a kettle he’d borrowed.

(1) That the kettle had been returned undamaged.

(2) That the kettle was already damaged when borrowed.

(3) That the kettle had never been borrowed.

The three arguments are inconsistent or contradictory but only one need be found true for the man not to be guilty of causing the damage.  Kettle logic was used by Freud to illustrate the way it’s not unusual for contradictory opposites simultaneously to appear in dreams and be experienced as “natural” in a way would obviously wouldn’t happen in a conscious state.  

Wednesday, March 23, 2022

Ouija

Ouija (pronounced wee-juh (sometimes wee-jee (US)))

(1) An instrument in the shape of a board on which is written the alphabet, the numbers 0-9 and the words "Yes", "No" & "Goodbye" (with occasional additions), the characters selected by a small, heart-shaped piece called a planchette.  Board is used during a séance to contact spirits of the dead, the characters selected by the participants collectively placing their hands on the planchette which is then guided by the spirit(s) to the appropriate letter or number.

(2) As Ouija board, a small-scale replica of an aircraft carrier's flight and hangar decks, installed in the in the flight control room and manually updated with scale models as a communications fail-safe.  Used in every US carrier since WWII (although now in the throes of being replaced by electronic versions).

1891: A trademark name granted to the Kennard Novelty Company (US), a compound of the from French oui (yes) and the German ja (yes).  Oui is from the Old French oïl, a compound of o (the affirmative particle) and il (he), akin to o-je (I), o-tu (thou), o-nos (we) and o-vos (you), all ‘yes’ constructed with pronouns.  O and òc are both from the Latin hoc (this) and may correspond to the Vulgar Latin construction hoc ille.  Ja is from the Middle High German ja, from Old High German (yes), from Proto-Germanic ja from the primitive Indo-European (already).  It was cognate with the Dutch ja, the English yea (yes) and the Latin iam (already).

Although Ouija, as a propriety brand-name, dates only from 1891, similar boards existed in China from circa 1100 BC and have long been part of occult and spiritual practice in the west, attaining great popularity in the mid-nineteenth century and again during WWI and its aftermath.

Analog Ouija Board on USS Ronald Reagan aircraft carrier.

Available for niche markets.

Saturday, June 24, 2023

Deadman

Deadman (pronounced ded-man or ded-muhn)

(1) In architecture and civil engineering a heavy plate, log, wall, or block buried in the ground that acts as an anchor for a retaining wall, sheet pile etc, usually by a tie connecting the two.

(2) A crutch-like prop, used temporarily to support a pole or mast during the erection process.

(3) In nautical use, an object fixed on shore temporarily to hold a mooring line.

(4) In nautical use, a rope for hauling the boom of a derrick inboard after discharge of a load of cargo.

(5) In mountaineering a metal plate with a wire loop attached for thrusting into firm snow to serve as a belay point, a smaller version being known as a deadboy.

(6) In slang, a bottle of alcoholic drink that has been consumed (ie is empty).

(7) In the operation of potentially dangerous machinery, a control or switch on a powered machine or vehicle that disengages a blade or clutch, applies the brake, shuts off the engine etc, when the driver or operator ceases to press a pedal, squeeze a throttle, etc; known also as the deadman throttle or the deadman control.  The hyphenated form dead-man is often used, both as noun and adjective.  Deadman is a noun and the noun plural is deadmans which seems ugly and a resulting formation such as "seven deadmans" is surely clumsy but most authoritative reference sources insist only "deadmans" will do.  Deadmen or dead-men is tolerated (by some liberal types) on the same basis as computer "mice" although "mouses" doesn't jar in the way "deadmans" seems to.

Circa 1895: A compound word, the construct being dead + man.  Dead was from the Middle English ded & deed, from Old English dēad, from the Proto-West Germanic daud, from daudaz.  The Old English dēad (a dead person; the dead collectively, those who have died) was the noun use of the adjective dead, the adverb (in a dead or dull manner, as if dead," also "entirely") attested from the late fourteenth century, again derived from the adjective.  The Proto-Germanic daudaz was the source also of the Old Saxon dod, the Danish død, the Swedish död, the Old Frisian dad, the Middle Dutch doot, the Dutch dood, the Old High German tot, the German tot, the Old Norse dauðr & the Gothic dauþs.  It's speculated the ultimate root was the primitive Indo-European dheu (to die).

Man was from the Middle English man, from the Old English mann (human being, person, man), from the Proto-West Germanic mann, from the Proto-Germanic mann (human being, man).  Doublet of Manu.  The specific sense of "adult male of the human race" (distinguished from a woman or boy) was known in the Old English by circa 1000.   Old English used wer and wif to distinguish the sexes, but wer began to disappear late in the thirteenth century, replaced by mann and increasingly man.  Man also was in Old English as an indefinite pronoun (one, people, they) and used generically for "the human race, mankind" by circa 1200.  Although often thought a modern adoption, use as a word of familiar address, originally often implying impatience is attested as early as circa 1400, hence probably its use as an interjection of surprise or emphasis since Middle English.  It became especially popular from the early twentieth century.

Calameo Dual-purpose MIL-SIM-FX mechanical dead-man and detonator switch (part-number MIL-12G-DMS).

The source of the name is the idea that if something is likely to in some way be dangerous if uncontrolled, operation is possible only if some device is maintained in a state which is possible only by a person not dead or in some debilitated condition.  The classic example is the train driver; if the driver does not maintain the switch in the closed position, the train slows to a halt.  Some manufactures describe the whole assembly as a "deadman's brake" and the part which is subject to human pressure as "deadman's switch" (or deadman's handle".  The phrase "dead man's fingers" is unrelated and is used variously in zoology, botany and in cooking and "dead man's rope" is a kind of seaweed (a synonym of sea-laces).  The legend of the "dead man's hand" (various combinations of aces and eights in poker) is based on the cards in the hand held by the unfortunate "Wild Bill" Hickok (1837–1876) when shot dead at the poker table.  A "dead man's arm" was a traditional English pudding, steamed and served in the cut-off sleeve of a man's shirt.  The phrase "dead man walking" began as US prison slang to refer to those on death row awaiting execution and it's since been adopted to describe figures like politicians, coaches, CEOs and the like who are thought about to be sacked.  Reflecting progress in other areas, dictionaries now list both "dead woman walking" and "dead person walking" but there scant evidence of use.

May have come across the odd dead man: Lindsay Lohan in hoodie arriving at the Los Angeles County Morgue to perform court-ordered community service, October 2011.

Deadman and the maintenance of MAD

The concept of nuclear deterrence depends on the idea of mutually assured destruction (MAD): that there would be certain retaliation, even if a nuclear first-strike destroyed the usual command and control structures of an adversary, that would not guarantee there wouldn’t be a nuclear counter-strike.  All front-line nuclear-weapon states employ systems to ensure a residual capacity to retaliate, even after suffering a catastrophic first strike, the best known of which are the Russian Мертвая рука (Dead Hand) and the US AN/DRC-8 (Emergency Rocket Communications System), both of which are often referred to as doomsday devices.  Both exist to close the strategic nuclear strike control loop and were inventions of the high Cold War, the USSR’s system later taken over by the successor Russian state.  The metaphor of a deadman is accurate to the extent of the need to keep closed a loop, the difference being the consequences.

Test launch of ground-based Russian RS-24 Yars ICBM from the Plesetsk facility in northwestern Russia, 9 December 2020.

The most extreme scenario is one in which there is left not a living soul with access to the loop.  In this case, the system switches from one designed to instigate a launch of ballistic missiles to one where some act is required to prevent the attack and is thus dubbed fail-deadly, the reverse of the fail-safe systems designed to prevent inadvertent launches.  The doomsday systems use a variety of mechanical and electronic monitoring protocols designed to (1) detect that a strike has occurred, (2) determine the extent of damage and (3) attempts to maintain or restore the usual communication channels of the military chain of command.  If the systems determine worst-case circumstances exist, a retaliatory launch of intercontinental ballistic missiles (ICBMs) will be triggered.  Neither the Kremlin nor the Pentagon tend to comment on such things but, over the years, there have been (what are assumed to be managed) leaks that the systems are usually inactive and activated only during times of crisis but the veracity of this is unknown.

Royal Navy test launch of UGM-133 Trident II nuclear submarine-launched ballistic missile (SLBM) from Vanguard class submarine HMS Vigilant, 28 October 2012.

One obvious theoretical vulnerability in the USSR’s and US systems is that at points it is electronic and therefore reliant on hardware, software and an energy source.  The UK government has an entirely analogue system which uses only pen and paper.  Known as letters of last resort, each incoming prime minister writes, in their own hand, four identical letters which are placed in a sealed envelope, given to the captain of each of the navy’s ballistic missile submarines who keeps it in his on-board safe.  The letters are only to be opened if an enemy (presumably nuclear) strike has damaged the chain of command to the extent it is no longer possible for the civilian government to instruct the military on what retaliatory action to take.  As soon as a prime-minister leaves office, the letters are, unopened, destroyed and replaced with ones from the new premier.  Those circumstances requiring a letter to be opened have never transpired and no prime-minister has ever commented publicly on what they wrote so the contents remain a genuine secret, known only to the writer and whomever they told.  So, although the only people who know the contents have never spoken, the consensus has long been the captains are likely to be given one of four options: 

(1) Retaliate.  Each of the four submarines is armed with up to sixteen 16 Trident II SLMBs (submarine-launched ballistic missiles), each missile equipped with up to twelve independently targeted warheads with a range of 7,000 miles (11,000 km).  There is always at least one at sea and the Admiralty never comments on its location although, in times of heightened political tension, additional boats may be activated.

(2) Not retaliate.

(3) The captains should use their judgment.  This, known as “the man on the ground” doctrine has a long tradition in the military although it was in some circumstances rendered redundant by advances in real-time communications.  In this case, it’s “the man under the water”.  An interesting question which touches on constitutional, international and military law, is the question of the point at which a state ceases to exist and the orders of a regime can be no longer said legally to be valid.

(4) Proceed to a place under an allied country's command or control.

Friedrich Wilhelm Nietzsche (1844-1900).

There is also a probably unexplored fifth option: a prime-minister could leave in the envelope a blank page.  This presumably would be substantively the same as option (3) but would denote a different political message to be mulled over in whatever remained of civilization.  No prime-minister has ever commented publicly on the thoughts which crossed their minds when writing these notes but perhaps some might have recalled Nietzche’s words in Beyond Good and Evil: Prelude to a Philosophy of the Future (1886): "He who fights with monsters might take care lest he thereby become a monster.  And if you gaze for long into an abyss, the abyss gazes also into you."  Although troubled when he wrote that, he wasn't yet quite mad.

Friday, July 7, 2023

Cruise

Cruise (pronounced krooz)

(1) To sail about on a pleasure trip (often as cruising).

(2) To sail about, as a warship patrolling a body of water.

(3) To travel about without a particular purpose or destination.

(4) To fly, drive, or sail at a constant speed that permits maximum operating efficiency for sustained travel.

(5) In aeronautics, the portion of aircraft travel at a constant airspeed and altitude between ascent and descent phases.

(6) To travel at a moderately fast, easily controllable speed.

(7) To travel about slowly, looking for customers or for something demanding attention.

(8) As cruise missile, an intermediate-range weapon.

(9) Among male homosexuals, actively to seek a casual sexual partner by moving about a particular area known to be frequented by those there for such purposes, an area known to be productive known as “cruisy” (“to troll” & “trolling” were once used as a synonyms but those terms have now been claimed by their use on the internet).

(10) In informal use in the US military, a period spent in the Marine Corps.

(11) In casual use in sporting competition, easily to win.

1645-1655:  From the Dutch kruisen (to cross, sail to and fro), from kruis or cruis (cross), from the Middle Dutch cruce, from the Latin crux.  Root was the primitive Indo-European sker (to turn, to bend); etymologists suggest it may be cognate with the Latin circus (circle) and curvus (curve).  In English, it began to be used as a noun in 1706 in the sense of “a voyage taken in courses” and by 1906 as “a voyage taken by tourists on a ship".  It was related to the French croiser (to cross, cruise), the Spanish cruzar and the German kreuzen.  The alternative spelling cruize is obsolete.  Cruise & cruising are nouns & verbs, cruised is a verb, cruiser is a noun and cruisy is an adjective; the noun plural is cruises.

Cruiser in the sense of "one who or that which cruises"(agent noun from the verb cruise) is from the 1670s, probably, borrowed from similar words in continental languages (such as the Dutch cruiser & French croiseur).  In older use, a cruiser was a warship built to patrol and protect commerce of the state to which it belongs and to chase hostile ships; cruisers were the classic gun boats used by the European colonial powers for patrolling their empires.  In this use they were often compared to the frigates of old in that they possessed good speed and were employed to protect the trade-routes, to glean intelligence, and to act as the “eyes of the fleet” and in casual use, during the eighteenth century, the term was often applied to the ships of privateers (pirates).  Cruiser was used to describe homosexuals “cruising for sex partners" (ie frequenting and lingering in places well-known for such things) from 1903, as a boxing weight (cruiserweight) class, from 1920.  The meaning "police patrol car" is a 1929 adoption of American English.

Royal Navy battlecruiser HMS Hood entering Valletta harbor, Malta 1937.

In admiralty use, cruisers are now the largest of the conventional warships still in service.  Navies used to use the term “cruiser” more as a description of the tasks for which the ships were used rather than specific nature of the construction, the early cruisers those ships which were used for long-range missions such as costal raiding or scouting and it was only in the late nineteenth century as the fleets grew and became more specialized that the classic model of the corvette / frigate / destroyer / cruiser / battleship evolved.  Even then there were distinctions such as light & heavy cruisers but the most interesting development in warship architecture was the battlecruiser, built essentially because the Dreadnought had created “a gap in the market”.  Battlecruisers were battleships with less armor, therefore gaining speed at the cost of greater vulnerability.  The theory was they would have the firepower to out-gun all but the battleships and those they could out-run with their greater speed.  The concept seemed sound and in December 1914, at the Battle of the Falkland Islands, two Royal Navy battlecruisers vindicated the theory when they chased and destroyed the German East Asia Squadron. However, in 1916, the performance of the battlecruisers in the Jutland engagement forced the Admiralty to re-consider.  Jutland was the closest thing to the great battle of the fleets which had been anticipated for decades but proved anti-climatic, both sides ultimately choosing to avoid the decisive encounter which offered the chance of victory or defeat.  What it did prove was that the naval theorists had been right; the battlecruiser could not fight the battleship and if their paths threatened to cross, the less-armored vessel should retreat and rely on greater speed to make good her escape.  There were technical deficiencies in the British ships, without which perhaps three of their battlecruisers wouldn’t have been lost, but what happened at Jutland made it clear to the admirals that uneven contests between the big capital ships were to be avoided.  The consequence was that the battlecruiser became unfashionable and after the round of disarmament in the 1920s, none were built until, unexpectedly, the Soviet Navy commissioned four in the 1980s.  They proved the last of the breed.

Origin of cruise missiles

US Pershing II cruise missiles in Neu-Ulm military base, Swabia, Bavaria in the then Federal Republic of Germany (The FRG, the old West Germany), 1984.

Carrying large warheads long distances, cruise missiles are guided weapons, used against ground targets; they fly at both subsonic and supersonic speed, remain in the atmosphere and, self-propelled for the most of their flight, travel for mostly at a constant speed.  In this they differ from ballistic missiles which fly in an arc, often reaching suborbital flight with a final trajectory much like a bullet because, once the fuel is expended, the path from that point is determined by the speed and direction of launch and the force of gravity pulling towards Earth.  Both cruise and ballistic missiles can carry nuclear warheads but cruise missiles are most often equipped with conventional warheads.  Theorists and researchers were exploring the possibility of military missiles as early as 1908, described then as the aerial torpedo, envisaged as remote-controlled weapons with which to shoot-down airships bombing London, perceived then as the most credible airborne delivery system.  .  Between the first and second world wars, the major powers all devoted resources to research but few projects reached even the prototype stage.

Annotated schematic of the V-1 (left) and a British Military Intelligence drawing (dated 16 June 1944, 3 days after the first V-1 attacks on London (right). 

First deployed in 1944 the German Vergeltungswaffen eins (“retaliatory weapon 1” or "reprisal weapon 1” and eventually known as the V-1) was the world’s first cruise missile.  One of the rare machines to use a pulse-jet, it emitted such a distinctive sound that those at whom it was aimed nicknamed it the “buzz-bomb” although it attracted other names including “flying bomb” and “doodlebug”.  In Germany, before Dr Joseph Goebbels (1897–1945; Reich Minister of Propaganda 1933-1945) decided it was the V-1, the official military code name was Fi 103 (The Fi stood for Fieseler, the original builder of the airframe and most famous for their classic Storch (Stork), short take-off & landing (STOL) aircraft) but there were also the code-names Maikäfer (maybug) & Kirschkern (cherry stone).  While the Allied defenses against the V-1 did improve over time, it was only the destruction of the launch sites and the occupation of territory within launch range that ceased the attacks.  Until then, the V-1 remained a highly effective terror weapon but, like the V-2 and so much of the German armaments effort, bureaucratic empire-building and political intrigue compromised the efficiency of the project.

Lindsay Lohan on a cruise in the Maldives, January 2019.

The V-1 used a gyroscope guidance system and was fitted with an unusual triple-layer fuse system, the primary device and a backup augmented by a fail-safe designed to ensure destruction of “duds” (weapons which fail to detonate) so they couldn’t be examined.  The accuracy of the thing was sufficient only for use against very large targets (such as the general area of a city which made sprawling London ideal) while the range of 250 km (155 miles) was significantly less than that of a medium bomber carrying the same payload. The main advantages were speed (although not sufficient to outrun the fastest of the low-altitude propeller-driven interceptors), expendability and economy of operation.  Indeed, it was probably the war’s outstanding delivery system in terms of cost per ton of explosive, able to carry a warhead of 850 kg (1,870 lb) to London at a tiny fraction of the cost of using manned aircraft for the same task with the priceless additional benefit of not risking the loss of aircrew.  The production cost of a V-1 was also only a small fraction of that of the supersonic V-2 ballistic missile which carried a warhead only of a similar-size although once launched, it was effectively invulnerable.  Unlike the V-2, the initial deployments of the V-1 required large, fixed launch ramps which were relatively easy to detect and susceptible to bombardment.  Later experiments produced much smaller launch facilities which provided for a greater rate of sustained fire.  Bomber-launched variants of the V-1 saw limited operational service near the end of the war, with the pioneering V-1's design reverse-engineered by the Americans as the Republic-Ford JB-2 cruise missile.

Luftwaffe Mistel Aircraft ( Focke-Wulf Fw 190 (upper) & Junkers Ju 88 (lower), Merseburg, Germany, 1945.

The "cruise missile" project which was the best example of the improvisation which characterized much of the ad-hoc weapon development of war time was the Mistel (mistletoe) or Beethoven-Gerät (Beethoven Device) composite aircraft program which the Germans developed in 1943.  It was a rudimentary air-launched cruise missile, made by a piloted fighter aircraft being mounted atop an unpiloted bomber-sized aircraft, packed with explosives and the larger aircraft would be released to glide towards the target.  Calling it the mistletoe reveals a sense of humor mot usually associated with the Luftwaffe but it was known rather more evocatively as the Vati und Sohn (Daddy and Son) or the Huckepack (Piggyback).  Although built in the hundreds, by the time it was available for deployment, the scope for attacking large targets with manned aircraft had reduced and the need was for precision delivery, something for which the Mistel was ill-suited and success was limited.

Wednesday, July 27, 2022

Bosnywash

Bosnywash (pronounced baws-nee-wosh, bos-nee-wash or boz-nee-wawsh (varies in the US by locality))

An informal noun describing the densely populated conurbation extending from Boston to Washington, encompassing New York City, Philadelphia, and Baltimore.

1971 (1967 for Boswash):  The construct was Bos(ton) + n(ew) y(ork) + wash(ington) and the form was always either Bosnywash or bosnywash, boswash following the same convention.  The constructs come from the age of the typewriter and not only would BosNYWash have been harder to type and read, the use of initial capitals in the elements of portmanteaus, blends or contractions was not a practice which came into wide use in English until the 1980s, under the influence of the IT industry which was searching for points of differentiation.

It’s debatable whether Bosnywash is a portmanteau or a contraction.  A portmanteau word is a blend of two or more words or parts of words, combined to create a new word.  A contraction is a word created by joining two or more words which tend in normal use to appear in sequence.  The stems of words which comprise a contraction are not truncated so on which side one sits in this doubtlessly pointless debate hangs on whether one regards the accepted short forms Bos, NY & Wash as “words” for the technical purpose of construction.  The rules of structural linguistics complicate things further because if a portmanteau is created by using already shortened compounds, they result can also be defined a clipped compound.  Quite what interpretation would apply to Boswash been derived from Bosnywash would thus presumably be less certain still but most regard both as portmanteaus.

BosWash and Bosnywash mean exactly the same thing: a densely populated conurbation extending from Boston in the north to Washington in the south, encompassing New York City, Philadelphia, and Baltimore.  The concept of cities expanding to envelop an entire surrounding land mass to exist as one vast megalopolis was the vision of US systems theorist Herman Kahn (1922–1983) who in 1967 coined the word Boswash for one of his essays speculating about the future.  While the word Boswash was novel, the idea that part of the north-eastern US might develop into the one, contiguous populated area had been discussed by urban geographers for almost a decade and it was just one of several places urban area had The idea of vast and expanding cities had been noted as a demographic phenomenon for centuries but the sudden acceleration of the global population, beginning in the mid-nineteenth century (the cause: (1) the wide deployment of modern Western medical techniques which simultaneously lowered the infant mortality rate & lengthened the typical human lifespan, (2) the installation of sanitation systems which reduced disease, (3) vaccinations against disease and (4) increases in agricultural output (the so-called “green revolution)) focused the attention of economists and urban geographers who, extrapolating historic and contemporary trends, developed the concept of the modern mega-city.  Bosnywash (phonetically, probably a more attractive word) appeared half-a-decade after Boswash in The Bosnywash Megalopolis: A Region of Great Cities (1971) by Leonard Arthur Swatridge (b 1931) and was likely an exercise in legitimization, folk in NYC never likely to take much notice of anything which doesn’t include their city.  There, if it didn't happen in New York, it didn’t happen.

South-east Queensland (Australia) and the trend towards the Gold Coast-Brisbane-Sunshine Coast megalopolis.

The idea has been applied to many areas of high population growth and increasing urbanization (globally, the dominant trend of the last seventy-five years) where cities and towns grow towards each other.  The south-east corner of the Australian state of Queensland is noted as an instance of what was one as transport corridor tended to develop into a megalopolis, stretching from the southern border of the Gold Coast to the northern extremes of the Sunshine Coast.  The word megalopolis was from 1832, the construct being the Ancient Greek megalo- (great), from megas (genitive megalou) + -polis (city).  It was used to describe a big, densely populated urban complex and during Antiquity was an epithet of the great cities (Athens, Syracuse, Alexandria); it was also was the name of a former city in Arcadia.  The rarely used descriptor of an inhabitant was megalopolitan.

Herman Kahn is remembered as a futurist but he built his early career as a systems theorist and, while at the RAND Corporation, was prominent in constructing the theoretical framework on which the US political-military establishment constructed the strategies which dictated the scope and form of the nuclear arsenal and the plans for its use.  Perhaps the highest stakes version ever undertaken of what came to be known as scenario planning under the application of game theory, Khan’s models were among those which, in a reductionist process, led to some awful yet elegant expressions such as “mutually assured destruction (MAD)” which triggered a generation of specialists in the Pentagon and the Kremlin counting missiles as the basis of high Cold War politics.  Kahn was reputedly one of the figures who served as an inspiration for the title character in Stanley Kubrick's (1928-1999) dark satire Dr Strangelove (1964) and, unsurprisingly, Sidney Lumet (1924-2011) noted the character of Professor Groeteschele in his more somber film of nuclear war, Fail Safe (1964).

Bosnywash personified: Lindsay Lohan (with former special friend Samantha Ronson), Estate Nightclub, Boston, January 2009 (left), shopping in New York City, September 2013 (centre) & at the White House Correspondents' Dinner, Washington DC, April 2012 (right).

Friday, October 11, 2024

Floppy

Floppy (pronounced flop-ee)

(1) A tendency to flop.

(2) Limp, flexible, not hard, firm, or rigid; flexible; hanging loosely.

(3) In IT, a clipping of “floppy diskette”.

(4) In historic military slang (Apartheid-era South Africa & Rhodesia (now Zimbabwe), an insurgent in the Rhodesian Bush War (the “Second Chimurenga” (from the Shona chimurenga (revolution)) 1964-1979), the use a reference to the way they were (in sardonic military humor) said to “flop” when shot.

(5) In informal use, a publication with covers made with a paper stock little heavier and more rigid that that used for the pages; Used mostly for comic books.

(6) In slang, a habitué of a flop-house (a cheap hotel, often used as permanent or semi-permanent accommodation by the poor or itinerant who would go there to “flop down” for a night) (archaic).

(7) In slang, as “floppy cats”, the breeders’ informal term for the ragdoll breed of cat, so named for their propensity to “go limp” when picked up (apparently because of a genetic mutation).

1855-1860: The construct was flop + -y.  Flop dates from 1595–1605 and was a variant of the verb “flap” (with the implication of a duller, heavier sound).  Flop has over the centuries gained many uses in slang and idiomatic form but in this context it meant “loosely to swing; to flap about”.  The sense of “fall or drop heavily” was in use by the mid-1830s and it was used to mean “totally to fail” in 1919 in the wake of the end of World War I (1914-1918), the conflict which wrote finis to the dynastic rule of centuries also of the Romanovs in Russia, the Habsburgs in Austria-Hungary and the Ottomans in Constantinople although in the 1890s it was recorded as meaning “some degree of failure”.  The comparative is floppier, the superlative floppiest.  Floppy a noun & adjective, floppiness is a noun, flopped is a noun & verb, flopping is a verb, floppier& floppiest are adjectives and floppily is an adverb; the noun plural is floppies.  The adjective floppish is non-standard and used in the entertainment & publishing industries to refer to something which hasn’t exactly “flopped” (failed) but which had not fulfilled the commercial expectations.

Lindsay Lohan in "floppy-brim" hat, on-set during filming of Liz & Dick (2012).  In fashion, many "floppy-brim" hats actually have a stiff brim, formed in a permanently "floppy" shape.  The true "floppy hats" are those worn while playing sport or as beachwear etc.

The word is used as a modifier in pediatric medicine (floppy baby syndrome; floppy infant syndrome) and as “floppy-wristed” (synonymous with “limp-wristed”) was used as a gay slur.  “Flippy-floppy” was IT slang for “floppy diskette” and unrelated to the previous use of “flip-flop” or “flippy-floppy” which, dating from the 1880s was used to mean “a complete reversal of direction or change of position” and used in politics to suggest inconsistency.  In the febrile world of modern US politics, to be labelled a “flip-flopper” can be damaging because it carries with it the implication what one says can’t be relied upon and campaign “promises” might thus not be honored.  Whether that differs much from the politicians’ usual behaviour can be debated but still, few enjoy being accused of flip-floppery (definitely a non-standard noun).  The classic rejoinder to being called a flip-flopper is the quote: “When the facts change, I change my mind. What do you do, sir?”  That’s often attributed to the English economist and philosopher Lord Keynes (John Maynard Keynes, 1883-1946) but it was said originally by US economist Paul Samuelson (1915–2009) the 1970 Nobel laureate in Economics.  In the popular imagination Keynes is often the “go to” economist for quote attribution in the way William Shakespeare (1564–1616) is a “go to author” and Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) a “go to politician”, both credited with thing they never said but might have said.  I phraseology, the quality of “Shakespearian” or “Churchillian” not exactly definable but certainly recognizable.  In the jargon of early twentieth century electronics, a “flip-flop” was a reference to switching circuits that alternate between two states.

Childless cat lady Taylor Swift with her “floppy cat”, Benjamin Button (as stole).  Time magazine cover, 25 December 2023, announcing Ms Swift as their 2023 Person of the Year.  "Floppy cat" is the the breeders' informal term for the ragdoll breed an allusion to their tendency to “go limp” when picked up, a behavior believed caused by a genetic mutation.

The other use of flop in IT is the initialism FLOP (floating point operations per second).  Floating-point (FB) arithmetic (FP) a way of handling big real numbers using an integer with a fixed precision, scaled by an integer exponent of a fixed base; FP doesn’t really make possible what would not in theory be achievable using real numbers but does make this faster and practical and the concept became familiar in the 1980s when Intel made available FPUs (floating point units, also known as math co-processors) which could supplement the CPUs (central processing units) of their x86 family.  The 8087 FPU worked with the 8086 CPU and others followed (80286/80287, 80386/80387, i486/i487 etc) until eventually the FPU for the Pentium range was integrated into the CPU, the early implementation something of a debacle still used as a case study in a number of fields departments including management and public relations.

FLOPs are an expression of specific performance and are used to measure those computations requiring floating-point calculations (typically in math-intensive work) and for purposes of “benchmarking” or determining “real-world” performance under those conditions, it’s a more informative number than the traditional rating of instructions per second (iSec).  The FLOPs became something of a cult in the 1990s when the supercomputers of the era first breached the trillion FLOP mark and as speeds rose, the appropriate terms were created:

kiloFLOPS: (kFLOPS, 103)
megaflops: (MFLOPS, 106)
gigaflops: GFLOPS, 109)
teraflops: TFLOPS, 1012)
petaFLOPS: PFLOPS, 1015)
exaFLOPS: (EFLOPS, 1018)
zettaFLOPS: ZFLOPS, 1021)
yottaFLOPS: YFLOPS, 1024)
ronnaFLOPS: RFLOPS, 1027)
quettaFLOPS: QFLOPS, 1030)

In the mysterious world of quantum computing, FLOPs are not directly applicable because the architecture and methods of operation differ fundamentally from those of classical computers.  Rather than FLOPs, the performance of quantum computers tends to be measured in qubits (quantum bits) and quantum gates (the operations that manipulate qubits).  The architectural difference is profound and explained with the concepts of superposition and entanglement:  Because a qubit simultaneously can represent both “0” & “1” (superposition) and these can be can be entangled (a relationship in which distance is, at least in theory, irrelevant; under parallelism, performance cannot easily be reduced to simple arithmetic or floating-point operations which remain the domain of classical computers which operate using the binary distinction between “O” (off) and “1” (on).

Evolution of the floppy diskette: 8 inch (left), 5¼ inch (centre) & 3½ inch (right).  The track of the floppy for the past half-century has been emblematic of the IT industry in toto: smaller, higher capacity and cheaper.  Genuinely it was one of the design parameters for the 3½ inch design that it fit into a man's shirt pocket.

In IT, the term “floppy diskette” used the WORM (write once, read many, ie "read only" after being written) principle first appeared in 1971 (soon doubtless clipped to “floppy” although the first known use of this dates from 1974).  The first floppy diskettes were in an 8 inch (2023 mm) format which may sound profligate for something with a capacity of 80 kB (kilobyte) but the 10-20 MB (megabit) hard drives of the time were typically the same diameter as the aperture of domestic front-loading washing machine so genuinely they deserved the diminutive suffix (-ette, from the Middle English -ette, a borrowing from the Old French -ette, from the Latin -itta, the feminine form of -ittus.  It was used to form nouns meaning a smaller form of something).  They were an advance also in convenience because until they became available, the usual way to transfer files between devices was to hard-wire them together.  Introduced by IBM in 1971, the capacity was two years later raised to 256 kB and by 1977 to a heady 1.2 MB (megabyte) with the advent of a double-sided, double-density format.  However, even then it was obvious the future was physically smaller media and in 1978 the 5¼ inch (133 mm) floppy debuted, initially with a formatted capacity of 360 kB but by 1982 this too had be raised to 1.2 MB using the technological advance if a HD (high density) file system and it was the 5¼ floppy which would become the first widely adopted industry “standard” for both home and business use, creating the neologism “sneakernet”, the construct being sneaker + net(work), the image being of IT nerds in their jeans and sneakers walking between various (unconnected) computers and exchanging files via diskette.  Until well into the twenty-first century the practice was far from functionally extinct and it persists even today with the use of USB sticks.

Kim Jong-un (Kim III, b 1982; Supreme Leader of DPRK (North Korea) since 2011) with 3½ inch floppy diskette (believed to be a HD (1.44 MB)).

The meme-makers use the floppy because it has become a symbol of technological bankruptcy. In OS (operating system) GUIs (graphical user interface) however, it does endure as the "save" icon and all the evidence to date does suggest that symbolic objects like icons do tend to outlive their source, thus the ongoing use in IT of analogue, rotary dial phones in iconography and the sound of a camera's physical shutter in smart phones.  Decades from now, we may still see representations of floppy diskettes.

The last of the mainstream floppy diskettes was the 3½ inch (89 mm) unit, introduced in 1983 in double density form with a capacity of 720 KB (although in one of their quixotic moves IBM used a unique 360 kB version for their JX range aimed at the educational market) but the classic 3½ was the HD 1.44 MB unit, released in 1986.  That really was the end of the line for the format because although in 1987 a 2.88 MB version was made available, few computer manufacturers offered the gesture of adding support at the BIOS (basic input output system) so adoption was infinitesimal.  The 3½ inch diskette continued in wide use and there was even the DMF (Distribution Media Format) with a 1.7 MB capacity which attracted companies like Microsoft, not because it wanted more space but to attempt to counter software piracy; within hours of Microsoft Office appearing in shrink-wrap with, copying cracks appeared on the bulletin boards (where nerds did stuff before the www (worldwideweb).  It was clear the floppy diskette was heading for extinction although slighter larger versions with capacities as high as 750 MB did appear but, expensive and needing different drive hardware, they were only ever a niche product seen mostly inside corporations.  By the time the CD-ROM (Compact Disc-Read-only Memory) reached critical mass in the mid-late 1990s the once ubiquitous diskette began rapid to fade from use, the release in the next decade of the USB sticks (pen drives) a final nail in the coffin for most.

In the mid 1990s, installing OS/2 Warp 4.0 (Merlin) with the optional packs and a service pack could require a user to insert and swap up to 47 diskettes.  It could take hours, assuming one didn't suffer the dreaded "floppy failure".

That was something which pleased everyone except the floppy diskette manufacturers who had in the early 1990s experienced a remarkable boom in demand for their product when Microsoft Windows 3.1 (7 diskettes) and IBM’s OS/2 2.0 (21 diskettes) were released. Not only was the CD-ROM a cheaper solution than multiple diskettes (a remarkably labor-intensive business for software distributors) but it was also much more reliable, tales of an installation process failing on the “final diskette” legion and while some doubtlessly were apocryphal, "floppy failure" was far from unknown.  By the time OS/2 Warp 3.0 was released in 1994, it required a minimum of 23 floppy diskettes and version 4.0 shipped with a hefty 30 for a base installation.  Few mourned the floppy diskette and quickly learned to love the CD-ROM.

What lay inside a 3½ inch floppy diskette.

Unlike optical discs (CD-ROM, DVD (Digital Versatile Disc) & Blu-Ray) which were written and read with the light of a laser, floppy diskettes were read with magnetic heads.  Inside the vinyl sleeve was a woven liner impregnated with a lubricant, this to reduce friction on the spinning media and help keep the surfaces clean.

Curiously though, niches remained where the floppy lived on and it was only in 2019 the USAF (US Air Force) finally retired the use of floppy diskettes which since the 1970s had been the standard method for maintaining and distributing the data related to the nation’s nuclear weapons deployment.  The attractions of the system for the military were (1) it worked, (2) it was cheap and (3) it was impervious to outside tampering.  Global thermo-nuclear war being a serious business, the USAF wanted something secure and knew that once data was on a device in some way connected to the outside world there was no way it could be guaranteed to be secure from those with malign intent (ayatollahs, the Secret Society of the Les Clefs d'Or, the CCP (Chinese Communist Party), the Freemasons, those in the Kremlin or Pyongyang et al) whereas a diskette locked in briefcase or a safe was, paradoxically, the state of twenty-first century security, the same philosophy which has seen some diplomatic posts in certain countries revert to typewriters & carbon paper for the preparation of certain documents.  In 2019 however, the USAF announced that after much development, the floppies had been retired and replaced with what the Pentagon described as a “highly-secure solid-state digital storage solution which work with the Strategic Automated Command and Control System (SACCS).

It can still be done: Although no longer included in PCs & laptops, USB floppy diskette drives remain available (although support for Windows 11 systems is said to be "inconsistent").  Even 5¼ inch units have been built.

It thus came as a surprise in 2024 to learn Japan, the nation which had invented motorcycles which didn’t leak oil (the British though they’d proved that couldn’t be done) and the QR (quick response) code, finally was abandoning the floppy diskette.  Remarkably, even in 2024, the government of Japan still routinely asked corporations and citizens to submit documents on floppies, over 1000 statutes and regulations mandating the format.  The official in charge of updating things (in 2021 he’d “declared war” on floppy diskettes) in July 2024 announced “We have won the war on floppy disks!” which must have be satisfying because he’d earlier been forced to admit defeat in his attempt to defenestrate the country’s facsimile (fax) machines, the “pushback” just too great to overcome.  The news created some interest on Japanese social media, one tweet on X (formerly known as Twitter) damning the modest but enduring floppy as a “symbol of an anachronistic administration”, presumably as much a jab at the “tired old men” of the ruling LDP (Liberal Democratic Party) as the devices.  There may however been an element of technological determinism in the reform because Sony, the last manufacturer of the floppy, ended production of them in 2011 so while many remain extant, the world’s supply is dwindling.  In some ways so modern and innovative, in other ways Japanese technology sometimes remains frozen, many businesses still demanding official documents to be endorsed using carved personal stamps called the印鑑 (ikan) or 判子 (hanko); despite the government's efforts to phase them out, their retirement is said to be proceeding at a “glacial pace”.  The other controversial aspect of the hanko is that the most prized are carved from ivory and it’s believed a significant part of the demand for black-market ivory comes from the hanko makers, most apparently passing through Hong Kong, for generations a home to “sanctions busters”.