Showing posts sorted by relevance for query Fail Safe. Sort by date Show all posts
Showing posts sorted by relevance for query Fail Safe. Sort by date Show all posts

Monday, June 15, 2020

Failsafe

Failsafe (pronounced feyl-seyf)

(1) In electronics, pertaining to or noting a mechanism built into a system, as in an early warning system or a nuclear reactor, for insuring safety should the system fail to operate properly.

(2) Anything equipped with a secondary system that insures continued operation even if the primary system fails; something designed to work or function automatically to prevent breakdown of a mechanism, system, or the like.(3) In manned nuclear weapon delivery systems (airplanes), of, relating to, or designating a system of coded military controls in which bombers dispatched to a prearranged point as part of a standard operating procedure cannot advance farther without direct orders from a designated authority and cannot have the nuclear warheads they carry armed until they have passed their prearranged point (known as the failsafe point (sometimes initial capital letter)).

1945: A compound word, the construct being fail + safe, apparently a back-formation from the verb phrase "to fail safely" (which would for those poor souls who worry about the split infinitive be "safely to fail".  Fail was from the Middle English failen, from the Anglo-Norman faillir, from the Vulgar Latin fallire (an alteration of the Latin fallere (to deceive, disappoint)), from either the primitive Indo-European bhāl- (to lie, deceive) or the primitive Indo-European sgwhhzel- (to stumble).  It was related to the Dutch feilen & fallen (to fail, miss), the German fehlen (to fail, miss, lack), the Danish fejle (to fail, err), the Swedish fela (to fail, be wanting, do wrong), the Icelandic feila (to fail) and the Spanish fallar (to fail, miss).  Safe was from the Middle English sauf, safe, saf & saaf, from the Old French sauf, saulf & salf (safe), from the Latin salvus (whole, safe”), from the primitive Indo-European solhz- (whole, every).

The meaning "unscathed, unhurt, uninjured; free from danger or molestation, in safety, secure; saved spiritually, redeemed, not damned" emerged circa 1300 from the Old French sauf (protected, watched-over; assured of salvation), from the Latin salvus (uninjured, in good health, safe) and related to salus (good health) & saluber (healthful), all from the primitive Indo-European solwos from the root sol- (whole, well-kept).  The quasi-preposition from circa 1300 was on the model of the French and Latin cognates.  From the late fourteenth century, the sense "rescued, delivered; protected; left alive, unkilled" had formed, along with the meaning "not exposed to danger" (of places) whereas the same thing as applied to actions was attested from the 1580s and "sure, reliable, not a danger" from about two decades later.  The sense of "conservative; cautious" dates from 1823.  The noun term safe-conduct was from the late thirteenth century language of diplomacy, from the Old French sauf-conduit; it was used to describe the protected status of diplomats who would for example be afforded safe-passage from their mission in situations such as the outbreak of war between the two states.  Although most associated with nuclear-weapons delivery systems (The novel Fail-Safe (1962) by Eugene Burdick (1918-1965) and Harvey Wheeler (1918-2004) was about a nuclear attack caused by mechanical error), the term failsafe was used originally by engineers in reference to aircraft construction.  The spellings failsafe and fail-safe are used interchangeably.  Failsafe is a noun & adjective and fail-safed & fail-safeing are verbs (seemingly usually; the noun plural is failsafes.  The adjective failsafeish is engineer's humor.

In fiction: Failsafe and nuclear weapons

Two films from 1964, Sidney Lumet's (1924-2011) Fail-Safe and Stanley Kubrick's (1928-1999) Doctor Strangelove: Or How I Learned to Stop Worrying and Love the Bomb were both about the fear of a nuclear holocaust.  Kubrick had his project in pre-production in early 1963 when he learned another studio had purchased the rights to the Fail-Safe, planning a cinema release before Dr Strangelove.  Not happy, Kubrick alleged plagiarism and threatened a lawsuit, asserting the novel Fail-Safe was "copied largely” from the book on which Dr Strangelove was based, Peter George's (1924-1966) Red Alert.  Rather than pursuing the matter through the courts, Columbia Pictures, committed to Dr Strangelove, chose the M&A route and took over distribution of Fail-Safe which it scheduled for a release after Dr Strangelove.  Kubrick probably needn’t have worried, Dr Strangelove, a masterpiece of dark humour, was a critical and commercial success while Fail-Safe, although praised by many scholars and military analysts wasn't well received by reviewers who though it melodramatic and found the plot implausible, dooming it at the box-office.

US war-room film set for Dr Strangelove.  Upon becoming president in 1981, Ronald Reagan (1911-2004, US president 1981-1989) was reportedly disappointed no Situation Room quite so dramatic actually existed, the room in the White House something like what would be used by an insurance company to conduct sales training seminars.  The story is thought likely apocryphal but there is documentary evidence Mr Reagan did sometimes confuse historic fact with depictions he'd seen in movies.

Pleading in the Alternative

In law, the courtroom tactic of “alternative pleading” is sometimes called a "legal failsafe" but, in the sense of the etymology, that's true only if the tactic works; in some cases it should more correctly be classified as "a last resort".  In US law, “alternative pleading” is the legal strategy in which multiple claims or defenses (that may be mutually exclusive, inconsistent or contradictory) may be filed.  Under the Federal Rules of Civil Procedure, at the point of filing the rule is absolute and untested; a party may thus file a claim or defense which defies the laws of physics or is in some other way technically impossible.  The four key aspects of alternative pleading are:

(1) Cover All Bases: Whatever possible basis might be available in a statement of claim or defence should be invoked to ensure that if a reliance on one legal precept or theory fails, others remain available.  Just because a particular claim or defense has been filed, there is no obligation on counsel to pursue each.

(2) Multiple Legal Fields: A party can plead different areas of law are at play, even if they would be contradictory if considered together.  A plaintiff might allege a defendant is liable under both breach of contract and, alternatively, unjust enrichment if no contract is found afoot.

(3) Flexibility: Alternative pleading interacts with the “discovery process” (ie going through each other’s filing cabinets and digital storage) in that it makes maximum flexibility in litigation, parties able to take advantage of previously unknown information.  Thus, pleadings should be structured not only on the basis of “known knowns” but also “unknown unknowns”, “known unknowns” and even the mysterious “unknown knowns”.  He may have been evil but for some things, we should be grateful to Donald Rumsfeld (1932–2021: US defense secretary 1975-1977 & 2001-2006).

(4) No Admission of Facts: By pleading in the alternative, a party does not admit that any of the factual allegations are true but are, in effect, asserting if one set of facts is found to be true, then one legal theory applies while if another set is found to be true, another applies.  This is another aspect of flexibility which permits counsel fully to present a case without, at the initial stages of litigation, being forced to commit to a single version of the facts or a single legal theory.

In the US, alternative pleading (typically wordy (there was a time when in some places lawyers charged “per word” in documents), lawyers prefer “pleading in the alternative”) generally is permitted in criminal cases, it can manifest as a defendant simultaneously claiming (1) they did not commit alleged act, (2) at the time the committed the act they were afflicted by insanity they are, as a matter of law, not criminally responsible, (3) that at the time they committed the act they were intoxicated and thus the extent of their guilt is diminished or (4) the act committed way justified by some reason such as provocation or self defense.  Lawyers however are careful in the way the tactic is used because judges and juries can be suspicious of defendants claiming the benefits of both an alibi and self defense.  When elements in an alternative pleading include a logical inconsistency, it's an example of "kettle logic".

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011. 

Kettle logic

The term “Kettle logic” (originally in the French: la logique du chaudron) was coined by French philosopher Jacques Derrida (1930-2004), one of the major figures in the history of post-modernist thought, remembered especially for his work on deconstructionism.  Kettle logic is category of rhetoric in which multiple arguments are deployed to defend a point, all with some element of internal inconsistency, some actually contradictory.  Derrida drew the title from the “kettle-story” which appeared in two works by the founder of psychoanalysis, Sigmund Freud (1856-1939): The Interpretation of Dreams (1900) & Jokes and Their Relation to the Unconscious (1905).  In his analysis of “Irma's dream”, Freud recounted the three arguments offered by the man who returned in damaged condition a kettle he’d borrowed.

(1) That the kettle had been returned undamaged.

(2) That the kettle was already damaged when borrowed.

(3) That the kettle had never been borrowed.

The three arguments are inconsistent or contradictory but only one need be found true for the man not to be guilty of causing the damage.  Kettle logic was used by Freud to illustrate the way it’s not unusual for contradictory opposites simultaneously to appear in dreams and be experienced as “natural” in a way would obviously wouldn’t happen in a conscious state.  

Wednesday, March 23, 2022

Ouija

Ouija (pronounced wee-juh (sometimes wee-jee (US)))

(1) An instrument in the shape of a board on which is written the alphabet, the numbers 0-9 and the words "Yes", "No" & "Goodbye" (with occasional additions), the characters selected by a small, heart-shaped piece called a planchette.  Board is used during a séance to contact spirits of the dead, the characters selected by the participants collectively placing their hands on the planchette which is then guided by the spirit(s) to the appropriate letter or number.

(2) As Ouija board, a small-scale replica of an aircraft carrier's flight and hangar decks, installed in the in the flight control room and manually updated with scale models as a communications fail-safe.  Used in every US carrier since WWII (although now in the throes of being replaced by electronic versions).

1891: A trademark name granted to the Kennard Novelty Company (US), a compound of the from French oui (yes) and the German ja (yes).  Oui is from the Old French oïl, a compound of o (the affirmative particle) and il (he), akin to o-je (I), o-tu (thou), o-nos (we) and o-vos (you), all ‘yes’ constructed with pronouns.  O and òc are both from the Latin hoc (this) and may correspond to the Vulgar Latin construction hoc ille.  Ja is from the Middle High German ja, from Old High German (yes), from Proto-Germanic ja from the primitive Indo-European (already).  It was cognate with the Dutch ja, the English yea (yes) and the Latin iam (already).

Although Ouija, as a propriety brand-name, dates only from 1891, similar boards existed in China from circa 1100 BC and have long been part of occult and spiritual practice in the west, attaining great popularity in the mid-nineteenth century and again during WWI and its aftermath.

Analog Ouija Board on USS Ronald Reagan aircraft carrier.

Available for niche markets.

Friday, July 7, 2023

Cruise

Cruise (pronounced krooz)

(1) To sail about on a pleasure trip (often as cruising).

(2) To sail about, as a warship patrolling a body of water.

(3) To travel about without a particular purpose or destination.

(4) To fly, drive, or sail at a constant speed that permits maximum operating efficiency for sustained travel.

(5) In aeronautics, the portion of aircraft travel at a constant airspeed and altitude between ascent and descent phases.

(6) To travel at a moderately fast, easily controllable speed.

(7) To travel about slowly, looking for customers or for something demanding attention.

(8) As cruise missile, an intermediate-range weapon.

(9) Among male homosexuals, actively to seek a casual sexual partner by moving about a particular area known to be frequented by those there for such purposes, an area known to be productive known as “cruisy” (“to troll” & “trolling” were once used as a synonyms but those terms have now been claimed by their use on the internet).

(10) In informal use in the US military, a period spent in the Marine Corps.

(11) In casual use in sporting competition, easily to win.

1645-1655:  From the Dutch kruisen (to cross, sail to and fro), from kruis or cruis (cross), from the Middle Dutch cruce, from the Latin crux.  Root was the primitive Indo-European sker (to turn, to bend); etymologists suggest it may be cognate with the Latin circus (circle) and curvus (curve).  In English, it began to be used as a noun in 1706 in the sense of “a voyage taken in courses” and by 1906 as “a voyage taken by tourists on a ship".  It was related to the French croiser (to cross, cruise), the Spanish cruzar and the German kreuzen.  The alternative spelling cruize is obsolete.  Cruise & cruising are nouns & verbs, cruised is a verb, cruiser is a noun and cruisy is an adjective; the noun plural is cruises.

Cruiser in the sense of "one who or that which cruises"(agent noun from the verb cruise) is from the 1670s, probably, borrowed from similar words in continental languages (such as the Dutch cruiser & French croiseur).  In older use, a cruiser was a warship built to patrol and protect commerce of the state to which it belongs and to chase hostile ships; cruisers were the classic gun boats used by the European colonial powers for patrolling their empires.  In this use they were often compared to the frigates of old in that they possessed good speed and were employed to protect the trade-routes, to glean intelligence, and to act as the “eyes of the fleet” and in casual use, during the eighteenth century, the term was often applied to the ships of privateers (pirates).  Cruiser was used to describe homosexuals “cruising for sex partners" (ie frequenting and lingering in places well-known for such things) from 1903, as a boxing weight (cruiserweight) class, from 1920.  The meaning "police patrol car" is a 1929 adoption of American English.

Royal Navy battlecruiser HMS Hood entering Valletta harbor, Malta 1937.

In admiralty use, cruisers are now the largest of the conventional warships still in service.  Navies used to use the term “cruiser” more as a description of the tasks for which the ships were used rather than specific nature of the construction, the early cruisers those ships which were used for long-range missions such as costal raiding or scouting and it was only in the late nineteenth century as the fleets grew and became more specialized that the classic model of the corvette / frigate / destroyer / cruiser / battleship evolved.  Even then there were distinctions such as light & heavy cruisers but the most interesting development in warship architecture was the battlecruiser, built essentially because the Dreadnought had created “a gap in the market”.  Battlecruisers were battleships with less armor, therefore gaining speed at the cost of greater vulnerability.  The theory was they would have the firepower to out-gun all but the battleships and those they could out-run with their greater speed.  The concept seemed sound and in December 1914, at the Battle of the Falkland Islands, two Royal Navy battlecruisers vindicated the theory when they chased and destroyed the German East Asia Squadron. However, in 1916, the performance of the battlecruisers in the Jutland engagement forced the Admiralty to re-consider.  Jutland was the closest thing to the great battle of the fleets which had been anticipated for decades but proved anti-climatic, both sides ultimately choosing to avoid the decisive encounter which offered the chance of victory or defeat.  What it did prove was that the naval theorists had been right; the battlecruiser could not fight the battleship and if their paths threatened to cross, the less-armored vessel should retreat and rely on greater speed to make good her escape.  There were technical deficiencies in the British ships, without which perhaps three of their battlecruisers wouldn’t have been lost, but what happened at Jutland made it clear to the admirals that uneven contests between the big capital ships were to be avoided.  The consequence was that the battlecruiser became unfashionable and after the round of disarmament in the 1920s, none were built until, unexpectedly, the Soviet Navy commissioned four in the 1980s.  They proved the last of the breed.

Origin of cruise missiles

US Pershing II cruise missiles in Neu-Ulm military base, Swabia, Bavaria in the then Federal Republic of Germany (The FRG, the old West Germany), 1984.

Carrying large warheads long distances, cruise missiles are guided weapons, used against ground targets; they fly at both subsonic and supersonic speed, remain in the atmosphere and, self-propelled for the most of their flight, travel for mostly at a constant speed.  In this they differ from ballistic missiles which fly in an arc, often reaching suborbital flight with a final trajectory much like a bullet because, once the fuel is expended, the path from that point is determined by the speed and direction of launch and the force of gravity pulling towards Earth.  Both cruise and ballistic missiles can carry nuclear warheads but cruise missiles are most often equipped with conventional warheads.  Theorists and researchers were exploring the possibility of military missiles as early as 1908, described then as the aerial torpedo, envisaged as remote-controlled weapons with which to shoot-down airships bombing London, perceived then as the most credible airborne delivery system.  .  Between the first and second world wars, the major powers all devoted resources to research but few projects reached even the prototype stage.

Annotated schematic of the V-1 (left) and a British Military Intelligence drawing (dated 16 June 1944, 3 days after the first V-1 attacks on London (right). 

First deployed in 1944 the German Vergeltungswaffen eins (“retaliatory weapon 1” or "reprisal weapon 1” and eventually known as the V-1) was the world’s first cruise missile.  One of the rare machines to use a pulse-jet, it emitted such a distinctive sound that those at whom it was aimed nicknamed it the “buzz-bomb” although it attracted other names including “flying bomb” and “doodlebug”.  In Germany, before Dr Joseph Goebbels (1897–1945; Reich Minister of Propaganda 1933-1945) decided it was the V-1, the official military code name was Fi 103 (The Fi stood for Fieseler, the original builder of the airframe and most famous for their classic Storch (Stork), short take-off & landing (STOL) aircraft) but there were also the code-names Maikäfer (maybug) & Kirschkern (cherry stone).  While the Allied defenses against the V-1 did improve over time, it was only the destruction of the launch sites and the occupation of territory within launch range that ceased the attacks.  Until then, the V-1 remained a highly effective terror weapon but, like the V-2 and so much of the German armaments effort, bureaucratic empire-building and political intrigue compromised the efficiency of the project.

Lindsay Lohan on a cruise in the Maldives, January 2019.

The V-1 used a gyroscope guidance system and was fitted with an unusual triple-layer fuse system, the primary device and a backup augmented by a fail-safe designed to ensure destruction of “duds” (weapons which fail to detonate) so they couldn’t be examined.  The accuracy of the thing was sufficient only for use against very large targets (such as the general area of a city which made sprawling London ideal) while the range of 250 km (155 miles) was significantly less than that of a medium bomber carrying the same payload. The main advantages were speed (although not sufficient to outrun the fastest of the low-altitude propeller-driven interceptors), expendability and economy of operation.  Indeed, it was probably the war’s outstanding delivery system in terms of cost per ton of explosive, able to carry a warhead of 850 kg (1,870 lb) to London at a tiny fraction of the cost of using manned aircraft for the same task with the priceless additional benefit of not risking the loss of aircrew.  The production cost of a V-1 was also only a small fraction of that of the supersonic V-2 ballistic missile which carried a warhead only of a similar-size although once launched, it was effectively invulnerable.  Unlike the V-2, the initial deployments of the V-1 required large, fixed launch ramps which were relatively easy to detect and susceptible to bombardment.  Later experiments produced much smaller launch facilities which provided for a greater rate of sustained fire.  Bomber-launched variants of the V-1 saw limited operational service near the end of the war, with the pioneering V-1's design reverse-engineered by the Americans as the Republic-Ford JB-2 cruise missile.

Luftwaffe Mistel Aircraft ( Focke-Wulf Fw 190 (upper) & Junkers Ju 88 (lower), Merseburg, Germany, 1945.

The "cruise missile" project which was the best example of the improvisation which characterized much of the ad-hoc weapon development of war time was the Mistel (mistletoe) or Beethoven-Gerät (Beethoven Device) composite aircraft program which the Germans developed in 1943.  It was a rudimentary air-launched cruise missile, made by a piloted fighter aircraft being mounted atop an unpiloted bomber-sized aircraft, packed with explosives and the larger aircraft would be released to glide towards the target.  Calling it the mistletoe reveals a sense of humor mot usually associated with the Luftwaffe but it was known rather more evocatively as the Vati und Sohn (Daddy and Son) or the Huckepack (Piggyback).  Although built in the hundreds, by the time it was available for deployment, the scope for attacking large targets with manned aircraft had reduced and the need was for precision delivery, something for which the Mistel was ill-suited and success was limited.

Thursday, June 24, 2021

Deadman

Deadman (pronounced ded-man or ded-muhn)

(1) In architecture and civil engineering a heavy plate, log, wall, or block buried in the ground that acts as an anchor for a retaining wall, sheet pile etc, usually by a tie connecting the two.

(2) A crutch-like prop, used temporarily to support a pole or mast during the erection process.

(3) In nautical use, an object fixed on shore temporarily to hold a mooring line.

(4) In nautical use, a rope for hauling the boom of a derrick inboard after discharge of a load of cargo.

(5) In mountaineering a metal plate with a wire loop attached for thrusting into firm snow to serve as a belay point, a smaller version being known as a deadboy.

(6) In slang, a bottle of alcoholic drink that has been consumed (ie is empty).

(7) In the operation of potentially dangerous machinery, a control or switch on a powered machine or vehicle that disengages a blade or clutch, applies the brake, shuts off the engine etc, when the driver or operator ceases to press a pedal, squeeze a throttle, etc; known also as the deadman throttle or the deadman control.  The hyphenated form dead-man is often used, both as noun and adjective.  Deadman is a noun and the noun plural is deadmans which seems ugly and a resulting formation such as "seven deadmans" is surely clumsy but most authoritative reference sources insist only "deadmans" will do.  Deadmen or dead-men is tolerated (by some liberal types) on the same basis as computer "mice" although "mouses" doesn't jar in the way "deadmans" seems to.

Circa 1895: A compound word, the construct being dead + man.  Dead was from the Middle English ded & deed, from Old English dēad, from the Proto-West Germanic daud, from daudaz.  The Old English dēad (a dead person; the dead collectively, those who have died) was the noun use of the adjective dead, the adverb (in a dead or dull manner, as if dead," also "entirely") attested from the late fourteenth century, again derived from the adjective.  The Proto-Germanic daudaz was the source also of the Old Saxon dod, the Danish død, the Swedish död, the Old Frisian dad, the Middle Dutch doot, the Dutch dood, the Old High German tot, the German tot, the Old Norse dauðr & the Gothic dauþs.  It's speculated the ultimate root was the primitive Indo-European dheu (to die).Man was from the Middle English man, from the Old English mann (human being, person, man), from the Proto-West Germanic mann, from the Proto-Germanic mann (human being, man), probably from the primitive Indo-European mon- (man) (men having the meaning “mind”); a doublet of manu.  The specific sense of “adult male of the human race” (distinguished from a woman or boy) was known in the Old English by circa 1000.   Old English used wer and wif to distinguish the sexes, but wer began to disappear late in the thirteenth century, replaced by mann and increasingly man.  Man also was in Old English as an indefinite pronoun (one, people, they) and used generically for "the human race, mankind" by circa 1200.  It was cognate with the West Frisian man, the Dutch man, the German Mann (man), the Norwegian mann (man), the Old Swedish maþer (man), the Swedish man, the Russian муж (muž) (husband, male person), the Avestan manš, the Sanskrit मनु (manu) (human being), the Urdu مانس‎ and Hindi मानस (mānas).   Although often thought a modern adoption, use as a word of familiar address, originally often implying impatience is attested as early as circa 1400, hence probably its use as an interjection of surprise or emphasis since Middle English.  It became especially popular from the early twentieth century.

Calameo Dual-purpose MIL-SIM-FX mechanical dead-man and detonator switch (part-number MIL-12G-DMS).

The source of the name is the idea that if something is likely to in some way be dangerous if uncontrolled, operation is possible only if some device is maintained in a state which is possible only by a person not dead or in some debilitated condition.  The classic example is the train driver; if the driver does not maintain the switch in the closed position, the train slows to a halt.  Some manufactures describe the whole assembly as a "deadman's brake" and the part which is subject to human pressure as "deadman's switch" (or deadman's handle".  The phrase "dead man's fingers" is unrelated and is used variously in zoology, botany and in cooking and "dead man's rope" is a kind of seaweed (a synonym of sea-laces).  The legend of the "dead man's hand" (various combinations of aces and eights in poker) is based on the cards in the hand held by the unfortunate "Wild Bill" Hickok (1837–1876) when shot dead at the poker table.  A "dead man's arm" was a traditional English pudding, steamed and served in the cut-off sleeve of a man's shirt.  The phrase "dead man walking" began as US prison slang to refer to those on death row awaiting execution and it's since been adopted to describe figures like politicians, coaches, CEOs and the like who are thought about to be sacked.  Reflecting progress in other areas, dictionaries now list both "dead woman walking" and "dead person walking" but there scant evidence of use.

May have come across the odd dead man: Lindsay Lohan in hoodie arriving at the Los Angeles County Morgue to perform court-ordered community service, October 2011.

Deadman and the maintenance of MAD

The concept of nuclear deterrence depends on the idea of mutually assured destruction (MAD): that there would be certain retaliation, even if a nuclear first-strike destroyed the usual command and control structures of an adversary, that would not guarantee there wouldn’t be a nuclear counter-strike.  All front-line nuclear-weapon states employ systems to ensure a residual capacity to retaliate, even after suffering a catastrophic first strike, the best known of which are the Russian Мертвая рука (Dead Hand) and the US AN/DRC-8 (Emergency Rocket Communications System), both of which are often referred to as doomsday devices.  Both exist to close the strategic nuclear strike control loop and were inventions of the high Cold War, the USSR’s system later taken over by the successor Russian state.  The metaphor of a deadman is accurate to the extent of the need to keep closed a loop, the difference being the consequences.

Test launch of ground-based Russian RS-24 Yars ICBM from the Plesetsk facility in northwestern Russia, 9 December 2020.

The most extreme scenario is one in which there is left not a living soul with access to the loop.  In this case, the system switches from one designed to instigate a launch of ballistic missiles to one where some act is required to prevent the attack and is thus dubbed fail-deadly, the reverse of the fail-safe systems designed to prevent inadvertent launches.  The doomsday systems use a variety of mechanical and electronic monitoring protocols designed to (1) detect that a strike has occurred, (2) determine the extent of damage and (3) attempts to maintain or restore the usual communication channels of the military chain of command.  If the systems determine worst-case circumstances exist, a retaliatory launch of intercontinental ballistic missiles (ICBMs) will be triggered.  Neither the Kremlin nor the Pentagon tend to comment on such things but, over the years, there have been (what are assumed to be managed) leaks that the systems are usually inactive and activated only during times of crisis but the veracity of this is unknown.

Royal Navy test launch of UGM-133 Trident II nuclear submarine-launched ballistic missile (SLBM) from Vanguard class submarine HMS Vigilant, 28 October 2012.

One obvious theoretical vulnerability in the USSR’s and US systems is that at points it is electronic and therefore reliant on hardware, software and an energy source.  The UK government has an entirely analogue system which uses only pen and paper.  Known as letters of last resort, each incoming prime minister writes, in their own hand, four identical letters which are placed in a sealed envelope, given to the captain of each of the navy’s ballistic missile submarines who keeps it in his on-board safe.  The letters are only to be opened if an enemy (presumably nuclear) strike has damaged the chain of command to the extent it is no longer possible for the civilian government to instruct the military on what retaliatory action to take.  As soon as a prime-minister leaves office, the letters are, unopened, destroyed and replaced with ones from the new premier.  Those circumstances requiring a letter to be opened have never transpired and no prime-minister has ever commented publicly on what they wrote so the contents remain a genuine secret, known only to the writer and whomever they told.  So, although the only people who know the contents have never spoken, the consensus has long been the captains are likely to be given one of four options: 

(1) Retaliate.  Each of the four submarines is armed with up to sixteen 16 Trident II SLMBs (submarine-launched ballistic missiles), each missile equipped with up to twelve independently targeted warheads with a range of 7,000 miles (11,000 km).  There is always at least one at sea and the Admiralty never comments on its location although, in times of heightened political tension, additional boats may be activated.

(2) Not retaliate.

(3) The captains should use their judgment.  This, known as “the man on the ground” doctrine has a long tradition in the military although it was in some circumstances rendered redundant by advances in real-time communications.  In this case, it’s “the man under the water”.  An interesting question which touches on constitutional, international and military law, is the question of the point at which a state ceases to exist and the orders of a regime can be no longer said legally to be valid.

(4) Proceed to a place under an allied country's command or control.

Friedrich Wilhelm Nietzsche (1844-1900).

There is also a probably unexplored fifth option: a prime-minister could leave in the envelope a blank page.  This presumably would be substantively the same as option (3) but would denote a different political message to be mulled over in whatever remained of civilization.  No prime-minister has ever commented publicly on the thoughts which crossed their minds when writing these notes but perhaps some might have recalled Nietzche’s words in Beyond Good and Evil: Prelude to a Philosophy of the Future (1886): "He who fights with monsters might take care lest he thereby become a monster.  And if you gaze for long into an abyss, the abyss gazes also into you."  Although troubled when he wrote that, he wasn't yet quite mad.

Wednesday, June 17, 2020

Bosnywash

Bosnywash (pronounced baws-nee-wosh, bos-nee-wash or boz-nee-wawsh (varies in the US by locality))

An informal noun describing the densely populated conurbation extending from Boston to Washington, encompassing New York City, Philadelphia, and Baltimore.

1971 (1967 for Boswash):  The construct was Bos(ton) + n(ew) y(ork) + wash(ington) and the form was always either Bosnywash or bosnywash, boswash following the same convention.  The constructs come from the age of the typewriter and not only would BosNYWash have been harder to type and read, the use of initial capitals in the elements of portmanteaus, blends or contractions was not a practice which came into wide use in English until the 1980s, under the influence of the IT industry which was searching for points of differentiation.

It’s debatable whether Bosnywash is a portmanteau or a contraction.  A portmanteau word is a blend of two or more words or parts of words, combined to create a new word.  A contraction is a word created by joining two or more words which tend in normal use to appear in sequence.  The stems of words which comprise a contraction are not truncated so on which side one sits in this doubtlessly pointless debate hangs on whether one regards the accepted short forms Bos, NY & Wash as “words” for the technical purpose of construction.  The rules of structural linguistics complicate things further because if a portmanteau is created by using already shortened compounds, they result can also be defined a clipped compound.  Quite what interpretation would apply to Boswash been derived from Bosnywash would thus presumably be less certain still but most regard both as portmanteaus.

BosWash and Bosnywash mean exactly the same thing: a densely populated conurbation extending from Boston in the north to Washington in the south, encompassing New York City, Philadelphia, and Baltimore.  The concept of cities expanding to envelop an entire surrounding land mass to exist as one vast megalopolis was the vision of US systems theorist Herman Kahn (1922–1983) who in 1967 coined the word Boswash for one of his essays speculating about the future.  While the word Boswash was novel, the idea that part of the north-eastern US might develop into the one, contiguous populated area had been discussed by urban geographers for almost a decade and it was just one of several places urban area had The idea of vast and expanding cities had been noted as a demographic phenomenon for centuries but the sudden acceleration of the global population, beginning in the mid-nineteenth century (the cause: (1) the wide deployment of modern Western medical techniques which simultaneously lowered the infant mortality rate & lengthened the typical human lifespan, (2) the installation of sanitation systems which reduced disease, (3) vaccinations against disease and (4) increases in agricultural output (the so-called “green revolution)) focused the attention of economists and urban geographers who, extrapolating historic and contemporary trends, developed the concept of the modern mega-city.  Bosnywash (phonetically, probably a more attractive word) appeared half-a-decade after Boswash in The Bosnywash Megalopolis: A Region of Great Cities (1971) by Leonard Arthur Swatridge (b 1931) and was likely an exercise in legitimization, folk in NYC never likely to take much notice of anything which doesn’t include their city.  There, if it didn't happen in New York, it didn’t happen.

South-east Queensland (Australia) and the trend towards the Gold Coast-Brisbane-Sunshine Coast megalopolis.

The idea has been applied to many areas of high population growth and increasing urbanization (globally, the dominant trend of the last seventy-five years) where cities and towns grow towards each other.  The south-east corner of the Australian state of Queensland is noted as an instance of what was one as transport corridor tended to develop into a megalopolis, stretching from the southern border of the Gold Coast to the northern extremes of the Sunshine Coast.  The word megalopolis was from 1832, the construct being the Ancient Greek megalo- (great), from megas (genitive megalou) + -polis (city).  It was used to describe a big, densely populated urban complex and during Antiquity was an epithet of the great cities (Athens, Syracuse, Alexandria); it was also was the name of a former city in Arcadia.  The rarely used descriptor of an inhabitant was megalopolitan.

Herman Kahn is remembered as a futurist but he built his early career as a systems theorist and, while at the RAND Corporation, was prominent in constructing the theoretical framework on which the US political-military establishment constructed the strategies which dictated the scope and form of the nuclear arsenal and the plans for its use.  Perhaps the highest stakes version ever undertaken of what came to be known as scenario planning under the application of game theory, Khan’s models were among those which, in a reductionist process, led to some awful yet elegant expressions such as “mutually assured destruction (MAD)” which triggered a generation of specialists in the Pentagon and the Kremlin counting missiles as the basis of high Cold War politics.  Kahn was reputedly one of the figures who served as an inspiration for the title character in Stanley Kubrick's (1928-1999) dark satire Dr Strangelove (1964) and, unsurprisingly, Sidney Lumet (1924-2011) noted the character of Professor Groeteschele in his more somber film of nuclear war, Fail Safe (1964).

Bosnywash personified: Lindsay Lohan (with former special friend Samantha Ronson), Estate Nightclub, Boston, January 2009 (left), shopping in New York City, September 2013 (centre) & at the White House Correspondents' Dinner, Washington DC, April 2012 (right).

Wednesday, December 4, 2024

Snoot

Snoot (pronounced snoot)

(1) In slang, the nose (of humans, animals, geological formations, distant galaxies and anything else with a feature even vaguely “nose-like”).

(2) In slang, an alcoholic drink.

(3) In slang, a police officer (especially a plain-clothed detective, the use explained by the notion of police “sticking their noses into” things).

(4) In clothing, the peak of a cap.

(5) In photography and film production, a cylindrical or conical e-shaped fitment on a studio light to control the scene area illuminated by restricting spill light.

(6) In informal use, a snob; an elitist individual; one who looks down upon those “not of the better classes”.

(7) In linguistics, a language pedant or snob; one who practices linguistic elitism (and distinct from a “grammar Nazi”).

(8) In engineering, as “droop snoot”, a design in which the nose of a machine is lowered (temporarily or permanently) for reasons of visibility or to optimize aerodynamics.

(9) To behave disdainfully toward; to condescend to (usually as “snooty”).

(10) To apply a snoot attachment to a light.

1861: From the Scots snoot (a variation of snout (nose or projecting feature of an animal), from the Middle English snowte, from the Middle Dutch snute, ultimately from the Proto-West Germanic snūt, from the Proto-Germanic snūtaz, source also of the German Schnauze (the basis of schnauzer, a name for a type of dog) and it’s presumed the slang schnoz (a nose, especially if large) is probably related.  Snoot is a noun & verb, snootiness, snooter & snootful are nouns, snooting & snooted are verbs, snooty, snootier & snootiest are adjectives and snootily is an adverb; the noun plural is snoots.

Lindsay Lohan's snoot.

The noun snootful dates from 1885 and was a synonym of skinful (to have imbibed as much liquor as one could manage).  It was based on the use of snout to mean “an an alcoholic drink” whereas skinful was an allusion to the time when wine was transported in containers made from animal skin (ie in original use skinful meant “the container is full”).  The adjective snooty (proud, arrogant) was first noted as university student slang in 1918 and presumably was in some way related to the earlier snouty (insolent, overbearing) which was in use by at least 1857, doubtlessly on the basis of “looking down one's nose at someone or something”.  In dialectal or slang use a snout (in the sense of “nose” is not of necessity derogatory and in fields like engineering, cosmology, geography, geology, cosmology or zoology, it is merely descriptive.  However, when used as a slang term for a snob (a snooty person), the sense is almost always negative although there are some elitists who are proud of their snootiness.  Those who don’t approve of barbarisms such as country & western music sometimes make sure their snootiness is obvious but as a general principle it’s usually better just to ignore such things.  The adjective snooty is in much more common use than the noun snoot and it appears often with a modifier such as “a bit snooty”.  That may seem strange because one is either snooty about someone or something or one isn’t but there are degrees of severity with which one can allow ones snootiness to manifest (the comparative “snootier”, the superlative “snootiest”.

In engineering, “droop snout” is used to describe a design in which the nose of a machine is lowered (temporarily or permanently) for reasons of visibility or to optimize aerodynamics.  The term was apparently first used between engineers in the late 1950s while working on the first conceptual plans for the Anglo-French supersonic airliner which became the Concorde although the first known use in print dates from 1963 (“droop nose” appearing in the same era).  The idea wasn’t developed for use on the Concorde.  An experimental British supersonic test-bed with a droop-nose had flown as early 1954 and proved the utility of the concept by being the first jet aircraft to exceed 1000 mph (1600 km/h) in level flight, later raising the world speed record of to 1132 mph (1822 km/h), exceeding the previous mark by an impressive 310 mph (500 km/h).  In aviation, the basic idea of a sloping nose had been around for decades and one of the reasons some World War II (1939-1945) Allied fighter pilots found targeting easier in the Hawker Hurricane than the Supermarine Spitfire was the nose of the former noticeably tapered towards the front, greatly enhancing forward visibility.

How the Concorde's droop snoot was used.

On the Concorde, the droop snoot wasn’t a mere convenience.  The combination of the engineers slide-rules and wind tunnel testing had proved what the shape had to be to achieve the combination of speed and fuel economy (the latter an under-estimated aspect of the development process) but that shape also meant the pilots’ view was so obstructed during take-offs, landings and taxiing that safety was compromised.  The solution was the “droop nose” mechanism which included a moving transparent visor which retracted into the nose prior to being lowered.  At supersonic speeds, the temperatures are high and so are the stresses so much attention was devoted to “fail-safe” systems including the droop snoot because a structural failure at Mach 2 would potentially be catastrophic for the entire airframe (and obviously every soul on board).  Thus, the hydraulic systems controling the droop snoot’s movement was duplicated and, as a last resort, the pilots had access to a simple mechanical lever which would disengage the pins holding the structure in place, the apparatus afterwards gracefully (hopefully) descending into its lowered position by the simple operation of gravity.  Droop snoots appeared also on Soviet supersonic aircraft including the short-lived Tupolev Tu-144 (visually close to a Concorde clone) and the Sukhoi T-4 strategic bomber which never entered production.  Interestingly, the USAF’s (US Air Force) North American XB-70 Valkyrie (a Mach 3 experimental bomber) didn’t use a droop snoot because it was developed exclusively for high-altitude, high-speed strategic bombing missions and, being a military airplane, would only ever operate from large, controlled airbases where additional ground support systems (monitoring and guidance) negated the need for the mechanism.

1955 Ford Customline (left) and the 1967 “droop snoot” “Custaxie” (right), the construct being Cust(omline) + (Gal)axie, the unusual hybrid created by merging (some of) a 1955 Customline with a 427 cubic inch (7.0 litre) Ford Galaxie V8.  The bizarre machine won the 1967 New Zealand Allcomers (a wonderful concept) saloon car championship, the modifications to the nose reckoned to be the equivalent of an additional 40-50 horsepower.

At sub-supersonic speeds, throughout the 1960s race-cars proved the virtue of the droop snoot (though a fixed rather than a moveable structure.  While sometimes weight-reduction was also attained, overwhelmingly the advantage was in aerodynamics and the idea began to spread to road cars although it would be decades before the concept would no longer be visually too radical for general market acceptance.

1972 Vauxhall Firenza coupé promotional material for the Canadian launch, a market in which the car was a disaster (left) and 1975 High Performance (HP) Firenza "dropsnoot".  GM in South Africa actually made a good car out of the Firenza coupé, building 100 (for homologation purposes) with the 302 cubic inch (4.9 litre) V8 used in the original Z/28 Chevrolet Camaro.  In South Africa, they were sold as the "Chevrolet Firenza".  

In 1973, officially, Vauxhall called their new version of the Firenza coupé the “High Performance (HP) Firenza” but quickly the press, noting the Concorde (then still three years from entering commercial service), dubbed it the “droopsnoot”, the reference obviously to the distinctive nosecone designed for aerodynamic advantage.  The advantages were real in terms of performance and fuel consumption but Vauxhall had the misfortune to introduce the model just as the first oil crisis began which stunted demand for high-performance cars (BMW’s 2002 Turbo another victim) and triggered a sharp recession which was a prelude to that decade’s stagflation.  Vauxhall had planned a build of some 10,000 a year but in the difficult environment, a paltry 204 were built.

A Ford Escort Mark 2 in the 1977 Rally of Finland (left) and a 1976 Escort RS2000  with the droop snoot (right).

In 1976, Ford launched their own take on the droop snoot, the Mark 2 Escort RS2000 featuring a similar mechanical specification to that of the Mark 1 but with a distinctive nosecone.  Ford claimed there was an aerodynamic benefit in the new nose but it was really a styling exercise designed to stimulate interest because the Escort was the corporation’s platform for rallying rather than something used on high-speed circuits and it certainly achieved the desired results, the model proving popular.  Ford Australia even offered it with four doors as well as two although emission regulations meant the additional horsepower on offer in Europe was denied to those down under.  Interestingly, although the range’s high-performance flagship, the factory rally team didn’t use the droop snoot version, those in competition using the standard, square-fronted body.

Godox Pro Snoot S-Type Mount SN-05