Showing posts sorted by relevance for query Materiel. Sort by date Show all posts
Showing posts sorted by relevance for query Materiel. Sort by date Show all posts

Wednesday, April 26, 2023

Materiel

Materiel (pronounced muh-teer-ee-el)

(1) In military use, arms, ammunition, and military equipment in general.

(2) The aggregate of things used or needed in any business, undertaking, or operation as distinguished from personnel (rare).

1814: A borrowing from the French matériel (equipment; hardware), from the Old French, from the Late Latin māteriālis (material, made of matter), from the Classical Latin māteria (wood, material, substance) from māter (mother).  Ultimate source was the primitive Indo-European méhtēr (mother).  Technically, materiel refers to supplies, equipment, and weapons in military supply chain management, and typically supplies and equipment only in a commercial context but it tends most to be used to describe military hardware and then to items specific to military use (ie not the office supplies etc used by armed forces personnel).  Materiel is a noun; the noun plural is materiels.

Illustrating military materiel: Lindsay Lohan does Top Gun by BlueWolfRanger95 on Deviant Art.  An aircraft is materiel as is a pilot's flight kit.  Just about every piece of equipment in this photo would be classed as materiel except perhaps the aviator sunglasses (may be a gray area).  Even non-combat, formal attire like dress shirts and ties are regarded by most military supply systems as materiel so materiel can be made from material.

Materiel is sometime notoriously, scandalously and even fraudulently expensive, tales of the Pentagon's purchase of US$1000 screwdrivers, toilet seats and such legion.  Of late though, there have been some well-publicized economies, the US Navy's latest Virgina-class submarine using an Xbox controller for the operation of its periscope rather than the traditional photonic mast system and imaging control panel.  The cost saving is approximately US$38,000 and there's the advantages (1) replacements are available over-the-counter at video game stores world-wide, (2) the young sailors operating the controller are almost all familiar with its feel and behavior and (3) the users report its much better to use than the heavy, clunky and less responsive standard device.  In the military context, materiel refers either to the specific needs (excluding manpower) of a force to complete a specific mission, or the general sense of the needs (excluding personnel) of a functioning force.  Materiel management is an all-encompassing term covering planning, organizing, directing, coordinating, controlling, and evaluating the application of resources to ensure the effective and economical support of military forces. It includes provisioning, storing, requirements determination, acquisition, distribution, maintenance, and disposal.  In the military, the terms "materiel management", "materiel control", "inventory control", "inventory management", and "supply management" are synonymous.

DPRK personnel: DPRK female soldiers stepping out, seventieth anniversary military parade, Pyongyang, September 2018.  Note the sensible shoes, an indication of the Supreme Leader’s thoughtfulness.

The French origins of materiel and personnel are usefully illustrative.  The French matériel (the totality of things used in the carrying out of any complex art or technique (as distinguished from the people involved in the process(es))) is a noun use of the adjectival matériel and a later borrowing of the same word that became the more familiar noun material. By 1819, the specific sense of "articles, supplies, machinery etc. used in the military" had become established.  The 1837 personnel (body of persons engaged in any service) is from the French personnel and was originally specific to the military, a contrastive term to materiel and a noun use of the adjectival personnel (personal), from the Old French personel.

DPRK materiel: Mock ups of the Pukguksong-5 SLBM displayed at military parade Thursday to mark the conclusion of the North Korea’s Workers’ Party congress (the first since 2016), Pyongyang, January 2021.

In January 2021, the DPRK (North Korea) included in a military parade, what appeared to be mock-ups of what’s described as the Supreme Leader’s latest submarine-launched ballistic missile (SLBM), the supposedly new Pukguksong-5.  Apparently, and predictably, an evolution of the Pukguksong-4 paraded a few months earlier, although retaining a similar 6 foot (1.8m) diameter, the payload shroud appeared about 28 inches (700mm) longer, suggesting the new SLBM’s estimated length is circa 35 feet (10.6m).  Given the constraints of submarine launch systems, the dimensions are broadly in line with expectations but do hint the DPRK has yet to finalise a design for its next-generation SLBM.  Nor have there been recent reports of the regime testing any big solid-rocket motors, this thought to confirm the views of Western analysts that development is in the early stages.

Pukguksong-4, October 2020.

As a brute force device, with performance measured merely by explosive force, based on the dimensions, it’s possible the DPRK could match similarly sized Western SLBMs.  However, the US Navy’s Poseidon multiple-warhead SLBM, which uses two solid-fuel stages and has a range of over 2800 miles (4800 km), uses very high-energy propellants and a light-weight structure, directed by sophisticated navigation, guidance and control systems.  It features also some very expensive engineering tricks such as rocket exhaust nozzles submerged within the rocket stages, reducing the length, thereby allowing it to be deployed in the confined launch tube.  Lacking the US’s technological and industrial capacity, the Pukguksong-5 is expected to be more rudimentary in design, construction, and propellant technology, range therefore likely not to exceed 1900 miles (3000 km) and almost certainly it won’t be capable of achieving the same precision in accuracy.

Wednesday, November 29, 2023

Accouterment

Accouterment (pronounced uh-koo-ter-muhnt or uh-koo-truh-muhnt)

(1) A clothing accessory or a piece of equipment regarded as an accessory (sometimes essential, sometimes not, depending on context).

(2) In military jargon, a piece of equipment carried by a soldier, excluding weapons and items of uniform.

(3) By extension, an identifying yet superficial characteristic; a characteristic feature, object, or sign associated with a particular niche, role, situation etc.

(4) The act of accoutering; furnishing (archaic since Middle English).

1540-1550: From the Middle French accoutrement & accoustrement, from accoustrer, from the Old French acostrer (arrange, sew up).  As in English, in French, the noun accoutrement was used usually in the plural (accoutrements) in the sense of “personal clothing and equipment”, from accoustrement, from accoustrer, from the Old French acostrer (arrange, dispose, put on (clothing); sew up).  In French, the word was used in a derogatory way to refer to “over-elaborate clothing” but was used neutrally in the kitchen, chefs using the word of additions to food which enhanced the flavor.  The verb accouter (also accoutre) (to dress or equip" (especially in military uniforms and other gear), was from the French acoutrer, from the thirteenth century acostrer (arrange, dispose, put on (clothing)), from the Vulgar Latin accosturare (to sew together, sew up), the construct being ad- (to) + consutura (a sewing together), from consutus, past participle of consuere (to sew together), the construct being con- + suere (to sew), from the primitive Indo-European root syu- (to bind, sew).  The Latin prefix con- was from the preposition cum (with), from the Old Latin com, from the Proto-Italic kom, from the primitive Indo- European óm (next to, at, with, along).  It was cognate with the Proto-Germanic ga- (co-), the Proto-Slavic sъ(n) (with) and the Proto-Germanic hansō.  It was used with certain words to add a notion similar to those conveyed by with, together, or joint or with certain words to intensify their meaning.  The synonyms include equipment, gear, trappings & accessory.  The spelling accoutrement (accoutrements the plural) remains common in the UK and much of the English-speaking world which emerged from the old British Empire; the spelling in North America universally is accouterement.  The English spelling reflects the French pronunciation used in the sixteenth century.  Accouterment is a noun; the noun plural (by far the most commonly used form) is accouterments.

In the military, the equipment supplied to (and at different times variously worn or carried by) personnel tends to be divided into "materiel" and "accouterments".  Between countries, at the margins, there are differences in classification but as a general principle:  Materiel: The core equipment, supplies, vehicles, platforms etc used by a military force to conduct its operations.  This definition casts a wide vista and covers everything from a bayonet to an inter-continental ballistic missile (ICBM), from motorcycles to tanks and from radio equipment to medical supplies.  Essentially, in the military, “materiel” is used broadly to describe tangible assets and resources used in the core business of war.  Accouterments: These are the items or accessories associated with a specific activity or role.  Is some cases, an item classified as an accouterment could with some justification be called materiel and there is often a tradition associated with the classification.  In the context of clothing for example, the basic uniform is materiel whereas things like belts, holsters, webbing and pouches are accouterments, even though the existence of these pieces is essential to the efficient operation of weapons which are certainly materiel.

The My Scene Goes Hollywood Lindsay Lohan Doll was supplied with a range of accessories and accouterments.  Items like sunglasses, handbags, shoes & boots, earrings, necklaces, bracelets and the faux fur "mullet" frock-coat were probably accessories.  The director's chair, laptop, popcorn, magazines, DVD, makeup case, stanchions (with faux velvet rope) and such were probably accouterments.

In the fashion business, one perhaps might be able to create the criteria by which it could be decided whether a certain item should be classified as “an accessory” or “an “accouterment” but it seems a significantly pointless exercise and were one to reverse the index, a list of accessories would likely be as convincing as a list of accouterments.  Perhaps the most plausible distinction would be to suggest accessories are items added to an outfit to enhance or complete the look (jewelry, handbags, scarves, hats, sunglasses, belts et al) while accouterments are something thematically related but in some way separate; while one might choose the same accessories for an outfit regardless of the event to be attended, the choice of accouterments might be event-specific.  So, the same scarf might be worn because it works so well with the dress but the binoculars would be added only if going to the races, the former an accessory to the outfit, the latter an accouterment for a day at the track.  That seems as close as possible to a working definition but many will continue to use the terms interchangeably.

Wednesday, May 3, 2023

Deadline

Deadline (pronounced ded-lahyn)

(1) The time by which something must be finished or submitted; the latest time for finishing something.

(2) A line or limit that must not be passed; a time limit for any activity.

(3) Historically, a boundary around a military prison beyond which a prisoner could not venture without the risk of being shot by the guards.

(4) A guideline marked on a plate for a printing press, indicating the point beyond which text would not be printed (archaic).

(5) Historically, a fishing line that has not for some time moved (indicating it might not be a productive place to go fishing).

(6) In military use, to render an item non-mission-capable; to remove materiel from the active list (available to be tasked); to ground an aircraft etc.

1864: The construct was dead + line.  Dead was from the Middle English ded, from the Old English dead (having ceased to live (also “torpid, dull” and of water “still, standing, not flowing”), from the Proto-Germanic daudaz (source also of the Old Saxon dod, the Danish død, the Swedish död, the Old Frisian dad, the Middle Dutch doot, the Dutch dood, the Old High German & German tot, the Old Norse dauðr and the Gothic dauþs), a past-participle adjective based on dau-, which (though this is contested by etymologists) may be from the primitive Indo-European dheu- (to die).  Line (in this context) was from the Middle English line & lyne, from the Old English līne (line, cable, rope, hawser, series, row, rule, direction), from the Proto-West Germanic līnā, from the Proto-Germanic līnǭ (line, rope, flaxen cord, thread), from the Proto-Germanic līną (flax, linen), from the primitive Indo-European līno- (flax).  The development in Middle English was influenced by the Middle French ligne (line), from the Latin linea (linen thread, string, plumb-line (also “a mark, bound, limit, goal; line of descent”)).  The earliest sense in Middle English was “a cord used by builders for taking measurements” which by the late fourteenth century extended to “a thread-like mark” which led to the notion of “a track, course, direction; a straight line.  The sense of a “limit, boundary” dates from the 1590s add was applied to the geographical lines drawn to divide counties.  The mathematical sense of “length without breadth” (ie describing the line drawn between points (dimensionless places in space)) was formalized in the 1550s and in the 1580s the “equatorial line was used to describe the Earth’s equator."  Other languages including Dutch, Finnish, Italian & Polish picked up deadline from English in unaltered form while the word also entered use in many countries for use in specific industries (journalism, publishing, television, printing etc).  Deadline & deadliner are nouns; deadlining & deadlined are verbs and postdeadline is an adjective; the noun plural is deadlines.

In the oral tradition, a deadline (which probably should be recorded as “dead line”) was a fishing line which for some time after being cast, hadn’t moved, indicating it might not be a productive place to go fishing.  The source of the first formalised meaning (a line which must not be crossed) was a physical line, the defined perimeter boundary line of prisoner of war (PoW) caps during the US Civil War (1861-1865): Any prisoner going beyond the “deadline” was liable to being shot (and thus perhaps recorded as “SWATE” (shot while attempting to escape).  Despite the name, the Civil War records indicate the deadline was rarely marked-out as a physical, continuous line but was instead defined by markers such as trees, signposts or features of the physical environment.  However, the word appears not to have caught on in any sense until 1917 when it was used to describe the guideline on the bed of printing presses which delineated the point past which text would not print.  It seems that the word migrated from the print room to the news room because by 1920 it was used in journalism in the familiar modern sense of a time limit: Copy provided after a specified time would not appear in the printed edition because it has “missed the deadline”.  From this use emerged “postdeadline” (after the deadline has passed) which sometimes existed on a red stamp an editor would use when returning copy to a tardy journalist, “deadliner” (a journalist notorious for submitting copy only seconds before a deadline) and the “deadline fighter” (a journalist who habitually offered reasons why their postdeadline copy should be accepted for publication).  Writers often dread deadlines but there are those who become sufficiently successful to not be intimidated.  The English author Douglas Adams (1952–2001), famous for The Hitchhiker’s Guide to the Galaxy (1979) wrote in the posthumously published collection The Salmon of Doubt: Hitchhiking the Galaxy One Last Time (2002): “I love deadlines. I love the whooshing noise they make as they go by”.  Few working journalists enjoy that luxury.  Other similar expressions include “zero hour”, “cutoff date” and the unimaginative “time limit”.  Deadline was unusual in that it was one of the few examples of the word “dead” being used as a word-forming element in its literal sense, another being “deadman”, a device used mostly in railways to ensure a train is graceful brought to a controlled stop in the event of the driver’s death on incapacitation.

The meaning shift in deadline was an example of an element of a word used originally in its literal sense (dead men SWATE) changing into something figurative.  Other examples of the figurative use of the element include “dead leg”, deadlock, dead loss, dead load, dead lift, dead ringer, dead heat and dead light.  The interesting term “dead letter” has several meanings.  In the New Testament it was used by the Apostle Paul in his second letter to the Corinthians (2 Corinthians 3:6) to contrast written, secular law with the new covenant of the spirit.  Paul’s argument was that legal statutes, without the Spirit, were powerless to bring about salvation and were therefore “dead letter” whereas the new covenant, based on the Spirit, brings life:  He has made us competent as ministers of a new covenant--not of the letter but of the Spirit; for the letter kills, but the Spirit gives life.” (2 Corinthians 3:6).  So, law devoid of the power of the Holy Spirit to interpret and apply it is a “dead letter” that can never be transformative, unlike the new covenant which is based on a living relationship with God through the Holy Spirit.

In a post office, a “dead letter” (which can be a “dead parcel”) is an item of mail which can neither be delivered to its intended recipient nor returned to the sender, usually because the addresses are incorrect or the recipient has moved without leaving a forwarding address.  Within postal systems, there is usually a “dead letter office” a special department dedicated to identifying and locating the sender or recipient.  If neither can be found and the item is unclaimed after a certain time (and in many systems there are deadlines), it may be opened and examined for any identifying information that could be used to identify either and if this proves unsuccessful, depending on its nature, the item may be destroyed or sent to public auction.  Beyond the Pauline and the postman, “dead letter” is a phrase which refers to (1) a law or regulation which nominally still exists but is no longer observed or enforced and (2) anything obsolescent or actually obsolete (floppy diskettes, fax machines et al).  In law, some examples are quite famous such as jurisdictions which retain the death penalty but never perform executions.  There have also been cases of attempting to use the “dead letter” law as an expression of public policy: In Australia, as late as 1997, the preferred position of the Tasmanian state government was that acts of homosexuality committed by men should remain unlawful and in the Criminal Code but that none would be prosecuted, the argument being it was important to maintain the expression of public disapproval of such things even if it was acknowledged criminal sanction was no longer appropriate.  There may have been a time when such an approach made political sense but even before 1997, that time had passed.  As is often the case, law reform was induced by generational change.

Founded in 2009 (an earlier incarnation Deadline Hollywood Daily had operated as a blog since 2006), deadline.com is a US film and entertainment news & gossip site now owned by Penske Media Corporation.

Thursday, October 12, 2023

Gap

Gap (pronounced gap)

(1) A break or opening, as in a fence, wall, or military line; breach; an opening that implies a breach or defect (vacancy, deficit, absence, or lack).

(2) An empty space or interval; interruption in continuity; hiatus.

(3) A wide divergence or difference; disparity

(4) A difference or disparity in attitudes, perceptions, character, or development, or a lack of confidence or understanding, perceived as creating a problem.

(5) A deep, sloping ravine or cleft through a mountain ridge.

(6) In regional use (in most of the English-speaking world and especially prominent in the US), a mountain pass, gorge, ravine, valley or similar geographical feature (also in some places used of a sheltered area of coast between two cliffs and often applied in locality names).

(7) In aeronautics, the distance between one supporting surface of an airplane and another above or below it.

(8) In electronics, a break in a magnetic circuit that increases the inductance and saturation point of the circuit.

(9) In various field sports (baseball, cricket, the football codes etc), those spaces between players which afford some opportunity to the opposition.

(10) In genetics, an un-sequenced region in a sequence alignment.

(11) In slang (New Zealand), suddenly to depart.

(12) To make a gap, opening, or breach in.

(13) To come open or apart; form or show a gap.

1350–1400: From the Middle English gap & gappe (an opening in a wall or hedge; a break, a breach), from Old Norse gap (gap, empty space, chasm) akin to the Old Norse gapa (to open the mouth wide; to gape; to scream), from the Proto-Germanic gapōną, from the primitive Indo-European root ghieh (to open wide; to yawn, gape, be wide open) and related to the Middle Dutch & Dutch gapen, the German gaffen (to gape, stare), the Danish gab (an expanse, space, gap; open mouth, opening), the Swedish gap & gapa and the Old English ġeap (open space, expanse).  Synonyms for gap can include pause, interstice, break, interlude, lull but probably not lacuna (which is associated specifically with holes).  Gap is a noun & verb, gapped & gapping are verbs, Gapless & gappy are adjectives; the noun plural is gaps.

Lindsay Lohan demonstrates a startled gape, MTV Movie-Awards, Gibson Amphitheatre, Universal City, California, June 2010.

The use to describe natural geographical formations (“a break or opening between mountains” which later extended to “an unfilled space or interval, any hiatus or interruption”) emerged in the late fifteenth century and became prevalent in the US, used of deep breaks or passes in a long mountain chain (especially one through which a waterway flows) and often used in locality names.  The use as a transitive verb (to make gaps; to gap) evolved from the noun and became common in the early nineteenth century as the phrases became part of the jargon of mechanical engineering and metalworking (although in oral use the forms may long have existed).  The intransitive verb (to have gaps) is documented only since 1948.  The verb gape dates from the early thirteenth century and may be from the Old English ġeap (open space, expanse) but most etymologists seem to prefer a link with the Old Norse gapa (to open the mouth wide; to gape; to scream); it was long a favorite way of alluding to the expressions thought stereotypical of “idle curiosity, listlessness, or ignorant wonder of bumpkins and other rustics” and is synonymous with “slack-jawed yokels”).  The adjective gappy (full of gaps; inclined to be susceptible to gaps opening) dates from 1846.  The adjectival use gap-toothed (having teeth set wide apart) has been in use since at least the 1570s, but earlier, Geoffrey Chaucer (circa 1344-1400) had used “gat-toothed” for the same purpose, gat from the Middle English noun gat (opening, passage) from the Old Norse gat and cognate with gate.

Lindsay Lohan demonstrates her admirable thigh gap, November 2013.

The “thigh gap” seems first to have been documented in 2012 but gained critical mass on the internet in 2014 when it became of those short-lived social phenomenon which produced a minor moral panic.  “Thigh gap” described the empty space between the inner thighs of a women when standing upright with feet touching; a gap was said to be good and the lack of a gap bad.  Feminist criticism noted it was not an attribute enjoyed by a majority of mature human females and it thus constituted just another of the “beauty standards” imposed on women which were an unrealizable goal for the majority.  The pro-ana community ignored this critique and thinspiration (thinspo) bloggers quickly added annotated images and made the thigh gap and essential aspect of female physical attractiveness.  

A walking, talking credibility gap: crooked Hillary Clinton (b 1947; US secretary of state 2009-2013).

In English, gap has been prolific in the creation of phrases & expressions.  The “generation gap” sounds modern and as a phrase it came into wide use only in the 1960s in reaction to the twin constructs of “teenagers” and the “counter-culture” but the concept has been documented since antiquity and refers to a disconnect between youth and those older, based on different standards of behavior, dress, artistic taste and social mores.  The term “technology gap” was created in the early 1960s and was from economics, describing the various implications of a nation’s economy gaining a competitive advantage over others by the creation or adoption of certain technologies.  However, the concept was familiar to militaries which had long sought to quantify and rectify any specific disadvantage in personnel, planning or materiel they might suffer compared to their adversaries; these instances are described in terms like “missile gap”, “air gap”, “bomber gap”, “megaton gap” et al (and when used of materiel the general term “technology deficit” is also used).  Rearmament is the usual approach but there can also be “stop gap” solutions which are temporary (often called “quick & dirty” (Q&D)) fixes which address an immediate crisis without curing the structural problem.  For a permanent (something often illusory in military matters) remedy for a deficiency, one is said to “bridge the gap”, “gap-fill” or “close the gap”.  The phrase “stop gap” in the sense of “that which fills a hiatus, an expedient in an emergency” appears to date from the 1680s and may have been first a military term referring to a need urgently to “plug a gap” in a defensive line, “gap” used by armies in this sense since the 1540s.  The use as an adjective dates from the same time in the sense of “filling a gap or pause”.  A “credibility gap” is discrepancy between what’s presented as reality and a perception of what reality actually is; it’s applied especially to the statements of those in authority (politicians like crooked Hillary Clinton the classic but not the only examples).  “Pay gap” & “gender gap” are companion terms used most often in labor-market economics to describe the differences in aggregate or sectoral participation and income levels between a baseline group (usually white men) and others who appear disadvantaged.

“Gap theorists” (known also as “gap creationists”) are those who claim the account of the Earth and all who inhabit the place being created in six 24 hour days (as described in the Book of Genesis in the Bible’s Old Testament) literally is true but that there was a gap of time between the two distinct creations in the first and the second verses of Genesis.  What this allows is a rationalization of modern scientific observation and analysis of physical materials which have determined the age of the planet.  This hypothesis can also be used to illustrate the use of the phrase “credibility gap”.  In Australia, gap is often used to refer to the (increasingly large) shortfall between the amount health insurance funds will pay compared with what the health industry actually charges; the difference, paid by the consumer, (doctors still insist on calling them patients) is the gap (also called the “gap fee”).  In Australia, the term “the gap” has become embedded in the political lexicon to refer to the disparity in outcomes between the indigenous and non-indigenous communities in fields such as life expectancy, education, health, employment, incarceration rates etc.  By convention, it can be used only to refer to the metrics which show institutional disadvantage but not other measures where the differences are also striking (smoking rates, crime rates, prevalence of domestic violence, drug & alcohol abuse etc) and it’s thus inherently political.  Programmes have been designed and implemented with the object of “closing the gap”; the results have been mixed.

Opinion remains divided on the use of platinum-tipped spark plugs in the Mercedes-Benz M100 (6.3 & 6.9) V8.

A “spark gap” is the space between two conducting electrodes, filled usually with air (or in specialized applications some other gas) and designed to allow an electric spark to pass between the two.  One of the best known spark gaps is that in the spark (or sparking) plug which provides the point of ignition for the fuel-air mixture in internal combustion engines (ICE).  Advances in technology mean fewer today are familiar with the intricacies of spark plugs, once a familiar (and often an unwelcome) sight to many.  The gap in a spark plug is the distance between the center and ground electrode (at the tip) and the size of the gap is crucial in the efficient operation of an ICE.  The gap size, although the differences would be imperceptible to most, is not arbitrary and is determined by the interplay of the specifications of the engine and the ignition system including (1) the compression ratio (low compression units often need a larger gap to ensure a larger spark is generated), (2) the ignition system, high-energy systems usually working better with a larger gap, (3) the materials used in the plug’s construction (the most critical variable being their heat tolerance); because copper, platinum, and iridium are used variously, different gaps are specified to reflect the variations in thermal conductivity and the temperature range able to be endured and (4) application, high performance engines or those used in competition involving sustained high-speed operation often using larger gaps to ensure a stronger and larger spark.

Kennedy, Khrushchev and the missile gap

The “missile gap” was one of the most discussed threads in the campaign run by the Democratic Party’s John Kennedy (JFK, 1917–1963; US president 1961-1963) in the 1960 US presidential election in which his opponent was the Republican Richard Nixon (1913-1994; US president 1969-1974).  The idea there was a “missile gap” was based on a combination of Soviet misinformation, a precautionary attitude by military analysts in which the statistical technique of extrapolation was applied on the basis of a “worst case scenario” and blatant empire building by the US military, notably the air force (USAF), anxious not to surrender to the navy their pre-eminence in the hierarchy of nuclear weapons delivery systems.  It’s true there was at the time a missile gap but it was massively in favor of the US which possessed several dozen inter-continental ballistic missiles (ICBM) while the USSR had either four or six, depending on the definition used.  President Dwight Eisenhower (1890-1969; US president 1953-1961), a five-star general well acquainted with the intrigues of the military top brass, was always sceptical about the claims and had arranged the spy flights which confirmed the real count but was constrained from making the information public because of the need to conceal his source of intelligence.  Kennedy may actually have known his claim was incorrect but, finding it resonated with the electorate, continued to include it in his campaigning, knowing the plausibility was enhanced in a country where people were still shocked by the USSR having in 1957 launched Sputnik I, the first ever earth-orbiting satellite.  Sputnik had appeared to expose a vast gap between the scientific capabilities of the two countries, especially in the matter of big missiles. 

President Kennedy & comrade Khrushchev at their unproductive summit meeting, Vienna, June 1961.

Fake gaps in such matters were actually nothing new.  Some years earlier, before there were ICBMs so in any nuclear war the two sides would have to have used aircraft to drop bombs on each other (al la Hiroshima & Nagasaki in 1945), there’d been a political furore about the claim the US suffered a “bomber gap” and would thus be unable adequately to respond to any attack.  In truth, by a simple sleight of hand little different to that used by Nazi Germany to 1935 to convince worried British politicians that the Luftwaffe (the German air force) was already as strong as the Royal Air Force (RAF), Moscow had greatly inflated the numbers and stated capability of their strategic bombers, a perception concerned US politicians were anxious to believe.  The USAF would of course be the recipient of the funds needed to build the hundreds (the US would end up building thousands) of bombers needed to equip all those squadrons and their projections of Soviet strength were higher still.  If all of this building stuff to plug non-existent gaps had happened in isolation it would have been wasteful of money and natural resources which was bad enough but this hardware made up the building blocks of nuclear strategy; the Cold war was not an abstract exercise where on both sides technicians with clipboards walked from silo to silo counting warheads.

Instead, the variety of weapons, their different modes of delivery (from land, sea, undersea and air), their degrees of accuracy and their vulnerability to counter-measures was constantly calculated to assess their utility as (1) deterrents to an attack, (2) counter-offensive weapons to respond to an attack or (3) first-strike weapons with which to stage a pre-emptive or preventative attack.  In the Pentagon, the various high commands and the burgeoning world of the think tanks, this analysis was quite an industry and it had to also factor in the impossible: working out how the Kremlin would react.  In other words, what the planners needed to do was create a nuclear force which was strong enough to deter an attack yet not seem to be such a threat that it would encourage an attack and that only scratched the surface of the possibilities; each review (and there were many) would produce detailed study documents several inches thick.

US Navy low-level photograph spy of San Cristobal medium-range ballistic missile (MRBM) site #1, Cuba, 23 October, 1962.

In October 1962, during the Cuban Missile Crisis, the somewhat slimmer nuclear war manuals synthesized from those studies were being read with more interest than usual.  It was a tense situation and had Kennedy and comrade Nikita Khrushchev (1894–1971; Soviet leader 1953-1964) not agreed to a back-channel deal, the US would probably have attacked Cuba in some manner, not knowing three divisions of the Red Army were stationed there to protect the Soviet missiles and that would have been a state of armed conflict which could have turned into some sort of war.  As it was, under the deal, Khrushchev withdrew the missiles from Cuba in exchange for Kennedy’s commitment not to invade Cuba and withdraw 15 obsolescent nuclear missiles from Turkey, the stipulation being the Turkish component must be kept secret.  That secrecy colored for years the understanding of the Cuban Missile Crisis and the role of the US nuclear arsenal played in influencing the Kremlin.  The story was that the US stayed resolute, rattled the nuclear sabre and that was enough to force the Soviet withdrawal.  One not told the truth was Lyndon Johnson (LBJ, 1908–1973; US president 1963-1969) who became president after Kennedy was assassinated in 1963 and historians have attributed his attitude to negotiation during the Vietnam War to not wishing to be unfavorably compared to his predecessor who, as Dean Rusk (1909–1994; US secretary of state 1961-1969) put it, stood “eyeball to eyeball” with Khrushchev and “made him blink first”.  The existence of doomsday weapon of all those missiles would distort Soviet and US foreign policy for years to come.

Monday, September 5, 2022

Sabotage

Sabotage (pronounced sab-uh-tahzh (U) or sab-oh-tahzh (non-U))

(1) Any underhand interference with production, work etc, in a plant, factory etc, as by enemy agents during wartime or by employees during a trade dispute; any similar action or behavior.

(2) In military use, an act or acts with intent to injure, interfere with, or obstruct the national defense of a country by willfully injuring or destroying, or attempting to injure or destroy, any national defense or war materiel, premises, or utilities, to include human and natural resources.

(3) Any undermining of a cause.

(4) To injure or attack by sabotage.

1907: From the French sabotage from saboter (to botch; to spoil through clumsiness (originally, to strike, shake up, harry and literally “to clatter in sabots (clog-like wooden soled shoes)”).

The noun sabotage is said to have been absorbed by English in 1907, having been used as a French borrowing since at least 1903.  The sense of the French usage was “malicious damaging or destruction of an employer's property by workmen", a development from the original idea of mere deliberate bungling and inefficiency as a form of ad-hoc industrial action.  Contemporary commentators in England noted "malicious mischief" was likely the “nearest explicit definition” of sabotage before point out “this new force in industry and morals” was definitely something associated with the continent.  As the meaning quickly shifted from mere lethargy in the means to physically damaging the tools of production, the story began to circulate that the origin of the word was related to instances of disgruntled strikers (something the English were apt to ascribe as habitual to French labour) tactic of throwing their sabots (clog-like wooden-soled shoes) into machinery.  There is no evidence this ever happened although it was such a vivid image that the tale spread widely and even enjoyed some currency as actual etymology but it was fake news.  Instead it was in the tradition of the French use in a variety of "bungling" senses including the poor delivery of a speech or a poorly played piece of music, the idea of a job botched or a discordant sound, like the clatter of many sabots on as a group walked on a hardwood floor.  The noun savate (a French method of fighting with the feet) from French savate (literally "a kind of shoe") is attested from 1862 and although linked to footwear, is unrelated to sabotage.

Prepared for sabotage: Lindsay Lohan in Gucci Black Patent Leather Hysteria Platform Clogs with wooden soles, Los Angeles, 2009.  The car is a 2009 (fifth generation) Maserati Quattroporte leased by her father.

What sabotage was depended also from where it was viewed.  In industry it was thought to be a substitute for striking in which the workers stayed in his place but proceeded to do his work slowly and badly, the aim being ultimately to displease his employer's customers and cause loss to his employer.  To the still embryonic unions seeking to organize labour, it was a reciprocal act of industrial democracy, going slow about the means of production and distribution in response to organized capital going slow in the matter of wages.  The extension by the military to describe the damage inflicted (especially clandestinely) to disrupt in some way the economy by damaging military or civilian infrastructure emerged during World War I (1914-1918).  The verb sabotage (to ruin or disable deliberately and maliciously) dates from 1912 and the noun saboteur (one who commits sabotage) was also first noted in the same year (although it had been used in English since 1909 as a French word); it was from the French agent noun from saboter and the feminine form was saboteuse.

The word exists in many European languages including Catalan (sabotatge), Czech (sabotáž), Danish (sabotage), Dutch (sabotage), Galician (sabotaxe), German (Sabotage), Hungarian (szabotázs), Italian (sabotaggio), Polish (sabotaż), Portuguese (sabotagem), Russian (сабота́ж) (sabotáž), Spanish (sabotaje), Swedish (sabotage) & Turkish (sabotaj).  Sabotage is so specific that it has no direct single-word synonym although, depending on context, related words include destruction, disruption, subversion, treachery, treason, vandalism, cripple, destroy, disrupt, hamper, hinder, obstruct, subvert, torpedo, undermine, vandalize, wreck, demolition, impairment, injury & disable.  Sabotage is a noun & verb, sabotaged is a verb & adjective, saboteur is a noun, sabotaging is a verb and sabotagable is an adjectival conjecture; some sources maintain there is no plural of sabotage and the correct form is “acts of sabotage” while others list the third-person singular simple present indicative form as sabotages.

Franz von Papen.

Although his activities as German Military Attaché for Washington DC during 1914-1915 would be overshadowed by his later adventures, Franz von Papen’s (1879–1969) inept attempts at sabotaging the Allied war effort would help introduce the word to the military vocabulary.  He attempted to disrupt the supply of arms to the British, even setting up a munitions factory with the intension of buying up scare commodities to deny their use by the Allies, only to find the enemy had contracted ample quantities so his expensive activities had no appreciable effect on the shipments.  Then his closest aide, after falling asleep on a train, left behind a briefcase full of letters compromising Papen for his activities on behalf of the central powers.  Within days, a New York newspaper published details of Papen’s amateurish cloak & dagger operations including his attempt to induce workers of Austrian & German descent employed in plants engaged in war production for the Allies to slow down their output or damage the goods.  Also in the briefcase were copies of letters he sent revealing shipping movements.

Even this wasn’t enough for the US to expel him so he expanded his operations, setting up a spy network to conduct a sabotage and bombing campaign against businesses in New York owned by citizens from the Allied nations.  That absorbed much money for little benefit but, undeterred, he became involved with Indian nationalists living in the US, arranging with them for arms to be shipped to India where he hoped a revolt against the Raj might be fermented, a strategy he pursued also with the Irish nationalists.  Thinking big, he planned an invasion of Canada and tried to enlist Mexico as an ally of the Central Powers in the event of the US entering the war with the promise California and Arizona would be returned.  More practically, early in 1915 he hired agents to blow up the Vanceboro international rail bridge which linked the US and Canada between New Brunswick and Maine.  That wasn’t a success but of greater impact was that Papen had departed from the usual practices of espionage by paying the bombers by cheque.  It was only his diplomatic immunity which protected him from arrest but British intelligence had been monitoring his activities and provided a file to the US State Department which in December 1915 declared him persona non grata and expelled him.  Upon his arrival in Berlin, he was awarded the Iron Cross.

Hopelessly ineffective though his efforts had proved, by the time Papen left the US, the words sabotage and saboteur had come into common use including in warning posters and other propaganda.  Papen went on greater things, serving briefly as chancellor and even Hitler’s deputy, quite an illustrious career for one described as “uniquely, taken seriously by neither his opponents nor his supporters”.  When one of the Weimar Republic's many scheming king-makers suggested Papen as chancellor, others thought the noting absurd, pointing out: "Papen has no head for politics."  The response was: "He doesn't need a head, his job is to be a hat".  Despite his known limitations, he proved one of the Third Reich’s great survivors, escaping purges and assassination and, despite being held in contempt by Hitler, served the regime to the end.  Even its coda he survived, being one of the few defendants at the main Nuremberg trial (1945-1946) to be acquitted (to be fair he was one of the few Nazis with the odd redeeming feature and his sins were those of cynical opportunism rather than evil intent) although the German courts did briefly imprison him, albeit under rather pleasant conditions.

The Simple Sabotage Field Manual (SSFM) was published in 1944 by the US Office of Strategic Services (OSS), the predecessor of the Central Intelligence Agency (CIA).  Its original purpose was as a resource for OSS field agents to use in motivating or recruiting potential foreign saboteurs and permission was granted permission to print and disseminate portions of the document as needed.  The idea was to provide tools and instructions so just about any member of society could inflict some degree of damage of a society and its economy, the rationale being that of a “death of a thousand cuts”.  In contrast, the more dramatic and violent acts of sabotage (high-risk activities like killings or blowing stuff up) were only ever practiced by a handful of citizens.  The SSFM was aimed at US sympathizers keen to disrupt war efforts against the allies during World War II (1939-1945) in ways that were barely detectable but, in cumulative effect, measurable and thus contains instructions for destabilizing or reducing progress and productivity by non-violent means. The booklet is separated into headings that correspond to specific audiences, including: Managers and Supervisors, Employees, Organizations and Conferences, Communications, Transportation (Railways, Automotive, and Water), General Devices for Lowering Morale and Creating Confusion & Electric Power.  The simplicity of approach was later adopted by the CIA when it distributed its Book of Dirty Tricks.

Of great amusement to students (amateur and professional) of corporate organizational behavior was that a number of the tactics the SSFM lists as being disruptive and tending to reduce efficiency are exactly those familiar to anyone working in a modern Western corporation.

Middle Management

(1) Insist on doing everything through “channels.” Never permit short-cuts to be taken in order to expedite decisions.

(2) Make “speeches.” Talk as frequently as possible and at great length. Illustrate your “points” by long anecdotes and accounts of personal experiences.

(3) When possible, refer all matters to committees, for “further study and consideration.” Attempt to make the committee as large as possible — never less than five.

(4) Bring up irrelevant issues as frequently as possible.

(5) Haggle over precise wordings of communications, minutes, resolutions.

(6) Refer back to matters decided upon at the last meeting and attempt to re-open the question of the advisability of that decision.

(7) Advocate “caution.” Be “reasonable” and urge your fellow-conferees to be “reasonable” and avoid haste which might result in embarrassments or difficulties later on.

Senior Management

(8) In making work assignments, always sign out the unimportant jobs first. See that important jobs are assigned to inefficient workers.

(9) Insist on perfect work in relatively unimportant products; send back for refinishing those which have the least flaw.

(10) To lower morale and with it, production, be pleasant to inefficient workers; give them undeserved promotions.

(11) Hold conferences when there is more critical work to be done.

(12) Multiply the procedures and clearances involved in issuing instructions, pay checks, and so on. See that three people have to approve everything where one would do.

Employees

(13) Work slowly.

(14) Contrive as many interruptions to your work as you can.

(15) Do your work poorly and blame it on bad tools, machinery, or equipment. Complain that these things are preventing you from doing your job right.

(16) Never pass on your skill and experience to a new or less skillful worker.

Monday, August 14, 2023

Obsolete & Obsolescent

Obsolete (pronounced ob-suh-leet)

(1) No longer in general use; fallen into disuse; that is no longer practiced or used, out of date, gone out of use, of a discarded type; outmoded.

(2) Of a linguistic form, no longer in use, especially if out of use for at least the past century.

(3) Effaced by wearing down or away (rare).

(4) In biology, imperfectly developed or rudimentary in comparison with the corresponding character in other individuals, as of a different sex or of a related species; of parts or organs, vestigial; rudimentary.

(5) To make obsolete by replacing with something newer or better; to antiquate (rare).

1570–1580: From the Latin obsolētus (grown old; worn out), past participle of obsolēscere (to fall into disuse, be forgotten about, become tarnished), the construct assumed to be ob- (opposite to) (from the Latin ob- (facing), a combining prefix found in verbs of Latin origin) + sol(ēre) (to be used to; to be accustomed to) + -ēscere (–esce) (the inchoative suffix, a form of -ēscō (I become)).  It was used to form verbs from nouns, following the pattern of verbs derived from Latin verbs ending in –ēscō).  Obsoletely is an adverb, obsoleteness is a noun and the verbs (used with object), are obsoleted & obsoleting; Although it does exist, except when it’s essential to covey a technical distinction, the noun obsoleteness is hardly ever used, obsolescence standing as the noun form for both obsolete and obsolescent.  The verb obsolesce (fall into disuse, grow obsolete) dates from 1801 and is as rare now as it was then.

Although not always exactly synonymous, in general use, archaic and obsolete are often used interchangeably.  However, dictionaries maintain a distinction: words (and meanings) not in widespread use since English began to assume its recognizably modern form in the mid-1700s, are labeled “obsolete”.  Words and meanings which, while from Modern English, have long fallen from use are labeled “archaic” and those now seen only very infrequently (and then in often in specialized, technical applications), are labeled “rare”.

Obsolescent (promounced ob-suh-les-uhnt)

(1) Becoming obsolete; passing out of use (as a word or meaning).

(2) Becoming outdated or outmoded, as applied to machinery, weapons systems, electronics, legislation etc.

(3) In biology, gradually disappearing or imperfectly developed, as vestigial organs.

1745–1755: From the Latin obsolēscentum, from obsolēscēns, present participle of obsolēscere (to fall into disuse); the third-person plural future active indicative of obsolēscō (degrade, soil, sully, stain, defile).  Obsolescently is an adverb and obsolescence a noun.  Because things that are obsolescent are becoming obsolete, the sometimes heard phrase “becoming obsolescent” is redundant.  The sense "state or process of gradually falling into disuse; becoming obsolete" entered general use in 1809 and although most associated with critiques by certain economists in the 1950s, the phrase “planned obsolescence was coined” was coined in 1932, the 1950s use a revival.

Things that are obsolete are those no longer in general use because (1) they have been replaced, (2) the activity for which they were designed is no longer undertaken.  Thing that are considered obsolescent are things still to some extent in use but are for whatever combination of reasons, are tending towards becoming obsolete.  in fading from general use and soon to become obsolete. For example, the Windows XP operating system (released in 2001) is not obsolete because some still use it, but it is obsolescent because, presumably it will in the years ahead fall from use.

Ex-Royal Air Force (RAF) Hawker Hunter in Air Force of Zimbabwe (AFZ) livery; between 1963-2002 twenty-six Hunters were at different times operated by the AFZ.  Declared obsolete as an interceptor by the RAF in 1963, some Hunters were re-deployed to tactical reconnaissance, ground-attack and close air support roles before being retired from front-line service in 1970.  Some were retained as trainers while many were sold to foreign air forces including India, Pakistan and Rhodesia (Zimbabwe since 1980).

Despite the apparent simplicity of the definition, in use, obsolescent is highly nuanced and much influenced by context.  It’s long been a favorite word in senior military circles; although notorious hoarders, generals and admirals are usually anxious to label equipment as obsolescent if there’s a whiff of hope the money might to forthcoming to replace it with something new.  One often unexplored aspect of the international arms trade is that of used equipment, often declared obsolescent by the military in one state and purchased by that of another, a transaction often useful to both parties.  The threat profile against which a military prepares varies between nations and equipment which genuinely has been rendered obsolescent for one country may be a valuable addition to the matériel of others and go on enjoy an operational life of decades.  Well into the twentieth-first century, WWII & Cold War-era aircraft, warships, tanks and other weapon-systems declared obsolescent and on-sold (and in some cases given as foreign aid or specific military support) by big-budget militaries remain a prominent part of the inventories of many smaller nations.  That’s one context, another hinges on the specific-tasking of materiel; an aircraft declared obsolescent as a bomber could go on long to fulfil a valuable role as in transport or tug.

In software, obsolescence is so vague a concept the conventional definition really isn’t helpful.  Many software users suffer severe cases of versionitis (a syndrome in which they suffer a sometimes visceral reaction to using anything but the latest version of something) so obsolescence to them seems an almost constant curse.  The condition tends gradually to diminish in severity and in many cases the symptoms actually invert: after sufficient ghastly experiences with new versions, versionitis begins instead to manifest as a morbid fear of every upgrading anything.  Around the planet, obsolescent and obsolete software has for decades proliferated and there’s little doubt this will continue, the Y2K bug which prompted much rectification work on the ancient code riddling the world of the main-frames and other places unlikely to be the last great panic (one is said to be next due in 2029).  The manufacturers too have layers to their declaration of the obsolete.  In 2001, Microsoft advised all legacy versions of MS-DOS (the brutish and now forty year old file-loader) were obsolete but, with a change of release number, still offer what's functionally the same MS-DOS for anyone needing a small operating system with minimal demands on memory size & CPU specification, mostly those who use embedded controllers, a real attraction being the ability easily to address just about any compatible hardware, a convenience more modern OSs have long restricted.  DOS does still have attractions for many, the long-ago derided 640 kb actually a generous memory space for many of the internal processes of machines and it's an operating system with no known bugs.  

XTree’s original default color scheme; things were different in the 1980s.

Also, obsolescent, obsolete or not, sometimes the old ways are the best.  In 1985, Underware Sytems (later the now defunct Executive Systems (EIS)) released a product called XTree, the first commercially available software which provided users a visual depiction of the file system, arranged using a root-branch tree metaphor.  Within that display, it was possible to do most file-handling such as copying, moving, re-naming, deleting and so on.  Version 1.0 was issued as a single, 35 kb executable file, supplied usually on a 5.25" floppy diskette and although it didn’t do anything which couldn’t (eventually) be achieved using just DOS, XTree made it easy and fast; reviewers, never the most easily impressed bunch, were effusive in their praise.  Millions agreed and bought the product which went through a number of upgrades until by 1993, XTreeGold 3.0 had grown to a feature-packed three megabytes but, and it was a crucial part of the charm, the user interface didn’t change and anyone migrating from v1 to v3 could carry on as before, using or ignoring the new functions as they choose.

However, with the release in 1990 of Microsoft’s Windows 3.0, the universe shifted and while it was still an unstable environment, it was obvious things would improve and EIS, now called the XTree Company, devoted huge resources to producing a Windows version of their eponymous product, making the crucial decision that when adopting the Windows-style graphical user interface (GUI), the XTree keyboard shortcuts would be abandoned.  This mean the user interface was something that looked not greatly different to the Windows in-built file manager and bore no resemblance to the even then quirky but marvelously lucid one which had served so well.  XTree for Windows was a critical and financial disaster and in 1993 the company was sold to rival Central Point Software, themselves soon to have their own problems, swallowed a year later by Symantec which, in a series of strategic acquisitions, soon assumed an almost hegemonic control of the market for Windows utilities.  Elements of XTree were interpolated into other Symantec products but as a separate line, it was allowed to die.  In 1998, Symantec officially deleted the product but the announcement was barely noted by the millions of users who continued to use the text-based XTree which ran happily under newer versions of Windows although, being a real-time program and thus living in a small memory space, as disks grew and file counts rose, walls were sometimes hit, some work-arounds possible but kludgy.  The attraction of the unique XTree was however undiminished and an independent developer built ZTree, using the classic interface but coded to run on both IBM’s OS/2 and the later flavors of Windows.  Without the constraints of the old real-time memory architecture, ZTree could handle long file and directory names, megalomaniacs now able to log an unlimited number of disks and files, all while using the same, lightning-fast interface.  The idea spread to UNIX where ytree, XTC, linuXtree and (most notably), UnixTree were made available.

ZTree, for those who can remember how things used to be done.

ZTree remains a brute-force favorite for many techs.  Most don’t often need to do those tasks at which it excels but, when those big-scale needs arise, as a file handler, ZTree still can do what nothing else can.  It’ll also do what’s now small-scale stuff; anyone still running XTree 1.0 under MS-DOS 2.11 on their 8088 could walk to some multi-core 64-bit monster with 64 GB RAM running Windows 11 and happily use ZTree.  ZTree is one of the industry’s longest-running user interfaces.

The Centennial Light, Livermore-Pleasanton Fire Department, Livermore, California.  Illuminated almost continuously since 1901, it’s said to be the world's longest-lasting light bulb.  The light bulb business became associated with the idea of planned obsolescence after the revelation of the existence of a cartel of manufacturers which had conspired to more than halve the service life of bulbs in order to stimulate sales.

As early as 1924, executives in US industry had been discussing the idea of integrating planned obsolescence into their systems of production and distribution although it was then referred to with other phrases.  The idea essentially was that in the industrial age, modern mercantile capitalism was so efficient in its ability to produce goods that it would tend to over-produce, beyond the ability to stimulate demand.  The result would be a glut, a collapse in prices and a recession or depression which affected the whole society, a contributing factor to what even then was known as the boom & bust economy.  One approach was that of the planned economy whereby government would regulate production and maintain employment and wages at the levels required to maintain some degree of equilibrium between supply and demand but such socialistic notions were anathematic to industrialists.  Their preference was to reduce the lifespan of goods to the point which matched the productive capacity and product-cycles of industry, thereby ensuring a constant churn.  Then, as now, there were those for and against, the salesmen delighted, the engineers appalled.

The actual phrase seems first to have been used in the pamphlet Ending the Depression Through Planned Obsolescence, published in 1932 by US real estate broker (and confessed Freemason) Bernard London (b circa 1873) but it wasn’t popularized until the 1950s.  Then, it began as a casual description of the techniques used in advertising to stimulate demand and thus without the negative connotations which would attach when it became part of the critique of materialism, consumerism and the consequential environmental destruction.  There had been earlier ideas about the need for a hyper-consumptive culture to service a system designed inherently to increase production and thus create endless economic growth: one post-war industrialist noted the way to “avoid gluts was to create a nation of gluttons” and exporting this model underlies the early US enthusiasm for globalism.  As some of the implications of that became apparent, globalization clearly not the Americanization promised, enthusiasm became more restrained.

Betamax and VHS: from dominant to obsolescent to obsolete; the DVD may follow.

Although the trend began in the United States in the late 1950s, it was in the 1970s that the churn rate in consumer electronics began to accelerate, something accounted for partly by the reducing costs as mass-production in the Far East ramped up but also the increasing rapidity with which technologies came and went.  The classic example of the era was the so-called videotape format war which began in the mid 1970s after the Betamax (usually clipped to Beta) and Video Home System (VHS) formats were introduced with a year of each other.  Both systems were systems by which analog recordings of video and audio content cold be distributed on magnetic tapes which loaded into players with a cassette (the players, regardless of format soon known universally as video cassette recorders (VCR).  The nerds soon pronounced Betamax the superior format because of superior quality of playback and commercial operators agreed with it quickly adopted as the default standard in television studios.  Consumers however came to prefer VHS because, on most of the screens on which most played their tapes, the difference between the two was marginal and the VHS format permitted longer recording times (an important thing in the era) and the hardware was soon available at sometimes half the cost of Betamax units.

It was essentially the same story which unfolded a generation later in the bus and operating systems wars; the early advantages of OS/2 over Windows and Micro Channel Architecture (MCA) over ISA/EISA both real and understood but few were prepared to pay the steep additional cost for advantages which seemed so slight and at the same time brought problems of their own.  Quite when Betamax became obsolescent varied between markets but except for a handful of specialists, by the late 1980s it was obsolete and the flow of new content had almost evaporated.  VHS prevailed but its dominance was short-lived, the Digital Versatile Disc (DVD) released in 1997 which within half a decade was the preferred format throughout the Western world although in some other markets, the thriving secondary market suggests even today the use of VCRs is not uncommon.  DVD sales though peaked in 2006 and have since dropped by some 80%, their market-share cannibalized not by the newer Blu-Ray format (which never achieved critical mass) but by the various methods (downloads & streaming) which meant many users were able wholly to abandon removable media.  Despite that, the industry seems still to think the DVD has a niche and it may for some time resist obsolescence because demand still exists for content on a physical object at a level it remains profitable to service.  Opinions differ about the long-term.  History suggests that as the “DVD generation” dies off, the format will fade away as those used to entirely weightless content available any time, in any place won’t want the hassle but, as the unexpected revival of vinyl records as a lucrative niche proved, obsolete technology can have its own charm which is why a small industry now exists to retro-fit manual gearboxes into modern Ferraris, replacing technically superior automatic transmissions.