Showing posts sorted by date for query Floppy. Sort by relevance Show all posts
Showing posts sorted by date for query Floppy. Sort by relevance Show all posts

Friday, September 26, 2025

Beret

Beret (pronounced buh-rey)

A soft, visor-less cap, made usually of a soft wool material or felt, styled with a close-fitting headband and a wide, round top, often with a tab at the center.

1827: From the French béret (round, flat, woolen cap), from the dialectal form béarn, from the Occitan (Gascon) & Old Provençal berret (cap), from the Medieval Latin birrettum (a flat woollen cap that was worn by peasants), a diminutive of the Late Latin birrus (a large hooded cloak), a word perhaps of Gaulish origin but the ultimate root probably was the Proto-Celtic birros (short) and related to the Welsh byr and the Middle Irish berr.  The similar clerical variation is called a biretta and in Spanish, the spelling is boina.  Some military units are associated with the color of their berets (green berets; blue berets etc).  Beret is a noun; the noun plural is berets,

A rendering of the famous photograph of Che Guevara (1928–1967) at the La Coubre memorial service by Alberto Korda (1928-2001), 5 March 1960.

Long culturally associated with France, its popularity as a fashion-piece may bounce around but the beret has never gone away, examples found by archaeologists in bronze age tombs and the headwear has appeared in art since Antiquity, notable especially in European sculpture from the twelfth century.  The floppiness certainly varied, a quality seemingly close to a direct relationship with size, suggesting all were made, as they appeared, from felt or some material with  similar properties.  One of the oldest forms of processed cloth, felt was a serendipitous creation by shepherds who, for warmth and comfort, filled their shoes with tufts of wool; as they walked and worked, they sweated and felt was made by the pressure of the foot pressing the the wool against the leather.  Berets were adopted first by Basque peasants, then royalty, then the military and then artists but in the twentieth century, they gained an anti-establishment association, influenced by French existentialists and the famous photograph of Che Guevara.

Lindsay Lohan in beret, promotional image for Saturday Night Live, episode 37-16, March 2012.

The military, the counterculture and the fashionistas have shared the once humble cap since.  One aspect of it however proved as vulnerable as any object of mass-manufacture to the arithmetic of unit-labour costs and world trade.  In France, early in the post-war years, there had been fifteen beret factories in the district of Oloron-Sainte-Marie in the Pyrénées where most French berets were made yet by the late 1990s, there was but one and it catered for only for the upper reaches of the market, supplying those few who simply didn’t wear clothes made east of Suez and Laulhère is the last remaining historic beret-maker still operating in France.  Dating from 1840 when the Laulhère family opened its first factory, such was the struggle to survive the national textile industry crisis as well as the erosion of its market by low-priced products of dubious quality that in 2013 the decision was taken by Laulhère finally to end production.  However, just as Charles de Gaulle (1890-1970; President of France 1959-1969) had une certain idée de la France, upon hearing the news of the closure, the industry decided there was une certaine idée de la mode française and a rescue package was organized by the Gascon-based Cargo Group and its sister company, Blancq-Olibet.  In a press release issued almost immediately after the new broke, Cargo Group confirmed they had acted because the beret was “such an important part of our history and patrimoine (cultural heritage).  Clearly, the beret is as important to the French as the baguette, and croissant, both items being traditional products which have survived to co-exist with cheaper, industrially manufactured "imitations".  Cargo’s business model was simultaneously to use Laulhère’s expertise and skilled workforce to introduce new, more modern lines but maintain the availability of the traditional styles and it appears to have been successful, the classic berets still on sale.  It’s one of those dependable industry staples which can just about every year be promoted by a label, publication or stylist as one of the trends to watch in the next season.  Unlike something like the polka-dot which tends to be cyclical with sometimes a decade between spikes, the classic, timeless beret is always there, running the gamut from revolutionary chic to French-girl accessory, something able to be worn in all four seasons and is the ultimate mix & match fall-back; it can work with spots or stripes and vivid or subdued solids.  It's rare that a catwalk doesn't include at least one beret.

Canadian-American singer-songwriter Joni Mitchell (b 1943) in beret on the cover of Hejira (1976), the image a collage of photographs by Dr Norman Seeff (b 1939) and Joel Bernstein (b 1952).

Hejira is one of the less frequently used transliterations (the others being Hijrah, Hidjra & Hegira) of the Arabic هِجْرَة (hijra) (departure, exodus), a word used to refer to the Prophet Muhammad’s (circa 570-632) flight with his companions in 622 from Mecca to Medina.  It was from the verb هَجَرَ (hajara) (emigrate, to abandon).  When interviewed, Ms Mitchell said she was attracted to the word because of the “attractiveness” of the “dangling j”; presumably, musicians “hear” in words musical properties in the way a synesthete might “see” colors.

Brigitte Bardot (1934-2025) in beret.

The beret certainly has a long history, floppy head coverings appearing in archaeological record of the Early Bronze Age (circa 3300-2000 BC) and they have remained a feature in European clothing ever since.  At least partially, this was technological determinism in action: felt was the material constantly used and, being non-woven, it is one of the easiest materials to produce without complex machinery or skills.  Felt is made by matting and pressing wet natural fibres (classically wool) and its is famously versatile and durable, peasants favouring it for the linings of jackets, footwear and of course hats, as valued for its warmth as its capacity to resist moisture.  In the severe Russian winter of 1941-1942, one of the most prized possessions to be taken by the Germans from dead or captured Red Army soldiers were their felt-lined boots.  Because of the hubristic assumption the Russians would be defeated before the onset of winter, the Wehrmacht (the German armed forces) had made little attempt to build-up stocks of suitable clothing but, even had they been more cautious, there was nothing in their inventory to match the effectiveness of the Soviet felt-lined boot.  By the seventeenth century, black felt hats (less a fashion choice than it simply being the most simple colour to produce) were virtually an item or uniform among the French working class, farmers and artisans although it wasn’t until 1827 the industry coined béret, from the Medieval Latin birretum (a flat woollen cap worn by peasants).

The hauntingly lovely Brigitte Bardot (at 17) with Andre Bourvil (1917-1970) in Le Trou Normand (Crazy for Love, 1952).  She had a minor part portraying the protagonist's cousin but it was her first feature film and it convinced directors "the camera loves her".

This being pre-EEC (European Economic Community (1957), the Zollverein that would evolve into the EU (European Union (1993)) Europe, the beret of course became a political statement and as tensions grew in the mid-nineteenth century between France & Spain, the fashion lines were drawn: French berets were blue and Spanish red although in a gesture which might have pleased the Marxists, the working class everywhere continue to wear black although they were drawn by the price rather than international solidarity (although that too vindicates Marxist economic theory).  However, it’s from the early twentieth century that historians of fashion trace the ascent of the black beret as an essentially classless chic accessory which could be worn by men & women alike although such are the memories of Brigitte Bardot and Catherine Deneuve (b 1943) that about the only men remembered for their berets are revolutionaries, Che Guevara, the Black Panthers and such.  One political aspect of the beret definitely is a myth: it’s not true the Nazis banned the hat during the occupation of France (1940-1944).  The origin of that tale seems to lie in the publication in the 1970s of a number of propaganda leaflets created by the SOE (Special Operations Executive, the UK government's “department of dirty tricks” with a mandate to set Europe ablaze), one of which was to spread in France the story the Germans were going to “ban the beret”, something with some basis in fact because there was, briefly, a local campaign to deprive Alsatians of the hat, on the basis it was a (Basque) manifestation of Frenchness.  Strange as it sounds, such things had been done before, the UK parliament in 1746 responding to the failure of the Jacobite rising of 1745 by passing the Dress Act which in Scotland restricted the wearing of tartan.  Quickly, the adults in Berlin put a stop to the wacky scheme and there's no evidence the SOE's plan was ever used although the organization remained active in what would now be called the business of misinformation & disinformation.  One curiosity of the aborted crackdown on berets was it didn't extend to onion sellers although that wasn't enough to save Robert Wagner (1895–1946; civil administrator of Alsace during the Nazi occupation) who, an unrepentant Nazi to the end, was sentenced to death by a French court and executed.  

Friday, October 11, 2024

Floppy

Floppy (pronounced flop-ee)

(1) A tendency to flop.

(2) Limp, flexible, not hard, firm, or rigid; flexible; hanging loosely.

(3) In IT, a clipping of “floppy diskette”.

(4) In historic military slang (Apartheid-era South Africa & Rhodesia (now Zimbabwe), an insurgent in the Rhodesian Bush War (the “Second Chimurenga” (from the Shona chimurenga (revolution)) 1964-1979), the use a reference to the way they were (in sardonic military humor) said to “flop” when shot.

(5) In informal use, a publication with covers made with a paper stock little heavier and more rigid that that used for the pages; Used mostly for comic books.

(6) In slang, a habitué of a flop-house (a cheap hotel, often used as permanent or semi-permanent accommodation by the poor or itinerant who would go there to “flop down” for a night) (archaic).

(7) In slang, as “floppy cats”, the breeders’ informal term for the ragdoll breed of cat, so named for their propensity to “go limp” when picked up (apparently because of a genetic mutation).

1855-1860: The construct was flop + -y.  Flop dates from 1595–1605 and was a variant of the verb “flap” (with the implication of a duller, heavier sound).  Flop has over the centuries gained many uses in slang and idiomatic form but in this context it meant “loosely to swing; to flap about”.  The sense of “fall or drop heavily” was in use by the mid-1830s and it was used to mean “totally to fail” in 1919 in the wake of the end of World War I (1914-1918), the conflict which wrote finis to the dynastic rule of centuries also of the Romanovs in Russia, the Habsburgs in Austria-Hungary and the Ottomans in Constantinople although in the 1890s it was recorded as meaning “some degree of failure”.  The comparative is floppier, the superlative floppiest.  Floppy a noun & adjective, floppiness is a noun, flopped is a noun & verb, flopping is a verb, floppier& floppiest are adjectives and floppily is an adverb; the noun plural is floppies.  The adjective floppish is non-standard and used in the entertainment & publishing industries to refer to something which hasn’t exactly “flopped” (failed) but which had not fulfilled the commercial expectations.

Lindsay Lohan in "floppy-brim" hat, on-set during filming of Liz & Dick (2012).  In fashion, many "floppy-brim" hats actually have a stiff brim, formed in a permanently "floppy" shape.  The true "floppy hats" are those worn while playing sport or as beachwear etc.

The word is used as a modifier in pediatric medicine (floppy baby syndrome; floppy infant syndrome) and as “floppy-wristed” (synonymous with “limp-wristed”) was used as a gay slur.  “Flippy-floppy” was IT slang for “floppy diskette” and unrelated to the previous use of “flip-flop” or “flippy-floppy” which, dating from the 1880s was used to mean “a complete reversal of direction or change of position” and used in politics to suggest inconsistency.  In the febrile world of modern US politics, to be labelled a “flip-flopper” can be damaging because it carries with it the implication what one says can’t be relied upon and campaign “promises” might thus not be honored.  Whether that differs much from the politicians’ usual behaviour can be debated but still, few enjoy being accused of flip-floppery (definitely a non-standard noun).  The classic rejoinder to being called a flip-flopper is the quote: “When the facts change, I change my mind. What do you do, sir?”  That’s often attributed to the English economist and philosopher Lord Keynes (John Maynard Keynes, 1883-1946) but it was said originally by US economist Paul Samuelson (1915–2009) the 1970 Nobel laureate in Economics.  In the popular imagination Keynes is often the “go to” economist for quote attribution in the way William Shakespeare (1564–1616) is a “go to author” and Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) a “go to politician”, both credited with thing they never said but might have said.  I phraseology, the quality of “Shakespearian” or “Churchillian” not exactly definable but certainly recognizable.  In the jargon of early twentieth century electronics, a “flip-flop” was a reference to switching circuits that alternate between two states.

Childless cat lady Taylor Swift with her “floppy cat”, Benjamin Button (as stole).  Time magazine cover, 25 December 2023, announcing Ms Swift as their 2023 Person of the Year.  "Floppy cat" is the the breeders' informal term for the ragdoll breed an allusion to their tendency to “go limp” when picked up, a behavior believed caused by a genetic mutation.

The other use of flop in IT is the initialism FLOP (floating point operations per second).  Floating-point (FB) arithmetic (FP) a way of handling big real numbers using an integer with a fixed precision, scaled by an integer exponent of a fixed base; FP doesn’t really make possible what would not in theory be achievable using real numbers but does make this faster and practical and the concept became familiar in the 1980s when Intel made available FPUs (floating point units, also known as math co-processors) which could supplement the CPUs (central processing units) of their x86 family.  The 8087 FPU worked with the 8086 CPU and others followed (80286/80287, 80386/80387, i486/i487 etc) until eventually the FPU for the Pentium range was integrated into the CPU, the early implementation something of a debacle still used as a case study in a number of fields departments including management and public relations.

FLOPs are an expression of specific performance and are used to measure those computations requiring floating-point calculations (typically in math-intensive work) and for purposes of “benchmarking” or determining “real-world” performance under those conditions, it’s a more informative number than the traditional rating of instructions per second (iSec).  The FLOPs became something of a cult in the 1990s when the supercomputers of the era first breached the trillion FLOP mark and as speeds rose, the appropriate terms were created:

kiloFLOPS: (kFLOPS, 103)
megaflops: (MFLOPS, 106)
gigaflops: GFLOPS, 109)
teraflops: TFLOPS, 1012)
petaFLOPS: PFLOPS, 1015)
exaFLOPS: (EFLOPS, 1018)
zettaFLOPS: ZFLOPS, 1021)
yottaFLOPS: YFLOPS, 1024)
ronnaFLOPS: RFLOPS, 1027)
quettaFLOPS: QFLOPS, 1030)

In the mysterious world of quantum computing, FLOPs are not directly applicable because the architecture and methods of operation differ fundamentally from those of classical computers.  Rather than FLOPs, the performance of quantum computers tends to be measured in qubits (quantum bits) and quantum gates (the operations that manipulate qubits).  The architectural difference is profound and explained with the concepts of superposition and entanglement:  Because a qubit simultaneously can represent both “0” & “1” (superposition) and these can be can be entangled (a relationship in which distance is, at least in theory, irrelevant), under such multi-string parallelism, performance cannot easily be reduced to simple arithmetic or floating-point operations which remain the domain of classical computers which operate using the binary distinction between “0” (off) and “1” (on).

Evolution of the floppy diskette: 8 inch (left), 5¼ inch (centre) & 3½ inch (right).  The track of the floppy for the past half-century has been emblematic of the IT industry in toto: smaller, higher capacity and cheaper.  Genuinely it was one of the design parameters for the 3½ inch design that it fit into a man's shirt pocket.

In IT, the term “floppy diskette” used the WORM (write once, read many, ie "read only" after being written) principle first appeared in 1971 (soon doubtless clipped to “floppy” although the first known use of this dates from 1974).  The first floppy diskettes were in an 8 inch (2023 mm) format which may sound profligate for something with a capacity of 80 kB (kilobyte) but the 10-20 MB (megabit) hard drives of the time were typically the same diameter as the aperture of domestic front-loading washing machine so genuinely they deserved the diminutive suffix (-ette, from the Middle English -ette, a borrowing from the Old French -ette, from the Latin -itta, the feminine form of -ittus.  It was used to form nouns meaning a smaller form of something).  They were an advance also in convenience because until they became available, the usual way to transfer files between devices was to hard-wire them together.  Introduced by IBM in 1971, the capacity was two years later raised to 256 kB and by 1977 to a heady 1.2 MB (megabyte) with the advent of a double-sided, double-density format.  However, even then it was obvious the future was physically smaller media and in 1978 the 5¼ inch (133 mm) floppy debuted, initially with a formatted capacity of 360 kB but by 1982 this too had be raised to 1.2 MB using the technological advance if a HD (high density) file system and it was the 5¼ floppy which would become the first widely adopted industry “standard” for both home and business use, creating the neologism “sneakernet”, the construct being sneaker + net(work), the image being of IT nerds in their jeans and sneakers walking between various (unconnected) computers and exchanging files via diskette.  Until well into the twenty-first century the practice was far from functionally extinct and it persists even today with the use of USB sticks.

Kim Jong-un (Kim III, b 1982; Supreme Leader of DPRK (North Korea) since 2011) with 3½ inch floppy diskette (believed to be a HD (1.44 MB)).

The meme-makers use the floppy because it has become a symbol of technological bankruptcy. In OS (operating system) GUIs (graphical user interface) however, it does endure as the "save" icon and all the evidence to date does suggest that symbolic objects like icons do tend to outlive their source, thus the ongoing use in IT of analogue, rotary dial phones in iconography and the sound of a camera's physical shutter in smart phones.  Decades from now, we may still see representations of floppy diskettes.

The last of the mainstream floppy diskettes was the 3½ inch (89 mm) unit, introduced in 1983 in double density form with a capacity of 720 KB (although in one of their quixotic moves IBM used a unique 360 kB version for their JX range aimed at the educational market) but the classic 3½ was the HD 1.44 MB unit, released in 1986.  That really was the end of the line for the format because although in 1987 a 2.88 MB version was made available, few computer manufacturers offered the gesture of adding support at the BIOS (basic input output system) so adoption was infinitesimal.  The 3½ inch diskette continued in wide use and there was even the DMF (Distribution Media Format) with a 1.7 MB capacity which attracted companies like Microsoft, not because it wanted more space but to attempt to counter software piracy; within hours of Microsoft Office appearing in shrink-wrap with, copying cracks appeared on the bulletin boards (where nerds did stuff before the www (worldwideweb).  It was clear the floppy diskette was heading for extinction although slighter larger versions with capacities as high as 750 MB did appear but, expensive and needing different drive hardware, they were only ever a niche product seen mostly inside corporations.  By the time the CD-ROM (Compact Disc-Read-only Memory) reached critical mass in the mid-late 1990s the once ubiquitous diskette began rapid to fade from use, the release in the next decade of the USB sticks (pen drives) a final nail in the coffin for most.

In the mid 1990s, installing OS/2 Warp 4.0 (Merlin) with the optional packs and a service pack could require a user to insert and swap up to 47 diskettes.  It could take hours, assuming one didn't suffer the dreaded "floppy failure".

That was something which pleased everyone except the floppy diskette manufacturers who had in the early 1990s experienced a remarkable boom in demand for their product when Microsoft Windows 3.1 (7 diskettes) and IBM’s OS/2 2.0 (21 diskettes) were released. Not only was the CD-ROM a cheaper solution than multiple diskettes (a remarkably labor-intensive business for software distributors) but it was also much more reliable, tales of an installation process failing on the “final diskette” legion and while some doubtlessly were apocryphal, "floppy failure" was far from unknown.  By the time OS/2 Warp 3.0 was released in 1994, it required a minimum of 23 floppy diskettes and version 4.0 shipped with a hefty 30 for a base installation.  Few mourned the floppy diskette and quickly learned to love the CD-ROM.

What lay inside a 3½ inch floppy diskette.

Unlike optical discs (CD-ROM, DVD (Digital Versatile Disc) & Blu-Ray) which were written and read with the light of a laser, floppy diskettes were read with magnetic heads.  Inside the vinyl sleeve was a woven liner impregnated with a lubricant, this to reduce friction on the spinning media and help keep the surfaces clean.

Curiously though, niches remained where the floppy lived on and it was only in 2019 the USAF (US Air Force) finally retired the use of floppy diskettes which since the 1970s had been the standard method for maintaining and distributing the data related to the nation’s nuclear weapons deployment.  The attractions of the system for the military were (1) it worked, (2) it was cheap and (3) it was impervious to outside tampering.  Global thermo-nuclear war being a serious business, the USAF wanted something secure and knew that once data was on a device in some way connected to the outside world there was no way it could be guaranteed to be secure from those with malign intent (ayatollahs, the Secret Society of the Les Clefs d'Or, the CCP (Chinese Communist Party), the Freemasons, those in the Kremlin or Pyongyang etc) whereas a diskette locked in briefcase or a safe was, paradoxically, the state of twenty-first century security, the same philosophy which has seen some diplomatic posts in certain countries revert to typewriters & carbon paper for the preparation of certain documents.  In 2019 however, the USAF announced that after much development, the floppies had been retired and replaced with what the Pentagon described as a “highly-secure solid-state digital storage solution which work with the Strategic Automated Command and Control System (SACCS).

It can still be done: Although no longer included in PCs & laptops, USB floppy diskette drives remain available (although support for Windows 11 systems is said to be "inconsistent").  Even 5¼ inch units have been built.

It thus came as a surprise in 2024 to learn Japan, the nation which had invented motorcycles which didn’t leak oil (the British though they’d proved that couldn’t be done) and the QR (quick response) code, finally was abandoning the floppy diskette.  Remarkably, even in 2024, the government of Japan still routinely asked corporations and citizens to submit documents on floppies, over 1000 statutes and regulations mandating the format.  The official in charge of updating things (in 2021 he’d “declared war” on floppy diskettes) in July 2024 announced “We have won the war on floppy disks!” which must have be satisfying because he’d earlier been forced to admit defeat in his attempt to defenestrate the country’s facsimile (fax) machines, the “pushback” just too great to overcome.  The news created some interest on Japanese social media, one tweet on X (formerly known as Twitter) damning the modest but enduring floppy as a “symbol of an anachronistic administration”, presumably as much a jab at the “tired old men” of the ruling LDP (Liberal Democratic Party) as the devices.  There may however been an element of technological determinism in the reform because Sony, the last manufacturer of the floppy, ended production of them in 2011 so while many remain extant, the world’s supply is dwindling.  In some ways so modern and innovative, in other ways Japanese technology sometimes remains frozen, many businesses still demanding official documents to be endorsed using carved personal stamps called the印鑑 (ikan) or 判子 (hanko); despite the government's efforts to phase them out, their retirement is said to be proceeding at a “glacial pace”.  The other controversial aspect of the hanko is that the most prized are carved from ivory and it’s believed a significant part of the demand for black-market ivory comes from the hanko makers, most apparently passing through Hong Kong, for generations a home to “sanctions busters”.

Friday, August 16, 2024

Obliterate

Obliterate (pronounced uh-blit-uh-reyt (U) or oh-blit-uh-reyt (non-U))

(1) To remove or destroy all traces of something; do away with; destroy completely.

(2) In printing or graphic design, to blot out or render undecipherable (writing, marks, etc.); fully to efface.

(3) In medicine, to remove an organ or another body part completely, as by surgery, disease, or radiation.

1590–1600: From the Latin oblitterātus, perfect passive participle of oblitterō (blot out), from oblinō (smear over) and past participle of oblitterāre (to efface; cause to disappear, blot out (a writing) & (figuratively) cause to be forgotten, blot out a remembrance), the construct being ob- (a prefixation of the preposition ob (in the sense of “towards; against”)) + litter(a) (also litera) (letter; script) + -ātus (-ate).  The suffix -ate was a word-forming element used in forming nouns from Latin words ending in -ātus, -āta, & -ātum (such as estate, primate & senate).  Those that came to English via French often began with -at, but an -e was added in the fifteenth century or later to indicate the long vowel.  It can also mark adjectives formed from Latin perfect passive participle suffixes of first conjugation verbs -ātus, -āta, & -ātum (such as desolate, moderate & separate).  Again, often they were adopted in Middle English with an –at suffix, the -e appended after circa 1400; a doublet of –ee.  True synonyms include black out, eliminate, exterminate, annihilate, eradicate, delete, erase & expunge because to obliterate something is to remove all traces.  Other words often used as synonyms don’t of necessity exactly convey that sense; they include obscure, ravage, smash, wash out, wipe out, ax, cancel and cut.  Obliterate & obliterated are verds & adjetives, obliteration & obliterator are nouns, obliterature & obliterating are nouns, verb & adjective, obliterable & obliterative are adjectives and obliteratingly is an adverb; the noun plural is obliterations.

Social anxiety can be "obliterated".  Who knew?

The verb obliterate was abstracted from the phrase literas scribere (write across letters, strike out letters).  The noun obliteration (act of obliterating or effacing, a blotting out or wearing out, fact of being obliterated, extinction) dates from the 1650s, from the Late Latin obliterationem (nominative obliteratio), the noun of action from the past-participle stem of oblitterāre (to efface; cause to disappear, blot out (a writing) & (figuratively) cause to be forgotten, blot out a remembrance).  The related late fourteenth century noun oblivion (state or fact of forgetting, forgetfulness, loss of memory) was from the thirteenth century Old French oblivion and directly from the Latin oblivionem (nominative oblivio) (forgetfulness; a being forgotten) from oblivisci, the past participle of oblitus (forget) of uncertain origin.  Oblivion is if interest to etymologists because of speculation about a semantic shift from “to be smooth” to “to forget”, the theory based on the construct being ob- (using ob in the sense of “over”) + the root of lēvis (smooth).  For this there apparently exists no documentary evidence either to prove or disprove the notion.  The Latin lēvis (rubbed smooth, ground down) was from the primitive Indo-European lehiu-, from the root (s)lei- (slime, slimy, sticky).

Obliterature

The noun obliterature is a special derived form used in literary criticism, the construct being oblit(erate) + (lit)erature.  It describes works of literature in some way "obliterated or mad void", the most celebrated (or notorious according to many) being those which "interpreted" things in a manner not intended by the original author but the words is applied also to texts deliberately destroyed, erased or rendered unreadable, either as an artistic statement or as a result of censorship, neglect, or decay.  La biblioteca de Babel" (The Library of Babel (1941)) by Jorge Luis Borges (1899-1986) was a short story which imagined a universe consisting of an infinite library containing every possible book but all volumes are some way corrupted or comprise only random strings of characters; all works wholly unintelligible and thus useless.  The chaotic library was symbolic of the most extreme example of obliterature in that all works had been rendered unreadable and devoid of internal meaning.

Nazis burning books, Berlin, 1933.

Probably for a long as writing has existed, there has been censorship (and its companion: self-censorship).  Some censorship is official government policy while countless other instances exist at institutional level, sometimes as a political imperative, some time because of base commercial motives.  The most infamous examples are literary works banned or destroyed as political or religious repression including occasions when the process was one of public spectacle such as the burning of books in Nazi Germany, aimed at Jewish, communist and other “degenerate or undesirable” authors.   The critique: “They burn the books they cannot write” is often attributed German-Jewish poet, writer and literary critic Heinrich Heine (1797–1856) whose work was among the thousands of volumes placed on a bonfire in Berlin in 1933 but it’s a paraphrase of a passage from his play Almansor (1821-1822), spoken by a Muslim after Christian had burned piles of the holy Quran: “Das war ein Vorspiel nur, dort wo man Bücher verbrennt, verbrennt man auch am Ende Menschen.”  (That was but a prelude; where they burn books, they will ultimately burn people as well.")

The Address Book (1983) by French conceptual artist Sophie Calle (b 1953) was based on an address book the author found in the street which, (after photocopying the contents) she returned to the owner.  She then contacted those in the book and used the information they provided to create a narrative about the owner, a man she had never met.  This she had published in a newspaper and the man promptly threatened to sue on the grounds of a breach of his right to privacy, demanding all examples of the work in its published form be destroyed.  Duly, the obliterature was performed.  Thomas Phillips' (1937–2022) A Humument: A treated Victorian novel (in various editions 1970-2016) is regarded by most critics as an “altered” book, a class of literature in which novel media forms (often graphical artwork) are interpolated to change the appearance and sometimes elements of meaning.  Phillips use as his base a Victorian-era novel (William (WH) Mallock's (1849–1923) A Human Document (1892)) and painted over its pages, leaving only select words visible to create new narratives, many of which were surreal.  This was obliterature as artistic device and it’s of historic interest because it anticipated many of the techniques of post modernism, multi-media productions and even meme-making.

Erasure Poetry takes an existing text and either erases or blacks-out (the modern redaction technique) words or passages to create a new poem from the remaining words; in the most extreme examples almost all the original is obliterated, with only fragments left to form a new work.  Ronald Johnson (1935–1998) was a US poet who in 1977 published the book-length RADI OS (1977), based on John Milton's (1608–1674) Paradise Lost (1667-1674) and used the redactive mechanism as an artistic device, space once used by the obliterated left deliberately blank, surrounding the surviving words.

Some critics and literary theorists include unfinished and fragmentary work under the rubric of obliterature and while that may seem a bit of a definitional stretch, the point may be that such texts in many ways can resemble what post modern (and post-post modern) obliterature practitioners publish as completed work.  There are many unfinished works by the famous which have been “brought to conclusion” by contracted authors, the critical response tending to vary from the polite to the dismissive although, in fairness, it may be that some things were left unfinished for good reasons.  The Portuguese author Fernando Pessoa (1888-1935) was extraordinarily prolific and apparently never discarded a single page, leaving a vast archive of unfinished, fragmented, and often unreadable manuscripts, the volume so vast many have never been deciphered.  It’s interesting to speculate that had Pessoa had access to word processors and the cloud whether he would have saved as much; if he’d lived in the age of the floppy diskette, maybe he’d have culled a bit.

The obliteration of animal carcasses with explosives

Strictly speaking, “to obliterate something” means “to remove or destroy all traces” which usually isn’t the case when explosives are used, the result more a wide dispersal of whatever isn’t actually vaporized but there’s something about the word which attracts those who blow-up stuff and they seem often to prefer obliteration to terms which might be more accurate.  As long as the explosion is sufficiently destructive, one can see their point and obliteration does memorably convey the implications of blowing-up stuff.  The word clearly enchanted the US Forest Service which in 1995 issued their classic document Obliterating Animal Carcasses with Explosives, helpfully including a step-by-step guide to the process.  Given it’s probably not a matter about which many have given much thought, the service explained obliterating large animal carcasses was an important safety measure in wilderness recreation areas where the remains might attract bears, or near picnic areas where people obviously wouldn’t want rotting flesh nearby.  A practical aspect also is that in many cases there is no way conveniently to move or otherwise dispose of a large carcass (such as a horse or moose which can weigh in excess of 500 kg (1100 lb) which might be found below a steep cut slope or somewhere remote.  So, where physical transportation is not practical, the chemistry and physics of explosives are the obvious alternative, the guide recommending fireline devices (specially developed coils containing explosive powder), used also to clear combustible materials in the path of a wildfire. 

Interestingly, the guide notes there will be cases in which the goal might not be obliteration.  In some ecosystems, what is most desirable is to disperse the carcass locally into the small chunks suited to the eating habits of predators in the area and when properly dispersed, smaller scavenging animals will break down the left-overs, usually within a week.  To effect a satisfactory dispersal, the guide recommends placing 20 lb (9 kg) of explosives on the carcass in key locations, then using a detonator cord to tie the charges together, the idea being to locate them on the major bones, along the spine.  However, in areas where there’s much human traffic, obliteration is required and the guide recommends placing 20 lb (9 kg) pounds of explosives on top and a similar load underneath although it’s noted this may be impossible if the carcass is too heavy, frozen into the ground, floating in water or simply smells too ghastly for anyone to linger long enough to do the job.  In that case, 55 lb (25 kg) of fireline should be draped over the remains although the actual amount used will depend on the size of the carcass, the general principle being the more explosives used, the greater the chance obliteration will be achieved.  Dispersal and obliteration are obviously violent business but it’s really just an acceleration of nature’s decomposition process.  Whereas a big beast like a horse can sit for months without entirely degrading, if explosives are used, in most cases after little more than a week it’d not be obvious an animal was ever there.  With regard to horses however, the guide does include the warning that prior to detonation, “horseshoes should be removed to minimize dangerous flying debris.”  Who knew?

It’s important enough explosives are used to achieve the desired result but in carcass disposal it's important also not to use too much.  In November 1970, the Oregon Highway Division was tasked with blowing up a 45-foot (14 m) eight-ton (8100 kg) decaying whale which lay on the shores near the town of Florence and they calculated it would need a half-ton (510 kg) of dynamite, the presumption being any small pieces would be left for seagulls and other scavengers.  Unfortunately, things didn’t go according to plan.  The viewing crowds had been kept a quarter-mile (400 m) from the blast-site but they were forced to run for cover as large chunks of whale blubber started falling on them and the roof of a car parked even further away was crushed.  Fortunately there were no injuries although most in the area were splattered with small pieces of dead whale.  Fifty years on, Florence residents voted to name a new recreation ground Exploding Whale Memorial Park in honor of the event.


Wednesday, May 15, 2024

Dongle

Dongle (pronounced dong-guhl)

(1) Historically, a hardware device attached to a computer without which a particular software program will not run; used to prevent unauthorized use.

(2) Latterly, a device (almost always plugged into a USB port) to enable connectivity with other devices, peripherals or networks.

(3) In (very) casual use, any small USB device. 

1980: First recorded in 1980, shortly after the debut of the original IBM PC-1 and although often cited as arbitrary coinage, is possibly derived from dangle, given the first dongles dangled from a short cable.  Unfortunately, the advertisement in a computer magazine in the early 1990s which claimed the dongle was invented by an engineer named Don Gall was just an attempt to make nerds laugh and maybe create an urban legend.  Donglegate was a 2013 incident at an IT conference when a double entendre on the word "dongle" led to a complaint.  Dongle is a noun; the noun plural is dongles.  Because of the nature of the devices and the use to which they're put, it's surprising it seems not to be used as a verb; nor have dongling & dongled emerged. 

Parallel-port dongle.

Historically, a dongle was a small (about the size of a box of matches) pass-through device, a piece of hardware which connected to a PC’s 25-pin parallel (then used most often for a printer) port.  It provided usually a form of copy protection and license enforcement in that the software with which it was supplied wouldn’t run unless dongle was plugged-in.  Most PCs had only one printer port (although from the original IBM PC-1 (1981), there was always support for three physical parallel ports and in the early days of the industry it wasn't unknown to see three Epson 9-pin impact (dot-matrix) clattering away, hanging off the one PC) but it was in some cases possible to daisy-chain multiple dongles on the one port and still connect the actual printer (as the last device in the chain).  The parallel dongles were famously robust and reliable but less successful were the early implementations of dongles as USB devices; the problems were not solved until the early 2000s.

Raspberry Pi Nano WiFi dongle.

Dongle has also come to mean just about anything small which connects to a USB port and does something.  Although the definitions were never codified, the convention of use with USB devices seems to be that a dongle is something which in some way is an enabling device for something else (such as connecting to a WAN, LAN, Wi-Fi or a streaming feed) whereas “stick”, “pen-drive” or “key” is used for something self-contained such as a flash drive.

Personalized dongles.

The most familiar “dongles are still the USB flash drives, used for data storage and file transfer, the later a large-scale version of the original “sneaker-net” which described a kind of low-tech “networking” of devices, accomplished by a programmer (wearing sneakers of course), floppy diskettes in hand, waling between machines to copy and write files.  Memories fade but it’s not forgotten just how transformative was the arrival of the USB flash drive.  Although there had been advances in removable storage which meant at least some users were no longer subject to the tyranny of the 1.44 MB floppy diskette, the proprietary devices were expensive and thus relatively rare.  Additionally, the removable media depended on compatible devices existing at each end, something not always possible conveniently to arrange.  On the PCS, the ubiquitous 3½ floppy diskette had shrunk from it 5¼-inch origins but grown from a capacity 180 KB to 1.44 MB and some versions in the 1990s supported 1.8 MB & 2.88 MB but take-up was slow, the BIOS (Basic Input-Output System) support needed by the latter attracting little support from manufacturers.  Thus the enthusiasm which greeted the early 16 MB USB flash drives early in the century too grew to the point that in 2024 2 TB sticks are available and 4 TB prototypes have been demonstrated.  That process also attracted the interest of economists who noted that even by the standards of IT hardware, the unit cost reductions have been dramatic: the early 1 GB sticks were advertised at US$300 yet by 2024 1 TB (1024 or 1000 GB depending on manufacturer) were advertised at under US$150.  Curiously (and unusually for the industry), just who “invented” the USB flash drive remains contested.

Rainbow Technologies' "Don Gale" dongle advertisement (1992).  While there was no actual Mr Gale, the story related is exactly the reason the software protection hardware was created.

Friday, February 2, 2024

Irrefragable

Irrefragable (pronounced ih-ref-ruh-guh-buhl)

(1) Not to be disputed or contested (as assertion).

(2) Not able to be denied or refuted; indisputable (as fact).

(3) That which cannot or should not be broken; indestructible (archaic and probably extinct).

(4) Of a person, someone obstinate; stubborn (obsolete except as a literary device).

1525–1535: A learned borrowing from Late Latin irrefrāgābilis (irrefragable) with the English suffix –able appended.  The suffix -able was from the Middle English -able, from the Old French -able, from the Latin -ābilis (capable or worthy of being acted upon), from the primitive Indo-European i-stem forms -dahli- or -dahlom (instrumental suffix); it was used to create adjectives with the sense of “able or fit to be done”.  The construct of irrefrāgābilis was the Latin ir- (a variant of in- (used a prefix meaning “not”)) + refragā() (the present active infinitive of refrāgor (to oppose, resist; to gainsay, thwart)) + -bilis (the suffix used to form adjectives indicating a capacity or worth of being acted upon).  Because of the paucity of documentary evidence, the ultimate source of the Latin refrāgor remains uncertain, but the construct may have been re- (the prefix used in the sense of “again”) + fragor (a breaking, shattering; a crash; din, uproar (from frangō (to break, shatter), ultimately from the primitive Indo-European bhreg- (to break)), formed as an antonym of suffrāgōr, the first-person singular present passive indicative of suffrāgō (to support; to vote for).  The sixteenth century French form was irréfragable, also from the Late Latin.  The meanings related to “indestructible objects” fell from use as early as the mid-seventeenth century while the figurative sense of “someone stubborn or obstinate” endured into the twentieth and, as a literary device, probably still tempts some and for those so tempted, the better style guides help by telling us to stress the second syllable.  The spelling irrefragible is obsolete.  Irrefragable is an adjective, irrefragability & irrefragableness are nouns and irrefragably is an adverb; the noun plural is irrefragabilies.

In English, irrefragable didn’t survive in common use for no better reason than people for whatever reason preferred the alternatives (literal & figurative) including (depending on the context): undeniable, indubitable, unassailable, indisputable, unambiguous, unquestionable, irrefutable, incontestable, immutable and unanswerable.  All those synonyms convey much the same thing for most so usually, the only thing the use of “irrefragable” is likely to engender is bafflement; few people will know what it means.  That can be fun between consenting word-nerds but it otherwise tends just to annoy.  There are structuralists who claim “irrefragable” is (or at least can be) different form a word like “unquestionable” because the former should specifically be associated with logical or argumentative strength while the later can be used in any context without necessarily emphasizing the same rigorous logical support.  So, because the underpinning of the scientific method is the disproving stuff, to say a scientific theory is irrefragable does not mean it cannot be argued against or disproven or that it’s beyond doubt or uncertainty; it means only that it cannot be refuted based on the current evidence.  By contrast, in some schools of theology, many things are unquestionable, not because they can be proved or disproven but because they must be accepted as matters of faith.  In the Roman Catholic Church, this is formalized: If a pope (invoking his infallibility in matters of dogma), declares something to be thus, it is, as a matter of canon law, both irrefragable & unquestionable.  The ancient idea of papal infallibility has been invoked only once since it was codified in the proceedings of the First Vatican Council (Vatican I 1869-1870) but since the early post-war years, pontiffs have found ways to achieve the same effect, John Paul II (1920–2005; pope 1978-2005) & Benedict XVI (1927–2022; pope 2005-2013, pope emeritus 2013-2022) both adept at using what was in effect a personal decree a power available to one who sits at the apex of what is in constitutional terms an absolute theocracy.  Critics have called this phenononom "creeping infallibility" and its intellectual underpinnings own much to the tireless efforts of Benedict XVI while he was head of the Inquisition (by then called the Congregation for the Doctrine of the Faith) during the late twentieth century.

Defragable: Defragmentation in action under MS-DOS 6.22.  On a nearly full big drive (say 320 MB) on which defragmentation had been neglected for a while, the process could take literally hours.  True obsessives would add the relevant command to their autoexec.bat to start every day with a defrag, the sequence being: (1) switch on, (2) go and get coffee and (3) hope it was done upon return.

Before installable file systems (IFS) began to gain critical mass in the 1990s, disk defragmenters were something of a fetish among nerds because, at the software level, there were few quicker (a relative term) and cheaper ways to make things run faster.  Fragment was from the late Middle English fragment, from the Latin fragmentum (a fragment, a remnant), the construct being frangō (I break) + -mentum, from the suffix -menta (familiar in collective nouns like armenta (herd, flock)), from the primitive Indo-European -mn̥the.  The tendency of the early file systems to increasing sluggishness was because the File Allocation Table (FAT) was an up-scaled variant of that used on floppy diskettes where the cluster sizes (the segments into which the media was divided) were small and thus less prone to fragmentation.  However, because of the arcane math which dictated how many clusters there could be under the various implementations of FAT, the only way to accommodate the increasing size of hard disk drives (HDD) was to make the clusters larger, the consequence of which was a file of 1 KB or less absorbed all of a 32 KB cluster, something both an inefficient use of space and inherently prone to fragmentation.  What defragmenters did was re-allocate files to make data both as contiguous and un-fragmented as possible.  Modern file systems (HPFS, NTFS etc) still have limits but the numbers are very big and contemporary operating systems now handle defragmentation dynamically.  Although it remains a useful system on USB pen drives and such because of the wide system compatibility and ease of use, it’s doubtful even the more nostalgic nerds have fond memories of FAT on HDDs; a corrupted FAT could be a nightmare.