Showing posts with label Nuclear. Show all posts
Showing posts with label Nuclear. Show all posts

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the use on the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified porn star Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia.  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of a government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” et al was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.

Friday, October 11, 2024

Floppy

Floppy (pronounced flop-ee)

(1) A tendency to flop.

(2) Limp, flexible, not hard, firm, or rigid; flexible; hanging loosely.

(3) In IT, a clipping of “floppy diskette”.

(4) In historic military slang (Apartheid-era South Africa & Rhodesia (now Zimbabwe), an insurgent in the Rhodesian Bush War (the “Second Chimurenga” (from the Shona chimurenga (revolution)) 1964-1979), the use a reference to the way they were (in sardonic military humor) said to “flop” when shot.

(5) In informal use, a publication with covers made with a paper stock little heavier and more rigid that that used for the pages; Used mostly for comic books.

(6) In slang, a habitué of a flop-house (a cheap hotel, often used as permanent or semi-permanent accommodation by the poor or itinerant who would go there to “flop down” for a night) (archaic).

(7) In slang, as “floppy cats”, the breeders’ informal term for the ragdoll breed of cat, so named for their propensity to “go limp” when picked up (apparently because of a genetic mutation).

1855-1860: The construct was flop + -y.  Flop dates from 1595–1605 and was a variant of the verb “flap” (with the implication of a duller, heavier sound).  Flop has over the centuries gained many uses in slang and idiomatic form but in this context it meant “loosely to swing; to flap about”.  The sense of “fall or drop heavily” was in use by the mid-1830s and it was used to mean “totally to fail” in 1919 in the wake of the end of World War I (1914-1918), the conflict which wrote finis to the dynastic rule of centuries also of the Romanovs in Russia, the Habsburgs in Austria-Hungary and the Ottomans in Constantinople although in the 1890s it was recorded as meaning “some degree of failure”.  The comparative is floppier, the superlative floppiest.  Floppy a noun & adjective, floppiness is a noun, flopped is a noun & verb, flopping is a verb, floppier& floppiest are adjectives and floppily is an adverb; the noun plural is floppies.  The adjective floppish is non-standard and used in the entertainment & publishing industries to refer to something which hasn’t exactly “flopped” (failed) but which had not fulfilled the commercial expectations.

Lindsay Lohan in "floppy-brim" hat, on-set during filming of Liz & Dick (2012).  In fashion, many "floppy-brim" hats actually have a stiff brim, formed in a permanently "floppy" shape.  The true "floppy hats" are those worn while playing sport or as beachwear etc.

The word is used as a modifier in pediatric medicine (floppy baby syndrome; floppy infant syndrome) and as “floppy-wristed” (synonymous with “limp-wristed”) was used as a gay slur.  “Flippy-floppy” was IT slang for “floppy diskette” and unrelated to the previous use of “flip-flop” or “flippy-floppy” which, dating from the 1880s was used to mean “a complete reversal of direction or change of position” and used in politics to suggest inconsistency.  In the febrile world of modern US politics, to be labelled a “flip-flopper” can be damaging because it carries with it the implication what one says can’t be relied upon and campaign “promises” might thus not be honored.  Whether that differs much from the politicians’ usual behaviour can be debated but still, few enjoy being accused of flip-floppery (definitely a non-standard noun).  The classic rejoinder to being called a flip-flopper is the quote: “When the facts change, I change my mind. What do you do, sir?”  That’s often attributed to the English economist and philosopher Lord Keynes (John Maynard Keynes, 1883-1946) but it was said originally by US economist Paul Samuelson (1915–2009) the 1970 Nobel laureate in Economics.  In the popular imagination Keynes is often the “go to” economist for quote attribution in the way William Shakespeare (1564–1616) is a “go to author” and Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) a “go to politician”, both credited with thing they never said but might have said.  I phraseology, the quality of “Shakespearian” or “Churchillian” not exactly definable but certainly recognizable.  In the jargon of early twentieth century electronics, a “flip-flop” was a reference to switching circuits that alternate between two states.

Childless cat lady Taylor Swift with her “floppy cat”, Benjamin Button (as stole).  Time magazine cover, 25 December 2023, announcing Ms Swift as their 2023 Person of the Year.  "Floppy cat" is the the breeders' informal term for the ragdoll breed an allusion to their tendency to “go limp” when picked up, a behavior believed caused by a genetic mutation.

The other use of flop in IT is the initialism FLOP (floating point operations per second).  Floating-point (FB) arithmetic (FP) a way of handling big real numbers using an integer with a fixed precision, scaled by an integer exponent of a fixed base; FP doesn’t really make possible what would not in theory be achievable using real numbers but does make this faster and practical and the concept became familiar in the 1980s when Intel made available FPUs (floating point units, also known as math co-processors) which could supplement the CPUs (central processing units) of their x86 family.  The 8087 FPU worked with the 8086 CPU and others followed (80286/80287, 80386/80387, i486/i487 etc) until eventually the FPU for the Pentium range was integrated into the CPU, the early implementation something of a debacle still used as a case study in a number of fields departments including management and public relations.

FLOPs are an expression of specific performance and are used to measure those computations requiring floating-point calculations (typically in math-intensive work) and for purposes of “benchmarking” or determining “real-world” performance under those conditions, it’s a more informative number than the traditional rating of instructions per second (iSec).  The FLOPs became something of a cult in the 1990s when the supercomputers of the era first breached the trillion FLOP mark and as speeds rose, the appropriate terms were created:

kiloFLOPS: (kFLOPS, 103)
megaflops: (MFLOPS, 106)
gigaflops: GFLOPS, 109)
teraflops: TFLOPS, 1012)
petaFLOPS: PFLOPS, 1015)
exaFLOPS: (EFLOPS, 1018)
zettaFLOPS: ZFLOPS, 1021)
yottaFLOPS: YFLOPS, 1024)
ronnaFLOPS: RFLOPS, 1027)
quettaFLOPS: QFLOPS, 1030)

In the mysterious world of quantum computing, FLOPs are not directly applicable because the architecture and methods of operation differ fundamentally from those of classical computers.  Rather than FLOPs, the performance of quantum computers tends to be measured in qubits (quantum bits) and quantum gates (the operations that manipulate qubits).  The architectural difference is profound and explained with the concepts of superposition and entanglement:  Because a qubit simultaneously can represent both “0” & “1” (superposition) and these can be can be entangled (a relationship in which distance is, at least in theory, irrelevant), under such multi-string parallelism, performance cannot easily be reduced to simple arithmetic or floating-point operations which remain the domain of classical computers which operate using the binary distinction between “0” (off) and “1” (on).

Evolution of the floppy diskette: 8 inch (left), 5¼ inch (centre) & 3½ inch (right).  The track of the floppy for the past half-century has been emblematic of the IT industry in toto: smaller, higher capacity and cheaper.  Genuinely it was one of the design parameters for the 3½ inch design that it fit into a man's shirt pocket.

In IT, the term “floppy diskette” used the WORM (write once, read many, ie "read only" after being written) principle first appeared in 1971 (soon doubtless clipped to “floppy” although the first known use of this dates from 1974).  The first floppy diskettes were in an 8 inch (2023 mm) format which may sound profligate for something with a capacity of 80 kB (kilobyte) but the 10-20 MB (megabit) hard drives of the time were typically the same diameter as the aperture of domestic front-loading washing machine so genuinely they deserved the diminutive suffix (-ette, from the Middle English -ette, a borrowing from the Old French -ette, from the Latin -itta, the feminine form of -ittus.  It was used to form nouns meaning a smaller form of something).  They were an advance also in convenience because until they became available, the usual way to transfer files between devices was to hard-wire them together.  Introduced by IBM in 1971, the capacity was two years later raised to 256 kB and by 1977 to a heady 1.2 MB (megabyte) with the advent of a double-sided, double-density format.  However, even then it was obvious the future was physically smaller media and in 1978 the 5¼ inch (133 mm) floppy debuted, initially with a formatted capacity of 360 kB but by 1982 this too had be raised to 1.2 MB using the technological advance if a HD (high density) file system and it was the 5¼ floppy which would become the first widely adopted industry “standard” for both home and business use, creating the neologism “sneakernet”, the construct being sneaker + net(work), the image being of IT nerds in their jeans and sneakers walking between various (unconnected) computers and exchanging files via diskette.  Until well into the twenty-first century the practice was far from functionally extinct and it persists even today with the use of USB sticks.

Kim Jong-un (Kim III, b 1982; Supreme Leader of DPRK (North Korea) since 2011) with 3½ inch floppy diskette (believed to be a HD (1.44 MB)).

The meme-makers use the floppy because it has become a symbol of technological bankruptcy. In OS (operating system) GUIs (graphical user interface) however, it does endure as the "save" icon and all the evidence to date does suggest that symbolic objects like icons do tend to outlive their source, thus the ongoing use in IT of analogue, rotary dial phones in iconography and the sound of a camera's physical shutter in smart phones.  Decades from now, we may still see representations of floppy diskettes.

The last of the mainstream floppy diskettes was the 3½ inch (89 mm) unit, introduced in 1983 in double density form with a capacity of 720 KB (although in one of their quixotic moves IBM used a unique 360 kB version for their JX range aimed at the educational market) but the classic 3½ was the HD 1.44 MB unit, released in 1986.  That really was the end of the line for the format because although in 1987 a 2.88 MB version was made available, few computer manufacturers offered the gesture of adding support at the BIOS (basic input output system) so adoption was infinitesimal.  The 3½ inch diskette continued in wide use and there was even the DMF (Distribution Media Format) with a 1.7 MB capacity which attracted companies like Microsoft, not because it wanted more space but to attempt to counter software piracy; within hours of Microsoft Office appearing in shrink-wrap with, copying cracks appeared on the bulletin boards (where nerds did stuff before the www (worldwideweb).  It was clear the floppy diskette was heading for extinction although slighter larger versions with capacities as high as 750 MB did appear but, expensive and needing different drive hardware, they were only ever a niche product seen mostly inside corporations.  By the time the CD-ROM (Compact Disc-Read-only Memory) reached critical mass in the mid-late 1990s the once ubiquitous diskette began rapid to fade from use, the release in the next decade of the USB sticks (pen drives) a final nail in the coffin for most.

In the mid 1990s, installing OS/2 Warp 4.0 (Merlin) with the optional packs and a service pack could require a user to insert and swap up to 47 diskettes.  It could take hours, assuming one didn't suffer the dreaded "floppy failure".

That was something which pleased everyone except the floppy diskette manufacturers who had in the early 1990s experienced a remarkable boom in demand for their product when Microsoft Windows 3.1 (7 diskettes) and IBM’s OS/2 2.0 (21 diskettes) were released. Not only was the CD-ROM a cheaper solution than multiple diskettes (a remarkably labor-intensive business for software distributors) but it was also much more reliable, tales of an installation process failing on the “final diskette” legion and while some doubtlessly were apocryphal, "floppy failure" was far from unknown.  By the time OS/2 Warp 3.0 was released in 1994, it required a minimum of 23 floppy diskettes and version 4.0 shipped with a hefty 30 for a base installation.  Few mourned the floppy diskette and quickly learned to love the CD-ROM.

What lay inside a 3½ inch floppy diskette.

Unlike optical discs (CD-ROM, DVD (Digital Versatile Disc) & Blu-Ray) which were written and read with the light of a laser, floppy diskettes were read with magnetic heads.  Inside the vinyl sleeve was a woven liner impregnated with a lubricant, this to reduce friction on the spinning media and help keep the surfaces clean.

Curiously though, niches remained where the floppy lived on and it was only in 2019 the USAF (US Air Force) finally retired the use of floppy diskettes which since the 1970s had been the standard method for maintaining and distributing the data related to the nation’s nuclear weapons deployment.  The attractions of the system for the military were (1) it worked, (2) it was cheap and (3) it was impervious to outside tampering.  Global thermo-nuclear war being a serious business, the USAF wanted something secure and knew that once data was on a device in some way connected to the outside world there was no way it could be guaranteed to be secure from those with malign intent (ayatollahs, the Secret Society of the Les Clefs d'Or, the CCP (Chinese Communist Party), the Freemasons, those in the Kremlin or Pyongyang et al) whereas a diskette locked in briefcase or a safe was, paradoxically, the state of twenty-first century security, the same philosophy which has seen some diplomatic posts in certain countries revert to typewriters & carbon paper for the preparation of certain documents.  In 2019 however, the USAF announced that after much development, the floppies had been retired and replaced with what the Pentagon described as a “highly-secure solid-state digital storage solution which work with the Strategic Automated Command and Control System (SACCS).

It can still be done: Although no longer included in PCs & laptops, USB floppy diskette drives remain available (although support for Windows 11 systems is said to be "inconsistent").  Even 5¼ inch units have been built.

It thus came as a surprise in 2024 to learn Japan, the nation which had invented motorcycles which didn’t leak oil (the British though they’d proved that couldn’t be done) and the QR (quick response) code, finally was abandoning the floppy diskette.  Remarkably, even in 2024, the government of Japan still routinely asked corporations and citizens to submit documents on floppies, over 1000 statutes and regulations mandating the format.  The official in charge of updating things (in 2021 he’d “declared war” on floppy diskettes) in July 2024 announced “We have won the war on floppy disks!” which must have be satisfying because he’d earlier been forced to admit defeat in his attempt to defenestrate the country’s facsimile (fax) machines, the “pushback” just too great to overcome.  The news created some interest on Japanese social media, one tweet on X (formerly known as Twitter) damning the modest but enduring floppy as a “symbol of an anachronistic administration”, presumably as much a jab at the “tired old men” of the ruling LDP (Liberal Democratic Party) as the devices.  There may however been an element of technological determinism in the reform because Sony, the last manufacturer of the floppy, ended production of them in 2011 so while many remain extant, the world’s supply is dwindling.  In some ways so modern and innovative, in other ways Japanese technology sometimes remains frozen, many businesses still demanding official documents to be endorsed using carved personal stamps called the印鑑 (ikan) or 判子 (hanko); despite the government's efforts to phase them out, their retirement is said to be proceeding at a “glacial pace”.  The other controversial aspect of the hanko is that the most prized are carved from ivory and it’s believed a significant part of the demand for black-market ivory comes from the hanko makers, most apparently passing through Hong Kong, for generations a home to “sanctions busters”.

Friday, September 13, 2024

Barracuda

Barracuda (pronounced bar-uh-koo-duh)

(1) Any of several elongated, predaceous marine fishes of the genus Sphyraena, certain species of which are used for food. The large fish are notoriously voracious and are found world-wide in tropical & sub-tropical waters; the collective noun is "battery".

(2) In slang, a treacherous, greedy person (obsolete).

(3) In slang, one who uses harsh or predatory means to compete.

(3) A car produced by the Plymouth division of Chrysler Corporation in three generations between 1964-1974 (as both Barracuda and 'Cuda).

1670-1680: From American Spanish, thought derived from customary use in the Caribbean, borrowed from the Latin American Spanish barracuda, perhaps from a Cariban word, most likely the Valencian-Catalan barracó (snaggletooth), first recorded as barracoutha.  There was the suggestion barracó may come from Latin in which the word barra could be used to mean "bar", the idea being this was a reference to to the elongated, bar-like shape of the fish; the theory is regarded as speculative.  Barracuda is a noun and barracudalike is an adjective; the noun plural is is barracuda or barracudas.

The plural of fish is an illustration of the inconsistency of English.  As the plural form, “fish” & “fishes” are often (and harmlessly) used interchangeably but in zoology, there is a distinction, fish (1) the noun singular & (2) the plural when referring to multiple individuals from a single species while fishes is the noun plural used to describe different species or species groups.  The differentiation is thus similar to that between people and peoples yet different from the use adopted when speaking of sheep and, although opinion is divided on which is misleading (the depictions vary), those born under the zodiac sign Pisces are referred to variously as both fish & fishes.  So, for most folk, the best advice if a plural of "barracuda'" is needed is to (1) use which ever produces the most elegant sentence and (2) be consistent in use.  However, ichthyologists (and probably zoologists in general) will note the barracuda genus "Sphyraena" consists of 29 species and will use "barracuda" if speaking of many fish of the one species and "barracudas" if fish of more than one species are involved.

The danger presented by barracuda in open water is well documented.  The US Navy's heavy cruiser USS Indianapolis (CA-35) was the warship which in July 1945 delivered to Tinian Naval Base the critical components for "Little Boy" the atomic bomb (a uranium device, for decades a genuine one-off, all other nuclear weapons built with plutonium until (it’s suspected) the DPRK (North Korea) used uranium for at least one of its tests) and it was torpedoed and sunk by an Imperial Japanese Navy submarine.  Because of wartime circumstances, the sinking remained unknown for some four days and of the crew of 1195, only 316 survived of the 890 who made it into the water, many of the rest taken by “sharks and five-foot long barracudas.

Barracuda (1977) was US horror movie set on the Florida coast.  The plot-line involved the inhabitants of a small town being menaced by batteries of barracuda which have become highly aggressive because of chemical intervention by a former military doctor who has gone mad while conducting secret government research into hypoglycaemia and its effect on human behavior.  The film was not well-reviewed and critics noted the "derivative & dubious plot, poorly executed special effects and lack of focus on the title character (the fish)". 

The Plymouth Barracuda & 'Cuda, 1964-1974

While the 1964 Ford Mustang is credited with creating the pony-car market, it was actually the Plymouth Barracuda which came first, released seventeen days earlier.  Ford’s used the approach of draping a sexy new body over an existing, low-cost, platform and drive-train and Chrysler chose the same route, using the sub-compact Valiant as Ford were using their Falcon.  In the years to come, there would be many who adopted the method, often with great success and on both sides of the Atlantic, there other manufacturers would create their own "pony cars".  Despite the chronology, it's the Mustang which deserves the credit for the linguistic innovation, the term "pony car" an allusion to the equine association in the Ford's name and a nod also to the thing being (in US terms at the time), a "smaller" car.  If was only after the Mustang had both created and defined the segment the Barracuda came to be called a pony car. 

1965 Ford Mustang "notchback".

Unfortunately, despite the project having been in the works for years, a sudden awareness Ford were well advanced meant Chrysler’s lower-budget development was rushed.  Despite the Valiant’s platform and drive-train being in many aspects technically superior to the less ambitious Falcon, Plymouth’s Barracuda was a bit of a flop, outsold by its competitor initially by around ten to one, numbers which got worse as "Mustangmania" overtook the land.  While the Mustang got what was called “the body from central casting”, from the windscreen forward, the Barracuda retained the sheet-metal from the mundane Valiant, onto which was grafted a rear end which was adventurous but stylistically disconnected from the front.

1964 Plymouth Barracuda.

It was an awkward discombobulation although, with the back-seat able to be folded down to transform the rear passenger compartment into a large luggage space, it was clever, practical design.  Although in the years to come, the notion of such lines being used for a "liftback" or "hatchback" would appear, even during the design process, it was never envisaged that the rear window might be made to open.  At the time, the matter of of installing the big, heavy piece of glass and its edging was thought challenge enough without adding the engineering the necessary hinges and body-mounting points.  Although not a stressed panel, the glass did contribute to structural rigidity which was good but it also produced much heat-soak into the interior; driving an early Barracuda on a hot' sunny day could be a "sticky" experience, vinyl upholstery a standard fitting and air-conditioning expensive and a generation away from becoming commonplace.  

1971 Jensen FF Mark III, one of 15 built.

The novelty of the Barracuda's rear-end was a giant window which, at 14.4 square feet (1.34m3), was at the time the largest ever installed in a production car.  In 1966, even grander glazing was seen on the Jensen Interceptor, styled by Italy’s Carrozzeria Touring, but there it was ascetically successful, the lines of the big trans-Atlantic hybrid more suited to such an expanse of glass.  Unlike Plymouth, Jensen took advantage of the possibilities offered and had the glass double as a giant, glazed trunk (boot) lid.  It didn't quite create one of the shooting brakes so adored by the gentry but it did enhance the practicality. Using Chrysler's big-block V8s and (but for a handful built with manual gearboxes) TorqueFlite automatic transmission, the Interceptor was no thoroughbred but it offered effortless performance and the bullet-proof reliability for which the US power-trains of the era were renowned.

1968 Plymouth Barracuda hardtop.

The extraordinary success of the Mustang nevertheless encouraged Chrysler to persist and the Barracuda, though still on the Valiant platform, was re-styled for 1967, this time with the vaguely Italianesque influences (noticed probably more by Americans than Italians) seen also in 1966 with the release of the second series of Chevrolet’s doomed, rear-engined Corvair.  Although the rear-engine configuration proved a cul-de-sac, aesthetically, the later Corvairs were among the finest US designs of the era and, unusually, the lovely lines were implemented as successfully in four-door form as on the coupe.  Visually, the revised Barracuda didn't quite scale the heights achieved by Chevrolet but greatly it improved on the original and was offered with both notchback and convertible coachwork, as well as the fastback the Mustang had made popular but, because of the economic necessity of retaining some aspects of the Valiant’s structure, it wasn’t possible to realise the short-deck, long-hood look with which the Mustang had established the pony car design motif used still today.

1969 Pontiac Firebird Trans Am.

General Motors’ (GM) answer to the Mustang wasn’t as constrained by the fiscal frugality which had imposed so many compromises on the Barracuda, the Chevrolet Camaro and the substantially similar Pontiac Firebird both introduced in 1966 with a curvaceous interpretation of the short-deck, long-hood idea which maintained a relationship with the GM’s then voguish “cokebottle” designs.  In a twist on the pony car process, the Camaro and Firebird were built on an entirely new platform which would later be used for Chevrolet’s new competitor for the Valiant and Falcon, the Nova.  Just as the pedestrian platforms had restricted the freedom to design the Barracuda, so the Camaro’s underpinnings imposed compromises in space utilization on the Nova, a few inches of the passenger compartment sacrificed to fashion.  For 1967, Ford released an updated Mustang, visually similar to the original but notably wider, matching the Camaro and Firebird in easily accommodating big-block engines, not something Chrysler easily could do with the Barracuda.

1969 Plymouth 'Cuda 440.

However, this was the 1960s and though Chrysler couldn’t easily install a big-block, they could with difficulty and so they did, most with a 383 cubic inch (6.3 litre) V8 and, in 1969, in a package now called ‘Cuda, (the name adopted for the hig-performance versions) a few with the 440 (7.2 litre).  At first glance it looked a bargain, the big engine not all that expensive but having ticked the box, the buyer then found added a number of "mandatory options" so the total package did add a hefty premium to the basic cost.  The bulk of the big-block 440 was such that the plumbing needed for disc brakes wouldn’t fit so the monster had to be stopped with the antiquated drum-type and nor was there space for power steering, quite a sacrifice in a car with so much weight sitting atop the front wheels.  The prototype built with a manual gearbox frequently snapped so many rear suspension components the engineers were forced to insist on an automatic transmission, the fluid cushion softening the impact between torque and tarmac.  Still, in a straight line, the things were quick enough to entice almost 350 buyers, many of whom tended to enjoy the experience a ¼ mile (402 metres) at a time, the drag-strip it's native environment.  To this day the 440 remains the second-largest engine used in a pony car, only Pontiac's later 455 (7.5) offering more displacement.

1968 Plymouth Barracuda convertible.

For what most people did most of the time (which included turning corners), the better choice, introduced late in 1967, was an enlarged version of Chrysler’s small-block V8 (LA), now bored-out to 340 cubic inches (5.6 litres); it wouldn’t be the biggest of the LA series but it was the best.  A high-revving, free-breathing thing from the days when only the most rudimentary emission controls were required, the toxic little (a relative term) 340 gave the Barracuda performance in a straight line not markedly inferior to the 440, coupled with markedly improved braking and cornering prowess.  One of the outstanding engines of the era and certainly one of Detroit's best small-block V8s, it lasted, gradually detuned, until 1973 by which time interest in performance cars had declined in parallel with the engineers ability economically to produce them while also complying with the increasingly onerous anti-pollution rules.

1968 Hemi Barracuda, supplied ex factory with un-painted black fibreglass.

Of course, for some even a 440 ‘Cuda wouldn't be enough and anticipating this, in 1968, Plymouth took the metaphorical shoehorn and installed the 426 cubic inch (6.9 litre) Street Hemi V8, a (slightly) civilised version of their racing engine.  Fifty were built (though one normally reliable source claims it was seventy) and with fibreglass panels and all manner of acid-dipping tricks to reduce weight, Plymouth didn’t even try to pretend the things were intended for anywhere except the drag strip.  The power-to-weight ratio of the 1968 Hemi Barracudas remains the highest of the era.  The things sometimes are described as "1968 Hemi 'Cudas" but in the factory documentation they were only ever referred to as "Hemi Barracuda" because the 'Cuda name wasn't introduced until the next season.  

1971 Plymouth 'Cuda coupe.

The third and final iteration of the Barracuda was introduced as a 1970 model and lasted until 1974.  Abandoning both the delicate lines of the second generation and the fastback body, the lines were influenced more by the Camaro than the Mustang and it was wide enough for any engine in the inventory.  This time the range comprised (1) the Barracuda which could be configured with either of the two slant sixes (198 (3.2) & 225 (3.6) or one of the milder V8s, (2) the Gran 'Cuda which offered slightly more powerful V8s and some additional luxury appointments including the novelty of an overhead console (obviously not available in the convertible) and (3) the 'Cuda which was oriented towards high-performance and available with the 340, 383, 440 and 426 units, the wide (E-body) platform able to handle any engine/transmission combination.  Perhaps the best looking of all the pony cars, sales encouragingly spiked for 1970, even the Hemi ‘Cuda attracting over 650 buyers, despite the big engine increasing the price by about a third and it would have been more popular still, had not the insurance premiums for such machines risen so high.  With this level of success, the future of the car seemed assured although the reaction of the press was not uncritical, one review of the Dodge Hemi Challenger (the ‘Cuda’s substantially similar stable-mate), finding it an example of “…lavish execution with little thought to practical application”.  Still, even if in some ways derivative (and as the subsequent, second generation Chevrolet Camaro & Pontiac Firebird would at the time suggest, outdated), the styling (the team led by John Herlitz (1942–2008)) has since been acknowledged as a masterpiece and when the "retro" take on the Challenger was released in the next century, those were the lines reprised, the new Mustang and Camaro also following the 1960s, not the 1970s.

1970 Plymouth Barracuda with 225 cubic inch (3.7 litre) slant-6 (left) and 1970 Plymouth Barracuda Gran Coupe (right).

It's the most powerful (The Hemis and triple-carburetor 440s) of the third generation Barracudas which are best remembered but production of those things (produced only for 1970 & 1971) never reached four figures.  Of the 105,000 Barracudas (some 26,000 of which were 'Cudas) made between 1970-1974, most were fitted with more pedestrian power-plants like the long-serving 318 cubic in (5.2 litre) V8 and the 198 & 225 (3.2 & 3.7) Slant-6, the latter pair serving what used to called the "grocery-getter" market (which in those less-enlightened times was known also as the “secretary's” or “women's” market); the sales breakdown for the other pony cars (Mustang, Camaro, Firebird, Challenger & Javelin) all revealed the same trend to some degree.  The Gran Coupe was the “luxury” version of the Barracuda, the engine options limited to the 225, 318 & 383 but with a better-trimmed interior, (something welcome in what was otherwise a quite austere environment of hard, unforgiving plastic) and some exterior bling including body sill, wheel lip and belt-line moldings.  The most notable fitting in the Gran Coupe was the overhead console, something earlier seen in the Ford Thunderbird.  A fairly large fitting for its limited utility (it included little more than an overhead light, low-fuel and door-ajar warning lights), other manufacturers would extend their functionality.  The overhead console wasn't available in the convertible version which was still sold as a "Gran Coupe", Plymouth using "coupe" as just another model name, applying it to two and four-door sedans and well as the blinged-up Grans pair.

1970 Plymouth AAR 'Cuda in "Lemon Twist" over black.

In 1970, there was a run of “AAR ‘Cudas”, a promotional model which tied in with the cars run in the Trans-Am series by the “All American Racers” (AAR) team run by US driver Dan Gurney (1931-2018).  Unlike the earlier cars produced in a certain volume in order to fulfil homologation requirements for eligibility in the Trans-Am (the Chevrolet Camaro Z28 (1967) (which in the factory’s early documents appeared as both Z-28 & Z/28) and Fords Boss 302 Mustang (1969), the AAR ‘Cudas were built in a more permissive regulatory environment, the requirement to homologate an engine within the 5.0 litre (305 cubic inch) limit dropped, the teams permitted to “de-stroke” larger mass-produced units.  The change was made explicitly to tempt Chrysler to compete, removing the expensive business of developing a special engine, exactly what Chevrolet and Ford had earlier been compelled to do and the spirit of compromise was at the time in their, the NASCAR (National Association for Stock Car Auto Racing) recently having nudged their 7.0 litre (quoted as 427 cubic inchs) to 430 to accommodate Ford’s new 429 (the 385 series V8).  So, although homologated, the AAR ‘Cudas didn’t have as close a relationship with what Gurney’s operation ran on the circuit compared with that enjoyed by the earlier Z28 Camaros and Boss Mustangs.

Underbody of 1970 Plymouth AAR 'Cuda in "Lemon Twist" over black.

The much admired side exhausts emulated the look of the (unlawful) "cut-out" systems some hot-rodders used but the AAR units were ducted using special mufflers with inlets & outlets both at the front.  Something of an affectation and probably a structural inefficiency in terms of gas-flow, they were undeniably a sexy look and AMG in the twenty-first century would adopt the "cut-out" look for the Mercedes-Benz G55 & G63 although without the convoluted path.

They did however look the part, equipped with a black fibreglass hood (bonnet) complete with lock-pins and a functional scoop, rear & (optional) front spoilers and a very sexy “side exhaust system” exiting just behind the doors.  Uniquely, the 340 in the “Trans-Am” cars ran a triple carburetor induction system (unlike the actual 5.0 litre race cars which were limited to a single four-barrel) and was rated at 290 (gross or SAE (Society of Automotive Engineers)) horsepower, a somewhat understated figure arrived at apparently because that was what was quoted for the Camaro Z28 and Boss 302 Mustang.  The engine genuinely was improved, the block a “special run” using an alloy of cast iron with a higher nickel content and including extra metal to permit the race teams to install four-bolt main bearings (none of the AAR road cars so configured).  Just to make sure buyers got the message, the front tyres were fat Goodyear E60x15s while the rears were an even beefier G60x15, a mix which was a first for Detroit and produced a pronounced forward rake.  So even if the AAR ‘Cudas really weren’t “race-ready”, they looked like they were which was of course the point of the whole exercise and they proved popular, Plymouth making 2724 (all coupes), 1604 of which were fitted with the TorqueFlite 727 automatic transmission, something not seen on the Trans-Am circuits but which was ideally suited to street use.  Dodge’s companion “homologation special” was the Challenger T/A in an identical configuration and of the 2400 coupes made, 1411 were automatics.

1970 Plymouth AAR 'Cuda with dealer-fitted (or re-production) front "chin" spoiler (option code J78) (left) and 1970 Plymouth AAR 'Cuda with standard rear "ducktail" spoiler (mandatory option J82) (right).

The black ABS plastic rear "ducktail" spoiler (mandatory option code J82) was standard on the AAR 'Cudas (and differed from the "wing" style unit optional on other 'Cudas) while the pair of front "chin" spoilers (J78) were optional.  The chin spoilers were not fitted by the factory but supplied as a "dealer-install kit" and shipped in the car's trunk (boot), the result being some variations in the mounting position so cars so configured.  The chin spoilers are available as re-productions (some even including the original Mopar part-number) and because they were dealer-installed it can be hard to tell whether they are original equipment, the slight variations in the positioning of the originals further muddying the waters.  For the “originality police” for whom “matching numbers” is the marker of the highest form of collectability, the small ABS protuberances are thus a challenge because while a rare dealer receipt or shipping list from 1970 can prove the provenance, an alleged authenticity can be difficult to disprove because there are now documented techniques by which plastic can be “aged”, a la the tricks art forgers once used to make a recent painting appear centuries old.  Scientific analysis presumably could be applied to determine the truth; there’s no record of the originality police ever having resorted to that but it may happen because in the collector market the difference in value between “original” and not original can be significant.

1970 Plymouth Barracuda Option M46 detail sheet (left) and 1970 Plymouth Barracuda with M46 (or re-production) rear (non-functional) quarter-panel (sill) scoop (right).

The reproduction of obscure and once rarely ordered options has meant there doubtlessly are more AAR ‘Cudas with the chin spoilers than were ever sold in that form and even the less desirable Barracudas are serviced by the industry.  In 1970 there was option code M46 which included (1) an Elastomeric (elastomer a rubbery material composed of long, chain-like molecules (or polymers) capable of recovering their original shape after suffering an impact) rear quarter-panel (sill) air scoop in front of the rear wheels, (2) matte black lower-body trim with white and red pinstripes, (3) a rear-panel black-out (similar to that used on the ‘Cuda), complemented with chrome trim from the Gran Coupe (the “luxury” version of the Barracuda which, despite the name, was available also as a convertible) and (4) blacked-out front & rear valences.  Offered only for 1970 Barracudas, Chrysler’s records indicate fewer than 450 were built but the reproduction scoops are sometimes seen even on later models including ‘Cudas on which they were never available.  Unlike the AAR’s chin spoilers, option code M46 was factory-fitted so authenticity can be verified by the fender tag.  Unlike the spoilers (which would have had some aerodynamic effect), option M46 was purely a “dress-up”, the quarter-panel scoop “non-functional” and only emulating the “rear-brake cooling ducts” sometimes used on race cars or exotic machines.  

1971 Plymouth 'Cuda convertible.

Circumstances conspired to doom the ‘Cuda, the 426 Hemi, the Challenger and almost the whole muscle car ecosystem.  Some of the pony cars would survive but for quite some time mostly only as caricatures of their wild predecessors.  Rapidly piling up were safety and emission control regulations which were consuming an increasing proportion of manufacturers’ budgets but just as lethal was the crackdown by the insurance industry on what were admittedly dangerously overpowered cars which, by international standards, were extraordinarily cheap and often within the price range of the 17-25 year old males most prone to high-speed accidents on highways.  During 1970, the insurance industry looked at the data and adjusted the premiums.  By late 1970, were it possible to buy insurance for a Hemi ‘Cuda and its ilk, it was prohibitively expensive and sales flopped from around 650 in 1970 to barely more than a hundred the next year, of which but a dozen-odd were convertibles.  Retired with the Hemi was the triple carburetor option for the 440; 1971 was the last time such a configuration would appear on a US-built vehicle.

It was nearly over.  Although in 1972 the Barracuda & Challenger were granted a stay of execution, the convertible and the big-block engines didn’t re-appear after 1971 and the once vibrant 340 was soon replaced by a more placid 360.  Sales continued to fall, soon below the point where the expensive to produce E-body was viable, production of both Barracuda and Challenger ending in 1974.  From a corporate point-of-view, the whole E-Body project had proved a fiasco: not only did it turn out to be labour-intensive to build, it was only ever used by the Barracuda & Challenger, a financial death sentence in an industry where production line rationalization was created by "platform-sharing".  Even without the factors which led to the extinction however, the first oil-crisis, which began in October 1973, would likely have finished them off, the Mustang having (temporarily) vacated that market segment and the Camaro and Firebird survived only because they were cheaper to build so GM could profitably maintain production at lower levels.  Later in the decade, GM would be glad about that for the Camaro and Firebird enjoyed long, profitable Indian summers.  That career wasn't shared by the Javelin, American Motors’ belated pony car which, although actually more successful than the Barracuda, outlived it only by months.

1971 Hemi 'Cuda convertible at 2021 auction.  Note the "gills" on the front fender, an allusion to the "fish" theme although anatomically recalling a shark more than a barracuda.  

It was as an extinct species the third generations ‘Cudas achieved their greatest success... as used cars.  In 2014, one of the twelve 1971 Hemi ‘Cuda convertibles sold at auction for US$3.5 million and in 2021, another attracted a bit of US$4.8 million without reaching the reserve.  In the collector market, numbers do "bounce around a bit" and while the "post-COVID" ecosystem was buoyant, by 2024 it appears things are more subdued but, like Ferrari's Dino 246GT & GTS, the 1971 Hemi 'Cuda convertibles remains a "litmus-paper" car which is regarded as indicative of the state of the market.  The next time one is offered for sale, the fall of the hammer will be watched with interest.

Sphyraena barracuda (great barracuda).

The barracuda, most notably the Sphyraena barracuda (great barracuda), can grow quite large with lengths of 3-5 feet (0.9-1.5 metres) being common but specimens have been verified at just over 6 feet (1.8 metres), weighing in excess of 100 lb (45 KG) although most caught by recreational fishers tend to be around 20-30 lb (9-14 KG).  They’re a fast, powerful predator, making them a much sought-after target for the more adventurous anglers, attracted by their aggressive strikes, impressive speed, and challenging fights, most hunting done in warmer coastal waters.  The techniques employed include including trolling, casting with artificial lures and live bait fishing but because of their sharp teeth and aggressive nature, specialized equipment such as wire leaders is often used to prevent them cutting through fishing lines.  Among recreational fishers, the pursuit is often on the basis of “the thrill of the chase” because the species can pose genuine health risks if eaten because of ciguatera poisoning, a toxin which accumulates in the fish’s flesh when they consume smaller, contaminated fish.

Hofit Golan (b 1985; left) and Lindsay Lohan (b 1968; right) fishing off Sardinia, July 2016 (left).  Fortunately perhaps, Ms Lohan didn’t hook a barracuda and caught something less threatening.  Apparently also fishing for “the thrill of the chase” (right), she posted on Instagram: “Bonding with nature. I let my little friend swim away after.