Showing posts sorted by date for query Radar. Sort by relevance Show all posts
Showing posts sorted by date for query Radar. Sort by relevance Show all posts

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified porn star Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.

Sunday, January 19, 2025

Rat

Rat (pronounced ratt)

(1) In zoology, any of several long-tailed rodents of the family Muridae, of the genus Rattus and related genera, distinguished from the mouse by being larger.

(2) In (scientifically inaccurate) informal use, any of the numerous members of several rodent families (eg voles & mice) that resemble true rats in appearance, usually having a pointy snout, a long, bare tail, and body length greater than 5 inches (120 mm).

(3) In hairdressing, a wad of shed hair used as part of a hairstyle; a roll of material used to puff out the hair, which is turned over it.

(4) In the slang of certain groups in London, vulgar slang for the vagina.

(5) As “to rat on” or “to rat out”, to betray a person or party, especially by telling their secret to an authority or enemy; to turn someone in.

(6) One of a brace of rodent-based slang terms to differentiate between the small-block (mouse motor) and big-block (rat motor) Chevrolet V8s built mostly in the mid-late twentieth century but still available (as "crate" engines) from US manufacturers.

(7) As RAT, a small turbine that is connected to a hydraulic pump, or electrical generator, installed in an aircraft and used as a power source.

(8) Slang term for a scoundrel, especially men of dubious morality.

(9) In the criminal class and in law enforcement, slang for an informer.

(10) In politics, slang for a person who abandons or betrays his party or associates, especially in a time of trouble.

(11) Slang for a person who frequents a specified place (mall rat, gym rat etc).

(12) In hairdressing, a pad with tapered ends formerly used in women's hair styles to give the appearance of greater thickness.

(13) In the slang of blue-water sailors, a place in the sea with rapid currents and crags where a ship is prone to being broken apart in stormy weather.

(14) In zoology (in casual use), a clipping of muskrat.

Pre 1000: From the Middle English ratte, rat & rotte, from the Old English ræt & rætt, and the Latin rodere from the Proto-Germanic rattaz & rattō (related also to the West Frisian rôt, the German Ratz & Ratte and the Swedish råtta & the Dutch rat), of uncertain origin but perhaps from the primitive Indo-European rehed- (to scrape, scratch, gnaw).  Zoological anthropologists however point out it’s possible there were no populations of rats in the Northern Europe of antiquity, and the Proto-Germanic word may have referred to a different animal.  The attestation of this family of words dates from the twelfth century.  Some of the Germanic cognates show considerable consonant variation such as the Middle Low German ratte & radde and the Middle High German rate, ratte & ratze, the irregularity perhaps symptomatic of a late dispersal of the word, although some etymologists link it with the Proto-Germanic stem raþō (nom); ruttaz (gen), the variations arising from the re-modellings in the descendants.

Mall rats.  In North America and other developed markets, there is now less scope for habitués because changing consumer behavior has resulted in a dramatic reduction in the volume of transactions conducted in physical stores and some malls are being either abandoned or re-purposed (health hubs and educational facilities being a popular use).  

The human distaste for these large rodents has made rat a productive additive in English.  Since the twelfth century it’s been applied (usually to a surname) to persons either held to resemble rats or share with them some characteristic or perception of quality with them. The specific sense of "one who abandons his associates for personal advantage" is from the 1620s, based on the belief that rats leave a ship about to sink or a house about to fall, and this led to the meaning "traitor” or “informant" although, perhaps surprisingly, there no reference to rat in this sense prior to 1902 where as the modern-sounding sense of associative frequency (mall-rat, gym-rat etc) was noted as early as 1864, firstly as “dock-rat”.  Dr Johnson dates “to smell a rat”, based on the behaviour of cats, to the 1540s.  Sir Boyle Roche (1736-1807), was an Irish MP famous for mangled phrases and mixed metaphors, of the best remembered of which was “I smell a rat; I see him forming in the air and darkening the sky; but I'll nip him in the bud".  There’s the rat-terrier (1852), the rat-catcher (1590s), the rat-snake (1818), rat-poison, (1799), the rat trap (late 1400s), the rat-pack (1951) and rat-hole which in 1812, based on the holes gnawed in woodwork by rats meant “nasty, messy place”, the meaning extended in 1921 to a "bottomless hole" (especially one where money goes).  Ratfink (1963) was juvenile slang either coined or merely popularized by US custom car builder Ed "Big Daddy" Roth (1932-2001), who rendered a stylised rat on some of his creations, supposedly to lampoon Mickey Mouse.

Cricket's most infamous rat (mullygrubber), MCG (Melbourne Cricket Ground), 1 February, 1981.  In the 1970s brown & beige really had been a fashionable color combination but this was the combo's death knell.

Rat has a specific meaning in the cricketing slang of the West Indies, referring to a ball which, after being delivered by the bowler, rather than bouncing off the pitch at some angle, instead runs along the ground, possibly hitting the stumps with sufficient force to dislodge the bails, dismissing the batsman, the idea being of a rat scurrying across the ground.  In Australian slang, the same delivery is called a mullygrubber which, although it sounds old-fashioned, is said to date only from the 1970s, the construct thought based on the dialectal rural term mully (dusty, powdery earth) + grub(ber) in the sense of the grubs which rush about in the dirt if disturbed in such an environment.  Such deliveries are wholly serendipitous (for the bowler) and just bad luck (for the batsman) because it's not possible for such as ball to be delivered on purpose; they happen only because of the ball striking some crack or imperfection in the pitch which radically alters it usual course to a flat trajectory.  If a batsman is dismissed as a result, it's often called a "freak ball" or "freak dismissal".  Of course if a ball is delivered underarm a rat is easy to effect but if a batsman knows one is coming, while it's hard to score from, it's very easy to defend against.  The most infamous mullygrubber was bowled at the Melbourne Cricket Ground (MCG) on 1 February 1981 when, with New Zealand needing to score six (by hitting the ball, on the full, over the boundary) of the final delivery of the match, the Australian bowler sent down an underarm delivery, the mullygrubber denying the batsman the opportunity to score and securing an Australian victory.  Although then permissible within the rules, it was hardly in the spirit of the game and consequently, the regulations were changed.

The Ram Air Turbine

Ram Air Turbine (RAT) diagram.

The Ram Air Turbine (RAT) is a small, propeller-driven turbine connected to a hydraulic pump, or electrical generator, installed in an aircraft to generate emergency power.  In an emergency, when electrical power is lost, the RAT drops from the fuselage or wing into the air-stream where it works as a mini wind-turbine, providing sufficient power for vital systems (flight controls, linked hydraulics and flight-critical instrumentation).

Vickers VC10 in BOAC (British Overseas Airways Corporation (1939-1974) livery.

Built between 1962-1970, although fast and a favorite with passengers because the rear-engine layout guaranteed a quiet cabin, only 54 VC10s were built and, in a market dominated by Boeing's epoch-making 707 (1956-1978), success proved elusive.  Even before the 747 in 1969 ushered in the wide-body era it was clear the elegant VC10 was a cul-de-sac but the airframe enjoyed a long career.  The RAF (Royal Air Force) had some configured as VIP transports and the last of those used as tankers for in-flight re-fueling platforms weren't retired until 2013.

Most modern commercial airliners are equipped with RATs, the first being installed on the Vickers VC10 in the early 1960s and the big Airbus A380 has the largest RAT propeller in current use at 64 inches (1.63 metres) but most are about half this size.  It’s expected as modern airliners begin increasingly to rely on electrical power, either propeller sizes will have to increase or additional RATs may be required, the latter sometimes the desirable choice because of the design limitations imposed by the height of landing gear.  A typical large RAT can produce from 5 to 70 kW but smaller, low air-speed models may generate as little as 400 watts.  Early free-fall nuclear weapons used rats to power radar altimeters and firing circuits; RATS being longer-lasting and more reliable than batteries.  

A RAT deployed.

The airline manufacturers have been exploring whether on-board fuel-cell technology can be adapted to negate the need for RAT, at least in the smaller, single-aisle aircraft where the weight of such a unit might be equal to or less than the RAT equipment.  The attraction of housing in an airliner's wing-body fairing is it would be a step towards the long-term goal of eliminating an airliner's liquid-fuelled auxiliary turbine power unit.  Additionally, if the size-weight equation could be achieved, there’s the operational advantage that a fuel-cell is easier to test than a RAT because, unlike the RAT, the fuel-cell can be tested without having to power-up most of the system.  The physics would also be attractive, the power from a fuel cell higher at lower altitudes where as the output of a RAT declines as airspeed decreases, a potentially critical matter given it’s during the relatively slow approach to a landing that power is needed to extend the trailing edge of the wing flaps and operate other controls.

If the weight and dimensions of the fuel cell is at least "comparable" to a RAT and the safety and durability testing is successful, at least on smaller aircrafts, fuel-cells might be an attractive option for new aircraft although, at this stage, the economics of retro-fitting are unlikely to be compelling.  Longer term research is also looking at a continuously running fuel cell producing oxygen-depleted exhaust gas for fuel-tank inerting (a safety system that reduces the risk of combustion in aircraft fuel tanks by lowering the oxygen concentration in the ullage (the space above the fuel) to below the level needed to support a fire, typically by replacing oxygen with an inert gas like nitrogen), and water for passenger amenities, thereby meaning an aircraft could be operated on the on the ground without burning any jet-fuel, the fuel-cell providing power for air conditioning and electrical systems.

1944 Messerschmitt Me 163 Komet  (1944-1945).

The only rocket-powered fighter ever used in combat, the Messerschmitt Me 163 Komet had a small RAT in the nose to provide electrical power.  The early prototypes of the somewhat more successful (and much more influential) Messerschmitt Me 262 jet fighter also had a propeller in the nose for the first test flights but it wasn't a a RAT; it was attached to a piston engine which was there as an emergency backup because of the chronic unreliability of the early jet engines.  It proved a wise precaution, the jets failing on more than one occasion.

1974 Suzuki's air-cooled GT380 Sebring with Ram Air System (left) and 1975 Suzuki GT750 with water-cooling (right).

The other “Ram Air” was Suzuki’s RAS (Ram Air System), fitted to the GT380 Sebring (1972-1980) and GT550 Indy (1972-1977) as well as (off and on) several version of the smaller two-cylinder models.  It wasn’t used on the water-cooled GT750 Le Mans (1972-1977) because the radiator acted to impede the airflow to the engine.

The GT380, GT550 and GT750 were two-stroke triples noted for an unusual 3-into-4 exhaust system which the central header-pipe was bifurcated, thus permitting four tail-pipes.  There was no justification in engineering for this (indeed it added cost and weight) and it existed purely for visual effect, allowing an emulation of the look on the four-cylinder Hondas and Kawasakis.  Ironically, despite the additional metal, the asymmetric 3-into-3 system on the Kawasaki triples (1969-1976) is better remembered although the charismatic (if sometimes lethal) qualities of the machines may be a factor in that; exhaust systems do exert a powerful fascination for motor-cyclists.  The RAS was nothing more than a cast aluminum shroud fixed atop the cylinder head to direct air-flow, enhancing upper cylinder cooling.  The “ram air” idea had been used in the 1960s by car manufacturers to “force feed” cool air directly into induction systems and when tested it did in certain circumstances increase power but whether Suzuki's RAS delivered more efficient cooling isn’t clear.  When the twin-cylinder GT250 Hustler (1971-1981 and thus pre-dating the pornography magazine Hustler, first published in 1974) was revised in 1976, the RAS was deleted and replaced by conventional fins without apparent ill-effect but the RAS was light, cheap to produce, maintenance-free and looked sexy so some advantages were certainly there.  Interestingly, the companion GT185 (1973-1978) retained the RAS for the model’s entire production.

Big and small-block Chevrolet V8s: the Rat and the Mouse

Small and big-block Chevrolet V8s compared, the small-block (mouse) to the left in each image, the big-block (rat) to the right.

Mouse and rat are informal terms used respectively to refer to the classic small (1955-2003) and big-block Chevrolet V8s (1958-2021).  The small-block was first named after a rodent although the origin is contested; either it was (1) an allusion to “mighty mouse” a popular cartoon character of the 1950s, the idea being the relatively small engine being able to out-perform many bigger units from other manufacturers or (2) an allusion to the big, heavy Chrysler Hemi V8s (the first generation (Firepower) 331 cubic inch (5.4 litre), 354 (5.8) & 392 (6.4) versions) being known as “the elephant”, the idea based on the widely held belief elephants are scared of mice (which may actually be true although the reason appears not to be the long repeated myth it’s because they fear the little rodents might climb up their trunk).  Zoologically, "bee" might have been a better choice; elephants definitely are scared of bees.  The mouse (small-block) and rat (big-block) distinction is simple to understand: the big block is externally larger although, counterintuitively, the internal displacement of some mouse motors was greater than some rats.  

1970 Chevrolet Chevelle SS 396 (with 402 cid V8).

Whether that seeming anomaly (actually common throughout the industry during the big-block era) amused or disturbed the decision-makers at Chevrolet isn't known but in 1970 when the small block 400 (6.6) was introduced, simultaneously the big-block 396 (6.5) was enlarged to 402 (6.6) but the corporation then muddied the waters by continuing to call the 402 a "Turbo-Jet 396" when fitted to the intermediate class Chevelle, the rationale presumably that "SS 396" had such strong "brand recognition".  Available since 1965, by 1969 the SS 396 Chevelle was finally out-selling the Pontiac GTO (which in 1964 had seeded the muscle car movement) so the attachment was understandable.  Further to confuse people, the 400 was advertised as the "Turbo-Fire 400" while if fitted to the full-size line, the 402 was called the "Turbo-Jet 400".  Presumably, the assumption was anyone understanding the 400 & 402 ecosystems would buy the one they wanted while those not in the know would neither notice nor care.  Nor was the deviation in displacement between what was on the badge and what lay beneath the hood (bonnet) exclusive to Chevrolet, there being a long list of things not quite what was on the label although the true specifications usually were listed in the documentation and even the advertising.  The variations occurred for a number of reasons but rarely was there an attempt to deceive, even if sometimes things were left unstated or relegated to the small print.

The “428 Cobra Matter”

That’s not to say there were no disputes about the difference between what was “in the tin” compared with what was “on the tin”.  In June 1969, a certain Mr Karl Francis “Fritz” Schiffmayer (1935-2010) of Lake Zurich, Illinois, wrote to Ford’s customer relations department complaining about the “427 Ford Cobra” he had purchased (as a new car) from a Chicago “Ford Dealer”.  What disappointed Mr Shiffmayer was the performance which didn’t match the widely publicized numbers achieved by many testers and, perhaps more to the point, he found his “$8500 Super Ford could barely keep up with” various $5000 Chevrolets.  For a Ford driver, few things could be more depressing.  Upon investigation, he discovered that despite “‘427’ signs all over the engine and the front fenders”, his car was not “a ‘427’ as advertised and labelled but a ‘428’”.  Both V8s were around seven litres but were in many ways not comparable.

Mr Shiffmayer's letter to Ford, 23 June 1969.

Notionally, Mr Shiffmayer got more than he paid for (ie 428 v 427 is nominally an extra cubic inch) and had he bought a dozen (12) bread rolls from the bakery and been supplied a “baker’s dozen” (13) there’d have been no grounds for complaint because bread rolls are a “fungible” (ie functionally identical) so getting 13 is always better than getting 12 at the same price.  However, the 427 and 428 engines, although from the same FE (Ford-Edsel) family and externally similar (until closely inspected), were very different internally with the former notably more oversquare (ie big-bore) and fitted with cross-bolted main bearings; additionally, the 427s used in the Cobras featured “side-oiling”, a more extensive system of lubrication which afforded priority deliver of oil to the bottom-end, making the engine more robust and better suited to the extreme demands of competition.  By contrast, the a Cobra’s 428 was a modified version of the “Police Interceptor 428”, a high-output edition of a powerplant usually found in Ford’s full-sized line including luxury cars and station wagons where it’s smoothness and effortless low-speed torque was appreciated.  The “Police Interceptor” specification was literally that: the engine used by law enforcement in highway patrol vehicles and for street use, it offered a useful lift in performance but it was not suitable for racetracks.  Later, Ford would “mix & match” the 427 & 428 to create the 428 Cobra Jet, the 427’s heads, intake manifold and some other “bolt-on” bits & pieces creating a combination of power and torque close to ideal for ¼ mile (402 m) runs down drag strips although even then Ford cheated, under-rating the output so the cars would be placed in a different category.  That year, in drag racing, the 428 Cobra Jet Mustangs dominated their class which prompted the sanctioning body to change the rules, imposing their own nominal output ratings rather than accepting those of the manufacturer.  Still, even the Cobra Jet 428 remained suitable only for street and strip because ¼ mile runs were done in a straight line and, without the cross-bolting and enhanced lubrication, it wouldn’t have matched the 427’s ability to endure the extreme lateral forces encountered on high-speed circuits.

AC Shelby Cobra CSX3209 after 427 transplant.

That “427 Cobras” with 428 engines even existed was a product of circumstances rather than planning.  Although now million dollar collectables, it’s sometimes forgotten the 427 Cobra was a commercial failure and that meant production numbers never reached the levels required for homologation to be granted for competition in the category for which it was intended so as well as not selling as well as the small block predecessors on which the model’s reputation was built, nor did the seven litre version ever match its success on the track.  When it came time to build the second batch of 100 427 Cobras, the engine was in short supply because the intricacies in construction, coupled with the wider bore being at the limit the block would accommodate (at the foundry, with a slight shifting of the casting cores, a 427 block would have to be scrapped), it was expensive to produce and inconvenient for Ford to schedule in the small batches the sales supported.  The cheap, mass-produced 428 Police Interceptor was both readily available and half the cost so it was an attractive alternative for Shelby and that it bolted straight without needing any changes made it more desirable still; thus 428-powered “427s”.  For the final run of 48, Shelby procured from Ford genuine 427 side-oilers so the 100 428s were a minority of the big-blocks used and many have since been converted (“rectified” some prefer to say) with the substitution of a 427.  Interestingly, four of the 428s were fitted with automatic transmissions which actually made them more-suitable for street use but nobody seems subsequently have done this as a modification.

Shelby American's reply to Mr Shiffmayer, 21 July 1969.

As it was, As it was, Mr Shiffmayer decided to persevere and kept Shelby Cobra CSX3209 until he died, in the 1970s replacing the 428 with a specially built “tunnel port” (a trick with the pushrods to optimize the fluid dynamics of the fuel-air flow) 427.  Whether he was impressed with the reply (Ford referred his letter to Shelby American) he received in response to his complaint isn’t known but it’s an interesting document for a number of reasons:

(1) “…during the five year existence of the Cobra, three engines were used, the 289, the 427 and the 428.  Actually, the first 75 used the 260.

(2) “Only a very few of the 959 Cobras built contained the 427 engine.  Actually of the 998 built (in fairness this wasn’t in 1969 the agreed “final count” but it’s hard to understand how 959 was calculated) more than 150 had the 427 and whether this constitutes “very few” is debatable but it’s also not relevant to the complaint.

(3) “The 427 is a nomenclature such as the GT-500 is for the Shelby car.  It does not relate to the cubic displacement of the engine.  We are sorry that this misunderstanding occurred.  Actually, when the Cobra 427 was released, “427” was a direct reference to displacement.  The “GT500” label was never likely to cause a “misunderstanding” because (1) there was no Ford 500 cid engine and (2) the GT500 was always advertised as being equipped with the 428.

Indisputably as labeled: 1966 Shelby 427 S/C Cobra (CSX 3040).  In 2018, it sold at auction for US$2,947,500.

So, the letter from Shelby really wasn’t a great deal of help (it was dated the day after man set foot on the moon so perhaps the writer's attention was divided).  Were such a case now to go to court several matters would need to be considered:

Was it notorious (ie widely known; common knowledge) in the circles of potential purchasers of such a car that some were powered by 427s and some by 428s and the differences between the two were well documented?  According to some sources, it was only after the “428 Cobra matter” began to attract comment that sales literature was updated to reflect the changed specification while others maintain publication was concurrent with production.

Was the fact the car had only “427” badges an indication of which engine was fitted or just a “model name” al la the Shelby GT500 (which used a 428) & GT350 (which used either a 289 or 302 (and later a 351))?  That originally (in 1965) “427” was a reference to the 427 engine seem incontestable but the question would be whether this changed to a mere “model name” when the 428 was adopted.  It would seem the evidential onus of proof of that would rest with Shelby American.

When making the decision to purchase, did the buyer rely on representations from an authority (in this case a “Ford dealer”) which might reasonably have been expected to (1) possess and (2) communicate all relevant facts?  In that matter, the court would need to consider whether, in the circumstances, there is any substantive difference between a “Ford dealer” and a “Shelby franchised dealer”.  This would be decided by (1) any competing claims from the parties and (2) what documents were supplied prior to or at the point of purchase.

Is it relevant that in 1966-1967 when Ford offered both the 427 & 428 in the Galaxie, that car was sold as the “7 Litre” (they really did use the French spelling) irrespective of which seven litre (427 or 428) V8 was fitted?  Given that, should the Cobra have been thus labelled and was the continued use of the 427 badge a misrepresentation in 428-powered cars?

Was the dealer aware of the buyer’s background?  Mr Shiffmayer was (1) an engineer with a degree in mechanical engineering from the University of Wisconsin and (2) he was not only an owner of a 289 Shelby Cobra but also raced it with notable success.  If the dealer was aware of those facts that doesn’t absolve them of a responsibility fully to disclose all relevant information but a court could consider it a mitigating factor.  If the dealer was aware those facts, what would then have to be considered is whether it would have been reasonable for it to be assumed the buyer either knew of the mechanical details or could reasonably have been expected to know.

Evidence: Shelby American "Shelby Cobra 427" spec sheet listing the 428 as the engine, thereby suggesting "427" was a model name rather than a reference to a specific engine.  The significance of this document rests on whether it appeared before or after Shelby American began selling 428 powered Cobras.