Showing posts sorted by relevance for query Gasoline. Sort by date Show all posts
Showing posts sorted by relevance for query Gasoline. Sort by date Show all posts

Friday, July 21, 2023

Gasoline

Gasoline (pronounced gas-uh-leen)

(1) A volatile, flammable liquid mixture of hydrocarbons, obtained from petroleum and used as fuel in internal-combustion engines or as a solvent.

(2) In the slang of drug users, marijuana, especially if notably potent (also as gas and there’s evidence both gas and gasoline have been used of other drugs).

(3) In slang, a cocktail made by mixing a spirit with an energy drink (the original believed to be a combination of vodka & Red Bull).

As used to describe the “light, volatile liquid obtained from distillation of petroleum”, gasoline dates from 1864 and was a variant of Gasolene which in the UK had been trade-marked the year before.  The word gasolene was from a trade-marked brand of petroleum-derived lighting oil, registered in 1862 which was based on the surname of English publisher and tea & coffee merchant John Cassell (1817–1865) who branched out into lighting fuel, marketed as both Cazeline & Cazzoline.  His publishing house Cassell & Co endures today as an imprint of the Octopus Publishing Group.  The surname Cassell was from the Anglo-Norman castel (a cognate of the English castle), from the Old French castel, from the Latin castellum, a diminutive of castrum.  The -eline suffix was from the Ancient Greek λαιον (élaion) (oil, olive oil), from λαία (elaía).  Etymologists speculate the spelling of gasolene (and thus gasoline) may have been influenced by Gazeline, an Irish product which was a clone of Cazzoline, either the promoters liked the assumed association with “gas” or simply they found it a more attractive word.  It’s though the general construct gas-o-line was built with the “o” representing the Latin oleum (oil) and the ending a borrowing from the chemical suffix -ine.  The alternative form gasolene is extinct in every market except Jamaica.  Gasoline is a noun & adjective and gasolinic is an adjective; the noun plural is gasolines.

Moderne BV-Aral Tankstelle (modern BV-Aral gas station), Bochum, FRG (Federal Republic of Germany (West Germany)), 1958.  The cars are an Opel Rekord (left), a Volkswagen Type 14 (Karmann Ghia) coupé (centre) and a Volkswagen Type 1 (Beetle) (right).  In the background stands the head office of the oil company BV-Aral AG.

In the US, the shortened form “gas” was in common use by at least 1897 but on the pattern of use typically found in other words, it’s likely it was around almost as soon as gasoline went on sale.  The “gas station” (place to fill up one’s automobile (“gassing up”) with gasoline by use of a “gas pump”) was recorded in California in 1916 and was in national use by the early 1920s.  The “gas pedal” (the accelerator) was first recorded in 1908 and is still used even in markets where the term petrol is preferred, as in the phrase “step on the gas” (depress the accelerator (ie go faster)) which is used generally to suggest increasing speed or effort and is not confined to automobiles.  The term gas-guzzler (a car with a high fuel consumption) was coined in 1973 after the first oil shock and in 1978 the US federal government imposed the first stage of its long-running “gas-guzzler tax”.  The noun gasohol (a gasoline with a small percentage of ethanol was coined in 1975; the mix was another reaction to the increase in the oil price and occasional shortages in the era.  To “pour gasoline on the fire” is a suggestion some action is making an already bad situation worse.  The term Avgas (the construct being av(iation) + gas) was coined during the First World War (1914-1918) when it was found the mix used in automobiles was unsuitable for aircraft which needed a mixture with higher specific energy (ie high octane).  The use in North America (and a handful of other places) of “gas” to refer to what is otherwise generally known as “petrol” sometimes mystifies because in many markets the usual distinction for road transport is between vehicles fueled by diesel, petrol & gas (usually liquid petroleum gas (LPG) or compressed natural gas (CNG).

Entertainment Tonight (ET) deconstructs Lindsay Lohan’s dance moves at a New Jersey gas station, October 2019.  According to ET, the routine was executed between gas pumps 3 & 4.

In chemistry, gas is matter in an intermediate state between liquid and plasma that can be contained only if it is fully surrounded by a solid (or in a bubble of liquid, or held together by gravitational pull); it can condense into a liquid, or can (in care cases) become a solid directly by deposition. The common synonym is vapor (also as vapour).  The word was a borrowing from the Dutch gas which was coined by chemist Brussels-based chemist & physician Jan Baptist van Helmont (1580–1644), from the Ancient Greek χάος (kháos) (chasm, void, empty space) and there may also have been some influence from geest (breath, vapour, spirit).  More speculatively, there were also the writings of the Swiss physician, alchemist, lay theologian, and philosopher of the German Renaissance Theophrastus von Hohenheim (circa 1493-1541 and known usually as Paracelsus) who wrote of kháos in the occultist’s  sense of “proper elements of spirits”" or "ultra-rarified water”, both of which accorded with van Helmont's definition of gas which he introduced to the world in Ortus medicinae, vel opera et opuscula omnia (The Origin of Medicine, or Complete Works (1648)) with the words Hunc spiritum, incognitum hactenus, novo nomine gas voco (“This vapor, hitherto unknown, I call by a new name, ‘gas’).

Lindsay Lohan gassing up her Porsche, Malibu, California, April 2020.

The use in science in the modern sense dates from 1779 and it was adopted for specific applications as technologies emerged or were commercialized: To describe a “combustible mix of vapors” the term “coal gas” was first used in 1794; the use in medicine for the anesthetic nitrous oxide was from 1794 (made famous in dentistry as “laughing gas” although the laughter was induced by impurities introduced in the early production processes rather than the inherent properties of N2O); “Poison gas” was from 1900" (1900).  The meaning “intestinal vapors” emerged in 1882 while the not unrelated sense of “empty talk” was from 1847 (meaning something like “hot air”) although more positively, by 1953 “it’s a gas” meant “something exciting or excellent”, “a gasser” in 1944 meaning much the same.  James Joyce (1882–1941) in Dubliners (1914) used gas to mean “fun, a joke”, an Anglo-Irish form thought linked to the use of laughing gas in dentistry.  In drag racing “gassers” (so named because they were fueled by gasoline rather than methanol or nitromethane) were the most common of the highly modified road cars in the early days of the sport but the National Hot Rod Association (NHRA) retired the category in 1972 and split the participation of gasoline-powered units into a number of classes.

Art Deco gas station, Beverly Hills, Los Angeles, California, 1931.

The “gas-works” was first described in 1914 and was a little misleading because they were actually bulk-storage facilities from which gas was distributed either by fixed lines or cylinders delivered to the premises.  The kitchen appliance the “gas-oven” was mentioned first in 1851 although “gas-stove” by then had been in use for three years.  The notorious “gas chambers” used by the Nazis in their mass-murder programmes are most associated with the attempt to exterminate the Jews of Europe but the first were actually built in 1939, as part of Aktion T4 which involved the killing of those with physical and intellectual disabilities.  These early facilities used carbon monoxide and were built within Germany and served also to murder other prisoners and although by later standards inefficient, were adequate for the numbers involved.  As territories to the east were occupied, similar structures were built and there were ever experiments with “mobile chambers”, large air-tight van coachwork added to truck chassis into which the exhaust gasses were ducted.  Again, these worked but by 1941 the Nazis now wished to exterminate millions and the most efficient method was found to be scaled-up chambers (disguised as shower rooms) into which the hydrogen cyanide-based anti-vermin fumigant Zyklon B was introduced, permitting a throughput at the most productive death camps of some 5500 at day, sometimes for months at a time.  The term “gas chamber” was widely used during the post-war hearings conducted by the International Military Tribunal (IMT) at Nuremberg (1945-1946) but as a method of judicial execution, many nations had by then used them at various times and the US only recently abandoned use of the method.

Roadsters line up to gas up, Gasoline Alley, Indianapolis Motor Speedway, May 1960.  This was one of the official postcards sold in the speedway's shop.

Gasoline Alley is the name of the garage area at the Indianapolis Motor Speedway.  That wasn’t the original name but in the 1920s, “gasoline alley” was the drivers’ slang for the forecourt at the back of the garages where the cars were taken to refuel.  Whether linked or not, there was in the era a popular newspaper comic strip called Gasoline Alley and the use of the name soon extended to the strip dividing the two rows of garages.  It caught the public imagination and the facility managers in the early 1950s added signage which meant the whole garage area became associated with the term.  As a result of the reconstructions necessitated by fires, modernization & expansion, Gasoline Alley is not recognizable compared to its original appearance but the name remains, even thought actual gasoline is now rarely pumped, the open-wheel cars switching first to methanol (1965) and later (2006) ethanol and it’s only when other categories use the track that gasoline is in the tanks.  If the sport is compelled to convert to electric (or hopefully hydrogen) propulsion, the name is unlikely to change.

Rod Stewart (b 1945), Gasoline Alley (1970).

Saturday, March 30, 2024

Swirl

Swirl (pronounced swurl)

(1) A twist, as of hair around the head or of trimming on a hat; a whorl or curl.

(2) Any curving, twisting line, shape, or form.

(3) A descriptor of a state or confusion or disorder.

(4) A swirling movement; whirl; eddy; to turn or cause to turn in a twisting spinning fashion (used especially of running water).

(5) In fishing, the upward rushing of a fish through the water to take the bait.

(6) To move around or along with a whirling motion; a whirl; an eddy.

(7) To feel dizzy or giddy (the idea of a “spinning head”).

(8) To cause to whirl; twist.

(9) To be arranged in a twist, spiral or whorl.

(10) Figuratively, to circulate, especially in a social situation.

(11) In AAVE (African-American Vernacular English), to in some way mingle interracially (dating, sex, marriage etc) (dated; now rare).

(12) In internal combustion engines (ICE), as “swirl chamber”, a now generic term for a type of combustion chamber design.

1375-1425: From the late (northern) Middle English swirlen (to eddy, swirl) which was probably from the Old Norse svirla (to swirl), a frequentative form of Old Norse sverra (to swing, twirl).  It was cognate with the Scots swirl & sworl (to eddy, swirl), the Norwegian Nynorsk svirla (to whirl around; swirl), the Swedish sorla (to murmur, buzz) and the Dutch zwirrelen (to swirl).  Related forms included the dialectal German schwirrlen (to totter), the West Frisian swiere (to reel, whirl), the Dutch zwieren (to reel, swing around), the German Low German swirren (to whizz, whirl or buzz around), the German schwirren (to whirr, whizz, buzz), the Swedish svirra (to whirr about, buzz, hum), the Danish svirre (to whizz, whirr) and the English swarm.  The construct may be understood as the Germanic root swir- + -l- (the frequentative suffix).  Swirl is a noun & verb, swirled is a verb & adjective, swirling is a noun, verb & adjective, swirly is a noun & adjective, swirler is a noun and swirlingly is an adverb; the noun plural is swirls.

In English, the late (northern) Middle English noun swirlen (to eddy, swirl) seems originally to have come from a Scottish word, the origin of which is undocumented but etymologists seem convinced of the Scandinavian links.  The sense of a “whirling movement” emerged in the early nineteenth century although the meaning “a twist or convolution (in hair, the grain of wood etc)” was in use by 1786.  The verb as a transitive in the sense of “give a swirling or eddying motion to” was in use in the early sixteenth century but it may by then long have been in oral use, one text from the fourteenth containing an example and the source of that may have been either Germanic (such as the Dutch zwirrelen (to swirl) or the Norwegian Nynorsk svirla (to whirl around; swirl) or it may have evolved from the English noun.  The intransitive sense (have a whirling motion, form or whirl in eddies) dates from 1755.  The adjective swirly existed by 1785 in the sense of “twisted or knotty” but by the middle of the next century it had come also to describe anything “whirling or eddying”, applied especially to anything aquatic.  By 1912, it was used also to mean “full of contortions or twists” although “swirling” in this sense had by then been in (gradually increasing) use for a century.

Of curls & swirls: Lindsay Lohan with curls (left) and swirls (right).

In hairdressing, although customers sometimes use the words “curl” and “swirl” interchangeably, to professionals the use should be distinct.  A swirl is a movement or pattern in which hair is styled or arranged, typically with a rounded or circular pattern and swirls can be natural (the pattern at the crown of the head where the hair grows in a circular direction) or stylized (the look deliberately created and most obvious in “up-dos) or the formal styles associated with weddings and such).  The end result is a wide vista and the swirl is more a concept than something which exists within defined parameters.  A curl is (1) a type of hair texture or (2) the act of creating a curl with techniques using tools and/or product.  Some people (and there’s a strong ethnic (ie genetic) association) naturally have curly hair due to the shape of their follicles and within the rubric of what used to be called the ulotrichous, hairdressers classify curls as tyree types: (1) tight (small, corkscrew-like structures), (2) medium (tighter curls but with a softer appearance) and (3) long spirals with a large diameter).  Some commercial product also lists “ringlets” as a type but as tight, well-defined spirals, they’re really a descriptive variation of the tight or medium.  So, the essential difference is that a swirl is a pattern or movement of the hair, while a curl describes texture or shape and while a swirl is a matter or arrangement, a curl demands changing the hair’s natural texture or shape.  Swirls are very much set-piece styles associated with formal events while curls are a popular way to add volume, texture, and movement to the hair.

In internal combustion engines (ICE), the “swirl chamber is a now generic term used to describe a widely-used type of combustion chamber when upon introduction, the fuel-air mixture “swirls around” prior to detonation.  The design is not new, Buick’s straight-8 “Fireball” cylinder head using a simple implementation as long ago as the 1920s and it would serve the corporation into the 1950s.  The critical aspect of the engineering was the interaction between a receded exhaust valve and a rising in the top of the piston which “pushed” most of the fuel-air mixture into what was a comparatively small chamber, producing what was then called a “high-swirl” effect, the “Fireball” moniker gained by virtue of the actual combustion “ball of fire” being smaller in volume than was typical at the time.  The benefit of the approach was two-fold: a reduction in fuel consumption because less was required per power-stroke and (2) a more consistent detonation of the poor quality fuel then in use.  As fuel improved in quality and compression ratios rose (two of the dominant trends of the post-war years), the attraction of swirl chambers diminished but the other great trend was the the effective reduction in the cost of gasoline (petrol) and as cars became larger & heavier and roads more suited to higher speeds, the quest was for power.

Swirling around: The swirl process in a diesel combustion chamber.

Power in those years usually was gained by increased displacement & combustion chamber designs optimized for flow; significantly too, many popular designs of combustion chamber (most notably those in the so-called “wedge” heads) were cheaper to produce and in those years, few gave much thought to air-pollution.  The cars of the 1950s & 1960s had really toxic exhaust emissions.  By the mid 1960s however, the problem of air pollution in US cities was so obvious and the health effects were beginning to be publicized, as was the contribution to all of this by motor vehicles.  Regulations began to appear, California in 1961 (because of the high vehicle population and certain geographical & climatic phenomena, Los Angeles & San Francisco were badly affected by air pollution) passing the first statute and the manufacturers quickly agreed to adopt this standard nationally, fearing other states might begin to impose more onerous laws.  Those however arrived by mid-decade and although there was specific no road-map, few had any doubts the rules would become stricter as the years passed.  The industry’s only consolation was that these laws would be federal legislation so they would need to offer only one specification for the whole country (although the time would come when California would decide things should be tougher and by the 1970s there were “Californian cars” and “49 state cars”).  K Street wasn’t the force then it later became and the manufacturers conformed with (relatively) little protest.

Fuel was still cheap and plentiful but interest in swirl chambers was revived by the promise of cleaner burning engines.  Because it wasn’t new technology, the research attracted little attention outside of the engineering community but in 1970, German-born Swiss engineer Michael May (b 1934) demonstrated a Ford (Cologne) Capri with his take on the swirl chamber in a special cylinder head.  In a nod to the Buick original, May nick-named his head design the “Fireball” (professional courtesy a thing among engineers).  What Herr May had done was add a small groove (essentially a channel surrounding the intake valve) to the chamber, meaning during the last faction of a second of piston movement, the already swirling fuel-air mixture got a final nudge in the right direction: instead of there being a randomness to the turbulence of the mix, the shape was controlled and was thus able to be lower in volume (a smaller fireball) and precisely controlled at the point at which the spark triggered detonation; May called this a “higher swirl”.  Not only did this reduce exhaust emissions but it also cut fuel consumption for a given state of tune so designers could choose their desired path: more power for the same fuel consumption or the same power for less and within a short time, just about the whole world was taking great interest in fuel consumption.

Detail of the original "flathead" cylinder head of the Jaguar V12 (left) and the later "Fireball" head with swirl chambers (right).

A noted use of May’s design was its adoption in 1981 on Jaguar’s infamously thirsty V12 (1971-1997), an innovation celebrated by the addition of the HE (High Efficiency) label for the revised power-plant.  The notion of “high efficiency” was comparative rather than absolute and the V12 remained by most standards a thirsty beast but the improvement could be in the order of 40% (depending on conditions) and it was little worse than the similar displacement Mercedes-Benz V8s of the era which could match the Jaguar for power but not the turbine-like smoothness.  Threatened with axing due to its profligate ways, the swirl chambers saved the V12 and it survived another sixteen years which included two severe recessions.  Debuting even before the Watergate scandal, it lasted until the Monica Lewinsky affair.  In the decades since, computer simulations and high-speed photography have further enhanced the behavior of swirl & turbulence, the small fireballs now contained in the center of the chamber, prevent heat from radiating to the surrounding surfaces, ensuring the energy (heat) is expended on pushing the piston down to being the next cycle, not wasting it by heating metal.  The system is popular also in diesel engines.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified content provider Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.

Thursday, June 22, 2023

Gullwing & Gull-wing

Gullwing & Gull-wing (pronounced guhl-wing)

(1) In aviation, an airplane wing that slants briefly upward from the fuselage and then extends horizontally outward (ie a short upward-sloping inner section and a longer horizontal outer section).

(2) Of doors, a door hinged at the top and opening upward (applied usually to cars but can be used in aviation, aerospace and architecture).

(3) Anything having or resembling (extended and partially extended) wings of a gull (and many other birds).

(4) In electronic hardware, a type of board connector for a small outline integrated circuit (SOIC).

(5) In historic admiralty jargon, a synonym of goose wing (a sail position).

Gull is from the Middle English gulle, from the Brythonic, from the Proto-Celtic wēlannā (seagull) and was cognate with the Cornish guilan, the Welsh gwylan, the Breton gouelan and the Old Irish faílenn.  The noun Gull was used (in a cook-book!) to describe the shore bird in the 1400s, probably from the Brythonic Celtic; it was related to the Welsh gwylan (gull), the Cornish guilan, the Breton goelann; all from Old Celtic voilenno-.  Gull replaced the Old English mæw.

The verb form meaning “to dupe, cheat, mislead by deception" dates from the 1540s, an adaptation by analogy from the earlier (1520) meaning "to swallow", ultimately from the sense of "throat, gullet" from the early 1400s.  The meaning was the idea of someone so gullible to “swallow whatever they’re told”.  As a cant term for "dupe, sucker, credulous person", it’s noted from the 1590s and is of uncertain origin but may be from the verb meaning "to dupe or cheat".  Another possibility is a link to the late fourteenth century Middle English gull & goll (newly hatched bird" which may have been influenced by the Old Norse golr (yellow), the link being the hue of the bird’s down.

Wing was from the late twelfth century Middle English winge & wenge (forelimb fitted for flight of a bird or bat), applied also to the part of certain insects which resembled a wing in form or function, from the Old Norse vængr (wing of a bird, aisle etc) from the Proto-Germanic wēinga & wēingan-.  It was cognate with the Danish vinge (“wing”), the Icelandic vængur (wing), the West Frisian wjuk (wing) and the Swedish vinge (“wing”), of unknown origin but possibly from the Proto-Germanic we-ingjaz ( a suffixed form of the primitive Indo-European root we- (blow), source of the Old English wawan (to blow).  It replaced the native Middle English fither, from the Old English fiþre & feðra (plural (and related to the modern feather)) from the Proto-Germanic fiþriją, which merged with fether, from the Old English feþer, from the Proto-Germanic feþrō).  The meaning "either of two divisions of a political party, army etc dates from circa 1400; the use in the architectures was first recorded in 1790 and applied figuratively almost immediately.  The slang sense of earn (one's) wings is 1940s, from the wing-shaped badges awarded to air cadets on graduation. To be under (someone's) wing "protected by (someone)" is recorded from the early thirteenth century; the phrase “on a wing and a prayer” is title of a 1943 song about landing a damaged aircraft.

A Gull in flight (left), inverted gull-wing on 1944 Voight Corsair (centre) & gull-wing on 1971 Piaggio P.136 (Royal Gull) (right).

In aviation, the design actually pre-dates powered flight (1903) by half a millennium, appearing in the speculative drawings of flying machines by Leonardo da Vinci (1452-1519) and others, an inevitable consequence of being influenced by the flapping wings of birds.  There were experiments, circa 1911, to apply the gull-wing principle to some of the early monoplanes in a quest to combine greater surface area with enhanced strength but it wasn’t until the 1920s it began widely to be used, firstly in gliders for some aerodynamic advantage and later, powered-aircraft.  In powered aircraft, the gull-wing offered little aerodynamically but had structural advantages in that it allowed designers more easily to ensure (1) increasingly larger propellers would have sufficient clearance, (2) undercarriage length could be reduced (and consequently strengthened) and (3) wing-spans could slightly be reduced, a thing of great significance when operations began on aircraft carriers, the gull-wing being especially suited to the folding-wing model.  Depending on the advantage(s) sought, designers used either a classic gull-wing or the inverted gull-wing.  The correct form is for all purposes except when applied to the the (1954-1957) Mercedes-Benz 300 SL coupé is the hyphenated gull-wing; only the 1950s Mercedes-Benz are called Gullwings.

1945 Jamin-Bouffort JB.

Cars with gull-wing doors had been built before Mercedes-Benz started making them at (small) scale and the principle was known in both aviation and marine architecture.  One was the 1945 Jamin-Bouffort JB, the creation of French aeronautical engineer Victor-Albert Bouffort (1912-1995) who had a long history of clever, practical (and sometimes unappreciated) designs.  The Jamin-Bouffort JB was a relatively small three-wheeler built using some of the techniques of construction used in light aircraft, the gull-wing doors the most obvious novelty.  Anticipating the 1950s boom in micro-cars, there was potential but with European industry recovering from the war, most effort was directed to resuming production of pre-war vehicles using surviving tooling and there was little interest in pursuing anything which required development time.  Monsieur Bouffort would go on to design other concepts ahead of their time, some of his ideas adopted by others decades after his prototypes appeared.

Bugatti Type 64 with gull-wing body fabricated using original conceptual sketches on 1939 Type 64 chassis.

In 1939, Jean Bugatti drew up plans for the Type 64, a vehicle with gull-wing doors, his sketches an example of the great interest being shown by European manufacturers in aerodynamics, then usually called streamlining.  Although two Type 64s were completed in 1939, neither used the gull-wing doors and it would be another eighty-odd years before Bugatti’s design was realised when collector & president of the American Bugatti Club, Peter Mullin (b 1941), arranged the fabrication of the coachwork, based on the original drawings.  Built in exactly the same way craftsmen would have accomplished the task in 1939, the body was mounted on the surviving Type 64 chassis (64002), united for the first time with what Jean Bugatti called papillon (butterfly) doors which all (except the French) now call gull-wings.

Mercedes-Benz and the gull-wing.

1952 Mercedes-Benz 300 SL prototype (W194).

By 1951, although the Wirtschaftswunder (the post-war German “economic miracle”) still lay ahead, structural changes (the most important being the retreat by the occupying forces in the western zones from the punitive model initially adopted and the subsequent creation in 1948 of the stable deutschmark), had already generated an economy of unexpected strength and Mercedes-Benz was ready to make a serious return to the circuits.  Because the rules then governing Formula One didn’t suit what it was at the time practical to achieve, the first foray was into sports car racing, the target the ambitious goal of winning the Le Mans 24 hour race, despite a tight budget which precluded the development of new engines or transmissions and dictated the use of as much already-in-production as possible.

1952 Mercedes-Benz 300 SL prototype (W194).

It was the use of a production car engine designed not for ultimate power but smoothness, reliability and a wide torque band which ultimately dictated the use of the gull-wing doors.  The engine was the 3.0 litre (183 cubic inch) M186 straight-six used in the big 300 limousines and its two door derivatives, of advanced design by the standards of the time but also big, tall and heavy, attributes not helpful to race-car designers who prefer components which are compact, light and able to be mounted low in a frame.  A new engine not being possible, the factory instead created a variation, the M194, which used the triple-carburetor induction of the 300S coupés in an improved cylinder head, the innovation being the iron-block now lying at a 50o angle, thereby solving the problem of height by allowing it to be installed while canted to the side, permitting a lower bonnet line.  Using the existing gearbox, it was still a heavy engine-transmission combination, especially in relation to its modest power-output and such was the interest in lightness that, initially, the conventional wet-sump was retained so the additional weight of the more desirable dry-sump plumbing wouldn’t be added.  It was only later in the development-cycle that dry-sump lubrication was added.

1952 Mercedes-Benz 300 SL space-frame (W194).

The calculations made suggested that with the power available, the W194 would be competitive only if lightness and aerodynamics were optimized.  Although the relationship between low-drag and down-force were still not well-understood, despite the scientific brain-drain to the US and USSR (forced and otherwise) in the aftermath of the war, the Germans still had a considerable knowledge-base in aerodynamics and this was utilized to design a small, slippery shape into which the now slanted straight-six would be slotted.  There being neither the time nor the money to build the car as a monocoque, the engineers knew the frame had to be light.  A conventional chassis was out of the question because of the weight and they knew from the pre-war experience of the SSKL how expensive and difficult it was to reduce mass while retaining strength.  The solution was a space-frame, made from tubular aluminum it was light, weighing only between 50-70 kg (110-155 lb) in it’s various incarnations yet impressively stiff and the design offered the team to opportunity to use either closed or open bodies as race regulations required.

However, as with many forms of extreme engineering, there were compromises, the most obvious of which being that the strength and torsional rigidity was in part achieved by mounting the side tubes so high that the use of conventionally opening doors was precluded.  In a race car, that was of no concern and access to the cockpit in the early W194s was granted by what were essentially top-hinged windows which meant ingress and egress was not elegant and barely even possible for those of a certain girth but again, this was though hardly to matter in a race car.  In this form, the first prototypes were built, without even the truck-like access step low on the flanks which had been in the original plans.

1952 Mercedes-Benz 300 SL production coupé (W194).

Actually, it turned out having gull-wing windows instead of gull-wing doors did matter.  Although the rules of motorsport’s pettifogging regulatory body, the Fédération Internationale de l'Automobile (the FIA; the International Automobile Federation) were silent on the types and direction of opening doors and, in the pre-war era they had tolerated some essentially fake doors, their scrutineers still raised objections during inspection for the 1952 Mille Miglia.  The inspectors were forced to relent when unable to point to any rule actually being broken but it was clear they’d be out for revenge and the factory modified the frame to permit doors extending down the flanks, thereby assuming the final shape which would come to define the gull-wing door.  Relocating some of the aluminum tubing to preserve strength added a few kilograms but forestalled any retrospective FIA nit-picking.  To this day, the FIA's legions of bureaucrats seem not to realise why they’ve for so long been regarded as impediments to competition and innovation.

The W194 at Le Mans, 1952.

First tested on the Nürburgring and Hockenheimring in late 1951, the W194, now dubbed 300 SL for promotional purposes, was in March 1952 presented to the press on the Stuttgart to Heilbronn autobahn.  In those happy days, there was nothing strange about demonstrating race cars on public highways.  The SL stood for Super Leicht (super light), reflecting the priority the engineers had pursued.  Ten W194s were built for the 1952 season and success was immediate, second in the Mille Miglia; a trademark 1-2-3 result in the annual sports car race in Bern and, the crowning achievement, a 1-2 finish in the twenty-four hour classic at Le Mans.  Neither usually the most powerful nor the fastest car in the races it contested, the 300 SL nevertheless so often prevailed because of a combination of virtues.  Despite the heavy drive-train, it was light enough not to impose undue stress on tyres, brakes or mechanical components, the limousine engine was tough and durable and the outstanding aerodynamics returned surprising good fuel economy; in endurance racing, reliability and economy compensate for a lot of absent horsepower.

As a footnote (one to be noted only by the subset of word nerds who delight in the details of nomenclature), for decades, it was said by many (even normally reliable sources) that SL stood for Sports Leicht (sports light) and the history of the Mercedes-Benz alphabet soup was such that it could have gone either way (the SSKL (1929) was the Super Sports Kurz (short) Leicht (light)) and from the 1950s on, for the SL, even the factory variously used Sports Leicht and Super Leicht.  It was only in 2017 it published a 1952 paper (unearthed from the corporate archive) confirming the correct abbreviation is Super Leicht.  Sports Leicht Rennsport (Sport Light Racing) seems to be used for the the SLRs because they were built as pure race cars, the W198 and later SLs being road cars but there are references also to Super Leicht Rennsport.  By implication, that would suggest the original 300SL (the 1951 W194) should have been a Sport Leicht because it was built only for competition but given the relevant document dates from 1952, it must have been a reference to the W194 which is thus also a Sport Leicht.  Further to muddy the waters, in 1957 the factory prepared two lightweight cars based on the new 300 SL Roadster (1957-1963) for use in US road racing and these were (at the time) designated 300 SLS (Sports Leicht Sport), the occasional reference (in translation) as "Sports Light Special" not supported by any evidence.  The best quirk of the SLS tale however is the machine which inspired the model was a one-off race-car built by Californian coachbuilder ("body-man" in the vernacular of the West Coast hot rod community) Chuck Porter (1915-1982).  Porter's SLS was built on the space-fame of a wrecked 300 SL gullwing (purchased for a reputed US$500) and followed the lines of the 300 SLR roadsters as closely as the W198 frame (taller than that of the W196S) allowed.  Although it was never an "official" designation, Porter referred to his creation as SL-S, the appended "S" standing for "scrap".

Porsche 356 SL, in pit lane, Le Mans, 1951.

Unlike their neighbours in Stuttgart, Porsche have never had any doubt what they meant by "SL".  After the end of World War II (1939-1945), it took some time for Herr Professor Ferdinand Porsche (1875–1951) to extricate himself from the clutches of the allied authorities which had arrested him as a war criminal but once free, he moved to the town of Gmünd in north-west Austria where he embarked on a project to build his own cars, the first being a small roadster using many components from the Volkswagen (Type 1; Beetle) with which he was familiar.  However, upon consideration of the realities of even small-scale series production and the market potential of various body styles, it was decided to create a rear-engined, closed coupé and in just under three months, Porsche’s small team designed what came to be known as the “Gmünd Coupé”, the first leaving the modest works in June 1948.  By 1950 over 50 had been built (including a half-dozen-odd cabriolets), all with hand-formed aluminum bodies and the layout and shape remains identifiable the in rear-engined (and the mid-engined Boxer and Cayman) Porsches produced in 2025.

Porsche 356 SL, Le Mans, 1951.

When in 1950 Porsche relocated his operation to Zuffenhausen, production resumed with bodies of steel rather than aluminum but eleven of the Gmünd chassis were shipped north and used by the factory for their competition programme; they were converted to Sport Leicht (Sports Light) specification and named 356 SL (internally the 356/2 3000 series), fitted with 1086 cm3 (66 cubic inch), air-cooled, flat four engines (rated at 46 horsepower), enlarged fuel tanks, louvered quarter-window covers, fender skirts (spats), streamlined belly fairings and an aluminium body.  The factory entered three 356 SLs in the 1951 Le Mans 24 hour endurance classic and while two crashed, one won the 751-1100 cm3 class (46-67 cubic inch). 

W194 roadsters at Nürburgring, 1952.

After the triumph of the 300 SL at Le Mans, the team won at the Eifelrennen and was then entered into in a race on the Nürburgring; to shed some weight, engineers converted three of the coupés to roadsters, emulating the body of one of the original ten which had been open-topped from the start.  To avoid any unpleasantness with the FIA, the section of the doors extending into the side of the car was retained and a smaller windscreen was installed to improve aerodynamics and afford the driver some protection from the weather and bugs unfortunate enough to be caught in the path.  The roofectomy reduced weight by about 100 kg (220 lb) which presumably helped, the four finishing in the first four places.

Winning W194, 1952 Carrera Panamericana Mexico, the protective metal struts were an ad-hoc addition after a bird strike.

One final adventure for the year yielded a perhaps unexpected success.  In November 1952, the factory entered two coupés and two roadsters in the third Carrera Panamericana Mexico, a race of 3100 kilometres (1925 miles) over five days and eight stages, their engines now bored out to 3.1 litres (189 cubic inches) increasing power from 175 bhp (130kw) to 180 (135).  The cars finished 1-2-3 although the third was disqualified for a rule violation and the winning car endured the intrusion at speed of a vulture through the windscreen.  Unlike the 300 SL, the unlucky bird didn’t survive.  There was however one final outing for the W194.  In 1955 it won the Rally Stella Alpina, the last time the event would be run in competitive form, one of many cancelled in the wake of the disaster at Le Mans that year in which 84 died and almost two-hundred injured.  Coincidently, that accident involved the W194’s successor, the 300 SLR.

1953 300 SL Prototype.

The 300 SL was re-engineered for the 1953 season, the bodywork now made from magnesium, lighter even than aluminum, the design of which had seen the car return to the wind-tunnel after which it gained a revised front section which not only reduced drag but also improved cooling by optimizing airflow to the radiator and engine compartment.  Power rose too.  Again drawing from wartime experience with the DB60x V12 aero-engine used in many German warplanes, direct fuel-injection was introduced which boosted output from 180 bhp (135 kw) to 215 bhp (158 kW).  Nor were the underpinnings neglected, the rear suspension design improved (somewhat) with the addition of the low-pivot single-joint swing axle (which would later appear on some production 300 SLs) while the transmission was flanged on the rear axle, not quite a transaxle but much improving the weight distribution.  The wheelbase was shortened by 100 millimetres (4 inches) and 16-inch wheels were adopted.  Even disk brakes were considered but the factory judged them years from being ready and it wouldn’t be until 1961 that they appeared on a Mercedes-Benz, more than half a decade after others had proved the technology on road and track.  There was however one exception to that, a disc brake had been installed between propeller shaft and differential on the high-speed truck built in 1954 to carry the Grand Prix cars between the factory and circuits in Europe.

The revised 300 SL however was never raced, the factory’s attention now turning to the Formula One campaign which, with the W196, would so successfully be conducted in 1954-1955, an off-shoot of which would be the W194’s replacement, the W196S sports car which would be based on the Grand Prix machine and dubbed, a bit opportunistically, the 300 SLR (Sport Leicht Rennen (Sport Light-Racing)).  Such was the impression made by the futuristic W194 that it would inspire production of the road-going 300 SL Gullwing (W198), 1400 of which were built during 1954-1957 (including 29 with aluminium bodies).

1955 Mercedes-Benz 300 SL (W194) (1954-1957).

Although the public found them glamorous, the engineers at Mercedes-Benz had never been enamoured by the 300 SL’s gull-wing doors, regarding them a necessary compromise imposed by the high side-structure of the space-frame which supported the body.  Never intended for use on road-cars, it was the guarantee of the US importer of Mercedes-Benz to underwrite the sale of a thousand gull-wing coupés that saw the 300 SL Gullwing enter production in 1954.  The sales predictions proved accurate and of the 1400 built, some 80% were delivered to North American buyers.  The W198 300SL was the model which became entrenched in the public imagination as “the Gullwing” and it’s the only instance where the word doesn’t need to be hyphenated.  Glamorous those doors may have been, they did impose compromises.  The side windows didn’t roll down, ventilation was marginal and air-conditioning didn’t exist; in a hot climate, one really had to want to drive a Gullwing.  There was also the safety issue, some drivers taking the precaution of carrying a hammer in case, in a roll-over, the inability to open the doors made the windscreen the only means of escape and roll-overs were perhaps more likely in a Gullwing than many other machines, the nature of the swing axles sometimes inducing unwanted behavior in what was one of the fastest cars on the road although, in fairness, on the tyres available in the 1950s that was less of an issue than it would become on later, stickier rubber.

In the US in 1956, a "fully optioned" Gullwing would have been invoiced at US$8894.00 and apart from the car itself, some of those options would proved a good investment, items like the knock-off wheels adding by the 2020s at least tens of thousands to the selling price and even the fitted luggage attracts a premium.  The options affecting the mechanical specification (notably the camshaft and choice of final drive ratio) had a significant influence on the character of the car, the former raising the rated horsepower from 220 to 240 and in its more powerful form the top speed would have been 140-155 mph (225-250 km/h) depending on the gearing.  Those serious about speed could opt for the "package" of the aluminum body with "full competition equipment" including the Rudge wheels and upgraded engine & suspension, supplied with two complete axles in a choice of gear ratios.  That package listed at US$9300.00 which in retrospective was another reasonable investment given the aluminum Gullwing from Rudi Klein's "junkyard collection" sold at auction in October 2024 for US$9,355,000 (an that for a vehicle needing restoration).  Still, all things are relative and average annual income in the US in 1956 was about US$3600 and while it was possible to buy what would now be called a "house & land package" for around US$8000, the typical house sold for more than twice that.  The 300 SL would still have been a sound investment because although one would have incurred insurance, maintenance, running and storage costs over the decades, a well maintained, original, 1956 aluminum Gullwing would now sell for well in excess of US$12 million while US$9600 placed in the S&P 500 index would by 2024 be worth some US$9.4 million (assuming all dividends were re-invested).  There have been better investments than aluminum Gullwings but not many.   

Mercedes-Benz 300 SLR (W196S), Stirling Moss & Denis Jenkinson, Mille Miglia, Italy, 1955.

The 300 SLR (W196S) was a sports car, nine of which were built to contest the 1955 World Sportscar Championship.  Essentially the W196 Formula One car with the straight-eight engine enlarged from 2.5 to 3.0 litres (152 to 183 cubic inches), the roadster is most famous for the run in the 1955 Mille Miglia in Italy which was won over a distance of 992 miles (1597 km) with an average speed of almost 100 mph (160 km/h); nothing like that has since been achieved.  There's infamy too attached to the 300 SLR; one being involved in the catastrophic crash and fire at Le Mans in 1955.

1955 300 SLR (W196S “Uhlenhaut” coupé). 

Two of the 300 SLRs were built with coupé bodies, complete with gull-wing doors.  Intended to be used in the 1955 Carrera Panamericana Mexico, they were rendered instantly redundant when both race and the Mercedes-Benz racing programme was cancelled after the Le Mans disaster.  The head of the programme, Rudolf Uhlenhaut (1906-1989), added an external muffler to one of the coupés, registered it for road use (such things were once possible when the planet was a happier place) and used it for a while as his company car.  It was then the fastest road-car in the world, an English journalist recording a top speed of 183 mph (295 km/h) on a quiet stretch of autobahn but Herr Uhlenhaut paid a price for the only partially effective muffler, needing hearing aids later in life.  Two were built (rot (red) & blau (blue), the names based on their interior trim) and for decades they remained either in the factory museum or making an occasional ceremonial appearance at race meetings.  However, in a surprise announcement, in June 2022 it was revealed rot had been sold in a private auction in Stuttgart for a world-record US$142 million, the most expensive car ever sold.  The buyer's identity was not released but it's believed rot is destined for a collection in the Middle East.  It's rumoured also the same buyer has offered US$100 million should an authentic 1929 Mercedes-Benz SSKL ever be uncovered.  

1970 Mercedes-Benz C-111 (1968-1970 (Wankel versions)).

Although the C-111 would have a second career in the late 1970s in a series of 5-cylinder diesel and V8 petrol engined cars used to set long-distance endurance records, its best remembered in its original incarnation as the lurid-colored (safety-orange according to the factory) three and four-rotor Wankel-engined gullwing coupés, sixteen of which were built.  The original was a pure test-bed for the Wankel engine in which so many manufacturers once had so much hope.  The first built looked like a failed high-school project but the second and third versions were both finished to production-car standards with typically high-quality German workmanship.  Although from the school of functional brutalism rather than the lovely things they might have been had styling been out-sourced to the Italians, the gull-winged wedges attracted much attention and soon cheques were enclosed in letters mailed to Stuttgart asking for one.  The cheques were returned; apparently there had never been plans for production even had the Wankel experiment proved a success.  The C-111 was fast, the four-rotor version said to reach 300 km/h (188 mph), faster than any production vehicle then available.

1991 Mercedes-Benz C112.

The C112 was an experimental mid-engined concept car built in 1991.  Designed to be essentially a road-going version of the Sauber-built C11 Group C prototype race car developed for the 1990 World Sports-Prototype Championship, it was powered by the 6.0 litre (366 cubic-inch) M120 V12 used in the R129 SL and C140/W140 S-Class variously between 1991-2001.  The C112 does appear to have been what the factory always claimed it was: purely a test-bed for technologies such as the electronically-controlled spring & damper units (which would later be included on some models as ABC (active body control)), traction control, rear wheel steering, tyre-pressure monitoring and distance-sensing radar.  As an indication it wasn't any sort of prototype intended for production, it offered no luggage space but, like the C111 twenty years earlier, it’s said hundreds of orders were received.  It was 1991 however and with the world in the depths of a severe recession, not even that would have been enough for a flirtation with thoughts of a production model.  After the C112, thoughts of a gull-wing were put on ice for another two decades, the SLR-McLaren (2003-2009) using what were technically “butterfly” door, hinged from the A-pillars.

2011 Mercedes-Benz SLS-AMG (2101-2014).

The factory’s most recent outing of the gull-wing door, the SLS, which used the naturally aspirated 6.2 litre (379 cubic inch) M159 DOHC V8, was produced between (2010-2014), a roadster version also available.  To allay any doubt, it was announced at the time of release that SLS stands for Super Leicht Sport (Super Light Sport) although such things are relative, the SLS a hefty 1600-odd kg (3,500 lb) although, in fairness, the original Gullwing wasn’t that much lighter and the SLS does pack a lot more gear, including windows which can be opened and air-conditioning.  In the way of modern marketing, many special versions were made available during the SLS’s relatively short life, even an all-wheel-drive electric version with a motor for each wheel.  Such is the lure of the gull-wing motif for Mercedes-Benz, it’s unlikely the SLS will be the last and a high-priced revival is expected to become a feature of the marketing cycle every couple of decades but we're unlikely to see any more V8s or V12s unless perhaps as a swan-song, AMG indicating recently they expect their 4.0 litre (244 cubic inch) V8 to remain in production for another ten years, Greta Thunberg (b 2003) and her henchmen the humorless EU bureaucrats permitting.

Lindsay Lohan at the Nicholas Kirkwood (b 1980; shoe designer) catwalk show with a prop vehicle (one of the gull-wing DMC DeLoreans modified closely to resemble the one used in the popular film Back to the Future (1985)), London Fashion Week, 2015.

Tesla Model X with falcon-wing doors.

Such was the allure of the 300 SL’s gull-wing doors that in the shadow they’ve cast for seventy-odd years, literally dozens of cars have appeared with the features, some of questionable aesthetic quality, some well-executed and while most were one-offs or produced only in small runs, there’s been the occasional (usually brief) success and of late some Teslas have been so equipped and with the novelty of them being the back doors, the front units conventionally hinged.  Tesla calls them “falcon wings” because the design was influenced by the bird.  However, the biomimicry was (for obvious reasons) not an attempt to gain the aerodynamic advantages of the falcon’s wing shape which affords exceptional maneuverability in flight but simply an adoption of the specific bone structure.  Unlike the fixed structure of the classic gull-wing door, the Tesla’s falcon-wing is fitted with an additional central joint which permits them to be opened in cramped spaces, aiding passenger ingress and egress.  Some Tesla engineers have however admitted the attraction of them as way to generate publicity and (hopefully) attract sales may have been considered during the design process.

Bricklin SV-1 (1974-1975, left) and DMC DeLorean (1981-1983, right).

Two of the best known of the doomed gull-wing cars were the Bricklin SV1 and the DeLorean, both the creations of individuals with interesting histories.  Malcolm Bricklin’s (b 1939) first flirtation with the automotive business was his introduction into the US market of the Subarus, built by the Japanese conglomerate Fuji Heavy Industries.  Having successfully imported the company’s scooters for some years, the model Mr Bricklin in 1968 chose was the 360, a tiny, egg-shaped device which had been sold in Japan for a decade, the rationale underlying his selection being it was so small and light it was exempt from just about any regulations.  Although really unsuited to US motoring conditions it was (at US$1300) several hundred dollars cheaper than a Volkswagen Beetle and had a fuel consumption around a third that delivered by even the less-thirsty US-built cars so it found a niche and some ten-thousand were sold before that gap in the market was saturated.  Ever imaginative, Mr Bricklin then took his hundreds of unsold 360s and re-purposed them essentially as large dodgem-cars, renting unused shopping-mall car-parks as ad-hoc race tracks and offering “laps” for as little as $US1.00.  He advertised “no speed limits” to attract the youth market but given the little machines took a reported 56 seconds for the 0-60 mph (0-100 km/h) run, reaching any state's legal limit in a car-park would have been a challenge.  Mr Bricklin achieved further success with Subaru’s more conventional (in a front wheel drive (FWD) context) 1000 and the corporation would later buy out his US interests for was thought to be a most lucrative transaction for both parties.

1969 Subaru 360 Deluxe.

His eponymous gull-winged creation was the SV-1 which, although nominally positioned as a “sports car” was marketed also as a “safety-vehicle” (hence the SV).  It certainly contained all of the safety features of the time and in that vein was offered mostly in lurid “high visibility” colors although the prototypes for an up-market “Chairman” version were displayed in more restrained black or white.  It was ahead of its time in one way, being fitted with neither ash-trays nor cigarette lighters, Mr Bricklin not approving of smoking and regarding the distractions of lighting-up while at the wheel a safety hazard.  Whether in stable conditions the car could have succeeded is speculative but the timing was extraordinarily unlucky.  The V8-powered car arrived on the market in 1974 shortly after the first oil shock saw a spike in the price of gasoline and in the midst of the recession and stagflation which followed in the wake.  Between its introduction and demise, the costs of the SV1 more than doubled and there were disruptions to the production process because supply problems (or unpaid bills, depending on who was asked) meant the AMC engine had to be replaced with a Ford power-plant.  By the time production ended, only some 3000 had been built, but, not discouraged, Mr Bricklin would go on to import Fiat sports cars and the infamous Yugo before being involved with a variety of co-ventures with Chinese partners.

1970 Pontiac GTO convertible.

John DeLorean (1925–2005) was a genuinely gifted engineer who emerged as one of the charismatic characters responsible for some of the memorable machines General Motors (GM) produced during its golden age of the 1950s & 1960s.  Under Mr DeLorean’s leadership, Pontiac in 1964 released the GTO which is considered (though contested by some) the first “muscle car” and the one responsible for the whole genre which would flourish for a crazy half-dozen years and in 1969 the Grand Prix which defined a whole market segment.  Apparently, the Grand Prix, produced at a low cost and sold at a high price was one of the most profitable lines of the era.  Given this, Mr DeLorean expected a smooth path to the top of GM but for a variety of reasons there were internal tensions and in 1973 he resigned to pursue his dream of making his own car.  It took years however to reach fruition because the 1970s were troubled times and like the Bricklin SV1, the DeLorean Motor Company’s (DMC) gull-winged DeLorean was released into a world less welcoming than had been anticipated.  By 1981, the world was again in recession and the complicated web of financing arrangements and agreements with the UK government to subsidize production could have worked if the design was good, demand was strong and the product was well-built, none of which was true.

1969 Pontiac Grand Prix 428 SJ.

As the inventory of unsold cars grew, so did the debt and desperate for cash, Mr DeLorean was persuaded (apparently without great difficulty) to become involved in the cocaine trafficking business which certainly offered fast money but his co-conspirator turned out to be an FBI informant who was a career criminal seeking a reduced sentence (on a unrelated matter) by providing the bureau with “a big scalp”.  At trial in 1984, Mr DeLorean was acquitted on all charges under the rule of "entrapment" but by then DMC was long bankrupt.  In the years since, the car has found a cult following, something due more to its part in the Back to the Future films than any dynamic qualities it possessed.  It was competent as a road car despite the rear-engine configuration and the use of an uninspiring power-plant but, apart from the stainless-steel bodywork and of course the doors, it had little to commend it although over the years there have been a number of (ultimately aborted) revivals and plans remain afoot for an electric gull-wing machine using the name to be released in 2025.