Showing posts sorted by date for query Ratio. Sort by relevance Show all posts
Showing posts sorted by date for query Ratio. Sort by relevance Show all posts

Friday, September 12, 2025

Vogue

Vogue (pronounced vohg)

(1) Something in fashion at a particular time or in a particular place.

(2) An expression of popular currency, acceptance, or favor.

(3) A highly stylized modern dance that evolved out of the Harlem ballroom scene in the 1960s, the name influenced by the fashion magazine; one who practiced the dance was a voguer who was voguing.

(4) In Polari, a cigarette or to light a cigarette (often in the expression “vogue me up”).

(5) The world's best known women's fashion magazine, the first issue in 1892 and now published by Condé Nast.

1565–1575: From the Middle English vogue (height of popularity or accepted fashion), from the Middle French vogue (fashion, success (literally, “wave or course of success”)), from the Old French vogue (a rowing), from voguer (to row, sway, set sail), from the Old Saxon wegan (to move) & wogōn (to sway, rock), a variant of wagōn (to float, fluctuate), from the Proto-Germanic wagōną (to sway, fluctuate) and the Proto-Germanic wēgaz (water in motion), wagōną (to sway, fluctuate), wēgaz (water in motion) & weganą (to move, carry, weigh), from the primitive Indo-European weǵh- (to move, go, transport (and an influence on the English way).  The forms were akin to the Old Saxon wegan (to move), the Old High German wegan (to move), the Old English wegan (to move, carry, weigh), the Old Norse vaga (to sway, fluctuate), the Old English wagian (to sway, totter), the Proto-West Germanic wagōn, the German Woge (wave) and the Swedish våg.  A parallel development the Germanic forms was the Spanish boga (rowing) and the Old Italian voga (a rowing), from vogare (to row, sail), of unknown origin and the Italianate forms were probably some influence on the development of the verb.  Vogue, voguie & voguer are nouns (voguette an informal noun), voguing is a noun and adjective, vogued is a verb and vogueing & voguish are adjectives; the noun plural is vogues.  The noun voguie is a special use and is a synonym of fashionista ((1) one who creates or promotes high fashion (designers, editors, models, influencers etc) or (2) one who dresses according to the trends of fashion, or one who closely follows those trends).

All etymologists seem to concur the modern meaning is from the notion of being "borne along on the waves of fashion" and colloquially the generalized sense of "fashion, reputation" is probably from the same Germanic source.  The phrase “in vogue” (having a prominent place in popular fashion) was recorded as long ago as 1643.  The fashion magazine (now owned by Condé Nast) began publication in 1892 and young devotees of its advice (they are legion) are voguettes.  In linguistics, vogue words are those words & phrases which become suddenly (although not always neologisms) popular and fade from use or becoming clichéd or hackneyed forms (wardrobe malfunction; awesome; problematic; at this point in time; acid test; in this space; parameters; paradigm etc).  Because it’s so nuanced, vogue has no universal synonym but words which tend to the same meaning (and can in some circumstances be synonymous) include latest, mod, now, rage, chic, craze, currency, custom, fad, favor, mode, popularity, practice, prevalence, style, stylishness, thing, trend & usage.

Lindsay Lohan cover, Vogue (Spanish edition), August 2009.

In Regional English, "vogue" could mean "fog or mist" and in Cornwall, the hamlet of Vogue in the parish of St Day gained its name from the Medieval Cornish vogue (a word for a medieval smelting furnace (ie "blowing house", the places generating much smoke)); civilization contributing to the increase in atmospheric concentrations of greenhouse gasses is nothing new.  Clearly better acquainted with trademark law than geography, in early 2022 counsel for Condé Nast sent a C&D (cease and desist letter) to the inn-keeper of the village’s The Star Inn at Vogue pub, demanding the place change its name to avoid any public perception of a connection between the two businesses.  The owners of the venerable pub declined the request (cheekily suggesting they might send their own C&D to Vogue demanding the publication find a new name on the basis of usurpation (an old tort heard before the Court of Chivalry).  Condé Nast subsequently apologized, citing insufficient investigation by their staff, a framed copy of their letter hung on the pub's wall.  Honor apparently satisfied on both sides, the two Vogues resumed the peaceful co-existence which had prevailed since 1892. 

1981 Range Rover In Vogue from the first run with the standard stylized steel wheels (left) and a later 1981 In Vogue with the three-spoke aluminum units.

Much of the 1970s was spent in what to many felt like a recession, even if there were only some periods in some places during which the technical definition was fulfilled and the novel phenomenon of stagflation did disguise some of the effects.  Less affected than most (of course) were the rich who had discovered a new status-symbol, the Range Rover which, introduced in 1970 had legitimized (though there were earlier ventures) the idea of the "luxury" four-wheel-drive (4WD) segment although the interior of the original was very basic (the floor-coverings rubber mats rather than carpets on the assumption that, as with the even more utilitarian Land Rovers, there would be a need to "hose out" the mud accumulated from a day's HSF (huntin', shootin' & fishin')), the car’s reputation built more on it's then unique blend of competence on, and off-road.  So good was the Range Rover in both roles that owners, used to being cosseted in leather and walnut, wanted something closer to that to which they were accustomed and dealers received enquiries about an up-market version.

Lindsay Lohan at the opening of the Ninety years of Vogue covers exhibition, Crillon Hotel, Paris, 2009.

That had been Rover’s original intention.  The plan had been to release a basic version powered by four cylinder engines and a luxury edition with a V8 but by 1970 time and development funds had run out so the car was released with the V8 power-train and the more spartan interior although it was quickly apparent few owners took advantage of being able to hose out the mud.  Indeed, so skewed was the buyer profile to urban profiles it's likely the only time many ventured off the pavement was to find a good spot in the car parks of polo fields.  In something which must now seem remarkable, although already perceived as a "prestige" vehicle, for the first decade-odd, the Range Rover was not available with either air-conditioning or an automatic transmission.  However, if the rich were riding out the decade well, British Leyland (which owned Rover) was not and it lacked the capital to devote to the project.  Others took advantage of what proved a profitable niche and those with the money (or spending OPM (other people's money) could choose from a variety of limited-production and bespoke offerings including LWB (long-wheelbase) models, four-door conversions, six wheelers and even open-topped versions from a variety of coach-builders such as Wood & Pickett and low-volume manufacturers like Switzerland’s Monteverdi which anticipated the factory by a number of years with their four-door coachwork.

Rendez-vous à Biarritz, Vogue magazine, March 1981.  The eight page advertising supplement was for Lancôme and Jaeger fashion collections, the Wood & Pickett-trimmed Range Rover a "backdrop" which would prove a serendipitous piece of product placement. 

British Leyland was soon subject to one of the many re-organizations which would seek (without success) to make it a healthy corporation and one consequence was increased autonomy for the division making Range Rovers.  No longer compelled to subsidize less profitable arms of the business, attention was turned to the matter of a luxury model, demand for which clearly existed.  To test market reaction, in late 1980, the factory collaborated with Wood & Pickett to design a specially-equipped two-door model as a proof-of-concept exercise to gauge market reaction.  The prototype (HAC 414W) was lent to Vogue magazine, a crafty choice given the demographic profile of the readership and the by then well-known extent of women’s own purchasing power and influence on that of their husbands.  Vogue took the prototype to Biarritz to be the photographic backdrop for the images taken for the magazine’s co-promotion of the 1981 Lancôme and Jaeger fashion collections, published in an eight-page advertising spread entitled Rendez-vous à Biarritz in the March 1981 edition.  The response was remarkable and while Lancôme and Jaeger’s launch attracted polite attention, Vogue’s mailbox (which then received letters in envelopes with postage stamps) was overwhelmingly filled with enquiries about the blinged-up Range-Rover (although "bling" was a linguistic generation away from use).

Vogue's Range Rover In Vogue (HAC 414W) in Biarritz, 1981, all nuts on board or otherwise attached.  The model name was a play on words, Range Rovers very much "in vogue" and this particular version substantially the one "in Vogue".

Rover had expected demand to be strong and the reaction to the Vogue spread justified their decision to prepare for a production run even before publication and the Range Rover In Vogue went on sale early in 1981, the limited-edition run all closely replicating the photo-shoot car except for the special aluminum wheels which were not yet in volume production.  Amusingly, the triple-spoke wheels (similar to the design Ford had used on the 1979 (Fox) Mustang) had been a problem in Biarritz, the factory supplying the wrong lug nuts which had a tendency to fall off, meaning the staff travelling with the car had to check prior to each shoot to ensure five were present on each wheel which would appear in the picture.  Not until later in the year would the wheels be ready so the In Vogue’s went to market with the standard stylized steel units, meaning the brochures had to be pulped and reprinted with new photographs and some small print: "Alloy wheels, as featured on the vehicle used by Vogue magazine will be available at extra cost through Unipart dealers later in 1981".  British Leyland's record-keeping was at the time as chaotic as much of its administration so it remains unclear how many were built.  The factory said the run would be 1,000, all in right hand drive (RHD) but many left hand drive (LHD) examples exist and it’s thought demand from the continent was such another small batch was built although this has never been confirmed.  The In Vogue’s exclusive features were:

Light blue metallic paint (the model-exclusive Vogue Blue) with wide body stripes in two shades of grey (not black as on the prototype).
High compression (9.35:1) version of the V8 (to provide more torque).
Higher high-gear ratio (0.996:1) in the transfer box (to reduce engine speed and thus noise in highway driving).
Air conditioning
Varnished walnut door cappings.
Armrest between the front seats.
Map pockets on the back of the front seats (the rationale for not including the folding picnic tables so beloved by English coach-builders being the design of the Range Rover's rear tailgate had made it the "de-facto picnic table".
Fully carpeted luggage compartment.
Carpeted spare wheel cover and tool-kit curtain.
Picnic hamper.
Stainless steel tailgate cap.
Black wheel hub caps.


The "fitted picnic hamper".

Condé Nast would later describe the In Vogue’s custom picnic hamper as the car’s "pièce de résistance". which might have amused Rover's engineers who would have put some effort into stuff they'd have thought "substantive".  Now usually written in English as "piece de resistance" (masterpiece; the most memorable accomplishment of one’s career or lifetime; one's magnum opus (great work)), the French phrase pièce de résistance (literally the "piece which has staying power") seems first to have appeared in English in Richard Cumberland (1732–1811) novel Arundel (1789).  One can see the writer's point.  Although the walnut, additional torque and certainly the air conditioning would have been selling points, like nothing else, the picnic hamper would have delighted the target market.

Demand for the In Vogue far exceeded supply and additional production runs quickly were scheduled.  In response to customer demand, the most frequently made request was acceded to, the second series available with Chrysler's robust TorqueFlite automatic transmission, introduced at the same time as the debut of a four-door version, another popular enquiry while the three-spoke wheels became standard equipment and equipment levels continued to rise, rear-head restraints fitted along with a much enhanced sound-system.  In what was perhaps a nod to the wisdom of the magazine's editors, although a cooler replaced the hamper for the second run, for the third, buyers received both cooler and hamper.  The third series, launched in conjunction with the Daks autumn fashion collection at Simpson's of Piccadilly, included a digital radio, the convenience of central locking and the almost unnoticed addition of front mud flaps so clearly there was an understanding that despite the Range Rover's well deserved reputation as a "Chelsea taxi", the things did sometimes see the mud and ladies didn't like the stuff getting on their dresses as they alighted.  In 1984, as "Vogue", it became the regular production top-of-the-range model and for many years served in this role although, for licencing reasons, when sole in the US it was called the "Country").  For both companies, the In Vogue and subsequent Vogues turned out to be the perfect symbiosis.

Art and Engineering

Vogue, January 1925, cover art by Georges Lepape.

From the start, Vogue (the magazine) was of course about frocks, shoes and such but its influence extended over the years to fields as diverse as interior decorating and industrial design.  The work of Georges Lepape (1887-1971) has long been strangely neglected in the history of art deco but he was a fine practitioner whose reputation probably suffered because his compositions habitually were regarded as derivative or imitative which seems unfair given there are many who are more highly regarded despite being hardly original.  His cover art for Vogue’s edition of 1 January 1925 juxtaposed one of French artist Sonia Delaunay’s (1885–1979) "simultaneous" pattern dresses and a Voisin roadster decorated with an art deco motif.

1927 Voisin C14 Lumineuse.

One collector in 2015 was so taken with Pepape’s image that when refurbishing his 1927 Voisin C14 Lumineuse (literally “light”, an allusion to the Voisin’s greenhouse-inspired design which allowed natural light to fill the interior), he commissioned Dutch artist Bernadette Ramaekers to hand-paint a geometric triangular pattern in sympathy with that on the Vogue cover in 1925.  Ms Ramaekers took six months to complete the project and when sold at auction in London in 2022, it realized Stg£202,500.  There are few designers as deserving of such a tribute as French aviation pioneer Gabriel Voisin (1880–1973) who made military aircraft during the First World War (1914-1918) and, under the name Avions Voisin, produced a remarkable range of automobiles between 1919-1939, encapsulating thus the whole inter-war period and much of the art deco era.  Because his designs were visually so captivating, much attention has always been devoted to his lines, curves and shapes but the underlying engineering was also interesting although some of his signature touches, like the (briefly in vogue) sleeve valve engine, proved a mirage.

Voisin's extraordinary visions:  1934 C27 Aérosport (left), 1934-1935 Voisin C25 Aérodynes (centre) & 1931 C20 Mylord Demi Berline (right).

Also a cul-de-sac was his straight-12 engine.  Slow-running straight-12 (there is even a straight-14 which displaces 25,340 litres (1,546,000 cubic inches) and produces 107,290 hp (80,080 kW)) engines are known at sea where they’re used in (very) big ships but on the road (apart from some less than successful military vehicles), only Voisin and Packard ever attempted them, the former making two, the latter, one.  Voisin’s concept was simple enough; it was two straight-6s joined together, end-on-end, the same idea many had used to make things like V12s (2 x V6s) straight-8s (2 x straight-4s) H16s (two flat-8s, one atop another) and even V24s (2 x V12s) but the sheer length of a straight-12 in a car presented unique problems in packaging and the management of the torsional vibrations induced by the elongated crankshaft.  Straight-12s were built for use in aircraft (Bristol's Type 25 Braemar II in 1919 using four of them!) where the attraction was the aerodynamic advantage conferred by the small frontal area but as engine speeds increased in the 1920s, so did the extent of the problem of crankshaft flex and the concept was never revived.

1934 Voisin C15 Saloit Roadster (left) and the one-off Packard straight-12, scrapped when the decision was taken not to proceed to production (right).

The length of the straight-12 meant an extraordinary amount of the vehicle’s length had to be devoted to housing just the engine and that resulted in a high number for what designers call the dash-to-axle ratio.  That was one of the many reasons the straight-12 never came into vogue and indeed was one of the factors which doomed the straight-8, a configuration which at least had some redeeming features.  Voisin must however have liked the appearance of the long hood (bonnet) because the striking C15 Saloit Roadster (which could have accommodated a straight-12) was powered by a straight-4, a sleeve valve Knight of 2500 cm³ (153 cubic inch).  The performance doubtlessly didn’t live up to the looks but so sensuous were those looks that many would forgive the lethargy.  The concept of a short engine in a lengthy compartment was revived by Detroit in the 1960s & 1970s, many of the truly gargantuan full-sized sedans and coupes built with elongated front & rear structures.  At the back, the cavernous trunks (boots) often could swallow four sets of gold clubs which would have had some appeal to the target market but much of the space under the hood was unused.  While large enough to accommodate a V16, the US industry hadn't made those since the last of the Cadillac V16s left the line in 1940 after a ten-year run.  While one of the reasons the V8 had supplanted the straight-8 was its relatively compact length, that virtue wasn't needed by the late 1950s when, in all directions, the sheet-metal grew well beyond what was required by the mechanical components, the additional size just for visual impact to enhance the perception of prestige and luxury in an era when bigger was better.  Dramatic though the look could be (witness the 1969 Pontiac Grand Prix), the packaging efficiency was shockingly wasteful.

The Dart which never was

Using one of his signature outdoor settings, Norman Parkinson (1913-1990) photographed model Suzanne Kinnear (b 1935) adorning a Daimler SP250, wearing a Kashmoor coat and Otto Lucas beret with jewels by Cartier.

The image appeared on the cover (left) of Vogue's UK edition in November 1959, the original's (right) color being "enhanced" in the Vogue pre-production editing tradition (women thinner, cars shinier).  The "wide" whitewall tyres were a thing at the time, even on sports cars and were a popular option on US market Jaguar E-Types (there (unofficially) called XK-E or XKE) in the early 1960s.  The car on the Vogue cover was XHP 438, built on prototype chassis 100002 at Compton Verney in 1959; it's the oldest surviving SP250, the other two prototypes (chassis 100000 & 100001 from 1958) dismantled when testing was completed.  XHP 438 was the factory's press demonstrator and was used in road tests by Motor and Autocar magazines before being re-furbished (motoring journalists subjecting the press fleet to a brief but hard life) and sold.  Uniquely, when XPH 438 was first registered in England, it was as a "Daimler Dart".

More Issues Than Vogue sweatshirt from Impressions.

There was however an issue with the "Dart" name.  The SP250 was first shown to the public at the 1959 New York Motor Show and there the problems began.  Aware the little sports car was quite a departure from the luxurious but rather staid line-up Daimler had for years offered, the company had chosen the pleasingly alliterative “Dart” as its name, hoping it would convey the sense of something agile and fast.  Unfortunately, Chrysler’s lawyers were faster still, objecting that they had already registered Dart as the name for a full-sized Dodge so Daimler needed a new name and quickly; the big Dodge would never be confused with the little Daimler but the lawyers insisted.  Imagination apparently exhausted, Daimler’s management reverted to the engineering project name and thus the car became the SP250 which was innocuous enough even for Chrysler's attorneys and it could have been worse.  Dodge had submitted their Dart proposal to Chrysler for approval and while the car found favor, the name did not and the marketing department was told to conduct research and come up with something the public would like.  From this the marketing types gleaned that “Dodge Zipp” would be popular and to be fair, dart and zip(p) do imply much the same thing but ultimately the original was preferred and Darts remained in Dodge’s lineup until 1976, for most of that time one of the corporation's best-selling and most profitable lines.  Cynically, the name was between 2012-2016 revived for an unsuccessful and unlamented FWD (front-wheel-drive) compact sedan.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.  That’s not a new sentiment, being one philosophers and others have for millennia expressed in various ways although since the advent of nuclear weapons, concerns understandably become heightened.  Like every form of military technology ever deployed, once the “genie is out of the bottle” the problem is there to be managed and at the dawn of the atomic age, delivering a lecture in 1936, the British chemist and physicist Francis Aston (1877–1945) (who created the mass spectrograph, winning the 1922 Nobel Prize in Chemistry for his use of it to discover and identify the isotopes in many non-radioactive elements and for his enunciation of the whole number rule) observed:

There are those about us who say that such research should be stopped by law, alleging that man's destructive powers are already large enough.  So, no doubt, the more elderly and ape-like of our ancestors objected to the innovation of cooked food and pointed out the great dangers attending the use of the newly discovered agency, fire.  Personally, I think there is no doubt that sub-atomic energy is available all around us and that one day man will release and control its almost infinite power.  We cannot prevent him from doing so and can only hope that he will not use it exclusively in blowing up his next door neighbor.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified content provider Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

Benito Mussolini (1883-1945; Duce (leader) & Prime-Minister of Italy 1922-1943) standing beside his “portrait bust” (1926).

The bust was carved by Swiss sculptor Ernest Durig (1894–1962) who gained posthumous notoriety when his career as a forger was revealed with the publication of his drawings which he’d represented as being from the hand of the French sculptor Auguste Rodin (1840-1917) under whom he claimed to have studied.  Mussolini appears here in one of the subsequently much caricatured poses which were a part of his personality cult.  More than one of the Duce's counterparts in other nations was known to have made fun of some of the more outré poses and affectations, the outstretched chin, right hand braced against the hip and straddle-legged stance among the popular motifs. 

“Portrait bust” in marble (circa 1895) of (1815-1989; chancellor of the German Empire (the "Second Reich") 1871-1890) by the German Sculptor Reinhold Begas (1831-1911).

 In sculpture, what had been known as the “portrait statue” came after the 1690s to be known as the “portrait bust” although both terms meant “sculpture of upper torso and head” and these proved a popular choice for military figures because the aspect enabled the inclusion of bling such as epaulettes, medals and other decorations and being depictions of the human figure, busts came to be vested with special significance by the superstitious.  In early 1939, during construction of the new Reich Chancellery in Berlin, workmen dropped one of the busts of Otto von Bismarck by Reinhold Begas, breaking it at the neck.  For decades, the bust had sat in the old Chancellery and the building’s project manager, Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), knowing Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) believed the Reich Eagle toppling from the post-office building right at the beginning of World War I had been a harbinger of doom for the nation, kept the accident secret, hurriedly issuing a commission to the German sculptor Arno Breker (1900–1991) who carved an exact copy.  To give the fake the necessary patina, it was soaked for a time in strong, black tea, the porous quality of marble enabling the fluid to induce some accelerated aging.  Interestingly, in his (sometimes reliable) memoir (Erinnerungen (Memories or Reminiscences) and published in English as Inside the Third Reich (1969)), even the technocratic Speer admitted of the accident: “I felt this as an evil omen”.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.

Saturday, May 17, 2025

Combat

Combat (pronounced kuhm-bat or kom-bat (verb); kom-bat (noun))

(1) To fight or contend against; vigorously to oppose.

(2) In military matters, certain parts of branches of the services which engage in armed conflict with enemy forces.

(3) An action fought between two military forces.

(4) As a descriptor (in the military and of weapos and weapons systems), a means to distinguish between an item design specifically for use in combat as oppose to one intended for other purpose.

1535-1540: From the Middle English intransitive verb combat (to fight, struggle, contend), from the sixteenth century French combat, from the twelfth century Old French combattre, from the Late Latin combattere, the construct being com (with (each other) (an an archaic form of cum)) + battuere (to beat, fight) (source of the modern English verb "batter").  The transitive sense dates from the 1580s; the figurative use from the 1620s.  The noun combat (a fight (originally especially "a fight between two armed persons" and later distinguished as single combat in the 1620s)), emerged in the 1560s and soon was applied in a general sense to "any struggle or fight between opposing forces".  Combat is a noun, verb & adjective, combater & combatant are nouns, combatted & combatting are verbs and combative is an adjective; the noun plural is combats.

Combative and dressed for combat: Lindsay Lohan in boxing gloves.

The phrase hors de combat (out of action; disabled; no longer able to fight (literally "out of combat")) was constructed from hors (out, beyond), from the Latin foris (outside (literally "out of doors")) + de (of) + combat.  It dates from 1757 and was related originally to battlefield conduct (the principle of which which would later be interpolated into the the rules of war) and is now a literary and rhetorical device.  It shouldn't be confused with the French expression hors concours (out of competition) which, dating from 1884, is applied to works of art in an exhibition but not eligible to be awarded a prize.  Given the sometimes nasty battles waged in galleries, perhaps hors de combat might sometimes be as appropriate but in exhibitions it's most often used of works which have either already won a prize or have been awarded the maximum number provided for in the competition rules.  Other sporting competitions sometimes use hors concours to describe entries which don't conform with the rules of the event but are for a variety of reasons permitted to run (notably in motorsport).  The adjective combative (pugnacious, disposed to fight) is from 1819 and by the mid nineteenth century had become much associated with the long discredited pseudo-science of phrenology, the related forms being combatively and the earlier (1815) combativeness.  Combatant (contending, disposed to combat) was an adjective by the mid fifteenth century and a noun (one who engages in battle) by circa 1855, both from the Old French combatant (which survives in Modern French as combattant) (skilled at fighting, warlike) where it had also been a noun.    The adjective combative (pugnacious, aggressive; disposed to engage in conflict (though not necessarily violence)) seems not pleasing to some because the incorrect spelling combatative is not uncommon.

The Combat: Woman Pleading for the Vanquished, oil on canvas by William Etty (1787-1849), National Gallery of Scotland.

Unusually for works in this tradition, The Combat is not a depiction of a historical or mythological event but a kind of morality tale exploring “the beauty and quality of mercy”.  Structurally, the picture is of a woman clutching a warrior who, with sword raised, seems poised to inflict a fatal strike on his fallen foe whose own blade lies shattered on the ground, the woman begging he be spared.  Praised for its technical accomplishment The Combat also attracted the criticism the ahistorical piece seemed just another of the artist’s opportunistic pretexts for painting more nude figures, long his favourite motif, but the painter dismissed the carping, reminding critics such imaginative works had a tradition dating from Antiquity, the Romans calling that school of composition “the Roman Visions, works not having their origin in history or poetry”.  Mr Etty certainly made a seminal contribution to the genre and he’s regarded as the first English painter of any skill to produce a substantial number of nudes, something which, predictably, has overshadowed his catalogue of estimable still lifes.  His life was solitary and in some ways strange and in much of the popular press his output was damned as “indecent” but when in 1828 proposed for membership of the Royal Academy, he was elected, defeating no less than John Constable 1776–1837) by 18 votes to five so his fellow artists rated him highly.  

The Norton Commando 750 Combat

1968 Kawasaki 500 Mach III (H1).

British manufacturers once regarded competition from the far-east with little concern but by the late 1960s, Japanese motorcycles had become serious machines enjoying commercial success.  Kawasaki’s 500cm3 (H1, Mach III) two-stroke triple debuted in 1968 while Honda’s 750-Four was released a year later, the former fast but lethally unstable, the latter more refined.  Three years on, the release of Kawasaki’s 900 cm3 Z1 confirmed the maturity of the Japanese product and the era of British complacency was over though the realization was too late to save the industry.

Nothing ever quite matched the rawness of the original Kawasaki Mach III.  Riders of high performance machines had for decades distinguished between fast, well-balanced motorcycles and those which, while rapid, needed to be handled with caution if used in anything but a straight line and on a billiard table smooth surface but even in those circumstances the Mach III could be a handful, the engine's power band narrow and the entry to it sudden and explosive.  Probably the best comparison was something like the BRM grand prix car (1947-1955) which used a supercharged 1.5 litre (91 cubic inch) V16; it was only marginally responsive under 8000 rpm but at that point suddenly delivered its extraordinary power which could be as much as 500-600 horsepower.  Many Mach III owners were soon noting while rear tyre life was short, the front lasted well because it spent so little time in contact with the road.  Adding to the trickiness, lacking the rigidity needed to cope with such stresses, the frame design meant there was something of a gyroscopic tendency under hard acceleration which could be at least disquieting and the consequences were often worse.  Still, nobody denied they were quick.  Clearly, only crazy people would buy such a thing but fortunately for Kawasaki (and presumably this was part of the product planning), by 1968 the Western world was populated as never before with males aged 17-25 (peak craziness years) with sufficient credit or disposable income to indulge the madness of youth.  It helped that under the Bretton Woods system (1944) of fixed exchange rates, at ¥360 to the US$, the Mach III was quite a bargain; on cost breakdown, nothing on two wheels or four came close and even at the time it was acknowledged there really were two identifiable generations of Mach IIIs: the ones built between 1968-1972 and those from 1973 until 1975 when production ended.  Not only was the power-band made a little wider (at the expense of the top-end) but a disk front brake was added, the swing-arm was extended and the frame geometry improved; while this didn’t wholly tame the challenging characteristics created by putting what was then the world’s most powerful two-stroke engine in what was essentially the light and not especially still frame used for their 350, it did mean the later Mach IIIs were a little more forgiving and not quite as quick.

1973 Kawasaki 750 Mach IV (H2).

As a design, the Mach III obviously had its flaws but as a piece of engineering, it exhibited typical Japanese soundness and attention to detail.  They borrowed much and while little was genuinely innovative, they had started with a clean sheet of paper and buyers found, unlike the British bikes, electrics were reliable and mechanical parts were not subject to the oil-leaks which the British had for decades claimed were endemic to the breed; far-eastern engineering was now mass-producing bikes a generation or more advanced.  However, the British industry was chronically under-capitalized so, lacking resources to develop new models, resorted to "improving" existing models.  While they were doing that, the Japanese manufacturers moved on and Kawasaki were planning something which would match the Mach III for performance but deliver it in a more civilized (and safer) manner.  This project was a four-stroke, four cylinder 750, developed while the Mach III was being toned down (a little) while the good idea of a broader power band and a (slightly) stiffer frame was used on the Mach IV (750 H2), the ultimate evolution of the two-stroke triple which delivered best of the the Mach III experience while (somewhat) taming the worst of its characteristics.

1969 Honda 750-Four "Sandcast".  The crankcases of the early 750s are referred to as being sandcast but they were actually gravity cast.  The production method for the first batch was was chosen because of uncertainty about demand.

However, in 1969 Honda, the largest in the Japanese industry and the company which in 1964 had stunned Formula One community when their 1.5 litre V12 car won a Grand Prix, released the motorcycle which threatened the very existence of the new big Kawasaki and the four-stroke Honda 750-Four was for a generation to set the template for its genre, as influential for big motorcycles as the Boeing 707 had in 1957 been for commercial airliners.  Kawasaki reviewed this disturbing intrusion on their planning, concluding the Honda was a touring machine and that the Mach III had proved there was demand machines orientated more to high-performance.  The board looked at the demographic charts and decided to proceed, enlarging their project to 900cm3 which, with double overhead camshafts (DOHC) was tuned more for top-end power than the more relaxed, single cam (SOHC) Honda.  Released in 1972, almost a year after the Mach IV, the Z1 attracted praise for its quality and performance, all delivered while offering a stability the charismatic but occasionally lethal triples never approached.  Internally, Kawasaki did their bit to ensure a good reception for the Z1 by making sure it was just a bit quicker than the Mach IV over a quarter mile, the 750 never tuned to the extent possible although as some found, more horsepower quickly and cheaply was available.    

1973 Kawasaki Z1.

The big Nortons, named Commando since 1967, had long been a benchmark for high-performance motorcycles and although the Mach III had (on paper) matched its speed, its handling characteristics were such that it could really be enjoyed only in a straight line and even then, was best handled by cautious experts.  The Honda 750-Four and Kawasaki Z1 were both vastly better as road machines and clearly the future of the breed.  The long-serving big British twins, while their handling was still impeccable, were now outdated, no longer offered a performance premium and still leaked oil.  Norton’s response in 1972 was the hastily concocted Commando Combat, the engine tweaked in the usual British manner with a high compression ratio, bigger carburetors, larger ports and a high-lift, long-duration camshaft.  These modifications, while the orthodox approach for racing engines, are not suitable for the road and the “peaky” Combat’s only advantage was great top-end power though it was noted the clever isolastic engine mounting did work well to limit the extent to which the greater vibration transmitted through the frame.  Unfortunately, the gains high in the rev-range compromised the low and mid-range performance, just where a road-bike most often operates.  Indeed, at points, the torque-curve actually went the wrong way and the only obvious way to disguise this was to lower the gearing which (1) restricted the top-speed to something embarrassing low and (2) meant even cruising speeds demanded high engine revolutions.  Sadly, it wasn’t possible for all long to enjoy the pleasures of all that power because the Combat's specification exposed weaknesses in pistons, bearings and crankshafts.  In some cases, main bearing life could be as little as 4000 miles (7000 km) but a small number of engines succumbed to other failures long before.  As a consolation, even if the Combat wouldn’t keep going, it was easy to stop, the front disk brake (designed by Norton and built by Lockheed, it used a hard chrome-plated cast-iron rotor because the heat-dissipation qualities were superior to stainless steel) was among the best in the industry.

So the most of the things that were changed made things worse.  Other things stayed the same including the oil leaks (the joke being seals existed to keep the dirt out, not the fluids in) and the absence of electric starting, the right legs of Norton owners reputedly more muscular than the left.  For the engine's problems the solution lay in engineering and metallurgy, a combination of a self-aligning spherical roller bearing called a superblend and un-slotted pistons.  But, by the time things were fixed, the fiasco had had triggered irreparable damage to market perceptions and Norton quietly dropped the Combat, applying the improvements to their mainstream engines without trying to match its top-end power.  Despite the reputation, there are owners (many of whom with great success used their Combats in competition) who reported sterling reliability from their machines and the consensus is it was only a relatively small number of Combat engines which failed but in mass-production, a well-publicized consumer-level failure rate well under 5% is enough to create  reputational damage.  Norton went bankrupt within a few years but the name has been revived several times over the past decades.

For those who can remember how things used to be done: 1972 Norton Commando 750 Combat Roadster (left) and 1972 Norton Commando 750 Combat Interstate (with custom drilled disk brake, right).

Introduced in 1972, the Interstate model was a response (as the name suggests) to US demand and was distinguished by the larger fuel tank, some of the additional capacity gained by removing the scalloped knee indentations seen on the Roadsters (which used a 2.2 imperial gallon (10 litre, 2.6 US gallon) tank.  The early Interstates were fitted with a 5.25 imperial gallon (23.9 litre, 6.30 US gallon) unit but in mid-year this was enlarged to a 5.5 imperial gallon (25 litre, 6.6 US gallon) device, the latter size carried-over as an option when in 1973 the Commando 850 was introduced and this remained available until production ended in 1977, by which time only a handful of Roadsters were leaving the line.

1954 Norton Dominator 500 (left), 1967 Norton Atlas 750 (centre) and 1972 Norton Commando 750 Combat (right).

When introduced in 1949, the 497 cm3 (30.3 cubic inch) parallel twin was as good an engine as any then available on two wheels and a great success but that popularity was ultimately what doomed Norton in the 1970s.  Over the years enlarged and tuned for more power, it proved adaptable to new frame designs and was an engine which kept Norton in the front rank of high-performance motorcycles but in not even half a decade between 1968-1972, the manufacturers in the Far East advanced further than the British industry had achieved in twenty years.  In 1967, well aware of the antiquity of the machinery from which they were coaxing another generation, Norton's management had been surprised at both the positive critical reception to the Commando and the volume of orders being received and for a while the immediate feature looked bright.  It perhaps could have been because the clever Isolastic engine mounting system had made it possible to absorb much of the big twin's chronically insoluble vibrations before they reached the rider and the Commando was a rewarding ride but what it should have been was a stop-gap while something better was developed.  Instead, it proved but a stay of execution.

Isolastic-era advertising: The agencies never depicted women riding Norton Commandos but they were a fixture as adornments, usually with lots of blonde hair and a certain expression.  One reason they may not have been suitable to use as riders was the phenomenon known as “helmet hair” (in idiomatic use, the effects of helmet wearing on those with “big hair”), which, upon removing helmet, manifested either as an unintended JBF or a bifurcated look in which the hair above the shoulders was flattened against the scalp while that beneath sat as the wind had determined.  There was also the challenge of kick-starting the big twins, the long-overdue electric-start not installed until 1975.