Showing posts sorted by relevance for query Delivery. Sort by date Show all posts
Showing posts sorted by relevance for query Delivery. Sort by date Show all posts

Tuesday, July 18, 2023

Delivery

Delivery (pronounced dih-liv-uh-ree (U) or dee-liv-er-ree (non-U))

(1) The carrying and turning over of letters, goods, etc to a designated recipient or recipients.

(2) A giving up or handing over; surrender.

(3) The utterance or enunciation of words.

(4) Vocal and bodily behavior during the presentation of a speech.

(5) The act or manner of giving or sending forth.

(6) The state of being delivered of or giving birth to a child; parturition.

(7) Something delivered.

(8) In commerce, a shipment of goods from the seller to the buyer.

(9) In law, a formal act performed to make a transfer of property legally effective.

(10) In printing, the part of a printing press where the paper emerges in printed form (also called delivery end).

(11) The act of rescuing or state of being rescued; liberation.

(12) In various ball sports, the act or manner of bowling or throwing a ball

(13) In machinery design, the discharge rate of a compressor or pump.

1400–1450: From the late Middle English delyvere & delyvery from the Anglo-Norman delivrée, from the Old French delivrer, from the Latin līberō, from līber (free), from the Old Latin loeber, from the Proto-Italic louðeros, from the primitive Indo-European hléwdheros, from hlewdh- (people) + the prefix de- (from the Latin -, from the preposition (of, from).  It was cognate with the Ancient Greek λεύθερος (eleútheros), the Sanskrit रोधति (ródhati), the Dutch lieden, the German Leute and the Russian люди (ljudi) (people); the Old English æf- was a similar prefix.  The word was a noun use of the feminine past participle of delivrer (to deliver) with the suffix assimilated to –ery.  Delivery, deliverer, deliveree, deliverance & deliverability are nouns, deliver & delivered are verbs & adjectives, deliverable is a noun & adjective, delivering is a noun & verb; the noun plural is deliveries. 

Delivery systems

The definition of delivery systems tends to be elastic, ranging from simple, single-se devices to entire trans-national human and industrial processes.  A hypodermic syringe can be thought a delivery system for a vaccine yet that vital machine is just one, small, inexpensive part in the delivery system for a vaccination programme in response to a pandemic.  Such a global programme demands a delivery system with many human and mechanical components: research, development, testing, multi-jurisdiction legal & regulatory compliance, production, distribution, software, hardware, refrigeration, storage and administration, all before the first nurse has delivered even one injection.

The Manhattan Project's uranium-based Little Boy (left & dropped on Hiroshima) and the plutonium implosion-type Fat Man (right & dropped on Nagasaki).  So confident was the project team in the reliability of the uranium bomb it wasn't tested prior to use while the worlds first nuclear explosion was the "Trinity Test" conducted in the New Mexico desert on 16 July 1945 when a plutonium device was detonated.  For decades, as a uranium device, the Hiroshima was a genuine one-off, all the nuclear weapons built using plutonium but it's possible that more recent entrants to the club such as the DPRK (North Korea) and Pakistan may have been attracted to uranium because of the speed and simplicity of construction. 

Delivery systems can thus be very expensive and it's not uncommon for the cost vastly to exceed whatever it is they were created to deliver.  The Manhattan Project (1942-1947) which produced the first nuclear weapons officially cost some two billion dollars ($US2,000.000,000) at a time when a billion dollars was a lot of money.  Expressed as pre-pandemic (2018-2019) money, the A-bomb project probably cost the equivalent of some US$30 billion and somewhat more once adjusted for recent inflation.  Given the physics and engineering involved, the cost seems not unexceptional but remarkably, the development of the best-known component of the bomb's delivery system was more expensive still.  Between the first studies in 1938 and its eventual commissioning in 1944, Boeing’s B29 Superfortress absorbed over three billion dollars even though, unlike the bomb which was revolutionary and startlingly new, conceptually, the bomber was an evolution of the existing B17.  It was however a collection of challenges in engineering which grew in extent and complexity as the project progressed and it was soon realized the initial specifications would need significantly to be upgraded to produce a machine which reliably could carry the desired bomb-load at the necessary altitude over the vast distances missions in the Pacific would demand.


The Boeing B29 (Enola Gay) used to deliver "Little Boy" to Hiroshima.  It was one of the "Silverplate" run which integrated a number of weight-saving measures and aerodynamic improvements as well as the modified bomb-bay.

It was the B29's engines which were the cause of much of the effort.  Early modelling suggested the use of six or even eight engines was viable in terms of a flyable airframe but the approach would so compromise the range and load capacity it would render the thing useless for the intended purpose so the four-engine configuration had to be maintained.  Jet engines would have been the answer but at that stage of their development, they lacked power, reliability and their fuel consumption was too high so a new piston engine was needed and that it would need to be of larger capacity was obvious.  However, it needed also to be of a design which didn't significantly increase frontal area so the only solution was effectively to couple two engines, one sitting behind the other.  That delivered the necessary power and the weight increase could be tolerated but induced a tendency to overheat because the rearward components received so much less of the cooling air-flow.  What made the consequences of that worse was the use of so much weight-saving but highly combustible magnesium and although ameliorated during development and in service, the inherent problem was never entirely solved and it was only in the post-war years when a different engine was fitted that the issue vanished.  As a quirk of history, although now thought of as the A-bomb's delivery system, the B29 was obviously never designed with it in mind and when the time came, it was found it didn't fit in the bomb-bay.  The Royal Air Force's (RAF) Avro Lancaster could have carried it but the US military declined to consider that option and a special run (the "Silverplates") of B29s was constructed with the necessary modifications.

The 18-cylinder, two-row Wright R-3350 Duplex-Cyclone radial used in the war-time B-29s (left) and the 28-cylinder four-row Pratt & Whitney R-4360 Wasp Major radial adopted post-war.    

However, although a wartime necessity, the big piston engines were a military cul-de-sac but an innovation in the B-29 which was influential was the use of what would now be understood as a "computer-directed" (not "computer-controlled" as is sometimes stated) fire control systems which allowed two crew remotely to operate the four-turret defensive armament.  Systems like that, of which there were a few, were a reason the B-29 venture was so expensive but there have been analysts who have looked at the records of both it and the Manhattan Project and concluded the costs of the latter were probably understated because, as something for years top-secret until the bombing of Hiroshima was announced in August 1945, a significant proportion of the real expenses were charged elsewhere (notably distributed among the military's many other activities) to hide things from congressional view, everyone involved knowing that if something needs to be kept secret, the last people who should be told are politicians.  Estimates of the extent of the accounting slight-of-hand have varied but it has been suggested it may have been as high as 25%.  In industry, such thing are far from unknown.  It's long amused some that the failure of Ford's doomed Edsel (1958-1960) could be attributed to it being little more than a superficial variation of existing Ford & Mercury models (sharing engines, transmissions, platforms & assembly plants) yet when the brand was dumped Ford booked a loss of US$250 million (US$2.6 billion in 2023 dollars).  There were all sorts of corporate advantages in stating the loss as it was done and it involved things like charging the cost of developing one of the engines used wholly against the Edsel programme, even though it would serve in millions of Fords and Mercury models until 1976.

Beware of imitations: The US Boeing B-29 and the Soviet Tupolev Tu-4 clone.

One unintended beneficiary of the huge investment in the B29 was the Soviet Air Force.  Three B29s had fallen into Russian hands after emergency landing on Soviet territory and these, despite repeated requests, Moscow declined to return to their rightful owners, instead taking one apart and meticulously, part-by-part, duplicating every piece and from this, assembled their own which was released as the Tupolev Tu-4 (NATO reporting name: Bull).  In production between 1949-1952, the reverse-engineered clone remained on the active list of the Soviet military until 1964 and some were still in service with the Chinese PLA (People's Liberation Army) in 1987.  Although the Tupolev lacked some of the Boeing's advanced electronics, the Russian engineers managed to deliver an aircraft close in weight to the original despite not have access to some of the more exotic metals although it was later admitted to achieve that there were some compromises in the structural redundancies fitted.

The German V2 (one of the Vergeltungswaffen ("retaliatory weapons" or "reprisal weapons")), the worlds first ballistic missile.  As a delivery system, although inaccurate, even in 1945 it would have been effective had a nuclear warhead been available but its small payload limited its application as a strategic weapon and it was able to be produced at scale only because of the use of expendable slave labor. 

In a more conventional use of the spoils of war, the Americans were also the beneficiaries of the development of someone else's delivery system.  Nazi Germany’s big V2 (A4) rockets were (more-or-less) perfected at a cost which after the war was revealed to be higher even than the official number booked against the Manhattan project and that was not surprising given it was in its way just as ambitious.  In what was a hastily organised effort, the Allied occupation forces in 1945 rushed to grab as much of the material associated with the V2 as they could lay their hands on, train-loads of components, drawings, machine tools and test rigs sent westward from territory which, under the terms agreed at the Yalta Conference (February 1945) were to be handed to the Russian.  Just as significantly, there was a major round-up of German scientists, engineers and technicians who has worked on the V2, most of whom were anxious to be "rounded-up" by the Americans, the alternative being a career in Russian "employment".  The round-up (operation paperclip) remains controversial because matters like a Nazi past or complicity in the use of slave labor were often overlooked if an individual's contribution to the Cold War was thought to be of value and the V2 certainly saved the US from having to spend much money and perhaps a decade or more developing its own delivery system for nuclear warheads and not only were the ICBMs (intercontinental ballistic missiles) lineal V2 descendents, so was the Saturn V delivery system for the Apollo missions which enabled a dozen men to walk on the moon.

Post delivery: Lindsay Lohan's nursery in a theme of aquatic blue & white.

In humans, the female of the species is final component of the delivery system and on 17 Jul 2023 Lindsay Lohan announced she had delivered a baby boy, named Luai (an Arabic name which can be translated as “shield” or “protector”).  The child’s career in commerce has already begun, Ms Lohan partnering with Nestig to design not only her nautically-flavored nursery, but also a collection of baby products inspired by the imagery of the sea.  The nursery is a functional space in that the brand’s Wave dresser is adaptable to dual-use as a changing table and Nestig's cloud crib is modular and may later be converted into a toddler bed.

Lindsay Lohan with Nestig Aviator Mobile.  The aviator mobile was said to be “designed in partnership with Lindsay Lohan” and “handcrafted and hand-assembled by artisans in Brazil from wood, felt and locally-sourced wool” each “thoughtfully packaged in a Nestig gift box” (US$85; attachment arm sold separately).

Tuesday, July 12, 2022

Googly

Googly (pronounced goo-glee)

(1) In cricket, a bowled ball that swerves in one direction and breaks in the other; an off-break bowled with a leg break action.  The delivery was once also known as the bosie or bosie-ball and is now commonly referred to as the wrong'un.

(2) Something protruding; exophthalmic (rare).

(3) A slang term for bulging eyes (archaic).

Circa 1880s: Apparently an invention of modern English but the origin is uncertain.  It may have been from the mid-nineteenth century use of goggle (to stare at admiringly or amorously'' although google was during the same era used to describe the Adam's apple, derived presumably from the sense of eyes which are thought similarly protruding, either literally or figuratively, a meaning alluded to by a popular hero in a contemporary comic strip being named “Goo” (suggesting ''sentimental, amorous'').  Googly (and a number of variant spellings) does appear around the turn of the twentieth century to have been in common use as an adjective applied to the eyes.  The numerical value of googly in Chaldean numerology is 2 and in Pythagorean numerology, 5.  Googly is a noun & adjective; the noun plural is googlies.

Bernard Bosanquet sending down a googly.

In cricket, a googly is a type of delivery bowled by a right-arm leg spin bowler.  It differs from the classic leg spin type delivery, in that the ball is propelled from the bowler's hand in a way that upon hitting the pitch, it deviates in a direction opposite from that the batter expects (ie towards the leg rather than the off stump).  Usually now called the wrong'un, it was once also called the bosie, the eponym referring to English cricketer Bernard Bosanquet (1877-1936) who is believed to have invented the action.  That the googly is Bosanquet’s creation is probably true in the sense that he was the one who practiced the delivery, learning to control and disguise it so it could be deployed when most useful.  However, cricket being what it is, it’s certain that prior to Bosanquet, many bowlers would occasionally (and perhaps unintentionally have bowled deliveries that behaved as googlies but, being something difficult to perfect, few would have been able to produce it on demand.  What Bosanquet, uniquely at the time, did was add it to his stock repertoire which inspired other bowlers to practice.

The “googly problem” also exists in “twister theory”, one of the many esoteric branches of theoretical physics understood only by a chosen few.  In twister theory, the “googly problem” is nerd shorthand for what’s properly known as “the gravitational problem”, an allusion to certain observed behavior differing from that which would be predicted by the mysterious twister theory, rather the same experience suffered by the batter in cricket who finds his leg stump disturbed by a ball he had every reasonable expectation would harmlessly go through to the keeper down the off side.  As one might expect of a work which involves a "googly problem", the author was an English mathematician, the Nobel laureate in physics, Sir Roger Penrose (b 1931).  It's presumed one of his pleasures has been explaining the googly reference to physicists from places not well acquainted with the charms of cricket.

Bosanquet appears to have perfected his googly between 1897-1901 and, as a noun, the word has been attached to it since shortly afterwards, documented in publications in England, Australia and New Zealand from circa 1903.  However, that was just the point at which a certain delivery which was so distinctive to demand an identifier came to be regarded as the definitive googly, a word which had long been used to describe cricket balls which behaved erratically off the pitch, a use perhaps based on the long use of “google-eyed” to refer to a surprised expression (ie those of the bamboozled batter).  Some etymologists have pondered whether the construct might thus be the phonetic goo + guile but few have ever supported this.  Googly was by the late-nineteenth century a well-known word used adjectively to describe spin-bowling which proved effective but there’s no suggestion it implied a particular type of delivery.  Googly and the odd variant seem to have been a way of refereeing to balls which relied on a slow delivery and spin through the air to turn off the pitch as opposed to those bowled fast or at medium pace, using the seam on the ball to achieve any movement.  All the evidence suggests the word was used to convey some unusual behavior.

Match report, day three of the second test of the 1891-1892 Ashes series, published in the Australian Star (Sydney), 1 February 1, 1892.

Here, in the one match report are deliveries described both as being googly (ie used as an adjective) and the googler (a noun) but there’s nothing here or anywhere else to suggest either is anything more specific than a reference to beguiling slow bowling.  Everything suggests the existence of both the noun and adjective was deliberate rather than some sloppy antipodean sub-editing.  Whatever the nuances of Briggs' bowling however, it must have been effective because in the same match he took what was then only the third hat-trick (a bowler taking wickets with three successive balls in the one innings) in Test cricket.  There has been the suggestion the adjectival use (apparently an Australian invention) as "googly ones" was an allusion to the idea of how a cricket ball might behave if egg-shaped, this based on the then widely-used local slang "googie" used for eggs.  Googie was from the Irish and Scottish Gaelic gugaí, gogaí & gogaidh (a nursery word for an egg).  Although wholly speculative, the notion has received support but more popular is the idea it was based on the use of googly in the manner of "googly-eyed", slang for eyes which bulged or were in some way irregular.

Match report of a club game, the Australian Star (Sydney), 1 February 1, 1892.

The report of the test match in 1892 isn’t the first known reference to the googly, the word appearing (again adjectively) in The Leader (Melbourne) on 19 December 1885 in a match report of a club game although, despite noting the bowler’s success in taking two wickets, the writer seemed less than impressed with the bowling.  Although googly is now less common ("wrong'un" now often preferred), it is at least gender-neutral and therefore unlikely to fall foul of the language police and be banned; the fate of batsman, fieldsman and all the positional variations of third man (though "silly point" and other "sillies" are still with us).  Nor is there any hint of the ethnic insensitivity which doomed the “chinaman” (a left-arm unorthodox spin), now dumped in the bin of of words linked with colonial oppression.

Sunday, April 14, 2024

Legside

Legside (pronounced leg-sahyd)

(1) In the terminology of cricket (also as onside), in conjunction with “offside”, the half of the cricket field behind the batter in their normal batting stance.

(2) In the terminology of horse racing, in conjunction with “offside”, the sides of the horse relative to the rider.

Pre 1800s: The construct was leg + side.  Leg was from the Middle English leg & legge, from the Old Norse leggr (leg, calf, bone of the arm or leg, hollow tube, stalk), from the Proto-Germanic lagjaz & lagwijaz (leg, thigh).  Although the source is uncertain, the Scandinavian forms may have come from a primitive Indo-European root used to mean “to bend” which would likely also have been linked with the Old High German Bein (bone, leg).  It was cognate with the Scots leg (leg), the Icelandic leggur (leg, limb), the Norwegian Bokmål legg (leg), the Norwegian Nynorsk legg (leg), the Swedish lägg (leg, shank, shaft), the Danish læg (leg), the Lombardic lagi (thigh, shank, leg), the Latin lacertus (limb, arm), and the Persian لنگ (leng).  After it entered the language, it mostly displaced the native Old English term sċanca (from which Modern English ultimately gained “shank”) which was probably from a root meaning “crooked” (in the literal sense of “bent” rather than the figurative used of crooked Hillary Clinton).  Side was from the Middle English side, from the Old English sīde (flanks of a person, the long part or aspect of anything), from the Proto-Germanic sīdǭ (side, flank, edge, shore), from the primitive Indo-European sēy- (to send, throw, drop, sow, deposit).  It was cognate with the Saterland Frisian Siede (side), the West Frisian side (side), the Dutch zijde & zij (side), the German Low German Sied (side), the German Seite (side), the Danish & Norwegian side (side) and the Swedish sida (side).  The Proto-Germanic sīdō was productive, being the source also of the Old Saxon sida, the Old Norse siða (flank; side of meat; coast), the Danish & Middle Dutch side, the Old High German sita and the German Seite.  Legside is an adjective.

A cricket field as described with a right-hander at the crease (batting); the batter will be standing with their bat held to the offside (there’s no confusion with the concept of “offside” used in football and the rugby codes because in cricket there’s no such rule).

In cricket, the term “legside” (used also as “leg side” or “on side”) is used to refer to the half of the field corresponding to a batter’s non-dominant hand (viewed from their perspective); the legside can thus be thought of as the half of the ground “behind” the while the “offside” is that in front.  This means that what is legside and what is offside is dynamic depending on whether the batter is left or right-handed and because in a match it’s not unusual for one of each to be batting during an over (the basic component of a match, each over now consisting of six deliveries of the ball directed sequentially at the batters), as they change ends, legside and offside can swap.  This has no practical significance except that because many of the fielding positions differ according to whether a left or right-hander is the striker.  That’s not the sole determinate of where a fielding captain will choose to set his field because what’s referred to as a “legside” or “offside” field will often be used in deference to the batter’s tendencies of play.  It is though the main structural component of field settings.  The only exception to this is when cricket is played in unusual conditions such as on the deck of an aircraft carrier (remarkably, it’s been done quite often) but there’s still a legside & offside, shifting as required between port & starboard just as left & right are swapped ashore.

The weird world of cricket's fielding positions.

Quite when legside & offside first came to be used in cricket isn't known but they’ve been part of the terminology of the sport since the rules of the game became formalized when the MCC (Marylebone Cricket Club) first codified the "Laws of Cricket" in what now seem a remarkably slim volume published in 1788, the year following the club’s founding.  There had earlier been rule books, the earliest known to have existed in the 1730s (although no copies appear to have survived) but whether the terms were then is use isn’t known.  What is suspected is legside and offside were borrowed from the turf where, in horse racing jargon, they describe the sides of the horse relative to the rider.  The use of the terms to split the field is reflected also in the names of some of the fielding positions, many of which are self-explanatory while some remain mysterious although presumably they must have seemed a good idea at the time.  One curious survivor of the culture wars which banished "batsman" & "fieldsman" to the shame of being microaggressions is "third man" which continues to be used in the men's game although in women's competition, all seem to have settled on "third", a similar clipping to that which saw "nightwatch" replace "nightwatchman"; third man surely can't last.  The ones which follow the dichotomous description of the field (although curiously “leg” is an element of some and “on” for others) including the pairings “silly mid on & silly mid off” and “long on & long off”, while in other cases the “leg” is a modifier, thus “slip & leg slip” and “gully & leg gully”.  Some positions use different terminology depending on which side of the field they’re positioned, “point” on the offside being “square leg” on the other while fractional variations in positioning means there is lexicon of terms such as “deep backward square leg” and “wide long off” (which experts will distinguish from a “wideish long off”).

Leg theory

Leg theory was a polite term for what came to be known as the infamous “bodyline” tactic.  In cricket, when bowling, the basic idea is to hit the stumps (the three upright timbers behind the batter), the object being to dislodge the bails (the pair of small wooden pieces which sit in grooves, atop the three).  That done, the batter is “dismissed” and the batting side has to send a replacement, this going on until ten batters have been dismissed, ending the innings.  In essence therefore, the core idea is to aim at the stumps but there are other ways to secure a dismissal such as a shot by the batter being caught on the full by a fielder, thus the attraction of bowling “wide of the off-stump” (the one of the three closest to the off side) to entice the batter to hit a ball in the air to be caught or have one come "off the edge" of the bat to be “caught behind”.  It was realized early on there was little to be gained by bowling down the legside except restricting the scoring because the batter safely could ignore the delivery, content they couldn’t be dismissed LBW (leg before wicket, where but for the intervention of the protective pads on the legs, the ball would have hit the wicket) because, under the rules, if the ball hits the pitch outside the line of the leg stump, the LBW rule can’t be invoked.

A batter can however be caught from a legside delivery and as early as the nineteenth century this was known as leg theory, practiced mostly the slow bowlers who relied on flight in the air and spin of the pitch to beguile the batter.  Many had some success with the approach, the batters unable to resist the temptation of playing a shot to the legside field where the fielders tended often to be fewer.  On the slower, damper pitches of places like England or New Zealand, the technique offered little prospect for the fast bowlers who were usually more effective the faster they bowled but on the generally fast, true decks in Australia, there was an opportunity because a fast, short-pitched (one which hits the pitch first in the bowlers half of the pitch before searing up towards the batter) delivery with a legside line would, disconcertingly, tend at upwards of 90 mph (145 km/h) towards the batter’s head.  The idea was that in attempting to avoid injury by fending off the ball with the bat, the batter would be dismissed, caught by one of the many fielders “packed” on the legside, the other component of leg theory.

Leg theory: Lindsay Lohan’s legs.

For this reason it came to be called “fast leg theory” and it was used off and on by many sides (in Australia and England) during the 1920s but it gained its infamy (and the more evocative “bodyline label) during the MCC’s (the designation touring England teams used until the 1970s) 1932-1933 Ashes tour of Australia.  Adopted as a tactic against the Australian batter Donald Bradman (1908–2001) against whom nothing else seemed effective (the English noting on the 1930 tour of England he’d once scored 300 runs in a day off his own bat at Leeds), bodyline became controversial after a number of batters were struck high on the body, one suffering a skull fracture (this an era in which helmets and other upper-body protection were unknown).  Such was the reaction the matter was a diplomatic incident, discussed by the respective cabinets in London and Canberra while acerbic cables were exchanged between the ACBC (Australian Cricket Board of Control) and the MCC.

Japanese leg theory: Zettai ryōiki (絶対領域) is a Japanese term which translates literally as “absolute territory” and is used variously in anime gaming and the surrounding cultural milieu.  In fashion, it refers to that area of visible bare skin above the socks (classically the above-the-knee variety) but below the hemline of a miniskirt, shorts or top.

Japanese schoolgirls, long the trend-setters of the nation's fashions, like to pair zettai ryouiki with solid fluffy (also called "plushies") leg warmers.  So influential are they that the roaming pack in this image, although they've picked up the aesthetic, are not actually real school girls.  So, beware of imitations: Tokyo, April 2024.

High-level interventions calmed thing sufficiently for the tour to continue which ended with the tourists winning the series (and thus the Ashes) 4-1.  The tour remains the high-water mark of fast leg theory because although it continued to be used when conditions were suitable, the effectiveness was stunted by batters adjusting their techniques and, later in the decade, the MCC updated their rule book explicitly to proscribe “direct attack” (ie deliveries designed to hit the batter rather than the stumps) bowling, leaving the judgment of what constituted that to the umpires.  Although unrelated and an attempt to counter the “negative” legside techniques which had evolved in the 1950s to limit scoring, further rule changes in 1957 banned the placement of more than two fielders behind square on the leg side, thus rendering impossible the setting of a leg theory field.  Despite all this, what came to be called “intimidatory short pitched bowling” continued, one of the reasons helmets began to appear in the 1970s and the rule which now applies is that only one such delivery is permitted per over.  It has never been a matter entirely about sportsmanship and within the past decade, the Australian test player Phillip Hughes (1988-2014) was killed when struck on the neck (while wearing a helmet) by a short-pitched delivery which severed an artery.

Tuesday, April 25, 2023

Breadvan

Breadvan (pronounced bred-vann)

(1) A delivery vehicle adapted for carrying loaves of bread or other bakery items for delivery to retail outlets, hotels, cafés etc.

(2) As the Ferrari 250 GT SWB “Breadvan”, a one-off vehicle produced in 1962.

(3) As a descriptor of the “breadvan” style applied to the rear of vehicles to seek aerodynamic advantages in competition, variously applied, mostly during the 1960s.

1840s: The construct was bread + van.  Bread was from the Middle English bred & breed (kind of food made from flour or the meal of some grain, kneaded into a dough, fermented, and baked), from the Old English brēad (fragment, bit, morsel, crumb), from the Proto-West Germanic braud, from the Proto-Germanic braudą (cooked food, leavened bread), from the primitive Indo-European berw- & brew- (to boil, to see).  Etymologists note also the Proto-Germanic braudaz & brauþaz (broken piece, fragment), from the primitive Indo-European bera- (to split, beat, hew, struggle) and suggest bread may have been a conflation of both influences.  It was cognate with the Old Norse brauð (bread), the Old Frisian brad (bread), the Middle Dutch brot (bread) and the German Brot (bread), the Scots breid (bread), the Saterland Frisian Brad (bread), the West Frisian brea (bread), the Dutch brood (bread), the Danish & Norwegian brød (bread), the Swedish bröd (bread), the Icelandic brauð (bread), the Albanian brydh (I make crumbly, friable, soft) and the Latin frustum (crumb).  It displaced the non-native Middle English payn (bread), from the Old French pain (bread), having in the twelfth century replaced the usual Old English word for "bread," which was hlaf.  Van was short for caravan, from the Middle French caravane, from the French caravane, from the Old French carvane & carevane, (or the Medieval Latin caravana), from the Persian کاروان‎ (kârvân), from the Middle Persian (kārawān) (group of desert travelers), from the Old Persian ultimately from the primitive Indo-European ker- (army).  Most famously, the word was used to designate a group of people who were travelling by camel or horse on the variety of routes referred to as the Silk Road and it reached the West after being picked up during the Crusades, from the Persian forms via the Arabic qairawan and connected ultimately to the Sanskrit karabhah (camel).  Breadvan (also as bread-van) is a noun; the noun plural is breadvans.

Horse-drawn breadvan.

The breadvans were first horse-drawn and came into use in the in the 1840s.  These vans were used to transport freshly baked bread from bakeries to homes and businesses in cities and towns in the UK, Europe, North America and Australasia.  The breadvan was adopted as an efficiency measure, being a significant improvement on the traditional system of delivery which was usually a “baker’s boy” carrying baskets of bread on foot to customers.  The horse-drawn breadvans allowed bakers increase production and expand their customer base.  The construction of the vehicles was not particularly specialized but they did need to be (1) waterproof to protect the goods from the elements and (2) secure enough that the bread was accessible either to opportunistic birds, dogs or thieves.

Breadvan: Morris Commercial J-type (1949-1961)

The Breadvans

Enzo Ferrari (1898-1988) is famous for the cars which carry his name but his imperious attitude to customers and employees alike led to a number of them storming out of Maranello and creating their own machines.  Some, like Lamborghini survived through many ups & downs, others like Bizzarrini survived for a while and ATS (Automobili Turismo Sport) produced a dozen exquisite creations before succumbing to commercial reality.  Another curious product of a dispute with il Commendatore was the Ferrari 250 GT SWB “Breadvan”.

Ferrari 250 GT SWB “Breadvan”.

Ferrari’s 250 GTO became available to customers in 1962 and one with his name on the list was Count Giovanni Volpi di Misurata (b 1938), principal of the Scuderia Serenissima Republica di Venezia (ssR) operation but when Enzo Ferrari found out Volpi was one of the financial backers of the ATS project, he scratched the count’s name from the order book.  Through the back channel deals which characterize Italian commerce. Volpi did obtain a GTO but he decided he’d like to make a point and decided to make something even better for his team’s assault on the 1962 Le Mans 24 hour endurance classic.  In the ssR stable was what was reputed to be the world’s fastest Ferrari GT SWB (serial number 2819 GT) and it was decided to update this to to a specification beyond even that of the GTO, a task entrusted to Giotto Bizzarrini (1926-2023) who, with remarkable alacrity, performed the task in the workshops of noted coachbuilder Piero Drogo (1926–1973).

The Breadvan leading a 250 GT SWB, the car on which it was based.

The changes were actually quite radical.  To obtain the ideal centre of gravity, the 3.0 litre (180 cubic inch) V12 engine was shifted 4¾ inches (120 mm) rearward, sitting entirely behind the front axle and mounted lower, something permitted by the installation of a dry sump lubrication system.  Emulating the factory GTO, six downdraft, twin choke Weber carburetors sat atop the inlet manifold, the tuned engine generating a healthy 300 bhp.  Given the re-engineering, on paper, the car was at least a match for the GTO except it lacked the factory machine’s five-speed gearbox, running instead the standard four-speed.  What really caught the eye however was body Bizzarrini’s striking bodywork, the sharp nose so low Perspex cover had to be fabricated to shield the cluster of a dozen velocity stacks of the Webers that protruded above the bonnet-line.  Most extraordinary however was the roofline which extended from the top of the windscreen to the rear where it was sharply cut-off to create what remains perhaps the most extreme Kamm-tail ever executed.

The count was impressed, the creation matching the GTO for power while being 100 kg (220 lb) lighter and aerodynamically more efficient.  Accordingly he included it in the three-car Ferrari team he assembled for Le Mans along with the GTO and a 250 TR/61 and it appearance caused a sensation, the French dubbing it la camionnette (little truck) but it was the English nickname “Breadvan” which really caught on.  To a degree the count proved his point.  Under pressure from Ferrari the organizers forced the Breadvan to run in the prototype class against pure racing cars rather than against the GTOs in the granturismo category but in the race, it outpaced the whole GT field until, after four hours, a broken driveshaft forced its retirement.  It was campaigned four times more in the season scoring two GT class victories and a class track record before being retired.  Volpi sold the car in 1965 for US$2,800 and its current value is estimated to be around US$30 million.  It remains a popular competitor on the historic racing circuit.

1965 Ford GT40 Mark I (left) 1966 Ford J Car (centre) & 1967 Ford GT40 Mark IV (right).

As well as the ATS, Lamborghinis and Bizzarrinis, Enzo Ferrari’s attitude to those who disagreed with him also begat the Ford GT40, a well-known tale recounted in the recent film Ford vs Ferrari (20th century Fox, 2019).  In 7.0 litre (427 cubic inch) form, the GT40 Mark II won at Le Mans in 1966 but Ford’s engineers were aware the thing was overweight and lacked the aerodynamic efficiency of the latest designs so embarked on a development program, naming the project the “J Car”, an allusion to the “Appendix J” regulations (one of the FIA’s few genuinely good ideas) with which it conformed.  The aluminum honeycomb chassis was commendably light and, noting the speed advantage gained by the Ferrari Breadvan in 1962, a similar rear section was fabricated and testing confirmed the reduction in drag.  Unfortunately, the additional speed it enabled exposed the limitation of the breadvan lines: Above a certain speed the large flat surface acted like an aircraft’s wing and the ensuing lift provoked lethal instability and in one fatal crash in testing, the lightweight chassis also proved fragile.  The “J Car” and its breadvan was thus abandoned and a more conventional approach was taken for both the chassis and body of the GT40 Mark IV and it proved successful, in 1967 gaining the second of Ford’s four successive victories at Le Mans (1966-1969).

Ford Anglia, Rallye Monte-Carlo, 1962 (left) & Ford Anglia "breadvans" built for New Zealand Allcomer racing during the 1960s. 

Marketing opportunity for niche players: A Lindsay Lohan breadvan, Reykjavik, Iceland.

In the US, Ford spent millions of dollars on the GT40’s abortive breadvan but in New Zealand, it doubtful the amateur racers in the popular “Allcomers” category spent very far into three figures in the development of their “breadvans”.  In the Allcomer category, the “breadvans” (again a nod to the Count Volpi’s 1962 Ferrari) were Ford Anglias with a rear section modified to gain some aerodynamic advantage.  The English Ford Anglia (1959-1968) had an unusual reverse-angle rear-window, a design chosen to optimize the headroom for back-seat passengers.  That it did but it also induced some additional drag which, while of no great consequence at the speeds attained on public roads, did compromise the top speed, something of great concern to those who found the little machines were otherwise ideal for racing.  In the spirit of improvisation for which New Zealand Allcomer racing was renowned, “breadvans” soon proliferated, fabricated variously from fibreglass, aluminum or steel (and reputedly even paper-mache although that may be apocryphal) and the approach was successful, Anglias competitive in some forms of racing well into the 1970s by which time some had, improbably, been re-powered with V8 engines.

Wednesday, March 9, 2022

Mankadding

Mankading (pronounced man-kad-ing)

In cricket, the action of the bowler, during his delivery, effecting a run-out of the non-striking batsman.

1947: Named after Indian all-rounder Mulvantrai Himmatlal Mankad (1917-1978 and usually styled Vinoo Mankad) who ran out Australian batsman Bill Brown (1912-2008) on 13 December 1947 in the second test (Australia v India, Sydney).

Vinoo Mankad, Lords, 1952

The mechanics of a Mankad is that as a bowler enters his delivery stride, the non-striking batsman usually leaves the crease and moves towards the other end of the wicket meaning it will take him less time to reach the other end if he and his batting partner choose to attempt a run.  If the non-striking batsman leaves the crease before the bowler has actually delivered the ball, the bowler may, rather than bowling the ball to the batsman on strike, use the ball to dislodge the bails at the non-striker’s end, thereby running-out the non-striker (said to be "out of his ground").  A long-standing convention is that, on the first instance of the bowler noticing it, he should warn the batsman to return behind the crease.  This has always been understood as a convention; nowhere is it mentioned either in the ICC’s (the International Cricket Council (the old Imperial Cricket Conference)) the Laws of the Game nor the MCC’s (Marylebone Cricket Club) guidance notes on the spirit of cricket.

Mankad’s example of this method of dismissal became so famous as to become eponymous.  During the second Australia v India test at the Sydney Cricket Ground (SCG), on 13 December 1947, Mankad ran out Bill Brown, the second time the bowler had dismissed the batsman in this fashion on their tour, having done it in an earlier match against an Australian XI and on that occasion he had first warned Brown.  There was, at the time, some unfavorable comment in the press suggesting bad sportsmanship but most, including the Australian Captain, Sir Donald Bradman (1908-2001), defended Mankad and there seems to be a consensus that, given the history and having been warned, Brown was a bit of dill for trying it on again; even Brown agreed.  Since this incident, a batsman dismissed in this fashion is said to have been "Mankaded".

It’s since been a troublesome thing although the ICC has made attempts to clarify things, essentially by defining when the bowler may Mankad.  By 2011 rule 42.11 read:  The bowler is permitted, before releasing the ball and provided he has not completed his usual delivery swing, to attempt to run out the non-striker. Whether the attempt is successful or not, the ball shall not count as one of the over. If the bowler fails in an attempt to run out the non-striker, the umpire shall call and signal dead ball as soon as possible."  Mysteriously to some, but very much in the tradition of cricket, under the ICC's rules, at this time, the Mankad remained defined both as "lawful" and "unfair" which of course favored the bowler and in 2014, the World Cricket Council, an independent consultative body of former international captains and umpires, commenting on the issue, unanimously expressed a lack of sympathy with batsman.  The Laws of Cricket were reissued in October 2017 with the relevant clause renumbered 41.16, permitting Mankading up to "…the instant when the bowler would normally have been expected to release the ball".

The latest (an presumably the last) attempt came as part of a new set of laws announced by the MCC in March 2022, to take effect in October.  Probably reflecting the reality imposed by 20/20 cricket in which margins tend to be tight and risky runs accepted as an essential part of the game, Mankad dismissals will no longer be considered unfair play.  In the 20/20 game, Mankading has been far from the rare thing it remains in the longer forms; in that fast and furious world, a concept like "unfair" must seem something from a vanished world.  This the ICC seems to accept, explaining the rule revision by saying the Mankad "...is a legitimate way to dismiss someone and it is the non-striker who is stealing the ground. It is legitimate, it is a run-out and therefore it should live in the run-out section of the laws."

That should be the end of what has for decades been controversial: something within the rules but thought not in the spirit of the game.  That's always been explained by "unfairness" in this context being something subjective, the argument being that if a non-striker was out of his ground by an inch or two, then to Mankad him was unfair whereas if he's blatantly cheating by being several feet down the pitch, then (especially if he's already been warned), the Mankad is fair enough.  One can see the charm of that approach but the inherent problem always was where to draw the line and the ICC has finally removed all doubt: while the ball is in play, if the bails are dislodged by the ball and the batsman is not behind his crease, he will be given out.  The Mankad is now just another run-out.

The four instances of Mankading in test cricket

Bill Brown by Vinoo Mankad, Australia v India, Sydney, 1947–1948

Ian Redpath by Charlie Griffith, Australia v West Indies, Adelaide, 1968–1969

Derek Randall by Ewen Chatfield, England v New Zealand, Christchurch, 1977–1978

Sikander Bakht by Alan Hurst, Pakistan v Australia, Perth, 1978–1979

Thursday, December 8, 2022

Bollard

Bollard (pronounced bol-erd)

(1) In nautical use, a thick, low post, usually of iron or steel, mounted on (1) the deck of a ship and (2) a wharf or the like, to which mooring lines from vessels are attached.

(2) Any small post to which lines, ropes etc are attached.

(3) A short post or block, usually deployed in an array and designed to exclude or divert motor vehicles from a road, lawn, pedestrian space etc, either as part of routine traffic management or as a security or anti-terrorism device (can be permanent or temporary).

(4) In mountaineering, an outcrop of rock or pillar of ice that may be used to belay a rope.

1300s: From the From Middle English bollard, the construct probably the Middle English bole (tree trunk) + -ard.  Bole was a mix of the Old English bula & bulla and the Old Norse boli, both from the Proto-Germanic bulô, from the primitive Indo-European bhel (to blow, swell up).  The –ard suffix was from the Middle English -ard, from the Old French -ard (suffix), from the Frankish -hard (hardy, bold), from the Proto-Germanic harduz (hard); it was used as a pejorative or diminutive suffix).  In 1844 it came to be used (first in the merchant marine, later by the Admiralty) to describe the strong, upright posts built into docks for fixing hawsers for mooring ships and after 1948 it began to be used in reference to the traffic control devices.  By the late 1950s, it was the word of choice to describe any upright device used either as a tethering point for ropes and cords or to restrict or direct vehicular or other traffic.  The security bollard (constructs in concrete or metal sufficiently large to prevent a vehicle from passing) began to appear in numbers as early as the 1940s although the specific phrase wasn’t in wide use until the 1980s in response to the increasing use of cars and trucks as delivery systems for large improvised explosive devices (IED).  Other derived terms include traffic bollard (a conical plastic device in distinctive colors used temporarily to divert motor vehicle traffic or to surround obstacles or dangerous sites) and bollard condition (the state of a ship with a propeller operating only to the extent of permitting near zero-speed maneuvering when moored).  Bollard is a noun; the noun plural is bollards.

Pedestrian crossing in Pompeii, Italy.  Most of the city was buried under volcanic ash and pumice when Mount Vesuvius erupted in 79 AD.

Structurally, bollards were part of Roman urban roadways although the function was different.  Roman pedestrian crossings used essentially the same design as today's zebra crossings except what we see as the white lines were elevated slabs of granite, allowing people to cross the road without their feet having to touch the mud and muck (Roman sewerage systems, though advanced by the standards of Antiquity, were neither as extensive nor as reliable modern machinery) which would often sit or flow through the streets and horse manure was ever-present.  The gap between the slabs was such that the wheels of horse-drawn or hand carts would fit between.

Arco di Settimio Severo (1742), oil on canvas by Canaletto (Giovanni Antonio Canal, 1697–1768), a significant painter of the eighteenth century Venetian school.

Built in the imperial capital in the traditional Roman way with travertine, white marble & brick, held together using concrete mixed with their famously sticky cement, the Arco di Settimio Severo (Arch of Septimius Severus) sits at one end of the Roman Forum and was dedicated in 203 to commemorate capture of Ctesiphon in 198 by Lucius Septimius Severus (145–211; Roman emperor 193-211).  Like many dictators (ancient & modern), Lucius had a fondness of triumphal architecture into which his name could be carved, another Arch of Septimius Severus built in Leptis Magna (his city of birth (now in present-day Libya)).  Discovered as a ruin in 1928, it was re-constructed by Italian colonial archeologists and architects.

It’s not known when the line of bollards (visible through the arch in Canaletto’s painting from 1842) was installed although it’s more likely to be an aspect of Renaissance town planning than anything Medieval.  Although speculative, it’s thought the spacing between the bollards indicates the intention was to deny access to heavy traffic (ie anything horse-drawn) while permitting hand carts (an essential part of the home-delivery economy) to pass.

The catwalk re-imagined: Lindsay Lohan walking between the bollards for one of her well-publicized (and not infrequent) court appearances during her "troubled Hollywood starlet" phase, Los Angeles, February 2011.  The legal matters involved set no precedents and it was in that sense not a notable case but the white piece was a Glavis Albino bandage dress from Kimberly Ovitz's (b 1983) pre-fall collection (which listed at US$575) and almost as soon as the photographs appeared on-line, it sold-out so there’s was that.  Here the bollards are used as stringers for the yellow plastic "Police Line: DO NOT CROSS" do not pass" tape and the same function is served by the stanchions used for the velvet ropes which define the limits for photographers at red-carpet events.

Dealing with terrorism is of necessity a reactive business and in Western cities, bollards sometimes appeared within hours of news of the use of motor vehicles somewhere as an instrument of murder, either as a delivery system for explosives or brute-force device to run down pedestrians.  Because of the haste with which the things were deemed needed, it wasn’t uncommon for bollards initially to be nothing but re-purposed concrete blocks (left), often not even painted, the stark functionality of purpose limited to preventing vehicular access which permitting those on foot to pass with minimal disruption.  They’ve since become a fixture in the built environment, often is stylized shapes (centre & right) and urban designers have been inventive, many objects which function as bollards not recognizably bollardish, being integrated into structures such as city furniture or bus shelters.

LEDA Security's rendering of some of the possibilities of bollards as engineered street furniture. Where the space is available, even small green spaces can be installed and, with integrated drip-feed irrigation systems, maintenance is low.  It was beneath one of these installations Barnaby Joyce (b 1967; thrice (between local difficulties) deputy prime minister of Australia 2016-2022) was filmed while sprawled on the ground, conducting a late-night, profanity-laced (though quite friendly) telephone conversation with his (second) wife, the mother of two of his six children.  It was later confirmed Mr Joyce had been drinking.

Hard-working bollards doing their job at the liquor store.