Showing posts sorted by date for query Stack. Sort by relevance Show all posts
Showing posts sorted by date for query Stack. Sort by relevance Show all posts

Wednesday, October 9, 2024

Decker

Decker (pronounced dek-er)

(1) Something (typically a bus, ship, aircraft, bed, sandwich et al), having a specified number of decks, floors, levels, layers and such (used usually in combination with a numerical or other expression indicating the number in the construction (double decker, triple decker, upper decker, five decker etc (sometimes hyphenated).

(2) As “table decker” an employee who “decks” (ie sets or adorns) a table used for entertaining (used also as a “coverer”) (archaic).  The idea lives on in the verb “bedeck” (to adorn).

(3) In boxing slang, a fighter with a famously powerful punch, able to “deck” an opponent (ie knock them to the canvas with a single punch).

(4) In historic naval slang, as “quarter-decker”, a label applied to officers known more for their attention to matters of etiquette or trivial regulations than competent seamanship or ability in battle.  It was an allusion to a warship’s “quarter deck” (the part of the spar-deck of a man-of-war (warship) between the poop deck and main-mast (and originally (dating from the 1620s), a smaller deck above the half-deck, covering about a quarter of the vessel’s LOA (length overall)).  In many navies, the quarter-deck was reserved as “a promenade for officers only”.

1785–1795: The construct was deck + -er.  Deck in this context was from the Middle English dekke (covering extending from side to side over part of a ship), from a nautical use of the Middle Dutch decke & dec (roof, covering), from the Middle Dutch decken, from the Proto-Germanic thakam (source also of the noun “thatch” and from the primitive Indo-European root steg & teg- (to cover) and the Old Dutch thecken, from the Proto-West Germanic þakkjan, from the Proto-Germanic þakjaną and related to the German Decke (covering, blanket).  The –er suffix was from the Middle English –er & -ere, from the Old English -ere, from the Proto-Germanic -ārijaz, thought most likely to have been borrowed from the Latin –ārius where, as a suffix, it was used to form adjectives from nouns or numerals.  In English, the –er suffix, when added to a verb, created an agent noun: the person or thing that doing the action indicated by the root verb.   The use in English was reinforced by the synonymous but unrelated Old French –or & -eor (the Anglo-Norman variant -our), from the Latin -ātor & -tor, from the primitive Indo-European -tōr.  When appended to a noun, it created the noun denoting an occupation or describing the person whose occupation is the noun.  The noun double-decker was first used in 1835 of ships with two decks above the water line and this extended to land transport (trains) in 1867.  Decker is a noun & adjective; the noun plural is deckers.

Flight deck of the US Navy's Nimitz-class aircraft carrier USS Carl Vinson (CVN 70).

The reason ships, trains, buses, aircraft and such have "decks" while buildings have "floors” or “stories (or storeys)” is traceable to nautical history and the nomenclature used in shipbuilding.  English picked up “deck” from the Middle Dutch decke & dec (roof, covering) where the use had been influenced by the Old Norse þekja (to cover) and in early shipbuilding, a “deck” was the structure which covered the hull of the ship, providing both a horizontal “working surface” and enclosing the vessel, creating a space for stores, cargo or accommodation which was protected from the elements.  In that sense the first nautical decks acted as a “roof”.  As ships became larger, the nautical architects began to include multiple decks, analogous with the floors of buildings in that they fulfilled a similar function, providing segregated layers (ie the storeys in buildings) used for cannons, crew quarters, storage and such.  As the terminology of shipbuilding became standardized, each deck came to have a specific name depending on its purpose or position (main deck, flight deck, poop deck, gun deck etc).

Ford Mustang convertible (1965–1973) replacement floor pan (complete, part number 3648B) by Moonlight Drive Sheet Metal.

Until the nineteenth century, although the vehicles used on land became larger, they tended to get longer rather than higher but the advent of steam propulsion made possible trains which ran on railways and these could pull carriages carrying freight or passengers.  The first “double decker” versions appeared in France in 1867 and were described as voitures à imperial, (imperial cars) were used on the Chemin de Fer de l'Ouest (Western Railway), the upper deck roofless and thus an “open-air experience”,  Rapidly, the idea spread and double-deck carriages became common for both long-distance and commuter services.  An outlier in the terminology is car design; cars have a floor (sometimes called the “floor pan”) rather than a deck, presumably because there’s only ever one.  In the narrow technical sense there have been cars with “two floors” but they were better understood as a “double-skinned” single floor and they were used for armor or to provide a space for something specialized such as hydrogen fuel-cells, the technique often called “sandwich construction”.

Boeing 314 Clipper flying boat cutaway (left) and front schematics of Boeing 747-300 (right).  Re-using some of an earlier design for a bomber which failed to meet the military’s performance criteria, between 1938-1941, Boeing built twelve 314 Clippers, long-range flying boats with the range to cross both the Atlantic and Pacific oceans.  Although used by the military during World War II, most of their service was with the two commercial operators Pan Am (originally Pan American Airways) and BOAC (British Overseas Airways Corporation).  Very much a machine of the pre-war age, the last Clippers were retired from service between 1946-1948, the advances in aviation and ground infrastructure built during war-time rendering them obsolete and prohibitively expensive to maintain.

Because train designers adopted the nautical terminology, it naturally came to be used also in buses, and aircraft, the term “flight deck” (where the pilot(s) sat) common even before multiple decks appeared on flying boats and other long-distance airframes.  The famous “bubble” of the Boeing 747 (1968-2023) remains one of the best known decks and although most associated with the glamour of first-class international travel, was designed originally as a freight compartment.  The multi-deck evolution continued and the Airbus A380 (2005-2021) was the first “double decker” with two passenger decks extending the full length of the fuselage (with cargo & baggage) carried in the space beneath hence the frequent description of the thing as a “triple decker”.

Lindsay Lohan contemplating three decker sandwich, now usually called a “club sandwich”.  Many menus do specify the number of decks in the clubs.

Deck widely was used of many raised flat surface which people could walk or stand upon (balcony, porch, patio, flat rooftop etc) and came to be used of the floor-like covering of the horizontal sections or compartments, of a ship, a use later extended to land transport (trains, busses etc) and in the twentieth century, to aircraft.  A pack or set of playing cards can be called a deck as (less commonly), can the dealt cards which constitute the “hand” of each player and the notion was extended to sets of just about anything vaguely similar (such as a collection of photographic slides). , Because slides tended to be called a “deck” only when in their magazine, this influenced the later use in IT when certain objects digitally were assemble for storage or use and in audio and video use when cartridges or cassettes were loaded into “tape decks”.  In print journalism, a deck is a headline consisting of one or more full lines of text (applied especially to a sub-headline).  The slang use in the trade of illicit narcotics to describe the folded paper used for distributing drugs was a US regionalism.  There are dozens of idiomatic and other uses of deck, the best known including “all hands on deck”, “swab the decks”, “hit the deck” “clear the decks”, “deck-chair”, “deckhand”, “deck shoes”, “flight deck”, “gun deck”, “observation deck”, “play with a full deck”, “promenade deck”, “re-arranging the deck chairs on the Titanic”, “decked out”, “stack the deck”, “sun deck”, “top deck” & “to deck someone”.

Schematic of the Royal Navy’s HMS Victory, a 104-gun first-rate ship of the line, laid down in 1759 and launched in 1765, most famous as the flagship of Admiral Lord Nelson’s (1758-1805) flagship at the Battle of Trafalgar on 21 October 1805; it was on her Nelson was killed in battle.  Uniquely, after 246 years on the active list, she is the world's oldest naval vessel still in commission.  Although the term wasn’t in use until the 1830s, Victory was a “five decker” configured thus:

Orlop Deck: The lowest deck, mainly used for storage and ship's equipment.
Lower Gun Deck: The deck housing the heaviest cannons.
Middle Gun Deck: This deck contained another set of guns, slightly lighter than those on the lower gun deck.
Upper Gun Deck: The third level of guns, with even lighter cannons.
Quarterdeck and Forecastle: The uppermost decks, where the captain and officers usually directed the ship during battle.

The early meanings in English evolved from “covering” to “platform of a ship” because of the visual similarity and it’s thought the idea of a deck being a “pack of cards” (noted in the 1590s) was based on them being stacked like the decks of a multi-deck man-of-war (warship).  The tape-deck was first so described in 1949 an was a reference to the flat surface of the old reel-to-reel tape recorders.  The first deck chairs were advertised in 1844, an allusion to the use of such thing on the decks of passenger ocean liners and deck shoes were those with sturdy rubber soles suitable for use on slippery surfaces; the modern “boat shoes” are a descendent.  The old admiralty phrase “clear the decks” dated from the days of the tall-masted warships (the best known of which was the big “ship-of-the-line”) and was a reference to the need to remove from the main deck the wreckage resulting from an attack (dislodged masts, sails, spas etc) to enable the battle to be rejoined without the obstructions.  Being made of wood, the ships were hard to sink but highly susceptible to damage, especially to the rigging which, upon fragmentation, tended to fall to the deck.  It may have been a adaptation of the French army slang débarasser le pont (clear the bridge).

Ford 302 cubic inch (4.9 litre) Windsor V8 with the standard deck (left) and the raised deck 351 (5.8) (right).  In production in various displacements between 1961-2000, the 221 (3.6), 255 (4.2), 260 (4.3), 289 (4.7) & 302 (4.9) all used what came retrospectively to be called the “standard deck” while the 351 (5.8) was the sole “raised deck” version.

For decades, it was common for US manufacturers to increase the displacement of their V8 engines but means of creating a “raised deck” version, the process involving raising the height of the engine block's deck surface (the surface where the cylinder heads bolt on).  What this allowed was the use of longer connecting rods while using the original heads and pistons which in combination with a “longer stroke crankshaft” increases the displacement (the aggregate volume of all cylinders).  The industry slang for such things was “decker” and the technique was used with other block configurations but is best known from the use in the 1960s & 1970s for V8s because it’s those which tend to be fetishized.  The path to greater displacement lay either in lengthening the stroke or increasing the bore (or a combination of the two) and while there were general engineering principles (longer stroke=emphasis on more torque at the cost of reducing maximum engine speed and bigger bore=more power and higher engine speeds) but there were limitations in how much a bore could safely be increased including the available metal.  A bigger bore (ie increasing the internal diameter of the cylinder) reduces the thickness of the cylinder walls and if they become too thing, there can be problems with cooling, durability or even the structural integrity of the block.  The piston size also increases which means the weight increases and thus so too does the reciprocating mass, increasing friction, wear and has the potential to compromise reliability, especially at high engine speeds.

Increasing the stroke will usually enhance the torque output, something of greater benefit to most drivers, most of the time than the “top end power” most characteristic of the “big bore” approach.  In street use, most engines spend most time at low or mid-range speed and it’s here a longer stroke tends to produce more torque so it has been a popular approach and the advantage for manufacturers is that creating a “decker” almost always is easier, faster and cheaper than arranging one which will tolerate a bigger bore, something which can demand a new block casting and sometimes changes to the physical assembly line.  With a raised deck, there can be the need to use different intake and exhaust manifolds and some other peripheral components but it’s still usually a cheaper solution than a new block casting.  Ford’s “thinwall” Windsor V8 was one of the longest-serving deckers (although the raised-deck version didn’t see out the platform’s life, the 351 (introduced in 1969) retired in 1997).  Confusingly, during the Windsor era, Ford also produced other 351s which belonged to a different engine family.  Ford didn’t acknowledge the biggest Windsor's raised deck in its designation but when Chrysler released a decker version of the “B Series” big-block V8 (1958-1978), it was designated “RB” (Raised B) and produced between 1959-1979.

1964 AEC Routemaster double decker Bus RM1941 (ALD941B) (left), two sightseeing AEC Routemasters in Christchurch, New Zealand (centre) and one of the "new" Routemasters, London 2023 (right).

London’s red, double-decker busses are one of the symbols most associated with the city and a fixture in literature, art and films needing something with which to capture the verisimilitude.  The classic example of the breed was the long-running AEC Routemaster, designed by the London Transport Board and built by the Associated Equipment Company (AEC) and Park Royal Vehicles.  The Routemaster entered service in 1956 and remained in production until 1968, changed over those years in many details but visually there was such continuity that it takes an expert (and buses are a thing so experts there are) to pick the model year.  They entered service in 1956 and remained in regular service until 2005 although some were retained as “nostalgia pieces” on designated “tourist” routes until COVID-19 finally saw their retirement; since then, many have been repurposed for service around the world on sightseeing duties and other tourist projects.

Boris Johnson (b 1964; UK prime-minister 2019-2022) will leave an extraordinary political legacy which in time may came to be remembered more fondly than it now may appear but one of his most enduring achievements is likely to be the “New Routemaster” which had the typically bureaucratic project name “New Bus for London” but came to be known generally as the “Boris Bus”, the honor accorded by virtue of him championing the idea while serving as Lord Mayor of London (2008-2016).  In truth, the original Routemaster, whatever its period charm, was antiquated years before it was withdrawn from service and although the doorless design made ingress and egress convenient, it was also dangerous and apparently a dozen passenger fatalities annually was not uncommon.  The Borisbus entered service in 2012 and by 2024 almost 1200 were in service.

1930 Lancia Omicron with 2½ deck coachwork and a clerestoried upper windscreen (left) and a “three decker” bus in Pakistan (right).

The Lancia Omicron was a bus chassis produced between 1927-1936; over 600 were built in different wheelbase lengths with both two and three-axle configurations.  Most used Lancia's long-serving, six-cylinder commercial engine but, as early as 1933, some had been equipped with diesel engines which were tested in North Africa where they proved durable and, in the Sudan, Ethiopia, Libya and Algeria, once petrol powered Omicron chassis were being re-powered with diesel power-plants from a variety of manufacturers as late as the 1960s.  Typically of bus use, coachbuilders fabricated many different styles of body but, in addition to the usual single and double deck arrangements, the Omicron is noted for a number of two and a half deck models, the third deck configured usually as a first-class compartment but in at least three which operated in Italy, they were advertised as “smoking rooms”, the implication presumably that the rest of the passenger compartment was smoke-free.  History doesn't record if the bus operators were any more enthusiastic about or successful in enforcing smoking bans than the usual Italian experience.  For a variety of reasons, busses with more than 2.something decks were rare and the Lancias and Alfa Romeos which first emerged in the 1920s were unusual.  However, the famously imaginative and inventive world of Pakistani commerce has produced a genuine “three decker” bus, marketed as the “limousine bus”.  What the designer did was take a long-distance, double decker coach and use the space allocated usually as a luggage compartment to configure as the interior of a long wheelbase (LWB) limousine, thereby creating a “first class” section, the four rows of seating accessible via six car-like (ie limousine) doors.

Tuesday, August 6, 2024

Thunk

Thunk (pronounced thunk)

(1) Onomatopoeic slang for sounds such as the impressive thud when the doors close on pre-modern Mercedes-Benz; representing the dull sound of the impact of a heavy object striking another and coming to an immediate standstill, with neither object being broken by the impact.

(2) In computer programming, a delayed computation (known also as a closure routine.

(3) In computing, in the Scheme programming language, a function or procedure taking no arguments.

(4) In computing, a specialized subroutine in operating systems where one software module is used to execute code in another or inject an additional calculation into another subroutine; a mapping of machine data from one system-specific form to another, usually for compatibility reasons, to allow a 16-bit program to run on a 32-bit operating system.

(5) In computing, to execute code by means of a thunk.

(6) As “get thunked” or “go thunk yourself”, an affectionate insult among the nerdiest of programmers.

(7) In colloquial use, a past tense form of think (the standard form being "thought").  Usually it's used humorously but, if erroneous, it's polite not to correct the mistake.

1876: The first documented instance as incorrect English is from 1876 but doubtlessly it had been used before and there’s speculation it may have be a dialectical form in one or more places before dying out.  There being no oral records and with nothing in writing prior to 1876, the history is unknown.  As an echoic of the sound of impact, it’s attested from 1952.  Although occasionally heard in jocular form, except in computing, thunk is non-standard English, used as a past tense or past participle of think.  The mistake is understandable given the existence of drink/drunk, sink/sunk etc so perhaps it’s a shame (like brung from bring) that it’s not a standard form except in computing.  The plural is thunks, the present participle thunking and the simple past and past participle thunked.  The numerical value of thunk in Chaldean Numerology is 4; the value in Pythagorean Numerology is 2.  Thunk & thunking are nouns & verbs, thunker is a noun and thunked is a verb; the noun plural is thunks.  The adjective thunkish is non-standard but is used in engineering and programming circles.

Getting thunked

The origin of the word to describe a number of variations of tricks in programming is contested, the earliest dating from 1961 as onomatopoeic abstractions of computer programming.  One holds a thunk is the (virtual) sound of data hitting the stack (some say hitting the accumulator).  Another suggestion is that it’s the sound of the expression being unfrozen at argument-evaluation time. The most inventive in that it was said to have been coined during an after-midnight programming session when it was realized a type of an argument in Algol 60 could be figured out in advance with a little compile-time thought, simplifying the evaluation machinery.  In other words, it had "already been thought of"; thus it was christened a "thunk", which is “the past tense of ‘think’ at two in the morning when most good programming is done on a diet of Coca-Cola and pizza”.


Door closing on 1967 Mercedes-Benz 230 S.  Until the 1990s, the quality of even the low-end Mercedes-Benz models was outstanding and the doors closed with a satisfying thunk.

Thunking as a programming concept does seem to have been invented in 1961 as “a chunk of code which provides an address”, a way of binding parameters to their formal definitions in procedure calls.  If a procedure is called with an expression in the place of a formal parameter, the compiler generates a thunk which computes the expression and leaves the address of the result in some standard location.  It usefulness was such it was soon generalised into: an expression, frozen with its environment for later evaluation if and when needed (that point being the closure), the process of unfreezing thunks called forcing.  As operating systems evolved into overlay-rich environments, the thunk became a vital stub-routine to load and jump to the correct overlay, Microsoft and IBM both defining the mapping of the 16-bit Intel environment with segment registers and 64K address limits whereas 32 & 64-bit systems had flat addressing and semi-real memory management.  Thunking permits multiple environments to run on the same computer and operating system and to achieve this, there’s the universal thunk, the generic thunk and the flat thunk, the fine distinctions of which only programmers get.  In another example of nerd humor, a person can be said to have their activities scheduled in a thunk-like manner, the idea being they need “frequently to be forced to completion”, especially if the task is boring.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

So it’s a bit nerdy but modern operating systems need thunking because 8, 16, 32 and 64-bit routines can need to run both concurrently and interactively on the same machine (real or virtual).  If a 32-bit application sends something with precision data types to a 64-bit handler, without thunking, the call will fail because the precise address can’t be resolved.  Although not literally true, it’s easiest to visualise thunking as a translation layer.

IBM OS/2 2.0 in shrink-wrap, 1992.

Thunking first entered consumer computing at scale with IBM’s OS/2 in 1987, an operating system still in surprisingly wide use and supported by IBM until early in the century.  Interestingly, although both OS/2 (and its successor eCom) have been unavailable for years, in August 2017, a private project released ArcaOS, an x86 operating system derived from OS/2 and, for personal use, it retails at US$129.00.  Like OS/2, it has some features which are truly unique such as, for the handful of souls on the planet who either need or wish simultaneously to run multiple 8, 16 and 32-bit text-mode sessions, (including those internally booting different operating systems in segregated memory) in their hundreds on the one physical machine.  First done in 1992 on OS/2 2.0, it’s still quite a trick and the on-line OS/2 Museum hosts an active community, development continuing.

Wednesday, January 10, 2024

Asymmetric

Asymmetric (pronounced a-sim-et-rick)

(1) Not identical on both sides of a central line; unsymmetrical; lacking symmetry.

(2) An asymmetric shape.

(3) In logic or mathematics, holding true of members of a class in one order but not in the opposite order, as in the relation “being an ancestor of”.

(4) In chemistry, having an unsymmetrical arrangement of atoms in a molecule.

(5) In chemistry, noting a carbon atom bonded to four different atoms or groups.

(6) In chemistry (of a polymer), noting an atom or group that is within a polymer chain and is bonded to two different atoms or groups that are external to the chain.

(7) In electrical engineering, of conductors having different conductivities depending on the direction of current flow, as of diodes

(8) In aeronautics, having unequal thrust, as caused by an inoperative engine in a twin-engined aircraft.

(9) In military theory, a conflict where the parties are vastly different in terms of military capacity.  This situation is not in all circumstances disadvantageous to the nominally inferior party.

(10) In gameplay, where different players have different experiences

(11) In cryptography, not involving a mutual exchange of keys between sender a7 receiver.

(12) In set theory, of a relation R on a set S: having the property that for any two elements of S (not necessarily distinct), at least one is not related to the other via R.

1870–1875: The construct was a- + symmetric.  The a- prefix was from the Ancient Greek - (a-) (ν-) (an- if immediately preceding a vowel) and was added to stems to created the sense of "not, without, opposite of".  The prefix is referred to as an alpha privative and is used with stems beginning with consonants (except sometimes “h”); “an-“ is synonymous and is used in front of words that start with vowels and sometimes “h”.  Symmetric was from the Latin symmetria from Ancient Greek συμμετρία (summetría).  Symmetry was from the 1560s in the sense of "relation of parts, proportion", from the sixteenth century French symmétrie and directly from the Latin symmetria, from the Greek symmetria (agreement in dimensions, due proportion, arrangement", from symmetros (having a common measure, even, proportionate), an assimilated form of syn- (together) + metron (measure) from the primitive Indo-European me- (to measure).  The meaning "harmonic arrangement of parts" dates from the 1590s.  The suffix -ic was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically.  In English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (H₂SO₄) has more oxygen atoms per molecule than sulphurous acid (H₂SO₃).  Asymmetric & asymmetrical are adjectives, asymmetricity, asymmetricality, asymmetricalness & asymmetry are nouns and asymmetrically is an adverb; the noun plural is asymmetries.

The usually symmetrically attired Lindsay Lohan demonstrates the possibilities of asymmetry.

1975 Kawasaki 750 H2 Mach IV.

Manufacturers of triple-cylinder motorcycles traditionally used single (3 into 1) or symmetrical (3 into 2) exhaust systems (although, during the 1970s, Suzuki offered some of their "Ram-Air" models with a bizarre 3 into 4 setup, the centre cylinder’s header bifurcated) but in 1969 Kawasaki adopted an asymmetric addition for one of the memorable machines of the time.  The Kawasaki 500 H1 Mach III had two outlets to the right, one to the left and was a fast, lethally unstable thing which was soon dubbed the "widow maker".  Improvements to the Mach III made it a little more manageable and its successor, the 750 H2 Mach IV was claimed to be better behaved but was faster still and best enjoyed by experts, preferably in a straight line although, with a narrow power band which peaked with a sudden rush, even that could be a challenge.  The Kawasaki triples remain the most charismatic of the Japanese motorcycles.

1973 Triumph X-75 Hurricane.

Available only during 1972-1973 and produced in small numbers, the Triumph X75 Hurricane was typical of the motorcycles being produced by the British manufacturers which had neglected development and re-investment and consequently were unable adequately to respond to the offerings of the Japanese which had done both aplenty.  Whatever their charms, models like the X75 were being rendered obsolescent, some of the underlying technology dating back decades yet, without the capital to invest, this was as good as it got and some of the fudges of the era were worse.  The X-75 was however ahead of its time in one way, it was a “factory special”, a design influenced by what custom shops in the US had been doing as one-offs for customers and in the years ahead, many manufacturers would be attracted by the concept and its healthy profit margins.  The X-75 is remembered also for the distinctive asymmetric stack of three exhaust pipes on the right-hand side.

1986 Ferrari Testarossa (1984-1991) with monospecchio.

Some of Ferrari's early-production Testarossas were fitted with a single high-mounted external mirror, on the left or right depending on the market into which it was sold and although the preferred term was the Italian “monospecchio” (one mirror), in the English speaking-world it was quickly dubbed the “flying mirror" (rendered sometimes in Italian as “specchio volante” (a ordinary wing mirror being a “specchietto laterale esterno”, proving everything sounds better in Italian)).  The unusual placement and blatant asymmetry annoyed some and delighted others, the unhappy more disgruntled still if they noticed the vent on right of the front spoiler not being matched by one to the left.  It was there to feed the air-conditioning’s radiator and while such offset singularities are not unusual in cars, many manufacturers create a matching fake as an aesthetic device: Ferrari did not.  The mirror’s curious placement was an unintended consequence of a European Union regulation (and it doubtful many institutions have in a relatively short time created as many regulations of such collective length as the EU) regarding the devices and this was interpreted by the designers as having to provide 100% rearward visibility.  Because of the sheer size of the rear bodywork necessitated by the twin radiators which sat behind the side-strakes (another distinctive Testarossa feature), the elevation was the only way this could be done but it later transpired the interpretation of the law was wrong, a perhaps forgivable mistake given the turgidity of EU legalese.

The Blohm & Voss BV 141

Focke-Wulf Fw 189 Eurl (Owl)

In aircraft, designs have for very good reason (aerodynamics, weight distribution, flying characteristics, ease of manufacture et al) tended to be symmetrical, sometimes as an engineering necessity such as the use of contra-rotationg propellers on some twin-engined airframes, a trick to offset the destabilizing effects of the torque when very potent power-plants are fitted.  There has though been the odd bizarre venture into structural asymmetry, one of the most intriguing being the Blohm & Voss BV 141, the most distinctive feature of which was an offset crew-capsule.  The BV 141 was tactical reconnaissance aircraft built in small numbers and used in a desultory manner by the Luftwaffe (the German air force) during World War II (1939-1945) and although it was studied by engineers from many countries, none seem to have been inspired to repeat the experiment. The origin of the curious craft lay in a specification issued in 1937 by the Reichsluftfahrtministerium (RLM; the German Air Ministry) which called for a single-engine reconnaissance aircraft, optimized for visual observation and, in response, Focke-Wulf responded with their Fw 189 Eurl (Owl) which, because of the then still novel twin-boomed layout, encountered some resistance from the RLM bureaucrats but it found much favor with the Luftwaffe and, over the course of the war, some nine-hundred entered service and it was used almost exclusively as the German's standard battlefield reconnaissance aircraft.  In fact, so successful did it prove in this role that the other configurations it was designed to accommodate, that of liaison and close-support ground-attack, were never pursued.  Although its performance was modest, it was a fine airframe with superb flying qualities and an ability to absorb punishment which, on the Russian front where it was extensively deployed, became famous and captured examples provide Russian aeronautical engineers with ides which would for years influence their designs.

The RLM had also invited Arado to tender but their Ar 198, although featuring an unusual under-slung and elongated cupola which afforded for the observer a uniquely panoramic view, proved unsatisfactory in test-flights and development ceased.  Blohm and Voss hadn't been included in the RLM's invitation but anyway chose to offer a design which was radically different even by the standards of the innovative Fw 189.  The asymmetric BV 141 design was eye-catching with the crew housed in an extensively glazed capsule, offset to starboard of the centre-line with a boom offset to the left which housed the single-engine in front with the tail to the rear.  Prototypes were built as early as 1938 and the Luftwaffe conducted operational trials over both the UK and USSR between 1939-1941 but, despite being satisfactory in most respects, the Bv 141 was hampered by poor performance, a consequence of using an under-powered engined.  A re-design of the structure to accommodate more powerful units was begun but delays in development and the urgent need for the up-rated engines for machines already in production doomed the project and the Bv 141 was in 1943 abandoned.

Blohm & Voss BV 141 prototype with full-width rear elevators & stabilizers.

Production Blohm & Voss BV 141 with port-only rear elevator & stabilizer.

Despite the ungainly appearance, test-pilots reported the Fw 141 was a nicely balanced airframe, the seemingly strange weight distribution well compensated by (1) component placement, (2) the specific lift characteristics of the wing design and (3) the choice of opposite rotational direction for crankshaft and propeller, the torque generated used as a counter-balance.  Nor, despite the expectation of some, were there difficulties in handling whatever behavior was induced by the thrust versus drag asymmetry and pilots all indicated some intuitive trimming was all that was needed to compensate for any induced yaw.  The asymmetry extended even to the tail-plane, the starboard elevator and horizontal stabilizer removed (to afford the tail-gunner a wider field of fire) after the first three prototypes were built; surprisingly, this was said barely to affect the flying characteristics.  Focke-Wolf pursued the concept, a number of design-studies (including a piston & turbojet-engine hybrid) initiated but none progressed beyond the drawing-board.

Asymmetric warfare

In the twenty-first century, the term “asymmetric warfare” became widely used.  The concept describes conflicts in which there are significant disparities in power, capability and strategies between opposing forces and although the phrase has become recently fashionable, the idea is ancient, based often on the successes which could be exploited by small, mobile and agile (often irregular) forces against larger, conventionally assembled formations.  Reports of such tactics are found in accounts of conflicts in Asia, Africa, the Middle East and Europe from as early as reliable written records have been found.  The classic example is what came later to be called “guerrilla warfare”, hit-and-run tactics which probe and attack a weak spots as they are detected, the ancestor of insurgencies, “conventional” modern terrorism and cyber-attacks.  However, even between conventional national militaries there have long been examples of the asymmetric such as the use of small, cheap weapons like torpedo boats and mines which early in the twentieth century proved effective against the big, ruinously expensive Dreadnoughts.  To some extent, the spike in use of the phrase in the post-Cold War era happened because it provided such a contrast between the nuclear weapon states which, although having a capacity to destroy entire countries without having one soldier step foot on their territory, found themselves vulnerable to low-tech, cleverly planned attacks.

Although the term “asymmetric warfare” covers encompasses a wide vista, one increasingly consistent thread is that it can be a difficult thing for "conventional" military formations to counter insurgencies conducted by irregular combatants who, in many places and for much of the time, are visually indistinguishable from the civilian population.  The difficulty lies not in achieving the desired result (destruction of the enemy) but managing to do so without causing an “excessive” number of civilian causalities; although public disapproval has meant the awful phrase “collateral damage” is now rarely heard, civilians (many of them women & children) continue greatly to suffer in such conflicts, the death toll high.  Thus the critique of the retaliatory strategy of the Israel Defence Force (IDF) in response to the attack by the Hamas on 7 October 2023, Palestinian deaths now claimed to exceed 20,000; that number is unverified and will include an unknown number of Hamas combatants but there is no doubt the percentage of civilian deaths will be high, the total casualty count estimated early in January 2024 at some 60,000.  What the IDF appear to have done is settle on the strategy adopted by Ulysses S Grant (1822–1885; US president 1869-1877) in 1863 when appointed head of the Union armies: the total destruction of the opposing forces.  That decision was a reaction to the realization the previous approach (skirmishes and the temporary taking of enemy territory which was soon re-taken) was ineffectual and war would continue as long as the other side retained even a defensive military capacity.  Grant’s strategy was, in effect: destroy the secessionist army and the secessionist cause dies out.

In the US Civil War (1861-1965) that approach worked though at an appalling cost, the 1860s a period when ballistics had advanced to the point horrific injuries could be inflicted at scale but battlefield medical tools and techniques were barely advance from Napoleonic times.  The bodies were piled high.  Grant’s success was influential on the development of the US military which eventually evolved into an organization which came to see problems as something not to be solved but overwhelmed by the massive application of force, an attitude which although now refined, permeates from the Pentagon down to platoon level.  As the US proved more than once, the strategy works as long as there’s little concern about “collateral damage”, an example of this approach being when the Sri Lankan military rejected the argument there was “no military solution” to the long running civil war (1983-2009) waged by the Tamil Tigers (the Liberation Tigers of Tamil Eelam (LTTE)).  What “no military solution” means is that a war cannot be won if the rules of war are followed so the government took the decision that if war crimes and crimes against humanity were what was required to win, they would be committed.

In the 1990s, a number of political and military theorists actually advanced the doctrine “give war a chance”, the rationale being that however awful conflicts may be, if allowed to continue to the point where one side gains an unambiguous victory, the dispute is at least resolved and peace can ensue, sometimes for generations.  For most of human history, such was the usual path of war but after the formation of the United Nations (UN) in 1945 things changed, the Security Council the tool of the great powers, all of which (despite their publicity) viewed wars as a part of whatever agenda they were at the time pursuing and depending on this and that, that meant their interests sometimes lay in ending conflicts and sometimes in prolonging them.  In isolation, such an arrangement probably could have worked (albeit with much “collateral damage”) but over the years, a roll-call of nations run by politicians appalled by the consequences of war began to become involved, intervening with peace plans,  offering mediation and urging the UN to deploy “peacekeeping” forces, something which became an international growth industry.  Added to that, for a number of reasons, a proliferation of non-government organizations (NGO) were formed, many of which concerned themselves with relief programmes in conflict zones and while these benefited may civilians, they also had the effect of allowing combatant forces to re-group and re-arm, meaning wars could drag on for a decade or more.

In the dreadful events in Gaza, war is certainly being given a chance and the public position of both the IDF and the Israeli government is that the strategy being pursued is one designed totally “to destroy” not merely the military capacity of Hamas but the organization itself.  Such an idea worked for Grant in the 1860s and, as the Sri Lankan military predicted they would, end-game there was achieved in 2009 on the basis of “total destruction”.  However, Gaza (and the wider Middle East) is a different time & place and even if the IDF succeeds in “neutralizing” the opposing fighters and destroying the now famous network of tunnels and ad-hoc weapons manufacturing centres, it can’t be predicted that Hamas in some form won’t survive and in that case, what seems most likely is that while the asymmetry of nominal capacity between the two sides will be more extreme than before, Hamas is more likely to hone the tactics than shift the objective.  The IDF high command are of course realists and understand there is nothing to suggest “the Hamas problem” can be solved and being practical military types, they know if a problem can’t be solved it must be managed.  In the awful calculations of asymmetric conflict, this means the IDF calculate that while future attacks will happen, the more destructive the response now, the longer will be the interval before the next event.

Tuesday, January 2, 2024

Corduroy

Corduroy (pronounced kawr-duh-roi)

(1) A cotton-filling pile fabric with lengthwise cords or ridges.

(2) In the plural as corduroys (or cords), trousers made from this fabric.

(3) Of, relating to, or resembling corduroy.

(4) A method of building an improvised road or causeway, constructed with logs laid together transversely (used by military forces, those operating in swampy ground areas etc); the result was described originally as “ribbed velvet” and when intended for sustained use, earth was thrown into the gaps and compacted, thus rendering a relatively smooth, stable surface.

(5) To form such a “road” by arranging logs transversely.

(6) In ski-run maintenance, a pattern on snow resulting from the use of a snow groomer to pack snow and improve skiing, snowboarding and snowmobile trail conditions (corduroy thought a good surface for skiing).

(7) In Irish slang, cheap, poor-quality whiskey (based on the idea on the fabric corduroy not being “smooth”.

1776: Of uncertain origin.  There’s no consensus among etymologists but the most support seems to be for the construct being cord + duroy (one of a number of lightweight, worsted fabrics once widely produced and known collectively as “West of England Cloth”).  Cord (A long, thin, flexible length of twisted yarns (strands) of fiber) was from the late thirteenth century Middle English corde (a string or small rope composed of several strands twisted or woven together; bowstring, hangman's rope), from the Old French corde (rope, string, twist, cord), from the Latin chorda (string of a musical instrument, cat-gut), from the Doric Ancient Greek χορδά (khordá) (string of gut, the string of a lyre) and may be compared with the Ionic χορδή (khord), from the primitive Indo-European ghere- (bowel; intestine).  The adjective cordless (of electrical devices or appliances working without a cord (ie powered by a battery) dates from 1905 and was augmented later by “wireless” which described radios so configured.  Cordless is still a useful word now that the meaning of “wireless” in the context of computing has become dominant.  The curious use of cord as “a unit of measurement for firewood, equal to 128 cubic feet (4 × 4 × 8 feet), composed of logs and/or split logs four feet long and none over eight inches diameter (and usually seen as a stack 4 feet high by 8 feet long)” dates from the 1610s and was so-called because it was measured with a cord of rope marked with the appropriate measures.

Lindsay Lohan as fashion influencer: In burnt red corduroy pants & nude platform pumps, Los Angeles, December 2011 (left) and leaving Phillippe Restaurant after dinner with Woody Allen (b 1935), New York, May 2012 (right).  It's said that before he met Lindsay Lohan, the film director had never worn corduroy trousers and he still prefers brown, seemingly unable to escape the 1970s when he was perhaps at his happiest.

Until well into the twentieth century, the old folk etymology was still being published which held the word was from the French corde du roi (cloth of the king”), which seems never to have be used in France, the correct term for “cloth of the king” being velours côte.  It’s not impossible there’s some link with cordesoy, from the French corde de soie (“rope of silk” or “silk-like fabric”) because that form is documented in an advertisement for clothing fabrics dating from 1756.  The spelling corderoy appears in commercial use in 1772 but the modern “corduroy” became the standard form in the 1780s.  The origin of duroy is obscure and the earliest known use (in print) of the word appeared in the early seventeenth century; it may be from the French du roi (of the king), a 1790 trade publication in France including the term duroi (a woolen fabric similar to tammy).  So, although the case for cord + duroy seems compelling, etymologists note (1) duroy was fashioned from wool while corduroy was made with cotton and there’s no other history of the two words being associated, (2) grammatically, the compound should have been duroy-cord and (3) this does not account for the earlier corderoy.

None of that of course means cord + duroy was not the source and many English words have been formed in murky ways.  There’s also the possibility of some link with the English surname Corderoy (although there is no evidence of a connection); the name was also spelt Corderey & Cordurey, the origin lying in the nickname for “a proud person” (of French origin, it meant “king’s heart”).  Some are more convinced by a suggestion made in 1910 by the English philologist Ernest Weekley (1865-1954) who speculated corduroy began life as folk-etymology for the trade-term common in the sixteenth century: colour de roy, from the French couleur de roi (king’s colour) which originally was a reference to both a cloth in the rich purple associated with the French kings and the color itself. Later, it came to signify a bright tawny colour and a cloth of this hue.  Corduroy as an adjective came into use after 1789 and the use to describe roads or causeways improvised by means of with logs laid together transversely (usually to provide wheeled vehicles passage over swampy ground dates from the 1780s.  Corduroy is a noun, verb & adjective, corduroying & corduroyed are verbs and corduroylike is an adjective; the noun plural is corduroys.

1975 Mercedes-Benz 450 SEL 6.9 trimmed in classic 1970s brown corduroy.

By the 1960s most Mercedes-Benz were being trimmed in their famously durable MB-Tex, a robust vinyl which was not only easy to maintain but closely resembled leather, lacking only the aroma (and third-party manufacturers soon made available aerosols for those who found the olfactory appeal too much to forgo).  However, leather and velour (of mohair in the most exclusive lines) were either standard or optional on more expensive models and when exported to first-world markets beyond Europe (such as North America or Australia), MB-Tex or leather was standard and the fabrics were available only by special order.  A notable exception was the Japanese market where buyers disliked the way their prized “car doilies” slid of the hide; they always preferred cloth.  The velour was certainly better suited to harsh northern European winters and testers would often comment on how invitingly comfortable were the seats trimmed in the fabric.  To add to the durability, the surfaces subject to the highest wear (ie those at the edges on which there was the most lateral movement during ingress and egress) used a corduroy finish (the centre panels also trimmed thus, just for symmetry).

Wednesday, December 6, 2023

Bedchamber

Bedchamber (pronounced bed-cheym-ber)

A now archaic word for bedroom; the alternative form was bed-chamber.

1325–1375:  From the Middle English bedchaumbre, the construct being bed + chamber.  Bed was from the Middle English bed or bedde, from the pre-1000 Old English bedd (bed, couch, resting-place; garden-bed, plot), from the Proto-Germanic badją (plot, grave, resting-place, bed) and thought perhaps derived from the Proto-Indo-European bhed (to dig).  It was cognate with the Scots bed and bede, the North Frisian baad and beed, the West Frisian bêd, the Low German Bedd, the Dutch bed, the German bett, the Danish bed, the Swedish bädd, the Icelandic beður and perhaps, (depending on the efficacy of the Proto-Indo-European lineage), the Ancient Greek βοθυρος (bothuros) (pit), the Latin fossa (ditch),the Latvian bedre (hole), the Welsh bedd (grave), the Breton bez (grave).  Any suggestion of links to Russian or other Slavic words is speculative.

Chamber dates from 1175-1225 and was from the Middle English chambre, borrowed from Old French chambre, from the Latin camera, derived from the Ancient Greek καμάρα (kamára) (vaulted chamber); the meaning “room”, usually private, drawn from French use.  As applied to anatomy, use emerged in the late fourteenth century; it was applied to machinery in 1769 and to ballistics from the 1620s.  The meaning "legislative body" is from circa 1400 and the term chamber music was first noted in 1789, not as a descriptor of any musical form but to indicate that intended to be performed in private rooms rather than public halls.

The Bedchamber Crisis, 1839

A Lady of the Bedchamber, a position held typically by women of noble descent, is a kind of personal assistant to the Queen of England.  A personal appointment by the Queen, they’ve existed for centuries, their roles varying according to the relationships enjoyed.  Most European royal courts from time-to-time also adopted the practice.

The 1839 bedchamber crisis is emblematic of the shifting of political power from monarch to parliament.  Although the eighteenth-century administrative and economic reforms created the framework, it was the 1832 Reform Act which, in doing away with a monarch’s ability to stack parliaments with ample compliant souls, shattered a sovereign’s capacity to dictate election results and within two years the new weakness was apparent.  In 1834, William IV (1765–1837; King of the UK 1830-1837)  dismissed the Whig Lord Melbourne (1779–1848; Prime Minister of the UK 1834 & 1835-1841) and appointed the Tory Sir Robert Peel (1788–1850; Prime Minister of the UK 1834–1835 & 1841–1846).  However, the King no longer enjoyed the electoral influence necessary to secure Peel a majority in the Commons and after being defeated in the house six times in as many weeks, the premier was obliged to inform the palace of his inability to govern, compelling the king to invite Melbourne to form a new administration, one which endured half a decade, out-living William IV.  The king's exercise in 1834 of the royal prerogative proved the last time the powers of the head of state would be invoked sack a prime-minister until an Australian leader was dismissed in 1975 by the governor-general (and in a nice touch the sacked PM had appointed the clearly ungrateful GG).

Queen Mary's State Bed Chamber, Hampton Court Palace (1819) by Richard Cattermole (1795–1858).

By 1839, Melbourne felt unable to continue and the new Queen Victoria (1819–1901; Queen of the UK 1837-1901), reluctantly, invited Sir Robert Peel to assume the premiership, a reticence some historians attribute as much to her fondness for the avuncular Melbourne as her preference for his Whig (liberal) politics.  Peel, knowing any administration he could form would be nominally in a minority, knew his position would be strengthened if there was a demonstration of royal support so asked Victoria, as a gesture of good faith, to replace some of the Whig Ladies of the Bedchamber with a few of Tory breeding.  Most of the ladies were the wives or daughters of Whig politicians and Sir Robert’s request made sense in the world of 1839.

Victoria rejected his request and prevailed upon Melbourne to continue which he did, until a final defeat in 1841.  By then it was clear only Peel could command a majority in the Commons and he insisted on his bedchamber cull, forcing Victoria to acquiesce to the parliament imposing on her the most intimate of her advisors.  This is the moment in constitutional history where the precedent is established of the parliament and not the Crown determining the formation and fate of governments.  Since then, the palace can warn, counsel and advise but not compel.

A lady in, if not of, the bedchamber.  A recumbent Lindsay Lohan in The Canyons (IFC Films, 2013).