Thursday, June 27, 2024

Monocoque

Monocoque (pronounced mon-uh-kohk or mon-oh-kok (non-U))

(1) A type of boat, aircraft, or rocket construction in which the shell carries most of the stresses.

(2) A type of automotive construction in which the body is combined with the chassis as a single unit.

(3) A unit of this type.

1911: From the French monocoque (best translated as “single shell” or “single hull” depending on application), the construct being mono- + coque.  Mono was from the Ancient Greek μόνος (monos) (alone, only, sole, single), from the primitive Indo-European root men (small, isolated).  Coque was from the Old French coque (shell) & concha (conch, shell), from the Latin coccum (berry) and concha (conch, shell) from the Ancient Greek κόκκος (kókkos) (grain, seed, berry).  In the early twentieth century, it was the French who were most dominant in the development of aviation.  Words like “monocoque”, “aileron”, “fuselage” and “empennage” are of French origin and endure in English because it’s a vacuum-cleaner of a language which sucks in anything from anywhere which is handy and manageable.  Monocoque is a noun; the noun plural is monocoques.

Noted monocoques

Deperdussin Monocoque, 1912.

A monocoque (sometime referred to as structural skin) is a form of structural engineering where loads and stresses are distributed through an object's external skin rather than a frame; concept is most analogous with an egg shell. Early airplanes were built using wood or steel tubes covered with starched fabric, the fabric rendering contributing only a small part to rigidity.  A monocoque construction integrates skin and frame into a single load-bearing shell, reducing weight and adding strength.  Although examples flew as early as 1911, airframes built as aluminium-alloy monocoques would not become common until the mid 1930s.  In a pure design where only function matters, almost anything can be made a stressed component, even engine blocks and windscreens.

Lotus 25, 1962.

In automotive design, the word monocoque is often misused, treated as a descriptor for anything built without a separate chassis.  In fact, most road vehicles, apart from a handful of expensive exotics, are built either with a separate chassis (trucks and some SUVs) or are of unibody/unitary construction where box sections, bulkheads and tubes to provide most of the structural integrity, the outer-skin adding little or no strength or stiffness.  Monocoque construction was first seen in Formula one in 1962, rendered always in aluminium alloys until 1981 when McLaren adopted carbon-fibre.  A year later, the McLaren F1 followed the same principles, becoming the first road car built as a carbon-fibre monocoque.

BRM P83 (H16), 1966.

In 1966, there was nothing revolutionary about the BRM P83’s monocoque chassis.  Four years earlier, in the second season of the voiturette era, that revolution had been triggered by the Lotus 25, built with the first fully stressed monocoque chassis, an epoch still unfolding as materials engineering evolves; the carbon-fibre monocoques seen first in the 1981 McLaren MP4/1 becoming soon ubiquitous.  The P83 used a monocoque made from riveted Duralumin (the word a portmanteau of durable and aluminium), an orthodox construction for the time.  Additionally, although it had been done before and would soon become an orthodoxy, what was unusual was that the engine was a stressed part of the monocoque.

BRM Type 15 (V16), 1949.

The innovation was born of necessity.  Not discouraged by the glorious failure of the extraordinary V16 BRM had built (with much much fanfare and precious little success) shortly after the war, the decision was taken again to join together two V8s in one sixteen cylinder unit.  Whereas in 1949, the V8s had been coupled at the centre to create a V16, for 1966, the engines were re-cast as 180o flat 8s with one mounted atop another in an H configuration, a two-crankshaft arrangement not seen since the big Napier-Sabre H24 aero-engines used in the last days of the war.  The design yielded the advantage that it was short, affording designers some flexibility in lineal placement, but little else.  It was heavy and tall, exacerbating further the high centre of gravity already created by the need to raise the engine location so the lower exhaust systems would clear the ground.  Just as significantly, it was wide, too wide to fit into a monocoque socket and thus was taken the decision to make the engine an integral, load-bearing element of the chassis.  There was no other choice.

BRM H16 engine and gearbox, 1966.
 
Structurally, it worked, the monocoque was strong and stable and despite the weight and height, the P83 might have worked if the H16 had delivered the promised horsepower but the numbers were never realised.  The early power output was higher than the opposition but it wasn’t enough to compensate for the drawbacks inherent in the design and, these being so fundamental they couldn’t be corrected, the only hope was even more power.  The path to power was followed and modest increases were gained but it was never enough and time ran out before the plan to go from 32 to 64 valves could come to fruition, an endeavour some suggested would merely have “compounded the existing error on an even grander scale.”  Additionally, with every increase in power and weight, the already high fuel consumption worsened.

The H16 did win one grand prix, albeit in a Lotus rather than a BRM monocoque, but that was a rare success; of the forty times it started a race, twenty-seven ended prematurely.  The irony of the tale is that in the two seasons BRM ran the 440 horsepower H16 with its sixteen cylinders, two crankshafts, eight camshafts and thirty-two valves, the championship in both years was won by the Repco-Brabham, its engine with 320 horsepower, eight cylinders, one crankshaft, two camshafts and sixteen valves.  Adding insult to the exquisitely bespoke H16’s injury, the Repco engine was based on an old Oldsmobile block which General Motors had abandoned.  After two seasons the H16 venture was retired, replaced by a conventional V12.

The Mercedes-Benz SLR McLaren


Mercedes-Benz McLaren SLR Coupé (left), Roadster (centre) and Speedster (right).

The monocoque Mercedes-Benz SLR McLaren (C199 / R199 / Z199) was a joint development with McLaren Automotive and was available as a coupé (2003-2010), roadster (2007-2009) & speedster (2009).  Visually, the car was something of an evocation of the 300 SLR gullwing coupé, two of which were built in 1955 for use in competition but never used, one of the consequences of the disaster that year during the Le Mans 24 hour endurance classic when a 300 SLR crashed into the crowd, killing 84 and injuring dozens of others.  Footage of that event is widely available and to a modern audience it will seem extraordinary the race was allowed to continue.


Lindsay Lohan, Britney Spears and Paris Hilton in Ms Hilton's Mercedes-Benz McLaren SLR, outside the Beverley Hills Hotel, Los Angeles.  This was the occasion which produced the photograph which appeared on the infamous “Bimbo Summit” front page of Rupert Murdoch’s (b 1931) New York Post, 29 November 2006.

The 300 SLR (Sport Leicht Rennsport (Sport Light Racing)) which crashed was an open version and the model name was a little opportunistic because it was essentially the W196R Formula One car with a 3.0 litre straight-8 (the F1 rules demanded a 2.5) so the SLR, built to contest the World Sports Car Championship, was technically the W196S; it became the 300 SLR to cross-associate it and the 300 SL gullwing (W198, 1954-1957).  Nine were built, two of which were converted to SLR gullwings and, although never raced, they came to be dubbed the “Uhlenhaut coupés” because they were co-opted by racing team manager Rudolf Uhlenhaut (1906–1989) as high-speed personal transport, tales of his rapid trips between German cities soon the stuff of legend and even if a few myths developed, the cars could exceed 290 km/h (180 mph) so some at least were probably true.  That what was essentially a Grand Prix race car with a body and headlights could be registered for road use is as illustrative as safety standards at Le Mans of how different was the world of the 1950s.  In 2022, one of the Uhlenhaut coupés was sold at auction to an unknown buyer (presumed to be Middle Eastern) for US$142 million, becoming by some margin the world’s most expensive used car.

As a footnote (one to be noted only by the subset of word nerds who delight in the details of nomenclature), for decades, it was said by many, even normally reliable sources, that SL stood for sports Sports Leicht (sports light) and the history of the Mercedes-Benz alphabet soup was such that it could have gone either way (the SSKL (1929) was the Super Sports Kurz (short) Leicht (light) and from the 1950s on, for the SL, even the factory variously used Sports Leicht and Super Leicht.  It was only in 2017 it published a 1952 paper (unearthed from the corporate archive) confirming the correct abbreviation is Super Leicht.  Sports Leicht Rennsport (Sport Light Racing) seems to be used for the the SLRs because they were built as pure race cars, the W198 and later SLs being road cars but there are references also to Super Leicht Rennsport.  By implication, that would suggest the original 300SL (the 1951 W194) should have been a Sport Leicht because it was built only for competition but given the relevant document dates from 1952, it must have been a reference to the W194 which is thus also a Sport Leicht.  Further to muddy the waters, in 1957 the factory prepared two lightweight cars based on the new 300 SL Roadster (1957-1963) for use in US road racing and these were (at the time) designated 300 SLS (Sports Leicht Sport), the occasional reference (in translation) as "Sports Light Special" not supported by any evidence.  The best quirk of the SLS tale however is the machine which inspired the model was a one-off race-car built by Californian coachbuilder ("body-man" in the vernacular of the West Coast hot rod community) Chuck Porter (1915-1982).  Porter's SLS was built on the space-fame of a wrecked 300 SL gullwing (purchased for a reputed US$500) and followed the lines of the 300 SLR roadsters as closely as the W198 frame (taller than that of the W196S) allowed.  Although it was never an "official" designation, Porter referred to his creation as SL-S, the appended "S" standing for "scrap".      

The SLR and its antecedents.

A Uhlenhaut coupé and a 300 SLR of course appeared for the photo sessions when in 2003 the factory staged the official release of the SLR McLaren and to may explicit the link with the past, the phrase “gullwing doors” appeared in the press kit documents no less than seven times.  Presumably, journalists got the message but they weren’t fooled and the doors have always, correctly, been called “butterflies”.  Unlike the machines of the 1950s which were built with an aluminium skin atop a space-frame, the twenty-first century SLRs were a monocoque (engineers say the sometimes heard “monocoque shell” is tautological) of reinforced carbon fibre.  Although the dynamic qualities were acknowledged and it was, by all but the measure of hyper-cars, very fast indeed, the reception it has enjoyed has always been strangely muted, testers seeming to find the thing rather “soulless”.  That seemed to imply a lack of “character” which really seems to suggest an absence of obvious flaws, the quirks and idiosyncrasies which can at once enrage and endear.

The nature of monocoque.

The monocoque construction offered one obvious advantage in that the inherent stiffness was such that the creation of the roadster version required few modifications, the integrity of the structure such that not even the absence of a roof compromised things.  Notably, the butterfly doors were able to be hinged along the windscreen (A) pillars, such was the rigidity offered by carbon fibre, a material for which the monocoque may have been invented.  McLaren would later use a variation of this idea when it released the McLaren MP4-12C (2011-2014), omitting the top hinge which allowed the use of frameless windows even on the roadster (spider) version.

The SLR Speedster (right) was named the Stirling Moss edition and was a homage to the 300 SLR (left) which in the hands of Sir Stirling Moss (1929–2020) and navigator Denis Jenkinson (1920–1996), won the 1955 Mille Miglia (an event run on public roads in Italy over a distance of 1597 km (992 miles)) at an average speed of 157.65 km/h (97.96 mph).

However, the minimalist (though very expensive) Speedster had never been envisaged when the monocoque was designed and to ensure structural integrity, changes had to be made to strengthen what would have become points of potential failure, the removal of the windscreen fame and assembly having previously contributed much to rigidity.  Door sills were raised (recalling the space frame which in 1951 had necessitated the adoption of the original gullwing doors on the first 300 SL (W194)) and cross-members were added across the cockpit, integrated with a pair of rollover protection bars.  Designed for speed, the Speedster eschewed niceties such as air-conditioning, an audio system, side windows and sound insulation; this was not a car for Paris Hilton.  All told, despite the additional bracing, the Speedster weighed 140 kg (310 lb) less than the coupé while the supercharged 5.5 litre V8 was carried over from the earlier 722 edition but the reduction in frontal area added a little to top speed, now claimed to be 350 km/h (217 mph) although the factory did caution that above 160 km/h (100 mph), the dainty wind deflectors would no longer contain the wind and a crash helmet would be required so even if the lack of air-conditioning might have been overlooked, that alone would have been enough for Paris Hilton to cross the Speedster off her list; she wouldn't want "helmet hair".  Only 75 were built, none apparently ever driven, all spending their time on display or the auction block, exchanged between collectors.

Wednesday, June 26, 2024

Mutation

Mutation (pronounced myoo-tey-shuhn)

(1) In biology (also as “break”), a sudden departure from the parent type in one or more heritable characteristics, caused by a change in a gene or a chromosome.

(2) In biology, (also as “sport”), an individual, species, or the like, resulting from such a departure.

(3) The act or process of mutating; change; alteration.

(4) A resultant change or alteration, as in form or nature.

(5) In phonetics (in or of Germanic languages), the umlaut (the assimilatory process whereby a vowel is pronounced more like a following vocoid that is separated by one or more consonants).

(6) In structural linguistics (in or of Celtic languages), syntactically determined morphophonemic phenomena that affect initial sounds of words (the phonetic change in certain initial consonants caused by a preceding word).

(7) An alternative word for “mutant”

(8) In cellular biology & genetics, a change in the chromosomes or genes of a cell which, if occurring in the gametes, can affect the structure and development of all or some of any resultant off-spring; any heritable change of the base-pair sequence of genetic material.

(9) A physical characteristic of an individual resulting from this type of chromosomal change.

(10) In law, the transfer of title of an asset in a register.

(11) In ornithology, one of the collective nouns for the thrush (the more common forms being “hermitage” & “rash”)

1325–1375: From the Middle English mutacioun & mutacion (action or process of changing), from the thirteenth century Old French mutacion and directly from the Latin mūtātion- (stem of mūtātiō) (a changing, alteration, a turn for the worse), noun of action from past-participle stem of mutare (to change), from the primitive Indo-European root mei- (to change, go, move).  The construct can thus be understood as mutat(e) +ion.  Dating from 1818, the verb mutate (to change state or condition, undergo change) was a back-formation from mutation.  It was first used in genetics to mean “undergo mutation” in 1913.  The –ion suffix was from the Middle English -ioun, from the Old French -ion, from the Latin -iō (genitive -iōnis).  It was appended to a perfect passive participle to form a noun of action or process, or the result of an action or process. The use in genetics in the sense of “process whereby heritable changes in DNA arise” dates from 1894 (although the term "DNA" (deoxyribonucleic acid) wasn't used until 1938 the existence of the structure (though not its structural detail) was fist documented in 1869 after the identification of nuclein).  In linguistics, the term “i-mutation” was first used in 1874, following the earlier German form “i-umlaut”, the equivalent in English being “mutation”.  The noun mutagen (agent that causes mutation) was coined in 1946, the construct being muta(tion) + -gen.  The –gen suffix was from the French -gène, from the Ancient Greek -γενής (-gens).  It was appended to create a word meaning “a producer of something, or an agent in the production of something” and is familiar in the names of the chemical elements hydrogen, nitrogen, and oxygen.  From mutagen came the derive forms mutagenic, mutagenesis & mutagenize.  Mutation, mutationist & mutationism is a noun, mutability is a noun, mutable & mutant are nouns & adjectives, mutated & mutating are verbs & adjectives, mutational & mutationistic are adjective and mutationally is an adverb; the noun plural is mutations.  For whatever reasons, the adverb mutationistically seems not to exist.

In scientific use the standard abbreviation is mutat and forms such as nonmutation, remutation & unmutational (used both hyphenated and not) are created as required and there is even demutation (used in computer modeling).  In technical use, the number of derived forms is vast, some of which seem to enjoy some functional overlap although in fields like genetics and cellular biology, the need for distinction between fine details of process or consequence presumably is such that the proliferation may continue.  In science and linguistics, the derived forms (used both hyphenated and not) include animutation, antimutation, backmutation, e-mutation, ectomutation, endomutation, epimutation, extramutation, frameshift mutation, hard mutation, heteromutation, homomutation, hypermutation, hypomutation, i-mutation, intermutation, intramutation, intromutation, macromutation, macromutational, megamutation, mesomutation, micromutation, missense mutation, mixed mutation, multimutation, mutationless, mutation pressure, nasal mutation, neomutation, nonsense mutation, oncomutation, paramutation. Pentamutation, phosphomutation. point mutation, postmutation, premutation, radiomutation, retromutation, soft mutation, spirant mutation, stem mutation, stereomutation, ultramutation & vowel mutation.

Ginger, copper, auburn & chestnut are variations on the theme of red-headedness: Ranga Lindsay Lohan demonstrates the possibilities.

Red hair is the result of a mutation in the melanocortin 1 receptor (MC1R) gene responsible for producing the MC1R protein which plays a crucial role also in determining skin-tone. When the MC1R gene is functioning normally, it helps produce eumelanin, a type of melanin that gives hair a dark color.  However, a certain mutation in the MC1R gene leads to the production of pheomelanin which results in red hair.  Individuals with two copies of the mutated MC1R gene (one from each parent) typically have red hair, fair skin, and a higher sensitivity to ultraviolet (UV) light, a genetic variation found most often in those of northern & western European descent.

A mutation is a change in the structure of the genes or chromosomes of an organism and mutations occurring in the reproductive cells (such as an egg or sperm), can be passed from one generation to the next.  It appears most mutations occur in “junk DNA” and the orthodox view is these generally have no discernible effects on the survivability of an organism.  The term junk DNA was coined to describe those portions of an organism's DNA which do not encode proteins and were thought to have no functional purpose (although historically there may have been some).  The large volume of these “non-coding regions” surprised researchers when the numbers emerged because the early theories had predicted they would comprise a much smaller percentage of the genome.  The term junk DNA was intentionally dismissive and reflected the not unreasonable assumption the apparently redundant sequences were mere evolutionary “leftovers” without an extant biological function of any significance.

However, as advances in computing power have enabled the genome further to be explored, it’s been revealed that many of these non-coding regions do fulfil some purpose including: (1) A regulatory function: (the binary regulation of gene expression, influencing when, where, and how genes are turned on or off; (2) As superstructure: (Some regions contribute to the structural integrity of chromosomes (notably telomeres and centromeres); (3) In RNA (ribonucleic acid) molecules: Some non-coding DNA is transcribed into non-coding RNA molecules (such as microRNAs and long non-coding RNAs), which are involved in various cellular processes; (4) Genomic Stability: It’s now clear there are non-coding regions which contribute to the maintenance of genomic stability and the protection of genetic information.  Despite recent advances, the term junk DNA is still in use in mapping but is certainly misleading for those not immersed in the science; other than in slang, in academic use and technical papers, “non-coding DNA” seems now the preferred term and where specific functions have become known, these regions are described thus.

There’s also now some doubt about the early assumptions that of the remaining mutations, the majority have harmful effects and only a minority operate to increase an organism's ability to survive, something of some significance because a mutation which benefits a species may evolve by means of natural selection into a trait shared by some or all members of the species.  However, there have been suggestions the orthodox view was (at least by extent) influenced by the slanting of the research effort towards diseases, syndromes and other undesirable conditions and that an “identification bias” may thus have emerged.  So the state of the science now is that there are harmful & harmless mutations but there are also mutations which may appear to have no substantive effect yet may come to be understood as significant, an idea which was explored in an attempt to understand why some people found to be inflected with a high viral-load of SARS-Cov-2 (the virus causing Covid-19) remained asymptomatic.

In genetics, a mutation is a change in the DNA sequence of an organism and it seems they can occur in any part of the DNA and can vary in size and type.  Most associated with errors during DNA replication, mutations can also be a consequence of viral infection or exposure to certain chemicals or radiation, or as a result of viral infections.  The classification of mutations has in recent years been refined to exist in three categories:

(1) By the Effect on DNA Sequence:  These are listed as Point Mutations which are changes in a single nucleotide and include (1.1) Substitutions in which one base pair is replaced by another, (1.2) Insertions which describe the addition of one or more nucleotide pairs and (1.3) Deletions, the removal of one or more nucleotide pairs.

(2) By the Effect on Protein Sequence: These are listed as: (2.1) Silent Mutations which do not change the amino acid sequence of the protein, (2.2) Missense Mutations in which there is a change one amino acid in the protein, potentially affecting its function, (2.3) Nonsense Mutations which create a premature stop codon, leading to a truncated and usually non-functional protein and (2.4) Frameshift Mutations which result from insertions or deletions that change the reading frame of the gene, often leading to a completely different and non-functional protein.

(3) By the Effect on Phenotype: These are listed as (3.1) Beneficial Mutations which provide some advantage to the organism, (3.2) Neutral Mutations which have no apparent significant effect on the organism's fitness and (3.3) Deleterious Mutations which are harmful to the organism and can cause diseases or other problems.

(4) By the Mechanism of Mutation: These are listed as (4.1) Spontaneous Mutations which occur naturally without any external influence, due often to errors in DNA replication and (4.2) Induced Mutations which result from exposure to mutagens environmental factors such as chemicals or radiation that can cause changes in DNA),

Because of the association with disease, genetic disorders and disruptions to normal biological functions, in the popular imagination mutations are thought undesirable.  They are however a crucial part of the evolutionary process and life on this planet as it now exists would not be possible without the constant process of mutation which has provided the essential genetic diversity within populations and has driven the adaptation and evolution of species.  Although it will probably never be known if life on earth started and died out before beginning the evolutionary chain which endures to this day, as far as is known, everything now alive (an empirically, that means in the entire universe) ultimately has a single common ancestor.  Mutations have played a part in the diversity which followed and of all the species which once have inhabited earth, a tiny fraction remain, the rest extinct.

Nuclear-induced mutations

Especially since the first A-Bombs were used in 1945, the idea of “mutant humans” being created by the fallout from nuclear war or power-plants suffering a meltdown have been a staple for writers of science fiction (SF) and producers of horror movies, the special-effects and CGI (computer generated graphics) crews ever imaginative in their work.  The fictional works are disturbing because radiation-induced human mutations are not common but radiation can cause changes in DNA, leading to mutations and a number of factors determine the likelihood and extent of damage.  The two significant types of radiation are: (1) ionizing radiation which includes X-rays, gamma rays, and particles such as alpha and beta particles.  Ionizing radiation has enough energy to remove tightly bound electrons from atoms, creating ions and directly can damage DNA or create reactive oxygen species that cause indirect damage.  In high doses, ionizing radiation can increase the risk of cancer and genetic mutations and (2) non-ionizing radiation which includes ultraviolet (UV) light, visible light, microwaves, and radiofrequency radiation.  Because this does not possess sufficient energy to ionize atoms or molecules, which there is a risk of damage to DNA (seen most typically in some types of skin cancer), but the risk of deep genetic mutations is much lower than that of ionizing radiation.  The factors influencing the extent of damage include the dose, duration of exposure, the cell type(s) affected, a greater or lesser genetic predisposition and age.

Peter Dutton (b 1970; leader of the opposition and leader of the Australian Liberal Party since May 2022) announces the Liberal Party's new policy advocating the construction of multiple nuclear power-plants in Australia.

The prosthetic used in the digitally-altered image (right) was a discarded proposal for the depiction of Lord Voldemort in the first film version of JK Rowling's (b 1965) series of Harry Potter children's fantasy novels; it used a Janus-like two-faced head.  It's an urban myth Mr Dutton auditioned for the part when the first film was being cast but was rejected as being "too scary".  If ever there's another film, the producers might reconsider and should his career in politics end (God forbid), he could bring to Voldemort the sense of menacing evil the character has never quite achieved.  Interestingly, despite many opportunities, Mr Dutton has never denied being a Freemason.

On paper, while not without challenges, Australia does enjoy certain advantages in making nuclear part of the energy mix: (1)  With abundant potential further to develop wind and solar generation, the nuclear plants would need only to provide the baseload power required when renewable sources were either inadequate or unavailable; (2) the country would be self-sufficient in raw uranium ore (although it has no enrichment capacity) and (3) the place is vast and geologically stable so in a rational world it would be nominated as the planet's repository of spent nuclear fuel and other waste.  The debate as it unfolds is likely to focus on other matters and nobody images any such plant can in the West be functioning in less than twenty-odd years (the Chinese Communist Party (CCP) gets things done much more quickly) so there's plenty of time to squabble and plenty of people anxious to join in this latest theatre of the culture wars.  Even National Party grandee Barnaby Joyce (b 1967; thrice (between local difficulties) deputy prime minister of Australia 2016-2022) has with alacrity become a champion of all things nuclear (electricity, submarines and probably bombs although, publicly, he seems not to have discussed the latter).  The National Party has never approved of solar panels and wind turbines because they associate them with feminism, seed-eating veganshomosexuals and other symbols of all which is wrong with modern society.  While in his coal-black heart Mr Joyce's world view probably remains as antediluvian as ever, he can sniff the political wind in a country now beset by wildfires, floods and heatwaves and talks less of the beauty of burning fossil fuels.  Still, in the wake of Mr Dutton's announcement, conspiracy theorists have been trying to make Mr Joyce feel better, suggesting the whole thing is just a piece of subterfuge designed to put a spanner in the works of the transition to renewable energy generation, the idea being to protect the financial positions of those who make much from fossil fuels, these folks being generous donors to party funds and employers of "helpful" retired politicians in lucrative and undemanding roles.

Tuesday, June 25, 2024

Chair

Chair (pronounced cherr)

(1) A seat, especially if designed for one person, usually with four legs (though other designs are not uncommon) for support and a rest for the back, sometimes with rests for the arms (as distinct from a sofa, stool, bench etc).

(2) Something which serves as a chair or provides chair-like support (often used in of specialized medical devices) and coined as required (chairlift, sedan chair, wheelchair etc).

(3) A seat of office or authority; a position of authority such as a judge.

(4) In academic use, a descriptor of a professorship.

(5) The person occupying a seat of office, especially the chairperson (the nominally gendered term “chairman” sometimes still used, even of female or non-defined chairs).

(6) In an orchestra, the position of a player, assigned by rank (1st chair, 2nd chair etc).

(7) In informal use, an ellipsis of electric chair (often in the phrase “Got the chair” (ie received a death sentence)).

(8) In structural engineering, the device used in reinforced-concrete construction to maintain the position of reinforcing rods or strands during the pouring operation.

(9) In glass-blowing, a glassmaker's bench having extended arms on which a blowpipe is rolled in shaping glass.

(10) In railroad construction, a metal block for supporting a rail and securing it to a crosstie or the like (mostly UK).

(11) To place or seat in a chair.

(12) To install in office.

(13) To preside over a committee, board, tribunal etc or some ad hoc gathering; to act as a chairperson.

(14) To carry someone aloft in a sitting position after a triumph or great achievement (mostly UK and performed after victories in sport).

(15) In chemistry, one of two possible conformers of cyclohexane rings (the other being boat), shaped roughly like a chair.

(16) A vehicle for one person; either a sedan chair borne upon poles, or a two-wheeled carriage drawn by one horse (also called a gig) (now rare).

(17) To award a chair to the winning poet at an eisteddfod (exclusive to Wales).

1250-1300: From the Middle English chayer, chaire, chaiere, chaere, chayre & chayere, from the Old French chaiere & chaere (chair, seat, throne), from the Latin cathedra (seat), from the Ancient Greek καθέδρα (kathédra), the construct being κατά (katá) (down) + δρα (hédra) (seat).  It displaced the native stool and settle, which shifted to specific meanings.  The twelfth century modern French chaire (pulpit, throne) in the sixteenth century separated in meaning when the more furniture came to be known as a chaise (chair).  Chair is a noun & verb and chaired & chairing are verbs; the noun plural is chairs.

The figurative sense of "seat of office or authority" emerged at the turn of the fourteenth century and originally was used of professors & bishops (there once being rather more overlap between universities and the Church).  That use persisted despite the structural changes in both institutions but it wasn’t until 1816 the meaning “office of a professor” was extended from the mid-fifteenth century sense of the literal seat from which a professor conducted his lectures.  Borrowing from academic practice, the general sense of “seat of a person presiding at meeting” emerged during the 1640s and from this developed the idea of a chairman, although earliest use of the verb form “to chair a meeting” appears as late as 1921.  Although sometimes cited as indicative of the “top-down” approach taken by second-wave feminism, although it was in the 1980s that the term chairwoman (woman who leads a formal meeting) first attained general currency, it had actually been in use since 1699, a coining apparently thought needed for mere descriptive accuracy rather than an early shot in the culture wars, chairman (occupier of a chair of authority) having been in use since the 1650s and by circa 1730 it had gained the familiar meaning “member of a corporate body appointed to preside at meetings of boards or other supervisor bodies”.  By the 1970s however, the culture wars had started and the once innocuous “chairwoman” was to some controversial, as was the gender-neutral alternative “chairperson” which seems first to have appeared in 1971.  Now, most seem to have settled on “chair" which seems unobjectionable although presumably, linguistic structuralists could claim it’s a clipping of (and therefore implies) “chairman”.

Chairbox offers a range of “last shift” coffin-themed chairs, said to be ideal for those "stuck in a dead-end job, sitting on a chair in a cubicle".  The available finishes include walnut (left) and for those who enjoy being reminded of cremation, charcoal wood can be used for the seating area (right).  An indicative list price is Stg£8300 (US$10,400) for a Last Shift trimmed in velvet.

The slang use as a short form of electric chair dates from 1900 and was used to refer both to the physical device and the capital sentence.  In interior decorating, the chair-rail was a timber molding fastened to a wall at such a height as would prevent the wall being damaged by the backs of chairs.  First documented in 1822, chair rails are now made also from synthetic materials.  The noun wheelchair (also wheel-chair) dates from circa 1700, and one so confined is said sometimes to be “chair bound”.  The high-chair (an infant’s seat designed to make feeding easier) had probably been improvised for centuries but was first advertised in 1848.  The term easy chair (a chair designed especially for comfort) dates from 1707.  The armchair (also arm-chair), a "chair with rests for the elbows", although a design of long-standing, was first so-described in the 1630s and the name outlasted the contemporary alternative (elbow-chair).  The adjectival sense, in reference to “criticism of matters in which the critic takes no active part” (armchair critic, armchair general etc) dates from 1879.  In academic use, although in the English-speaking world the use of “professor” seems gradually to be changing to align with US practice, the term “chair” continues in its traditional forms: There are chairs (established professorships), named chairs (which can be ancient or more recent creations which acknowledge the individual, family or institution providing the endowment which funds the position), personal chairs (whereby the title professor (in some form) is conferred on an individual although no established position exists), honorary chairs (unpaid appointments) and even temporary chairs (which means whatever the institution from time-to-time says it means).

In universities, the term “named chair” refers usually to a professorship endowed with funds from a donor, typically bearing the name of the donor or whatever title they nominate and the institution agrees is appropriate.  On rare occasions, named chairs have been created to honor an academic figure of great distinction (usually someone with a strong connection with the institution) but more often the system exists to encourage endowments which provide financial support for the chair holder's salary, research, and other academic activities.  For a donor, it’s a matter both of legacy & philanthropy in that a named chair is one of the more subtle and potentially respectable forms of public relations and a way to contribute to teaching & research in a field of some interest or with a previous association.

Professor Michael Simons (official photograph issued by Yale University's School of Medicine).

So it can be a win-win situation but institutions do need to practice due diligence in the process of naming or making appointments to named chairs as a long running matter at Yale University demonstrates.  In 2013, an enquiry convened by Yale found Professor Michael Simons (b 1957) guilty of sexual harassment and suspended him as Chief of Cardiology at the School of Medicine.  Five years on, the professor accused Yale of “punishing him again” for the same conduct in a gender-discriminatory effort to appease campus supporters of the #MeToo movement which had achieved national prominence.  That complaint was prompted when Professor Simons was in 2018 appointed to, and then asked to resign from a named chair, the Robert W Berliner Professor of Medicine, endowed by an annual grant of US$500,000 from the family of renal physiologist, Robert Berliner (1915-2002).  Professor Simons took his case to court and early in 2024 at a sitting of federal court ruled, he obtained a ruling in his favour, permitting him to move to trial, Yale’s motion seeing a summary judgment in all matters denied, the judge fining it appropriate that two of his complaints (one on the basis of gender discrimination in violation of Title VII of the Civil Rights Act (1964) and one under Title IX of the Education Amendments Act (1972)) should be heard before a jury.  The trial judge noted in his judgment that there appeared to be a denial of due process in 1918 and that happened at a time when (as was not disputed), Yale was “the subject of news reports criticizing its decision to reward a sexual harasser with an endowed chair.

What the documents presented in Federal court revealed was that Yale’s handling of the matter had even within the institution not without criticism.  In 2013 the University-Wide Committee on Sexual Misconduct found the professor guilty of sexual harassment and he was suspended (but not removed) as chief of cardiology at the School of Medicine.  Internal documents subsequently leaked to the New York Times (NYT) revealed there were 18 faculty members dissatisfied with that outcome and a week after the NYT sought comment from Yale, it was announced Simons would be removed from the position entirely and in November 2014, the paper reported that Yale had also removed him from his position as director of its Cardiovascular Research Center.  Simons alleges that these two additional actions were taken in response to public reaction to the stories published by the NYT but the university disputed that, arguing the subsequent moves were pursuant to the findings of an internal “360 review” of his job performance.  In 2018, Simons was asked to relinquish the Berliner chair on the basis he would be appointed instead to another endowed chair.  In the documents Simons filed in Federal Court, this request came after “one or more persons … sympathetic to the #MeToo movement” contacted the Berliner family encouraging them to demand that the University remove Simons from the professorship, prompting Yale, “fearing a backlash from the #MeToo activists and hoping to placate them,” to “began exploring” his removal from the chair.

School of Medicine, Yale University, New Haven, Connecticut, USA.

Later in 2018, Simons was duly appointed to another named chair, prompting faculty members, students and alumni to send an open letter to Yale’s president expressing “disgust and disappointment” at the appointment.  The president responded with a formal notice to Simmons informing him he had 24 hours to resign from the chair, and Simmons also alleges he was told by the president of “concerns” the institution had about the public criticism.  In October 2019, Simons filed suit against Yale (and a number of individuals) on seven counts: breach of contract, breach of the implied warranty of fair dealing, wrongful discharge, negligent infliction of emotional distress, breach of privacy, and discrimination on the basis of gender under Title VII of the Civil Rights Act of 1964 and Title IX of the Education Amendments of 1972.   Three of these (wrongful discharge, negligent infliction of emotional distress and breach of privacy) were in 2020 struck-out in Federal Court and this was the point at which Yale sought summary judgment for the remainder.  This was partially granted but the judge held that the matter of gender discrimination in violation of Title VII and Title IX needed to be decided by a jury.  A trial date has not yet been set but it will be followed with some interest.  While all cases are decided on the facts presented, it’s expected the matter may be an indication of the current state of the relative strength of “black letter law” versus “prevailing community expectations”.

Personal chair: Lindsay Lohan adorning a chair.

The Roman Catholic Church’s dogma of papal infallibility holds that a pope’s rulings on matters of faith and doctrine are infallibility correct and cannot be questioned.  When making such statements, a pope is said to be speaking ex cathedra (literally “from the chair” (of the Apostle St Peter, the first pope)).  Although ex cathedra pronouncements had been issued since medieval times, as a point of canon law, the doctrine was codified first at the First Ecumenical Council of the Vatican (Vatican I; 1869–1870) in the document Pastor aeternus (shepherd forever).  Since Vatican I, the only ex cathedra decree has been Munificentissimus Deus (The most bountiful God), issued by Pius XII (1876–1958; pope 1939-1958) in 1950, in which was declared the dogma of the Assumption; that the Virgin Mary "having completed the course of her earthly life, was assumed body and soul into heavenly glory".  Pius XII never made explicit whether the assumption preceded or followed earthly death, a point no pope has since discussed although it would seem of some theological significance.  Prior to the solemn definition of 1870, there had been decrees issued ex cathedra.  In Ineffabilis Deus (Ineffable God (1854)), Pius IX (1792–1878; pope 1846-1878) defined the dogma of the Immaculate Conception of the Blessed Virgin Mary, an important point because of the theological necessity of Christ being born free of sin, a notion built upon by later theologians as the perpetual virginity of Mary.  It asserts that Mary "always a virgin, before, during and after the birth of Jesus Christ", explaining the biblical references to brothers of Jesus either as children of Joseph from a previous marriage, cousins of Jesus, or just folk closely associated with the Holy Family.

Technically, papal infallibility may have been invoked only the once since codification but since the early post-war years, pontiffs have found ways to achieve the same effect, John Paul II (1920–2005; pope 1978-2005) & Benedict XVI (1927–2022; pope 2005-2013, pope emeritus 2013-2022) both adept at using what was in effect a personal decree a power available to one who sits at the apex of what is in constitutional terms an absolute theocracy.  Critics have called this phenomenon "creeping infallibility" and its intellectual underpinnings own much to the tireless efforts of Benedict XVI while he was head of the Inquisition (by then called the Congregation for the Doctrine of the Faith (CDF) and now renamed the Dicastery for the Doctrine of the Faith (DDF)) during the late twentieth century.  The Holy See probably doesn't care but DDF is also the acronym, inter-alia, for "drug & disease free" and (in gaming) "Doom definition file" and there's also the DDF Network which is an aggregator of pornography content.

The “chair” photo (1963) of Christine Keeler (1942-2017) by Hong Kong Chinese photographer Lewis Morley (1925-2013) (left) and Joanne Whalley-Kilmer (b 1961) in Scandal (1989, a Harvey Weinstein (b 1952) production) (centre).  The motif was reprised by Taiwanese-American photographer Yu Tsai (b 1975) in his sessions for the Lindsay Lohan Playboy photo-shoot; it was used for the cover of the magazine’s January/February 2012 issue (right).  Ms Lohan wore shoes for some of the shoot but these were still "nudes" because "shoes don't count"; everybody knows that. 

The Profumo affair was one of those fits of morality which from time-to-time would afflict English society in the twentieth century and was a marvellous mix of class, sex, spying & money, all things which make an already good scandal especially juicy.  The famous image of model Christine Keeler, nude and artfully positioned sitting backwards on an unexceptional (actually a knock-off) plywood chair, was taken in May 1963 when the moral panic over the disclosure Ms Keeler simultaneously was enjoying the affection of both a member of the British cabinet and a Soviet spy.  John Profumo (1915-2006) was the UK’s Minister for War (the UK cabinet retained the position until 1964 although it was disestablished in the US in 1947) who, then 46, was found to be conducting an adulterous affair with the then 19 year old topless model at the same time she (presumably as her obviously crowded schedule permitted) fitted in trysts with a KGB agent, attached to the Soviet embassy with the cover of naval attaché.  Although there are to this day differing interpretations of the scandal, there have never been any doubts this potential Cold-War conduit between Moscow and Her Majesty’s Secretary of State for War represented at least a potential conflict of interest.  The fallout from the scandal ended Profumo’s political career, contributed to the fall of Harold Macmillan’s (1894–1986; UK prime-minister 1957-1963) government and was one of a number of the factors in the social changes which marked English society in the 1960s.

Commercially & technically, photography then was a different business and the “chair” image was the last shot on a 12-exposure film, all taken in less than five minutes at the end of a session which hurriedly had been arranged because Ms Keeler had signed a contract which included a “nudity” clause for photos to be used as “publicity stills” for a proposed film about the scandal.  As things turned out, the film was never released (not until Scandal (1989) one would appear) but the photograph was leaked to the tabloid press, becoming one of the more famous of the era although later feminist critiques would deconstruct the issues of exploitation they claimed were inherent.  Playboy’s editors would not be unaware of the criticism but the use of a chair to render a nude image SFW (suitable for work) remains in the SOP (standard operating procedures) manual.

Contact sheet from photoshoot, Victoria and Albert (V&A) Museum: exhibit E.2830-2016.

Before the “nude” part which concluded the session, two rolls of film had already been shot with the subject sitting in various positions (on the chair and the floor) while “wearing” a small leather jerkin.  At that point the film’s producers mentioned the “nude” clause.  Ms Keeler wasn’t enthusiastic but the producers insisted so all except subject and photographer left the room and the last roll was shot, some of the earlier poses reprised while others were staged, the last, taken with the camera a little further away with the subject in what Mr Morley described as “a perfect positioning”, was the “chair” shot.

The “Keeler Chair” (left) and an Arne Jacobsen Model 3107 (right).

Both chair & the gelatin-silver print of the photograph are now in the collections of London’s Victoria and Albert (V&A) Museum (the photograph exhibit E.2-2002; the chair W.10-2013).  Although often wrongly identified a Model 3107 (1955) by Danish modernist architect & furniture designer Arne Jacobsen (1902-1971), it’s actually an example of one of a number of inexpensive knock-offs produced in the era.  Mr Morley in 1962 bought six (at five shillings (50c) apiece) for his studio and it’s believed his were made in Denmark although the identity of the designer or manufacturer are unknown.  Unlike a genuine 3107, the knock-off has a handle cut-out (in a shape close to a regular trapezoid) high on the back, an addition both functional and ploy typical of those used by knock-off producers seeking to evade accusations of violations of copyright.  Structurally, a 3017 uses a thinner grade of plywood and a more subtle molding.  The half-dozen chairs in Mr Morley’s studio were mostly unnoticed office furniture until Ms Keeler lent one its infamy although they did appear in others of his shoots including those from his session with television personality & interviewer Sir David Frost (1939–2013) and it’s claimed the same chair was used for both.  In London’s second-hand shops it’s still common to see the knock-offs (there were many) described as “Keeler” chairs and Ms Lohan’s playboy shoot was one of many in which the motif has been used and it was the obvious choice of pose for Joanne Whalley-Kilmer’s promotional shots for the 1989 film in which she played Ms Keeler; it was used also for the covers of the DVD & Blu-ray releases 

Old Smoky, the electric chair once used in the Tennessee Prison System, Alcatraz East Crime Museum.  "Old Sparky" seems to be the preferred modern term.

Crooked Hillary Clinton in pantsuit.

Although the numbers did bounce around a little, polling by politico.com found that typically about half of Republican voters believe crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) should be locked up while fewer than 2% think she should “get the chair”, apparently on the basis of her being guilty of something although some might just find her “really annoying” and take the pragmatic view a death sentence would remove at least that problem from their life.  The term “electric chair” is most associated with the device used for executions but is also common slang to describe other machinery including electric wheelchairs and powered (heat, cooling or movement) seats or chairs of many types.  First used in the US during the 1890s, like the guillotine, the electric chair was designed as a more humane (ie faster) method of execution compared with the then common hanging where death could take minutes.  Now rarely used (and in some cases declared unconstitutional as a “cruel & unusual punishment”), in some US states, technically it remains available including as an option the condemned may choose in preference to lethal injection.

Electric Chair Suite (1971) screen print decology by Andy Warhol.

Based on a newspaper photograph (published in 1953) of the death chamber at Sing Sing Prison in New York, where US citizens Julius (1918-1953) & Ethel Rosenberg (1915-1953) were that year executed as spies, Andy Warhol (1928–1987) produced a number of versions of Electric Chair, part of the artist’s Death and Disaster series which, beginning in 1963, depicted imagery such as car crashes, suicides and urban unrest.  The series was among the many which exploited his technique of transferring a photograph in glue onto silk, a method which meant each varied in some slight way.  His interest was two-fold: (1) what is the effect on the audience of render the same image with variations and (2) if truly gruesome pictures repeatedly are displayed, is the effect one of reinforcement or desensitization?  His second question was later revisited as the gratuitous repetition of disturbing images became more common as the substantially unmediated internet achieved critical mass.  The first of the Electric Chair works was created in 1964.