Showing posts with label Art. Show all posts
Showing posts with label Art. Show all posts

Friday, July 18, 2025

Mural

Mural (pronounced myoor-uhl)

(1) A large picture painted or affixed directly on a wall or ceiling.

(2) A greatly enlarged photograph attached directly to a wall.

(3) A wallpaper pattern representing a landscape or the like, often with very widely spaced repeats so as to produce the effect of a mural painting on a wall of average size; sometimes created as a trompe l'oeil (“deceives the eye”).

(4) Of, relating to, or resembling a wall.

(5) Executed on or affixed to a wall.

(6) In early astronomy, pertaining to any of several astronomical instruments that were affixed to a wall aligned on the plane of a meridian; formerly used to measure the altitude of celestial bodies.

1400–1450: From the late Middle English mural, from the Latin mūrālis (of or pertaining to a wall), the construct being mūr(us) (wall) + ālis (the Latin suffix added to a noun or numeral to form an adjective of relationship; alternative forms were ārisēlisīlis & ūlis).  The Latin mūrālis was from the Old Latin moiros & moerus, from the primitive Indo-European root mei (to fix; to build fences or fortifications) from which Old English picked-up mære (boundary, border, landmark) and Old Norse gained mæri (boundary, border-land).  In the historic record, the most familiar Latin form was probably munire (to fortify, protect).  The sense of "a painting on a wall" seems to have emerged as late as 1915 as a clipping of "mural-painting" (a painting executed upon the wall of a building), a term in use since at least 1850 and derived from mural in its adjectival form.

The adjective intermural (between walls) dates from the 1650s, from the Latin intermuralis (situated between walls), the construct being from inter- (between) + muralis (pertaining to a wall) from mūrus (wall).  The adjective intramural (within the walls (of a city, building etc)) dates from 1846, the construct being intra- (within) muralis (pertaining to a wall) from mūrus (wall); it was equivalent to Late Latin intramuranus and in English, was used originally in reference to burials of the dead.  It came first to be used in relation to university matters by Columbia in 1871.  Mural is a noun, verb & adjective; muraled is a verb & adjective, muralist & muralism are nouns and muraling is a verb; the noun plural is murals.  The adjectives murallike, muralish & muralesque are non-standard and the adverb murally is unrelated, murally a term from heraldry meaning “with a mural crown” and used mostly in the technical terms “murally crowned” & “murally gorged”.  A mural crown was a crown or headpiece representing city walls or towers and was used as a military decoration in Ancient Rome and later as a symbol in European heraldry; its most common representation was as a shape recalling the alternating merlons (raised structures extending the wall) atop a castle’s turret which provided defensive positions through which archers could fire.  The style remains familiar in some of the turrets which sometimes on the more extravagant McMansions and in the chess piece properly called the rook but also referred to as a castle.

Lindsay Lohan murals in the style of street art (graffiti): In hijab (al-amira) with kebab roll by an unknown street artist, Melbourne, Australia (left), the photograph artist used as template (centre) and in a green theme in Welcome to Venice mural by UK-born Californian street artist Jules Muck (b 1978) (right).  While a resident of Venice Beach, Ms Lohan lived next door to former special friend DJ Samantha Ronson (b 1977).

In multi-cultural Australia, the kebab roll has become a fixture in the fast-food scene with variations extending from vegan to pure meat, the term “kebab” something of a generic meaning what the vendor decides it means.  Cross-culturally the kebab roll fills a particular niche as the standard 3 am snack enjoyed by those leaving night clubs, a place and time at which appetites are heightened.  After midnight, many kebab rolls are sold by street vendors from mobile carts and those in the Middle East will not be surprised to learn barbaric Australians sometimes add pineapple to their roll.  The photograph of Ms Lohan in hijab was taken during a “doorstop” (an informal press conference) after her visit in October 2016 to Gaziantep (known to locals as Antep), a city in the Republic of Türkiye’s south-eastern Anatolia Region.  The purpose of the visit was to meet with Syrian refugees being housed in Gaziantep’s Nizip district and the floral hijab was a gift from one of the residents who presumably assisted with the placement because there’s an art to a well-worn al-amira.  Ms Muck’s work was a gesture to welcome Ms Lohan moving from Hollywood to Venice Beach and the use of green is a theme in many of her works.  Unfortunately, Ms Lohan’s time in Venice Beach was brief because she was compelled to return to New York City after being stalked by the Freemasons.

Mural montage: Donald Trump (b 1946; US president 2017-2021 and since 2025) osculating with Mr Putin (Vladimir Vladimirovich Putin; b 1952; president or prime minister of Russia since 1999), Benjamin Netanyahu (b 1949; Israeli prime minister 1996-1999, 2009-2021 and since 2022), Boris Johnson (b 1964; UK prime-minister 2019-2022), Francis (1936-2025; pope 2013-2025) and “Lyin’ Ted” Cruz (b 1970; US senator (Republican-Texas) since 2013).

Probably not long after the charcoal and ochre of the first cave paintings was seen by someone other than the artist, there emerged the calling of “art critic” and while the most common fork of that well-populated profession focuses on the aesthetic, art has also long been political.  The mural of course has much scope to be controversial because they tend to be (1) big and (2) installed in public spaces, both aspects making the things highly visible.  Unlike a conventionally sized painting which, even if large, a curator can hang in some obscure spot or put into storage, the mural is just where it is and often part of the built environment; there it will be seen.  In art history, few murals have more intriguing tales than Michelangelo’s (Michelangelo di Lodovico Buonarroti Simoni; 1475–1564) ceiling and frescos (1508-1512) in the Vatican’s Sistine Chapel but although there were at the time of the commissioning and completion few theological or political squabbles, there were the Vatican’s usual personal and institutional tensions, cardinals and bishops with their own agendas (some financial) peeking and poking into why Julius II (1443–1513; pope 1503-1513) had handed the juicy contract to someone thought primarily a sculptor rather than a painter.

Sistine Chapel, The Vatican, Rome.

The political stoush came later.  At the time, the nudity had been noted and while some voices were raised in opposition, there was no attempt to censor the work because during the High Renaissance, depictions of nudity (on canvas, in marble etc) were all around including in the Vatican but decades later, during the sittings of the Council of Trent (1545–1563), critiques of “nakedness” in art became more vocal.  That was especially the case after the Counter-Reformation (circa 1550–circa 1670) produced a more severe Church, a development with many repercussions, one of which was the “fig-leaf campaign” in which an artist was commissioned to paint over (especially male) genitalia, the traditional “fig leaf” the preferred device.  Perhaps curiously, despite the early appearance of the motif in the art of Christendom, for centuries the fig leaf wasn’t “obligatory” although they appear often enough that at times they must have been at least “desirable” and in other periods and places clearly “essential”.  The later infamous “Fig Leaf Campaign” was initiated by Pope Paul IV (1476–1559; pope 1555-1559) and continued by his successors although it was most associated with the ruling against “lasciviousness” in religious art made in 1563 by the Council of Trent.  It was something very much in the spirit of the Counter-Reformation and it was Pius IV (1499–1565; pope 1559-1565) who commissioned artist Daniele da Volterra (circa 1509–1566) to paint over the genitalia Michelangelo had depicted on his ceiling, extending his repertoire from strategically positioned leaves to artfully placed draperies or loincloths; to his dying day Romans nicknamed Volterra “Il Braghettone” (The Breeches Maker).  As late as the nineteenth century Greco-Roman statues from antiquity were still having their genitals covered with fig leaves (sometimes detachable, a trick the British Museum later adopted to protect Victoria’s (1819–1901; Queen of the UK 1837-1901) delicate sensibilities during her infrequent visits).  Another example of practical criticism was the edict by Pius IX (1792–1878; pope 1846-1878) that extant male genitalia on some of the classical statues adorning the Vatican should be “modified” and that involved stonemasons, sculptors and other artisans receiving commissions to “modify or cover” as required, some fig leaves at the time added.  It is however a myth popes sometimes would be seen atop a ladder, chisel in hand, hammering away for not only did they hire contractors to do the dirty work, what was done was almost always concealment rather than vandalism.

Then a work in progress, this is one of the few known photographs of Diego Rivera's mural in New York City's Rockefeller Center.  According to the Workers Age of 15 June, 1933, the image was "...taken surreptitiously by one or Rivera's aides... 

Still, no pope ever ordered Michelangelo’s ceiling painted over but not all artists were so fortunate.  On 9 May 1933 (by coincidence a day when the Nazis publicly were burning books), the New York’s very rich Rockefeller family ordered Mexican artist Diego Rivera (1886-1957) to cease work on his mural depicting "human intelligence in control of the forces of nature", then being painted in the great hall of the 70-storey Rockefeller Center in New York City.  Taking photographs of the mural was also prohibited.  What incurred the family’s wrath was the artist had included a depiction of Bolshevik revolutionary comrade Vladimir Lenin (1870–1924; head of government of Russia or Soviet Union 1917-1924) against a background of crowds of unemployed workers.  Comrade Lenin has not appeared in sketch (entitled Man at the Crossroads Looking with Hope and High Vision to the Choosing of a New and Better Future) the artist had provided prior to the commission being granted.  Nelson Rockefeller (1908–1979; US vice president 1974-1977 and who earned immortality by having "died on the job") genuinely was a modern art fan-boy and attempted to negotiate a compromise but it was the depth of the Great Depression and other family members, knowing there was in the air talk of revolution (the Rockefeller family had much to lose), didn’t want the idle unemployed getting ideas.  The mural was covered by a canvas drape until February 1934, when, under cover of darkness, it was broken up and carted off to be dumped, the family dutifully having paid the artist his US$21,000 fee.

Saturday, July 5, 2025

Futurism

Futurism (pronounced fyoo-chuh-riz-uhm)

(1) A movement in avant-garde art, developed originally by a group of Italian artists in 1909 in which forms (derived often from the then novel cubism) were used to represent rapid movement and dynamic motion  (sometimes with initial capital letter)

(2) A style of art, literature, music, etc and a theory of art and life in which violence, power, speed, mechanization or machines, and hostility to the past or to traditional forms of expression were advocated or portrayed (often with initial capital letter).

(3) As futurology, a quasi-discipline practiced by (often self-described) futurologists who attempt to predict future events, movements, technologies etc.

(4) In the theology of Judaism, the Jewish expectation of the messiah in the future rather than recognizing him in the presence of Christ.

(5) In the theology of Christianity, eschatological interpretations associating some Biblical prophecies with future events yet to be fulfilled, including the Second Coming.

1909: From the Italian futurismo (literally "futurism" and dating from circa 1909), the construct being futur(e) + -ism.  Future was from the Middle English future & futur, from the Old French futur, (that which is to come; the time ahead) from the Latin futūrus, (going to be; yet to be) which (as a noun) was the irregular suppletive future participle of esse (to be) from the primitive Indo-European bheue (to be, exist; grow).  It was cognate with the Old English bēo (I become, I will be, I am) and displaced the native Old English tōweard and the Middle English afterhede (future (literally “afterhood”) in the given sense.  The technical use in grammar (of tense) dates from the 1520s.  The –ism suffix was from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Futurism, futurology, & futurology are nouns, futurist is a noun & adjective and futuristic is an adjective; the noun plural is futurisms.

Lindsay Lohan in Maison Martin Margiela (b 1957) Futuristic Eyewear.

As a descriptor of the movement in art and literature, futurism (as the Italian futurismo) was adopted in 1909 by the Italian poet Filippo Tommaso Marinetti (1876-1944) and the first reference to futurist (a practitioner in the field of futurism) dates from 1911 although the word had been used as early as 1842 in Protestant theology in the sense of “one who holds that nearly the whole of the Book of Revelations refers principally to events yet to come”.  The secular world did being to use futurist to describe "one who has (positive) feelings about the future" in 1846 but for the remainder of the century, use was apparently rare.  The (now probably extinct) noun futurity was from the early seventeenth century.  The noun futurology was introduced by Aldous Huxley (1894-1963) in his book Science, Liberty and Peace (1946) and has (for better or worse), created a minor industry of (often self-described) futurologists.  In theology, the adjective futuristic came into use in 1856 with reference to prophecy but use soon faded.  In concert with futurism, by 1915 it referred in art to “avant-garde; ultra-modern” while by 1921 it was separated from the exclusive attachment to art and meant also “pertaining to the future, predicted to be in the future”, the use in this context spiking rapidly after World War II (1939-1945) when technological developments in fields such as ballistics, jet aircraft, space exploration, electronics, nuclear physics etc stimulated interest in such progress.

Untouched: Crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) & Bill Clinton (b 1946; US president 1993-2001) with cattle, 92nd Annual Hopkinton State Fair, Contoocook, New Hampshire, September 2007.

Futures, a financial instrument used in the trade of currencies and commodities appeared first in 1880; they allow (1) speculators to bet on price movements and (2) producers and sellers to hedge against price movements and in both cases profits (and losses) can be booked against movement up or down.  Futures trading can be lucrative but is also risky, those who win gaining from those who lose and those in the markets are usually professionals.  The story behind crooked Hillary Clinton's extraordinary profits in cattle futures (not a field in which she’d previously (or has subsequently) displayed interest or expertise) while “serving” as First Lady of Arkansas ((1979–1981 & 1983–1992) remains murky but it can certainly be said that for an apparently “amateur” dabbling in a market played usually by experienced professionals, she was remarkably successful and while perhaps there was some luck involved, her trading record was such it’s a wonder she didn’t take it up as a career.  While many analysts have, based on what documents are available, commented on crooked Hillary’s somewhat improbable (and apparently sometime “irregular”) foray into cattle futures, there was never an “official governmental investigation” by an independent authority and no thus adverse findings have ever been published.  

The Arrival (1913), oil on canvas by Christopher Richard Wynne Nevinson (1889-1946), Tate Gallery.

Given what would unfold over during the twentieth century, it’s probably difficult to appreciate quite how optimistic was the Western world in the years leading up to the World War I (1914-1918).  Such had been the rapidity of the discovery of novelties and of progress in so many fields that expectations of the future were high and, beginning in Italy, futurism was a movement devoted to displaying the energy, dynamism and power of machines and the vitality and change they were bringing to society.  It’s also often forgotten that when the first futurist exhibition was staged in Paris in 1912, the critical establishment was unimpressed, the elaborate imagery with its opulence of color offending their sense of refinement, now so attuned to the sparseness of the cubists.

The Hospital Train (1915), oil on canvas by Gino Severini (1883-1966), Stedelijk Museum.

Futurism had debuted with some impact, the Paris newspaper Le Figaro in 1909 publishing the manifesto by Italian poet Filippo Tommaso Marinetti. Marinetti which dismissed all that was old and celebrated change, originality, and innovation in culture and society, something which should be depicted in art, music and literature. Marinetti exalted in the speed, power of new technologies which were disrupting society, automobiles, aeroplanes and other clattering machines.  Whether he found beauty in the machines or the violence and conflict they delivered was something he left his readers to decide and there were those seduced by both but his stated goal was the repudiation of traditional values and the destruction of cultural institutions such as museums and libraries.  Whether this was intended as a revolutionary roadmap or just a provocation to inspire anger and controversy is something historians have debated.  Assessment of Marinetti as a poet has always been colored by his reputation as a proto-fascist and some treat as "fake mysticism" his claim his "visions" of the future and the path to follow to get there came to him in the moment of a violent car crash. 

Futurismo: Uomo Nuovo (New Man, 1918), drawing by Mario Sironi (1885-1961).

As a technique, the futurist artists borrowed much from the cubists, deploying the same fragmented and intersecting plane surfaces and outlines to render a number of simultaneous, overlaid views of an object but whereas the cubists tended to still life, portraiture and other, usually static, studies of the human form, the futurists worshiped movement, their overlays a device to depict rhythmic spatial repetitions of an object’s outlines during movement.  People did appear in futurist works but usually they weren’t the focal point, instead appearing only in relation to some speeding or noisy machine.  Some of the most prolific of the futurist artists were killed in World War I and as a political movement it didn’t survive the conflict, the industrial war dulling the public appetite for the cult of the machine.  However, the influence of the compositional techniques continued in the 1920s and contributed to art deco which, in more elegant form, would integrate the new world of machines and mass-production into motifs still in use today.

Motociclista (Motorcyclist, circa 1924), oil on canvas by Mario Sironi.

By the early twentieth century when the Futurism movement emerged, machines and mechanism were already hundreds of years old (indeed the precursor devices pre-date Christ) but what changed was the new generations of machines had become sexy (at least in the eyes of men), associated as they were with something beyond mere functionalism: speed and style.  While planes, trains & automobiles all attracted the futurists, the motorcycle was a much-favored motif because it possessed an intimacy beyond other forms of transportation in that, literally it was more an extension of the human body, the rider at speed conforming to the shape of the structure fashioned for aerodynamic efficiency with hands and feet all directly attached to the vital controls: machine as extension of man.

The Modern Boy No. 100, Vol 4, Week Ending 4 January, 1930.

The Modern Boy (1928-1939) was, as the name implies, a British magazine targeted at males aged 12-18 and the content reflected the state of mind in the society of the inter-war years, the 1930s a curious decade of progress, regression, hope and despair.  Although what filled much of the pages (guns, military conquest and other exploits, fast cars and motorcycles, stuff the British were doing in other peoples’ countries) would today see the editors cancelled or visited by one of the many organs of the British state concerned with the suppression of such things), it was what readers (presumably with the acquiescence of their parents) wanted.  Best remembered of the authors whose works appeared in The Modern Boy was Captain W.E. Johns (1893–1968), a World War I RFC (Royal Flying Corps) pilot who created the fictional air-adventurer Biggles.  The first Biggles tale appeared in 1928 in Popular Flying magazine (released also as Popular Aviation and still in publication as Flying) and his stories are still sometimes re-printed (although with the blatant racism edited out).  The first Biggles story had a very modern-sounding title: The White FokkerThe Modern Boy was a successful weekly which in 1988 was re-launched as Modern Boy, the reason for the change not known although dropping superfluous words (and much else) was a feature of modernism.  In October 1939, a few weeks after the outbreak of World War II, publication ceased, Modern Boy like many titles a victim of restrictions by the Board of Trade on the supply of paper for civilian use.

Jockey Club Innovation Tower, Hong Kong (2013) by Zaha Hadid (1950-2016).

If the characteristics of futurism in art were identifiable (though not always admired), in architecture, it can be hard to tell where modernism ends and futurism begins.  Aesthetics aside, the core purpose of modernism was of course its utilitarian value and that did tend to dictate the austerity, straight lines and crisp geometry that evolved into mid-century minimalism so modernism, in its pure form, should probably be thought of as a style without an ulterior motive.  Futurist architecture however carried the agenda which in its earliest days borrowed from the futurist artists in that it was an assault on the past but later moved on and in the twenty-first century, the futurist architects seem now to be interested above all in the possibilities offered by advances in structural engineering, functionality sacrificed if need be just to demonstrate that something new can be done.  That's doubtless of great interest at awards dinners where architects give prizes to each other for this and that but has produced an international consensus that it's better to draw something new than something elegant.  The critique is that while modernism once offered “less is more”, with neo-futurist architecture it's now “less is bore”.  Art deco and mid-century modernism have aged well and it will be interesting to see how history judges the neo-futurists.

Tuesday, July 1, 2025

Sherry

Sherry (pronounced sher-ee)

(1) A fortified, amber-colored wine, originally from the Jerez region of southern Spain or any of various similar wines made elsewhere; usually drunk as an apéritif.  Technically, a white wine.

(2) A female given name, a form of Charlotte.

(3) A reddish color in the amber-brown spectrum.

1590-1600: A (mistaken singular) back formation from the earlier sherris (1530s), from the Spanish (vino deXeres ((wine from) Xeres).  Xeres is now modern-day Jerez (Roman (urbsCaesaris) in Spain, near the port of Cadiz, where the wine was made.  The official name is Jerez-Xérès-Sherry, one of Spain's wine regions, a Denominación de Origen Protegida (DOP).  The word sherry is an anglicisation of Xérès (Jerez) and the drink was previously known as sack, from the Spanish saca (extraction) from the solera.  In EU law, sherry has protected designation of origin status, and under Spanish law, to be so labelled, the product must be produced in the "Sherry Triangle", an area in the province of Cádiz between Jerez de la Frontera, Sanlúcar de Barrameda, and El Puerto de Santa María.  In 1933 the Jerez denominación de origen was the first Spanish denominación officially thus recognized, named D.O. Jerez-Xeres-Sherry and sharing the same governing council as D.O. Manzanilla Sanlúcar de Barrameda.  The name "sherry" continues to be used by US producers where, to conform to domestic legislation, it must be labeled with a region of origin such as Oregon Sherry but can’t be sold in the EU (European Union) because of the protected status laws.  Both Canadian and Australian winemakers now use the term Apera instead of Sherry, although customers seem still to favor the original.  Sherry is a noun; the noun plural is sherries.

Sherry Girl (in bold two-copa ‘Sherry Stance’) and the ultimate sherry party.

Held annually since 2014 (pandemics permitting), Sherry Week is a week-long celebration of “gastronomical and cultural events” enjoyed by the “vibrant global Sherry community” which gathers to “showcase the wine’s incredible diversity, from the dry crispness of Fino to the velvety sweetness of Cream.  Although the multi-venue Sherry Week is now the best known meeting on the Sherry calendar, worldwide, since 2014 some 20,000 events have taken place with the approval of the Consejo Regulador for Jerez-Xérès-Sherry and Manzanilla; to date there have been more than half a million attendees and in 2024 alone there were over 3,000 registered events in 29 countries in cities including London, Madrid, São Paulo, Tokyo, Buenos Aires, Auckland and Shanghai.  Daringly, the publicity for the 2025 gatherings introduced “Sherry Girl” whose “bold two-copa ‘Sherry Stance’” is now an icon for the drink.  Sherry Girl is new but dedicated sherryphiles will be pleased to learn the traditional “Sherry Ruta” (Sherry route) remains on the schedule, again in “multi-venue routes offering exclusive pairing experiences”, described as “not a typical wine crawl but a triumphant strut with tipples, tastings, and tapas.  For the adventurous, participants are able to use the interactive venue map to curate their own Sherry Ruta in their city of choice.  The 2025 event will be held between 3-9 November. 

Dry Sack, a sherry preferred by many because of its balance; straddling sweet and dry.  Purists tend to the dry finos while sweeter cream sherries are recommended for neophytes.

The name "sherry" continues to be used by US producers where, to conform to domestic legislation, it must be labeled with a region of origin such as Oregon Sherry but can’t be sold in the EU because of their protected status laws.  Both Canadian and Australian winemakers now use the term Apera instead of Sherry, although customers seem still to favor the original.  For the upper-middle class and beyond, sherry parties were a fixture of late-Victorian and Edwardian social life but the dislocations of the World War I (1914-1918) seemed to render them extinct. It turned out however to be a postponement and sherry parties were revived, the height of their popularity being enjoyed during the 1930s until the post-war austerity the UK endured after World War II (1939-1945) saw them a relic restricted moistly to Oxbridge dons, the genuinely still rich, Church of England bishops and such although they never quite vanished and those who subscribe to magazines like Country Life or Tatler probably still exchange invitations to each other's sherry parties.

For Sherry and Cocktail Parties, trade literature by Fortnum and Mason, Regent Street, Piccadilly, London, circa 1936.  The luxury department store, Fortnum & Mason, used the services of the Stuart Advertising Agency, which employed designers to produce witty and informative catalogues and the decorative art is illustrative of British commercial art in this period.

For the women who tended to be hostess and organizer, there were advantages compared with the tamer tea party.  Sherry glasses took less space than cups of tea, with all the associated paraphernalia of spoons, milk and sugar and, it being almost impossible to eat and drink while balancing a cup and saucer and conveying cake to the mouth, the tea party demanded tables and chairs.  The sherry glass and finger-food was easier for while one must sit for tea, one can stand for sherry so twice the number of guests could be asked.  Sherry parties indeed needed to be tightly packed affairs, the mix of social intimacy and alcohol encouraging mingling and they also attracted more men for whom the offer of held little attraction.  The traditional timing between six and eight suited the male lifestyle of the time and they were doubtless more attracted to women drinking sherry than women drinking tea for while the raffish types knew it wasn't quite the "leg-opener" as gin was renowned to be, every little bit helps.

In hair color and related fields, "sherry red" (not to be confused with the brighter "cherry red") is a rich hue on the spectrum from amber to dark brown: Lindsay Lohan (who would be the ideal "Cherry Girl" model) demonstrates on the red carpet at the Liz & Dick premiere, Los Angeles, 2012.

Sherry party planner.

Novelist Laura Troubridge (Lady Troubridge, (née Gurney; 1867-1946)), who in 1935 published what became the standard English work on the topic, Etiquette and Entertaining: to help you on your social way, devoted an entire chapter to the sherry party.  She espoused an informal approach as both cheap and chic, suggesting guests be invited by telephone or with “Sherry, six to eight” written on a visiting card and popped in an envelope.   She recommended no more than two-dozen guests, a half-dozen bottles of sherry, a couple of heavy cut-glass decanters and some plates of “dry and biscuity” eats: cheese straws, oat biscuits, cubes of cheddar.  This, she said, was enough to supply the makings of a “…jolly kind of party, with plenty of cigarettes and talk that will probably last until half past seven or eight.
Cocktail Party by Laurence Fellows (1885-1964), Esquire magazine, September 1937.

The Sherry party should not be confused with the cocktail party.  Cocktail parties in drawing rooms at which Martinis were served often were much more louche affairs.  Note the elegantly sceptical expressions on the faces of the women, all of whom have become immured to the tricks of “charming men in suits”.  For women, sherry parties were more welcoming places.

Monday, June 30, 2025

Bunker

Bunker (pronounced buhng-ker)

(1) A large bin or receptacle; a fixed chest or box.

(2) In military use, historically a fortification set mostly below the surface of the ground with overhead protection provided by logs and earth or by concrete and fitted with above-ground embrasures through which guns may be fired.

(3) A fortification set mostly below the surface of the ground and used for a variety of purposes.

(4) In golf, an obstacle, classically a sand trap but sometimes a mound of dirt, constituting a hazard.

(5) In nautical use, to provide fuel for a vessel.

(6) In nautical use, to convey bulk cargo (except grain) from a vessel to an adjacent storehouse.

(7) In golf, to hit a ball into a bunker.

(8) To equip with or as if with bunkers.

(9) In military use, to place personnel or materiel in a bunker or bunkers (sometimes as “bunker down”).

1755–1760: From the Scottish bonkar (box, chest (also “seat” (in the sense of “bench”) of obscure origin but etymologists conclude the use related to furniture hints at a relationship with banker (bench).  Alternatively, it may be from a Scandinavian source such as the Old Swedish bunke (boards used to protect the cargo of a ship).  The meaning “receptacle for coal aboard a ship” was in use by at least 1839 (coal-burning steamships coming into general use in the 1820s).  The use to describe the obstacles on golf courses is documented from 1824 (probably from the extended sense “earthen seat” which dates from 1805) but perhaps surprisingly, the familiar sense from military use (dug-out fortification) seems not to have appeared before World War I (1914-1918) although the structures so described had for millennia existed.  “Bunkermate” was army slang for the individual with whom one shares a bunker while the now obsolete “bunkerman” (“bunkermen” the plural”) referred to someone (often the man in charge) who worked at an industrial coal storage bunker.  Bunker & bunkerage is a noun, bunkering is a noun & verb, bunkered is a verb and bunkerish, bunkeresque, bunkerless & bunkerlike are adjectives; the noun plural is bunkers.

Just as ships called “coalers” were used to transport coal to and from shore-based “coal stations”, it was “oilers” which took oil to storage tanks or out to sea to refuel ships (a common naval procedure) and these STS (ship-to-ship) transfers were called “bunkering” as the black stuff was pumped, bunker-to-bunker.  That the coal used by steamships was stored on-board in compartments called “coal bunkers” led ultimately to another derived term: “bunker oil”.  When in the late nineteenth century ships began the transition from being fuelled by coal to burning oil, the receptacles of course became “oil bunkers” (among sailors nearly always clipped to “bunker”) and as refining processes evolved, the fuel specifically produced for oceangoing ships came to be called “bunker oil”.

Bunker oil is “dirty stuff”, a highly viscous, heavy fuel oil which is essentially the residue of crude oil refining; it’s that which remains after the more refined and volatile products (gasoline (petrol), kerosene, diesel etc) have been extracted.  Until late in the twentieth century, the orthodox view of economists was its use in big ships was a good thing because it was a product for which industry had little other use and, as essentially a by-product, it was relatively cheap.  It came in three flavours: (1) Bunker A: Light fuel oil (similar to a heavy diesel), (2) Bunker B: An oil of intermediate viscosity used in engines larger than marine diesels but smaller than those used in the big ships and (3) Bunker C: Heavy fuel oil used in container ships and such which use VLD (very large displacement), slow running engines with a huge reciprocating mass.  Because of its composition, Bucker C especially produced much pollution and although much of this happened at sea (unseen by most but with obvious implications), when ships reached harbor to dock, all the smoke and soot became obvious.  Over the years, the worst of the pollution from the burning of bunker oil greatly has been reduced (the work underway even before the Greta Thunberg (b 2003) era), sometimes by the simple expedient of spraying a mist of water through the smoke.

Floor-plans of the upper (Vorbunker) and lower (Führerbunker) levels of the structure now commonly referred to collectively as the Führerbunker.

History’s most infamous bunker remains the Berlin Führerbunker in which Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) spent much of the last few months of his life.  In the architectural sense there were a number of Führerbunkers built, one at each of the semi-permanent Führerhauptquartiere (Führer Headquarters) created for the German military campaigns and several others built where required but it’s the one in Berlin which is remembered as “the Führerbunker”.  Before 1944 when the intensification of the air raids by the RAF (Royal Air Force) and USAAF (US Army Air Force) the term Führerbunker seems rarely to have been used other than by the architects and others involved in their construction and it wasn’t a designation like Führerhauptquartiere which the military and other institutions of state shifted between locations (rather as “Air Force One” is attached not to a specific airframe but whatever aircraft in which the US president is travelling).  In subsequent historical writing, the term Führerbunker tends often to be applied to the whole, two-level complex in Berlin and although it was only the lower layer which officially was designated as that, for most purposes the distinction is not significant.  In military documents, after January, 1945 the Führerbunker was referred to as Führerhauptquartiere.

Führerbunker tourist information board, Berlin, Germany.

Only an information board at the intersection of den Ministergärten and Gertrud-Kolmar-Straße, erected by the German Goverment in 2006 prior to that year's FIFA (Fédération Internationale de Football Association (International Federation of Association Football)) World Cup now marks the place on Berlin's Wilhelmstrasse 77 where once the Führerbunker was located.  The Soviet occupation forces razed the new Reich Chancellery and demolished all the bunker's above-ground structures but the subsequent GDR (Deutsche Demokratische Republik (German Democratic Republic; the old East Germany) 1949-1990) abandoned attempts completely to destroy what lay beneath.  Until after the fall of the Berlin Wall (1961-1989) the site remained unused and neglected, “re-discovered” only during excavations by property developers, the government insisting on the destruction on whatever was uncovered and, sensitive still to the spectre of “Neo-Nazi shrines”, for years the bunker’s location was never divulged, even as unremarkable buildings (an unfortunate aspect of post-unification Berlin) began to appear on the site.  Most of what would have covered the Führerbunker’s footprint is now a supermarket car park.

The first part of the complex to be built was the Vorbunker (upper bunker or forward bunker), an underground facility of reinforced concrete intended only as a temporary air-raid shelter for Hitler and his entourage in the old Reich Chancellery.  Substantially completed during 1936-1937, it was until 1943 listed in documents as the Luftschutzbunker der Reichskanzlei (Reich Chancellery Air-Raid Shelter), the Vorbunker label applied only in 1944 when the lower level (the Führerbunker proper) was appended.  In mid January, 1945, Hitler moved into the Führerbunker and, as the military situation deteriorated, his appearances above ground became less frequent until by late March he rarely saw the sky,  Finally, on 30 April, he committed suicide.

Bunker Busters

Northrop Grumman publicity shot of B2-Spirit from below, showing the twin bomb-bay doors through which the GBU-57 are released.

Awful as they are, there's an undeniable beauty in the engineering of some weapons and it's unfortunate humankind never collectively has resolved exclusively to devote such ingenuity to stuff other than us blowing up each other.

The use in June 2025 by the USAF (US Air Force) of fourteen of its Boeing GBU-57 (Guided Bomb Unit-57) Massive Ordnance Penetrators (MOP) bombs against underground targets in Iran (twelve on the Fordow Uranium Enrichment Plant and two on the Natanz nuclear facility) meant “Bunker Buster” hit the headlines.  Carried by the Northrop B-2 Spirit heavy bomber (built between 1989-2000), the GBU-57 is a 14,000 kg (30,000 lb) bomb with a casing designed to withstand the stress of penetrating through layers of reinforced concrete or thick rock.  “Bunker buster” bombs have been around for a while, the ancestors of today’s devices first built for the German military early in World War II (1939-1945) and the principle remains unchanged to this day: up-scaled armor-piercing shells.  The initial purpose was to produce a weapon with a casing strong enough to withstand the forces imposed when impacting reinforced concrete structures, the idea simple in that what was needed was a delivery system which could “bust through” whatever protective layers surrounded a target, allowing the explosive charge to do damage where needed rtaher than wastefully being expended on an outer skin.  The German weapons proved effective but inevitably triggered an “arms race” in that as the war progressed, the concrete layers became thicker, walls over 2 metres (6.6 feet) and ceilings of 5 (16) being constructed by 1943.  Technological development continued and the idea extended to rocket propelled bombs optimized both for armor-piercing and aerodynamic efficiency, velocity a significant “mass multiplier” which made the weapons still more effective.

USAF test-flight footage of Northrop B2-Spirit dropping two GBU-57 "Bunker Buster" bombs.

Concurrent with this, the British developed the first true “bunker busters”, building on the idea of the naval torpedo, one aspect of which was in exploding a short distance from its target, it was highly damaging because it was able to take advantage of one of the properties of water (quite strange stuff according to those who study it) which is it doesn’t compress.  What that meant was it was often the “shock wave” of the water rather than the blast itself which could breach a hull, the same principle used for the famous “bouncing bombs” used for the RAF’s “Dambuster” (Operation Chastise, 17 May 1943) raids on German dams.  Because of the way water behaved, it wasn’t necessary to score the “direct hit” which had been the ideal in the early days of aerial warfare.

RAF Bomber Command archive photograph of Avro Lancaster (built between 1941-1946) in flight with Grand Slam mounted (left) and a comparison of the Tallboy & Grand Slam (right), illustrating how the latter was in most respects a scaled-up version of the former.  To carry the big Grand Slams, 32 “B1 Special” Lancasters were in 1945 built with up-rated Rolls-Royce Merlin V12 engines, the removal of the bomb doors (the Grand Slam carried externally, its dimensions exceeding internal capacity), deleted front and mid-upper gun turrets, no radar equipment and a strengthened undercarriage.  Such was the concern with weight (especially for take-off) that just about anything non-essential was removed from the B1 Specials, even three of the four fire axes and its crew door ladder.  In the US, Boeing went through a similar exercise to produce the run of “Silverplate” B-29 Superfortresses able to carry the first A-bombs used in August, 1945. 

Best known of the British devices were the so called earthquake bombs”, the Tallboy (12,000 lb; 5.4 ton) & Grand Slam (22,000 lb, 10 ton) which, despite the impressive bulk, were classified by the War Office as “medium capacity”.  The terms “Medium Capacity” (MC) & “High Capacity” referenced not the gross weight or physical dimensions but ratio of explosive filler to the total weight of the construction (ie how much was explosive compared to the casing and ancillary components).  Because both had thick casings to ensure penetration deep into hardened targets (bunkers and other structures encased in rock or reinforced concrete) before exploding, the internal dimensions accordingly were reduced compared with the ratio typical of contemporary ordinance.  A High Capacity (HC) bomb (a typical “general-purpose bomb) had a thinner casing and a much higher proportion of explosive (sometimes over 70% of total weight).  These were intended for area bombing (known also as “carpet bombing”) and caused wide blast damage whereas the Tallboy & Grand Slam were penetrative with casings optimized for aerodynamic efficiency, their supersonic travel working as a mass-multiplier.  The Tallboy’s 5,200 lb (2.3 ton) explosive load was some 43% of its gross weight while the Grand Slam’s 9,100 lb (4 ton) absorbed 41%; this may be compared with the “big” 4000 lb (1.8 ton) HC “Blockbuster” which allocated 75% of the gross weight to its 3000 LB (1.4 ton) charge.  Like many things in engineering (not just in military matters) the ratio represented a trade-off, the MC design prioritizing penetrative power and structural destruction over blast radius.  The novelty of the Tallboy & Grand Slam was that as earthquake bombs, their destructive potential was able to be unleashed not necessarily by achieving a direct hit on a target but by entering the ground nearby, the explosion (1) creating an underground cavity (a camouflet) and (2) transmitting a shock-wave through the target’s foundations, leading to the structure collapsing into the newly created lacuna. 

The etymology of camouflet has an interesting history in both French and military mining.  Originally it meant “a whiff of smoke in the face (from a fire or pipe) and in figurative use it was a reference to a snub or slight insult (something unpleasant delivered directly to someone) and although the origin is murky and it may have been related to the earlier French verb camoufler (to disguise; to mask) which evolved also into “camouflage”.  In the specialized military jargon of siege warfare or mining (sapping), over the seventeen and nineteenth centuries “camouflet” referred to “an underground explosion that does not break the surface, but collapses enemy tunnels or fortifications by creating a subterranean void or shockwave”.  The use of this tactic is best remembered from the Western Front in World War I, some of the huge craters now tourist attractions.

Under watchful eyes: Grand Ayatollah Ali Khamenei (b 1939; Supreme Leader, Islamic Republic of Iran since 1989) delivering a speech, sitting in front of the official portrait of the republic’s ever-unsmiling founder, Grand Ayatollah Ruhollah Khomeini (1900-1989; Supreme Leader, Islamic Republic of Iran, 1979-1989).  Ayatollah Khamenei seemed in 1989 an improbable choice as Supreme Leader because others were better credentialed but though cautious and uncharismatic, he has proved a great survivor in a troubled region.

Since aerial bombing began to be used as a strategic weapon, of great interest has been the debate over the BDA (battle damage assessment) and this issue emerged almost as soon as the bunker buster attack on Iran was announced, focused on the extent to which the MOPs had damaged the targets, the deepest of which were concealed deep inside a mountain.  BDA is a constantly evolving science and while satellites have made analysis of surface damage highly refined, it’s more difficult to understand what has happened deep underground.  Indeed, it wasn’t until the USSBS (United States Strategic Bombing Survey) teams toured Germany and Japan in 1945-1946, conducting interviews, economic analysis and site surveys that a useful (and substantially accurate) understanding emerged of the effectiveness of bombing although what technological advances have allowed for those with the resources is the so-called “panacea targets” (ie critical infrastructure and such once dismissed by planners because the required precision was for many reasons rarely attainable) can now accurately be targeted, the USAF able to drop a bomb within a few feet of the aiming point.  As the phrase is used by the military, the Fordow Uranium Enrichment Plant is as classic “panacea target” but whether even a technically successful strike will achieve the desired political outcome remains to be seen.

Mr Trump, in a moment of exasperation, posted on Truth Social of Iran & Israel: “We basically have two countries that have been fighting so long and so hard that they don't know what the fuck they're doing."  Actually, both know exactly WTF they're doing; it's just Mr Trump (and many others) would prefer they didn't do it.

Donald Trump (b 1946; US president 2017-2021 and since 2025) claimed “total obliteration” of the targets while Grand Ayatollah Khamenei admitted only there had been “some damage” and which is closer to the truth should one day be revealed.  Even modelling of the effects has probably been inconclusive because the deeper one goes underground, the greater the number of variables in the natural structure and the nature of the internal built environment will also influence blast behaviour.  All experts seem to agree much damage will have been done but what can’t yet be determined is what has been suffered by the facilities which sit as deep as 80 m (260 feet) inside the mountain although, as the name implies, “bunker busters” are designed for buried targets and it’s not always required for blast directly to reach target.  Because the shock-wave can travel through earth & rock, the effect is something like that of an earthquake and if the structure sufficiently is affected, it may be the area can be rendered geologically too unstable again to be used for its original purpose.

Within minutes of the bombing having been announced, legal academics were being interviewed (though not by Fox News) to explain why the attacks were unlawful under international law and in a sign of the times, the White House didn't bother to discuss fine legal points like the distinction between "preventive & pre-emptive strikes", preferring (like Fox News) to focus on the damage done.  However, whatever the murkiness surrounding the BDA, many analysts have concluded that even if before the attacks the Iranian authorities had not approved the creation of a nuclear weapon, this attack will have persuaded them one is essential for “regime survival”, thus the interest in both Tel Aviv and (despite denials) Washington DC in “regime change”.  The consensus seems to be Grand Ayatollah Khamenei had, prior to the strike, not ordered the creation of a nuclear weapon but that all energies were directed towards completing the preliminary steps, thus the enriching of uranium to ten times the level required for use in power generation; the ayatollah liked to keep his options open.  So, the fear of some is the attacks, even if they have (by weeks, months or years) delayed the Islamic Republic’s work on nuclear development, may prove counter-productive in that they convince the ayatollah to concur with the reasoning of every state which since 1945 has adopted an independent nuclear deterrent (IND).  That reasoning was not complex and hasn’t changed since first a prehistoric man picked up a stout stick to wave as a pre-lingual message to potential adversaries, warning them there would be consequences for aggression.  Although a theocracy, those who command power in the Islamic Republic are part of an opaque political institution and in the struggle which has for sometime been conducted in anticipation of the death of the aged (and reportedly ailing) Supreme Leader, the matter of “an Iranian IND” is one of the central dynamics.  Many will be following what unfolds in Tehran and the observers will not be only in Tel Aviv and Washington DC because in the region and beyond, few things focus the mind like the thought of ayatollahs with A-Bombs.

Of the word "bust"

The Great Bust: The Depression of the Thirties (1962) by Jack Lang (left), highly qualified porn star Busty Buffy (b 1996, who has never been accused of misleading advertising, centre) and The people's champion, Mr Lang, bust of Jack Lang, painted cast plaster by an unknown artist, circa 1927, National Portrait Gallery, Canberra, Australia (right).  Remembered for a few things, Jack Lang (1876–1975; premier of the Australian state of New South Wales (NSW) 1925-1927 & 1930-1932) remains best known for having in 1932 been the first head of government in the British Empire to have been sacked by the Crown since William IV (1765–1837; King of the UK 1830-1837) in 1834 dismissed Lord Melbourne (1779–1848; prime minister of the UK 1834 & 1835-1841).

Those learning English must think it at least careless things can both be (1) “razed to the ground” (totally to destroy something (typically a structure), usually by demolition or incineration) and (2) “raised to the sky” (physically lifted upwards).  The etymologies of “raze” and “raise” differ but they’re pronounced the same so it’s fortunate the spellings vary but in other troublesome examples of unrelated meanings, spelling and pronunciation can align, as in “bust”.  When used in ways most directly related to human anatomy: (1) “a sculptural portrayal of a person's head and shoulders” & (2) “the circumference of a woman's chest around her breasts” there is an etymological link but these uses wholly are unconnected with bust’s other senses.

Bust of Lindsay Lohan in white marble by Stable Diffusion.  Sculptures of just the neck and head came also to be called “busts”, the emphasis on the technique rather than the original definition.

Bust in the sense of “a sculpture of upper torso and head” dates from the 1690s and was from the sixteenth century French buste, from the Italian busto (upper body; torso), from the Latin bustum (funeral monument, tomb (although the original sense was “funeral pyre, place where corpses are burned”)) and it may have emerged (as a shortened form) from ambustum, neuter of ambustus (burned around), past participle of amburere (burn around, scorch), the construct being ambi- (around) + urere (to burn),  The alternative etymology traces a link to the Old Latin boro, the early form of the Classical Latin uro (to burn) and it’s though the development in Italian was influenced by the Etruscan custom of keeping the ashes of the dead in an urn shaped like the person when alive.  Thus the use, common by the 1720s of bust (a clipping from the French buste) being “a carving of the “trunk of the human body from the chest up”.  From this came the meaning “dimension of the bosom; the measurement around a woman's body at the level of her breasts” and that evolved on the basis of a comparison with the sculptures, the base of which was described as the “bust-line”, the term still used in dress-making (and for other comparative purposes as one of the three “vital statistics” by which women are judged (bust, waist, hips), each circumference having an “ideal range”).  It’s not known when “bust” and “bust-line” came into oral use among dress-makers and related professions but it’s documented since the 1880s.  Derived forms (sometimes hyphenated) include busty (tending to bustiness, thus Busty Buffy's choice of stage-name), overbust & underbust (technical terms in women's fashion referencing specific measurements) and bustier (a tight-fitting women's top which covers (most or all of) the bust.

The other senses of bust (as a noun, verb & adjective) are diverse (and sometimes diametric opposites and include: “to break or fail”; “to be caught doing something unlawful / illicit / disgusting etc”; “to debunk”; “dramatically or unexpectedly to succeed”; “to go broke”; “to break in (horses, girlfriends etc): “to assault”; the downward portion of an economic cycle (ie “boom & bust”); “the act of effecting an arrest” and “someone (especially in professional sport) who failed to perform to expectation”.  That’s quite a range and that has meant the creation of dozens of idiomatic forms, the best known of which include: “boom & bust”, “busted flush”, “dambuster”, “bunker buster”,  “busted arse country”, “drug bust”, “cloud bust”, belly-busting, bust one's ass (or butt), bust a gut, bust a move, bust a nut, bust-down, bust loose, bust off, bust one's balls, bust-out, sod buster, bust the dust, myth-busting and trend-busting.  In the sense of “breaking through”, bust was from the Middle English busten, a variant of bursten & bresten (to burst) and may be compared with the Low German basten & barsten (to burst).  Bust in the sense of “break”, “smash”, “fail”, “arrest” etc was a creation of mid-nineteenth century US English and is of uncertain inspiration but most etymologists seem to concur it was likely a modification of “burst” effected with a phonetic alteration but it’s not impossible it came directly as an imperfect echoic of Germanic speech.  The apparent contradiction of bust meaning both “fail” and “dramatically succeed” happened because the former was an allusion to “being busted” (ie broken) while the latter meaning used the notion of “busting through”.