Showing posts sorted by date for query Minimalism. Sort by relevance Show all posts
Showing posts sorted by date for query Minimalism. Sort by relevance Show all posts

Saturday, July 5, 2025

Futurism

Futurism (pronounced fyoo-chuh-riz-uhm)

(1) A movement in avant-garde art, developed originally by a group of Italian artists in 1909 in which forms (derived often from the then novel cubism) were used to represent rapid movement and dynamic motion  (sometimes with initial capital letter)

(2) A style of art, literature, music, etc and a theory of art and life in which violence, power, speed, mechanization or machines, and hostility to the past or to traditional forms of expression were advocated or portrayed (often with initial capital letter).

(3) As futurology, a quasi-discipline practiced by (often self-described) futurologists who attempt to predict future events, movements, technologies etc.

(4) In the theology of Judaism, the Jewish expectation of the messiah in the future rather than recognizing him in the presence of Christ.

(5) In the theology of Christianity, eschatological interpretations associating some Biblical prophecies with future events yet to be fulfilled, including the Second Coming.

1909: From the Italian futurismo (literally "futurism" and dating from circa 1909), the construct being futur(e) + -ism.  Future was from the Middle English future & futur, from the Old French futur, (that which is to come; the time ahead) from the Latin futūrus, (going to be; yet to be) which (as a noun) was the irregular suppletive future participle of esse (to be) from the primitive Indo-European bheue (to be, exist; grow).  It was cognate with the Old English bēo (I become, I will be, I am) and displaced the native Old English tōweard and the Middle English afterhede (future (literally “afterhood”) in the given sense.  The technical use in grammar (of tense) dates from the 1520s.  The –ism suffix was from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Futurism, futurology, & futurology are nouns, futurist is a noun & adjective and futuristic is an adjective; the noun plural is futurisms.

Lindsay Lohan in Maison Martin Margiela (b 1957) Futuristic Eyewear.

As a descriptor of the movement in art and literature, futurism (as the Italian futurismo) was adopted in 1909 by the Italian poet Filippo Tommaso Marinetti (1876-1944) and the first reference to futurist (a practitioner in the field of futurism) dates from 1911 although the word had been used as early as 1842 in Protestant theology in the sense of “one who holds that nearly the whole of the Book of Revelations refers principally to events yet to come”.  The secular world did being to use futurist to describe "one who has (positive) feelings about the future" in 1846 but for the remainder of the century, use was apparently rare.  The (now probably extinct) noun futurity was from the early seventeenth century.  The noun futurology was introduced by Aldous Huxley (1894-1963) in his book Science, Liberty and Peace (1946) and has (for better or worse), created a minor industry of (often self-described) futurologists.  In theology, the adjective futuristic came into use in 1856 with reference to prophecy but use soon faded.  In concert with futurism, by 1915 it referred in art to “avant-garde; ultra-modern” while by 1921 it was separated from the exclusive attachment to art and meant also “pertaining to the future, predicted to be in the future”, the use in this context spiking rapidly after World War II (1939-1945) when technological developments in fields such as ballistics, jet aircraft, space exploration, electronics, nuclear physics etc stimulated interest in such progress.

Untouched: Crooked Hillary Clinton (b 1947; US secretary of state 2009-2013) & Bill Clinton (b 1946; US president 1993-2001) with cattle, 92nd Annual Hopkinton State Fair, Contoocook, New Hampshire, September 2007.

Futures, a financial instrument used in the trade of currencies and commodities appeared first in 1880; they allow (1) speculators to bet on price movements and (2) producers and sellers to hedge against price movements and in both cases profits (and losses) can be booked against movement up or down.  Futures trading can be lucrative but is also risky, those who win gaining from those who lose and those in the markets are usually professionals.  The story behind crooked Hillary Clinton's extraordinary profits in cattle futures (not a field in which she’d previously (or has subsequently) displayed interest or expertise) while “serving” as First Lady of Arkansas ((1979–1981 & 1983–1992) remains murky but it can certainly be said that for an apparently “amateur” dabbling in a market played usually by experienced professionals, she was remarkably successful and while perhaps there was some luck involved, her trading record was such it’s a wonder she didn’t take it up as a career.  While many analysts have, based on what documents are available, commented on crooked Hillary’s somewhat improbable (and apparently sometime “irregular”) foray into cattle futures, there was never an “official governmental investigation” by an independent authority and no thus adverse findings have ever been published.  

The Arrival (1913), oil on canvas by Christopher Richard Wynne Nevinson (1889-1946), Tate Gallery.

Given what would unfold over during the twentieth century, it’s probably difficult to appreciate quite how optimistic was the Western world in the years leading up to the World War I (1914-1918).  Such had been the rapidity of the discovery of novelties and of progress in so many fields that expectations of the future were high and, beginning in Italy, futurism was a movement devoted to displaying the energy, dynamism and power of machines and the vitality and change they were bringing to society.  It’s also often forgotten that when the first futurist exhibition was staged in Paris in 1912, the critical establishment was unimpressed, the elaborate imagery with its opulence of color offending their sense of refinement, now so attuned to the sparseness of the cubists.

The Hospital Train (1915), oil on canvas by Gino Severini (1883-1966), Stedelijk Museum.

Futurism had debuted with some impact, the Paris newspaper Le Figaro in 1909 publishing the manifesto by Italian poet Filippo Tommaso Marinetti. Marinetti which dismissed all that was old and celebrated change, originality, and innovation in culture and society, something which should be depicted in art, music and literature. Marinetti exalted in the speed, power of new technologies which were disrupting society, automobiles, aeroplanes and other clattering machines.  Whether he found beauty in the machines or the violence and conflict they delivered was something he left his readers to decide and there were those seduced by both but his stated goal was the repudiation of traditional values and the destruction of cultural institutions such as museums and libraries.  Whether this was intended as a revolutionary roadmap or just a provocation to inspire anger and controversy is something historians have debated.  Assessment of Marinetti as a poet has always been colored by his reputation as a proto-fascist and some treat as "fake mysticism" his claim his "visions" of the future and the path to follow to get there came to him in the moment of a violent car crash. 

Futurismo: Uomo Nuovo (New Man, 1918), drawing by Mario Sironi (1885-1961).

As a technique, the futurist artists borrowed much from the cubists, deploying the same fragmented and intersecting plane surfaces and outlines to render a number of simultaneous, overlaid views of an object but whereas the cubists tended to still life, portraiture and other, usually static, studies of the human form, the futurists worshiped movement, their overlays a device to depict rhythmic spatial repetitions of an object’s outlines during movement.  People did appear in futurist works but usually they weren’t the focal point, instead appearing only in relation to some speeding or noisy machine.  Some of the most prolific of the futurist artists were killed in World War I and as a political movement it didn’t survive the conflict, the industrial war dulling the public appetite for the cult of the machine.  However, the influence of the compositional techniques continued in the 1920s and contributed to art deco which, in more elegant form, would integrate the new world of machines and mass-production into motifs still in use today.

Motociclista (Motorcyclist, circa 1924), oil on canvas by Mario Sironi.

By the early twentieth century when the Futurism movement emerged, machines and mechanism were already hundreds of years old (indeed the precursor devices pre-date Christ) but what changed was the new generations of machines had become sexy (at least in the eyes of men), associated as they were with something beyond mere functionalism: speed and style.  While planes, trains & automobiles all attracted the futurists, the motorcycle was a much-favored motif because it possessed an intimacy beyond other forms of transportation in that, literally it was more an extension of the human body, the rider at speed conforming to the shape of the structure fashioned for aerodynamic efficiency with hands and feet all directly attached to the vital controls: machine as extension of man.

The Modern Boy No. 100, Vol 4, Week Ending 4 January, 1930.

The Modern Boy (1928-1939) was, as the name implies, a British magazine targeted at males aged 12-18 and the content reflected the state of mind in the society of the inter-war years, the 1930s a curious decade of progress, regression, hope and despair.  Although what filled much of the pages (guns, military conquest and other exploits, fast cars and motorcycles, stuff the British were doing in other peoples’ countries) would today see the editors cancelled or visited by one of the many organs of the British state concerned with the suppression of such things), it was what readers (presumably with the acquiescence of their parents) wanted.  Best remembered of the authors whose works appeared in The Modern Boy was Captain W.E. Johns (1893–1968), a World War I RFC (Royal Flying Corps) pilot who created the fictional air-adventurer Biggles.  The first Biggles tale appeared in 1928 in Popular Flying magazine (released also as Popular Aviation and still in publication as Flying) and his stories are still sometimes re-printed (although with the blatant racism edited out).  The first Biggles story had a very modern-sounding title: The White FokkerThe Modern Boy was a successful weekly which in 1988 was re-launched as Modern Boy, the reason for the change not known although dropping superfluous words (and much else) was a feature of modernism.  In October 1939, a few weeks after the outbreak of World War II, publication ceased, Modern Boy like many titles a victim of restrictions by the Board of Trade on the supply of paper for civilian use.

Jockey Club Innovation Tower, Hong Kong (2013) by Zaha Hadid (1950-2016).

If the characteristics of futurism in art were identifiable (though not always admired), in architecture, it can be hard to tell where modernism ends and futurism begins.  Aesthetics aside, the core purpose of modernism was of course its utilitarian value and that did tend to dictate the austerity, straight lines and crisp geometry that evolved into mid-century minimalism so modernism, in its pure form, should probably be thought of as a style without an ulterior motive.  Futurist architecture however carried the agenda which in its earliest days borrowed from the futurist artists in that it was an assault on the past but later moved on and in the twenty-first century, the futurist architects seem now to be interested above all in the possibilities offered by advances in structural engineering, functionality sacrificed if need be just to demonstrate that something new can be done.  That's doubtless of great interest at awards dinners where architects give prizes to each other for this and that but has produced an international consensus that it's better to draw something new than something elegant.  The critique is that while modernism once offered “less is more”, with neo-futurist architecture it's now “less is bore”.  Art deco and mid-century modernism have aged well and it will be interesting to see how history judges the neo-futurists.

Thursday, November 28, 2024

Cereal & Serial

Cereal (pronounced seer-ee-uhl)

(1) Any plant of the grass family yielding an edible grain (wheat, rye, oats, rice, corn, maize, sorghum, millet etc).

(2) The grain from those plants.

(3) An edible preparation of these grains, applied especially to packaged, (often process) breakfast foods.

(4) Of or relating to grain or the plants producing it.

(5) A hamlet in Alberta, Canada.

(6) As Ceres International Women's Fraternity, a women's fraternity focused on agriculture, founded on 17 August 1984 at the International Conclave of FarmHouse fraternity.

1590s: From the sixteenth century French céréale (having to do with cereal), from the Latin cereālis (of or pertaining to the Roman goddess Ceres), from the primitive Indo-European ker-es-, from the root er- (to grow”) from which Latin gained also sincerus (source of the English sincere) and crēscō (grow) (source of the English crescent).  The noun use of cereal in the modern sense of (a grass yielding edible grain and cultivated for food) emerged in 1832 and was developed from the adjective (having to do with edible grain), use of which dates from 1818, also from the French céréale (in the sense of the grains).  The familiar modern use (packaged grain-based food intended for breakfast) was a creation of US English in 1899.  If used in reference to the goddess Ceres, an initial capital should be used.  Cereal, cereology & cerealogist are nouns and ceralic is an adjective; the noun plural is cereals.

Lindsay Lohan mixing Pilk.

Cereal is often used as modifier (cereal farming, cereal production, cereal crop, non-cereal, cereal bar, pseudocereal, cereal dust etc) and a cereologist is one who works in the field of cerealogy (the investigation, or practice, of creating crop circles).  The term “cereal killer” is used of one noted for their high consumption of breakfast cereals although some might be tempted to apply it to those posting TikTok videos extolling the virtue of adding “Pilk” (a mix of Pepsi-Cola & Milk) to one’s breakfast cereal.  Pilk entered public consciousness in December 2022 when Pepsi Corporation ran a “Dirty Sodas” promotion for the concoction, featuring Lindsay Lohan.  There is some concern about the high sugar content in packaged cereals (especially those marketed towards children) but for those who want to avoid added sugar, Pepsi Corporation does sell “Pepsi Max Zero Sugar” soda and Pilk can be made using this.  Pepsi Max Zero Sugar contains carbonated water, caramel color, phosphoric acid, aspartame, acesulfame potassium, caffeine, citric acid, potassium benzoate & calcium disodium EDTA.

TikTok, adding Pilk to cereal and the decline of Western civilization.

A glass of Pilk does of course make one think of Lindsay Lohan but every mouthful of one’s breakfast cereal is something of a tribute to a goddess of Antiquity.  In 496 BC, Italy was suffering one of its periodic droughts and one particularly severe and lingering, the Roman fields dusty and parched.  As was the practice, the priests travelled to consult the Sibylline oracle, returning to the republic’s capital to report a new goddess of agriculture had to be adopted and sacrifices needed immediately to be made to her so rain would again fall on the land.  It was Ceres who was chosen and she became the goddess of agriculture and protector of the crops while the caretakers of her temple were the overseers of the grain market (they were something like the wheat futures traders in commodity exchanges like the Chicago Board of Trade (CBOT)).  It was the will of the goddess Ceres which determined whether a harvest was prolific or sparse and to ensure abundance, the Romans ensured the first cuttings of the corn were always sacrificed to her.  It’s from the Latin adjective cereālis (of or pertaining to the Roman goddess Ceres) English gained “cereal”.

For millennia humanity’s most widely cultivated and harvested crop, cereal is a grass cultivated for its edible grain, the best known of which are rice, barley, millet, maize, rye, oats, sorghum & wheat.  Almost all cereals are annual crops (ie yielding one harvest per planting) although some strains of rice can be grown as a perennial and an advantages of cereals is the differential in growth rates and temperature tolerance means harvesting schedules can be spread from mid-spring until late summer.  Except for the more recent hybrids, all cereals are variations of natural varieties and the first known domestication occurred early in the Neolithic period (circa 7000–1700 BC).  Although the trend in cultivated area and specific yield tended over centuries to display a gradual rise, it was the “green revolution” (a combination of new varieties of cereals, chemical fertilizers, pest control, mechanization and precise irrigation which began to impact agriculture at scale in the mid twentieth century) which produced the extraordinary spike in global production.  This, coupled with the development of transport & distribution infrastructure (ports and bulk carriers), made possible the increase in the world population, now expected to reach around 10 billion by mid-century before declining.

Serial (pronounced seer-ee-uhl)

(1) Anything published, broadcast etc, in short installments at regular intervals (a novel appearing in successive issues of a magazine (ie serialized); a radio or TV series etc).

(2) In library & publishing jargon, a publication in any medium issued in successive parts bearing numerical or chronological designation and intended to be continued indefinitely.

(3) A work published in installments or successive parts; pertaining to such publication; pertaining to, arranged in, or consisting of a series.

(4) Occurring in a series rather than simultaneously (used widely, serial marriage; serial murderer, serial adulterer etc).

(5) Effecting or producing a series of similar actions.

(6) In IT, of or relating to the apparent or actual performance of data-processing operations one at a time (in the order of occurrence or transmission); of or relating to the transmission or processing of each part of a whole in sequence, as each bit of a byte or each byte of a computer word.

(7) In grammar, of or relating to a grammatical aspect relating to an action that is habitual and ongoing.

(8) In formal logic and logic mathematics (of a relation) connected, transitive, and asymmetric, thereby imposing an order on all the members of the domain.

(9) In engineering & mass-production (as “serial number”), a unique (to a certain product, model etc) character string (which can be numeric or alpha-numeric) which identifies each individual item in the production run.

(10) In music, of, relating to, or composed in serial technique.

(11) In modern art, a movement of the mid-twentieth century avant-garde in which objects or constituent elements were assembled in a systematic process, in accordance with the principles of modularity.

(12) In UK police jargon, a squad of officers equipped with shields and other protective items, used for crowd and riot control.

1823: From the New Latin word seriālis, from the Classical Latin seriēs (series), the construct being serial + -al on the Latin model which was seriēs + -ālis.  It was cognate to the Italian seriale.  The Latin seriēs was from serere (to join together, bind), ultimately from the primitive Indo-European ser- (to bind, put together, to line up).  The -al suffix was from the Middle English -al, from the Latin adjectival suffix -ālis, ((the third-declension two-termination suffix (neuter -āle) used to form adjectives of relationship from nouns or numerals) or the French, Middle French and Old French –el & -al.  It was use to denote the sense "of or pertaining to", an adjectival suffix appended (most often to nouns) originally most frequently to words of Latin origin, but since used variously and also was used to form nouns, especially of verbal action.  The alternative form in English remains -ual (-all being obsolete).  The –alis suffix was from the primitive Indo-European -li-, which later dissimilated into an early version of –āris and there may be some relationship with hel- (to grow); -ālis (neuter -āle) was the third-declension two-termination suffix and was suffixed to (1) nouns or numerals creating adjectives of relationship and (2) adjectives creating adjectives with an intensified meaning.  The suffix -ālis was added (usually, but not exclusively) to a noun or numeral to form an adjective of relationship to that noun. When suffixed to an existing adjective, the effect was to intensify the adjectival meaning, and often to narrow the semantic field.  If the root word ends in -l or -lis, -āris is generally used instead although because of parallel or subsequent evolutions, both have sometimes been applied (eg līneālis & līneāris).  Serial, serializer , serialization serialism & serialist are nouns, serialing, serialize & serialed are verbs, serializable is an adjective and serially is adverb; the noun plural is serials.

The “serial killer” is a staple of the horror film genre.  Lindsay Lohan’s I Know Who Killed Me (2007) was not well received upon release but it has since picked up a cult following.

The adjective serial (arranged or disposed in a rank or row; forming part of a series; coming in regular succession) seems to have developed much in parallel with the French sérial although the influence of one on the other is uncertain.  The word came widely to be used in English by the mid nineteenth century because the popular author Charles Dickens (1812–1870) published his novels in instalments (serialized); sequentially, chapters would appear over time in periodicals and only once the series was complete would a book appear containing the whole work.  The first use of the noun “serial” to mean “story published in successive numbers of a periodical” was in 1845 and that came from the adjective; it was a clipping of “serial novel”.  By 1914 this had been extended to film distribution and the same idea would become a staple of radio and television production, the most profitable for of which was apparently the “mini-series”, a term first used in 1971 although the concept had been in use for some time.  Serial number (indicating position in a series) was first recorded in 1866, originally of papers, packages and such and it was extended to soldiers in 1918.  Surprisingly perhaps, given the long history of the practice, the term, “serial killer” wasn’t used until 1981 although the notion of “serial events” had been used of seemingly sequential or related murders as early as the 1960s.  On that model, serial became a popular modifier (serial rapist, serial adulterer, serial bride, serial monogamist, serial pest, serial polygamy etc)

For those learning English, the existence of the homophones “cereal” & “serial” must be an annoying quirk of the language.  Because cereals are usually an annual crop, it’s reasonable if some assume the two words are related because wheat, barley and such are handled in a “serial” way, planting and harvesting recurrent annual events.  Doubtless students are told this is not the case but there is a (vague) etymological connection in that the Latin serere meant “to join together, to bind” and it was used also to mean “to sow” so there is a connection in agriculture: sowing seeds in fields.  For serial, the connection is structural (linking elements in a sequence, something demonstrated literally in the use in IT and in a more conceptual way in “serial art”) but despite the differences, both words in a way involve the fundamental act of creating order or connection.

Serial art by Swiss painter Richard Paul Lohse (1902–1988): Konkretion I (Concretion I, 1945-1946), oil on pavatex (a wood fibre board made from compressed industrial waste) (left), Zwei gleiche Themen (Two same topics, 1947), colored pencil on paper (centre) and  Konkretion III (1947), oil on pavatex.

In modern art, “serial art” was a movement of the mid-twentieth century avant-garde in which objects or constituent elements were assembled in a systematic process in accordance with the principles of modularity.  It was a concept the legacy of which was to influence (some prefer “infect”) other artistic schools rather than develop as a distinct paradigm but serial art is still practiced and remains a relevant concept in contemporary art.  The idea was of works based on repetition, sequences or variations of a theme, often following a systematic or conceptual approach; the movement was most active during the mid-twentieth century and a notable theme in Minimalism, Donald Judd (1928-1994), Andy Warhol (1928–1987), Sol LeWitt (1928-2007) (there must have been something “serial” about 1928) and Richard Paul Lohse (1902-1988) all pioneers of the approach.  Because the techniques of the serialists were adopted by many, their style became interpolated into many strains of modern art so to now speak of it as something distinctive is difficult except in a historic context.  The embrace by artists of digital tools, algorithms, and AI (Artificial Intelligence) technologies has probably restored a new sort of “purity” to serial art because generative processes are so suited to create series of images, sculptures or digital works that explore themes like pattern, progression, or variation, the traditional themes of chaos, order and perception represented as before.  In a way, serial art was just waiting for lossless duplication and the NFT (Non-fungible token) and more conservative critics still grumble the whole idea is little different to an architect’s blueprint which documents the structural framework without the “skin” which lends the shape its form.  They claim it's the engineering without the art.

Relics of the pre-USB age; there were also 25 pin serial ports.

In IT hardware, “serial” and “parallel” refer to two different methods of transmitting data between devices or components and the distinction lies in how data bits are sent over a connection.  In serial communication, data was transmitted one bit at a time over as little as single channel or wire which in the early days of the industry was inherently slow although in modern implementations (such as USB (Universal Serial Bus) or PCIe (Peripheral Component Interconnect Express)) high speeds are possible.  Given what was needed in the early days, serial technology was attractive because the reduction in wiring reduced cost and complexity, especially over the (relatively) long distances at which serial excelled and with the use of line-drivers, the distances frequently were extended to hundreds of yards.  The trade-off was of course slower speed but these were simpler times.  In parallel communication, data is transmitted multiple bits at a time, each bit traveling simultaneously over its own dedicated channel and this meant it was much faster than serial transmission.  Because more wires were demanded, the cost and complexity increased, as did the potential for interference and corruption but most parallel transmission was over short distances (25 feet (7½ metres) was “long-distance”) and the emergence of “error correcting” protocols made the mode generally reliable.  For most, it was the default method of connecting a printer and for large file sizes the difference in performance was discernible, the machines able to transmit more data in a single clock cycle due to simultaneous bit transmission.  Except for specialized applications or those dealing with legacy hardware (and in industries like small-scale manufacturing where such dedicated machines can be physically isolated from the dangers of the internet, parallel and serial ports and cables continue to render faithful service) parallel technology is effectively obsolete and serial connections are now almost universally handled by the various flavours of USB.

Wednesday, June 12, 2024

Reduction

Reduction (pronounced ri-duhk-shuhn)

(1) The act of reducing or the state of being reduced.

(2) The amount by which something is reduced or diminished.

(3) The form (result) produced by reducing a copy on a smaller scale (including smaller scale copies).

(4) In cell biology, as meiosis, especially the first meiotic cell division in which the chromosome number is reduced by half.

(5) In chemistry, the process or result of reducing (a reaction in which electrons are gained and valence is reduced; often by the removal of oxygen or the addition of hydrogen).

(6) In film production when using physical film stock (celluloid and such), the process of making a print of a narrower gauge from a print of a wider gauge (historically from 35 to 16 mm).

(7) In music, a simplified form, typically an arrangement for a smaller number of parties  such as an orchestral score arranged for a solo instrument.

(8) In computability theory, a transformation of one problem into another problem, such as mapping reduction or polynomial reduction.

(9) In philosophy (notably in phenomenology), a process intended to reveal the objects of consciousness as pure phenomena.

(10) In metalworking, the ratio of a material's change in thickness compared to its thickness prior to forging and/or rolling.

(11) In engineering, (usually as “reduction gear”), a means of energy transmission in which the original speed is reduced to whatever is suitable for the intended application.

(12) In surgery, a procedure to restore a fracture or dislocation to the correct alignment, usually with a closed approach but sometimes with an open approach.

(13) In mathematics, the process of converting a fraction into its decimal form or the rewriting of an expression into a simpler form.

(14) In cooking, the process of rapidly boiling a sauce to concentrate it.

(15) During the colonial period, a village or settlement of Indians in South America established and governed by Spanish Jesuit missionaries.

1475–1485: From the Middle English reduccion, from the earlier reduccion, from the Middle French reduction, from the Latin reductiōnem & reductiōn- (stem of reductiō (a “bringing back”)) the construct being reduct(us) (past participle of redūcere (to lead back) + -iōn- (the noun suffix).  The construct in English was thus reduc(e), -ion.  Reduce was from the Middle English reducen, from the Old French reduire, from the Latin redūcō (reduce), the construct being re- (back) + dūcō (lead).  The –ion suffix was from the Middle English -ioun, from the Old French -ion, from the Latin -iō (genitive -iōnis).  It was appended to a perfect passive participle to form a noun of action or process, or the result of an action or process.  Reduction, reductivism, reductionistic & reductionism are nouns, reductionist is a noun & adjective, reductional & reductive are adjectives; the noun plural is reductions.  Forms like anti-reduction, non-reduction, over-reduction, pre-reduction, post-reduction, pro-reduction, self-reduction have been created as required.

Actor Ariel Winter (b 1998), before (left) and after (right) mammaplasty (breast reduction).  Never has satisfactorily it been explained why this procedure seems to be lawful in all jurisdictions.

In philosophy & science, reductionism is an approach used to explain complex phenomena by reducing them to their simpler, more fundamental components.  It posits that understanding the parts of a system and their interactions can provide a complete explanation of the system as a whole an approach which is functional and valuable is some cases and to varying degrees inadequate in others.  The three generally recognized classes of reductionism are (1) Ontological Reductionism, the idea that reality is composed of a small number of basic entities or substances, best illustrated in biology where life processes are explained by reducing things to the molecular level.  (2) Methodological Reductionism, an approach which advocates studying systems by breaking into their constituent parts, much used in psychology where it might involve studying human behavior by examining neurological processes.  (3) Theory Reductionism which involves explaining a theory or phenomenon in one field by the principles of another, more fundamental field as when chemistry is reduced to the physics or chemical properties explained by the operation of quantum mechanics.  Reduction has been an invaluable component in many of the advances in achieved in science in the last two-hundred-odd years and some of the process and mechanics of reductionism have actually been made possible by some of those advances.  The criticism of an over-reliance on reductionism in certain fields in that its very utility can lead to the importance of higher-level structures and interactions being overlooked; there is much which can’t fully be explained by the individual parts or even their interaction.  The diametric opposite of reductionism is holism which emphasizes the importance of whole systems and their properties that emerge from the interactions between parts.  In philosophy, reductionism is the position which holds a system of any level of complexity is nothing but the sum of its parts and an account of it can thus be reduced to accounts of individual constituents.  It’s very much a theoretical model to be used as appropriate rather than an absolutist doctrine but it does hold that phenomena can be explained completely in terms of relations between other more fundamental phenomena: epiphenomena.  A reductionist is either (1) an advocate of reductionism or (2) one who practices reductionism.

Reductionism: Lindsay Lohan during "thin phase".

The adjective reductive has a special meaning in Scots law pertaining to reduction of a decree or other legal device (ie something rescissory in its effect); dating from the sixteenth century, it’s now rarely invoked.  In the sense of “causing the physical reduction or diminution of something” it’s been in use since the seventeenth century in fields including chemistry, metallurgy, biology & economics, always to convey the idea of reduces a substance, object or some abstract quantum to a lesser, simplified or less elaborated form.  At that time, it came to be used also to mean “that can be derived from, or referred back to; something else” and although archaic by the early 1800s, it existence in historic texts can be misleading.  It wasn’t until after World War II (1939-1945) that reductive emerged as a derogatory term, used to suggest an argument, issue or explanation has been “reduced” to a level of such simplicity that so much has been lost as to rob things of meaning.  The phrase “reductio ad absurdum” (reduction to the absurd) is an un-adapted borrowing from the Latin reductiō ad absurdum, and began in mathematics, logic (where it was a useful tool in deriving proofs in fields like).  In wider use, it has come to be used of a method of disproving a statement by assuming the statement is true and, with that assumption, arriving at a blatant contradiction; the synonyms are apagoge & “proof by contradiction”.

Single-family houses (D-Zug) built in 1922 on the principle of architectural reductionism by Heinrich Tessenow in collaboration with Austrian architect Franz Schuster (1892–1972), Moritzburger Weg 19-39 (the former Pillnitzer Weg), Gartenstadt Hellerau, Dresden, Germany.

As a noun, a reductivist is one who advocates or adheres to the principles of reductionism or reductivism.  In art & architecture (and some aspects of engineering) this can be synonymous with the label “a minimalist” (one who practices minimalism).  As an adjective, reductivist (the comparative “more reductivist”, the superlative “most reductivist”) means (1) tending to reduce to a minimum or to simplify in an extreme way and (2) belonging to the reductivism movement in art or music.  The notion of “extreme simplification” (a reduction to a minimum; the use of the fewest essentials) has always appealed some and appalled others attracted to intricacy and complexity.  The German architect Professor Heinrich Tessenow (1876-1950) summed it up in the phrase for which he’s remembered more than his buildings: “The simplest form is not always the best, but the best is always simple.”, one of those epigrams which may not reveal a universal truth but is probably a useful thing to remind students of this and that lest they be seduced by the process and lose sight of the goal.  Tessenow was expanding on the principle of Occam's Razor (the reductionist philosophic position attributed to English Franciscan friar & theologian William of Ockham (circa 1288–1347) written usually as Entia non sunt multiplicanda praeter necessitatem (literally "Entities must not be multiplied beyond necessity" which translates best as “the simplest solution is usually the best.

Reductio in extrema

1960 Lotus Elite Series 1 (left) and at the Le Mans 24 Hour endurance classic, June 1959 (left) Lotus Elite #41 leads Ferrari 250TR #14. The Ferrari (DNF) retired after overheating, the Elite finishing eighth overall, winning the 1.5 litre GT class.

Weighing a mere 500-odd kg (1100 lb), the early versions of the exquisite Lotus Elite (1957-1963) enchanted most who drove it but the extent of the reductionism compromised the structural integrity and things sometimes broke when used under everyday conditions which of course includes potholed roads.  Introduced late in 1961 the Series 2 Elite greatly improved this but some residual fragility was inherent to the design.  On the smooth surfaces of racing circuits however, it enjoyed an illustrious career, notable especially for success in long-distance events at the Nürburgring and Le Mans.  The combination of light weight and advanced aerodynamics meant the surprisingly powerful engine (a lightweight and robust unit which began life powering the water pumps of fire engines!) delivered outstanding performance, frugal fuel consumption and low tyre wear.  As well as claiming five class trophies in the Le Mans 24 hour race, the Elite twice won the mysterious Indice de performance (an index of thermal efficiency), a curious piece of mathematics actually intended to ensure, regardless of other results, a French car would always win something.

Colin Chapman (1928–1982), who in 1952 founded Lotus Cars, applied reductionism even to the Tessenow mantra in his design philosophy: “Simplify, then add lightness.”  Whether at the drawing board, on the factory floor or on the racetrack, Chapman seldom deviated from his rule and while it lent his cars sparking performance and delightful characteristics, more than one of the early models displayed an infamous fragility.  Chapman died of a heart attack which was a good career move, given the likely legal consequences of his involvement with John DeLorean (1925–2005) and the curious financial arrangements made with OPM (other people's money) during the strange episode which was the tale of the DMC DeLorean gullwing coupé.

1929 Mercedes-Benz SSKL blueprint (recreation, left) and the SSKL “streamliner”, AVUS, Berlin, May 1932 (right).

The Mercedes-Benz SSKL was one of the last of the road cars which could win top-line grand prix races.  An evolution of the earlier S, SS and SSK, the SSKL (Super Sports Kurz (short) Leicht (light)) was notable for the extensive drilling of its chassis frame to the point where it was compared to Swiss cheese; reducing weight with no loss of strength.  The SSK had enjoyed success in competition but even in its heyday was in some ways antiquated and although powerful, was very heavy, thus the expedient of the chassis-drilling intended to make it competitive for another season.  Lighter (which didn't solve but at least to a degree ameliorated the high tyre wear) and easier to handle than the SSK (although the higher speed brought its own problems, notably in braking), the SSKL enjoyed a long Indian summer and even on tighter circuits where its bulk meant it could be out-manoeuvred, sometimes it still prevailed by virtue of sheer power.  By 1932 however the engine’s potential had been reached and no more metal could be removed from the structure without dangerously compromising safety; in engineering (and other fields), there is a point at which further reduction becomes at least counter-productive and often dangerouw.  The solution was an early exercise in aerodynamics (“streamlining” the then fashionable term), an aluminium skin prepared for the 1932 race held on Berlin’s AVUS (Automobil-Versuchs und Übungsstraße (automobile traffic and practice road)).  The reduction in air-resistance permitted the thing to touch 255 km/h (158 mph), some 20 km/h (12 mph) more than a standard SSLK, an increase the engineers calculated would otherwise have demanded another (unobtainable) 120 horsepower.  The extra speed was most useful at the unique AVUS which comprised two straights (each almost six miles (ten kilometres) in length) linked by two hairpin curves, one a dramatic banked turn.  The SSKL was the last of the breed, the factory’s subsequent Grand Prix machines all specialized racing cars.

Reduction gears: Known casually as "speed reducers", reduction gears are widely used in just about every type of motor and many other mechanical devices.  What they do is allow the energy of a rotating shaft to be transferred to another shaft running at a reduced speed (achieved usually by the use of gears (cogs) of different diameters.

In chemistry, a reduction is the process or result of reducing (a reaction in which electrons are gained and valence is reduced; often by the removal of oxygen or the addition of hydrogen) and as an example, if an iron atom (valence +3) gains an electron, the valence decreases to +2.  Linguistically, it’s obviously counterintuitive to imagine a “reduced atom” is one which gains rather than loses electrons but the term in this context dates from the early days of modern chemistry, where reduction (and its counterpart: “oxidation”) were created to describe reactions in which one substance lost an oxygen atom and the other substance gained it.   In a reaction such as that between two molecules of hydrogen (2H2)and one of oxygen (O2) combining to produce two molecules of water (2H2O), the hydrogen atoms have gained oxygen atoms and were said to have become “oxidized,” while the oxygen atoms have “lost them” by attaching themselves to the hydrogens, and were thus “reduced”.  Chemically however, in the process of gaining an oxygen atom, the hydrogen atoms have had to give up their electrons and share them with the oxygen atoms, while the oxygen atoms have gained electrons, thus the seeming paradox that the “reduced” oxygen has in fact gained something, namely electrons.

Secretary of Defence the younger (left) and elder (right).  Donald Rumsfeld (left) with Gerald Ford (1913–2006; US president 1974-1977) and George W Bush (George XLIII, b 1946; US president 2001-2009).

Donald Rumsfeld (1932–2021; US Secretary of Defense 1975-1977 & 2001-2006) may or may not have been evil but his mind could sparkle and his marvellously reductionist principles can be helpful.  His classification of knowledge was often derided but it remains a useful framework:

(1) Known unknowns.
(2) Known knowns.
(3) Unknown unknowns.
(4) (most intriguingly) Unknown knowns.

A expert reductionist, he reminded us also there are only three possible answers to any question and while there's a cultural reluctance to say “don’t know”, sometimes it is best:

(1) I know and I’m going to tell you.
(2) I know and I’m not going to tell you.
(3) Don’t know.

While (1) known unknowns, (2) known knowns and (3) unknown unknowns are self-explanatory, examples of (4) unknown knowns are of interest and a classic one was the first “modern” submarine, developed by the Germans during the last months of World War II (1939-1945).

German Type XII Elektroboot (1945).

In World War II, the course of the war could have been very different had OKM (Oberkommando der Marine (the Kriegsmarine's (German Navy) high command)) followed the advice of the commander of the submarines and made available a fleet of 300 rather than building a surface fleet which wasn’t large enough to be a strategic threat but of sufficient size to absorb resources which, if devoted to submarines, could have been militarily effective.  With a fleet of 300, it would have been possible permanently to maintain around 100 at sea but at the outbreak of hostilities, only 57 active boats were on the navy’s list, not all of which were suitable for operations on the high seas so in the early days of the conflict, it was rare for the Germans to have more than 12 committed to battle in the Atlantic.  Production never reached the levels necessary for the numbers to achieve critical mass but even so, in the first two-three years of the war the losses sustained by the British were considerable and the “U-Boat menace” was such a threat that much attention was devoted to counter-measures and by 1943 the Allies could consider the battle of the Atlantic won.  The Germans’ other mistake was not building a true submarine capable of operating underwater (and therefore undetected) for days at a time.

It was only in 1945 when Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945) and Grand Admiral Karl Dönitz (1891–1980; head of the German Navy 1943-1945, German head of state 1945) were assessing the “revolutionary” new design that they concluded there was no reason why such craft couldn’t have been built in the late 1930s because the engineering capacity and technology existed even then (although the industrial and labor resources did not).  It was a classic case of what Donald Rumsfeld (1932–2021: US defense secretary 1975-1977 & 2001-2006) would later call an “unknown known”: The Germans in 1939 knew how to build a modern submarine but didn’t “know that they knew”.  Despite the improvements however, military analysts have concluded that even if deployed in numbers, such was the strength of forces arrayed against Nazi Germany that by 1945, not even such a force could have been enough to turn the tide of war.  However, had the German navy in 1939-1940 had available a fleet of even 100 such submarines (about a third what OKM (Oberkommando der Marine (the Kriegsmarine's (German Navy) high command) calculated was the critical strategic size given at any point only a third would be at sea with the others either in transit or docked), the battle in the Atlantic would have been much more difficult for the British.

Mr Rumsfeld however neglected to mention another class of knowledge: the “lost known”, examples of which have from time-to-time appeared and there may be more still to be discovered.  The best known were associated with the knowledge lost after the fall in the fifth century of the Western Roman Empire when Europe entered the early medieval period, once popularly known as the “Dark Ages”.  The lost knowns included aspects of optics such as lens grinding and the orthodoxy long was the knowledge was not “re-discovered” or “re-invented” until perfected in Italy during the late thirteenth century although it’s now understood that in the Islamic world lens continued during the late Medieval period to be ground and it’s suspected it was from Arabic texts the information reached Europe.

What really was a lost known was how the Romans of Antiquity made their long-lasting opus caementicium (concrete) so famously “sticky” and resistant to the effects of salt water.  Unlike modern concrete, made using Portland cement & water, Roman concrete was a mix of volcanic ash & lime, mixed with seawater, the later ingredient inducing a chemical reaction creating a substance stronger and more durable.  When combined with volcanic rocks, it formed mineral crystalline structures called aluminum tobermorite which lent the mix great strength and resistance to cracking.  After the fall of Rome, the knowledge was lost and even when a tablet was discovered listing the mix ratios, caementicium couldn’t be replicated because the recipe spoke only of “water” and not “sea water”, presumably because that was common knowledge.  It was only modern techniques of analysis which allowed the “lost known” to again become a “known known”.