Showing posts sorted by date for query Acid. Sort by relevance Show all posts
Showing posts sorted by date for query Acid. Sort by relevance Show all posts

Monday, December 30, 2024

Syncategorematic

Syncategorematic (pronounced sin-kat-i-gawr-uh-mat-ik or sin-kat-i-gor-uh-mat-ik)

(1) In traditional logic, of or relating to a word that is part of a categorical proposition but is not a term, as all, some, is (applying to expressions not in any of Aristotle's categories, but form meaningful expressions together with them, such as conjunctions and adverbs).

(2) In contemporary logic, of or relating to a word or symbol that has no independent meaning and acquires meaning only in the context of other words or symbols, as the symbol (or the word of).

1820–1830: From the Late Latin syncatēgorēmat-, stem of syncatēgorēma (part of a discourse that needs another word to become fully intelligible).  The construct was syn- + categorematic.  The syn- prefix (used also as syl- (if preceding a “l” and sym- (if preceding a “b”, “m” or “p”) was from the Ancient Greek συν- (sun-), from σύν (sún) (with, in company with, together with) and may be compare with the Sanskrit सम्- (sam-).  It was appended to create forms with the meanings (1) identical, (2) with, together, or (3) concomitant.  Categorematic was from the Ancient Greek κατηγορμα (katēgorēma) (predicate; or something that is affirmed) from the verb κατηγορέω (katēgoreō) (to accuse, assert, or predicate).  The Latinized version of this root (categorema) was adopted by English and was the source of the familiar word “category” (and derivatives).  With the mix of Greek and Latin influences, syncategorematic is one of those words the more fastidious purists dislike.

The suffix -ate was a word-forming element used in forming nouns from Latin words ending in -ātus, -āta, & -ātum (such as estate, primate & senate).  Those that came to English via French often began with -at, but an -e was added in the fifteenth century or later to indicate the long vowel.  It can also mark adjectives formed from Latin perfect passive participle suffixes of first conjugation verbs -ātus, -āta, & -ātum (such as desolate, moderate & separate).  Again, often they were adopted in Middle English with an –at suffix, the -e appended after circa 1400; a doublet of –ee.  The construct of the –atic suffix was –at(e) +‎ -ic; it was a doublet of –age and an alternative form of –tic.  The -ic suffix was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (H₂SO₄) has more oxygen atoms per molecule than sulphurous acid (H₂SO₃).  Used in linguistics (of a term), syncategorematic describes those terms which demand the appending of other terms to make a meaningful constituent of language; in th more arcane corners of the field, there are terms said to be (the comparative) “more syncategorematic” and (the superlative) most syncategorematic.  Syncategorematic & syncategorematical are adjectives, syncategoreme & syncategorematicity are nouns and syncategorematically is an adverb; the noun plural is syncategoremes or syncategorematicities (the latter apparently not in use).

Sentence: “The girl’s legs were under the table.”  The preposition “under” is syncategorematic because it describes the spatial relationship between her “legs” and the “table” but in isolation, “under” has no meaning.

It’s a long word and the definition might seem convoluted but in English syncategorematic words are in common, everyday use.  In traditional logic, it denotes a word unable to sustain useful meaning if standing alone but must be joined to a categorematic term in order to enter a categorical proposition; “some”, “and” & “all” are among those used most frequently.  In modern logic, the sense was extended to any symbol with no independent meaning.  Some terms which seem grammatically categorematic are often classed as syncategorematic, especially those which are in some way “value loaded” attributive adjectives (such as “good” or “large”).  Neither “good, nor “large” possess independent meaning whereas “red” and “is a Ferrari” are so vested; something can simply be “red” but it there is nothing which can be understood from “large” as a stand-alone term.  Meaning is attained only when something is added because the meaning comes from the sense of being relative-to-something (a “large bacterium” may be big enough to be the biggest bacterium ever seen but, comparatively, it’s nothing close to the size of a “large elephant”.

Sentence: “A girl is sitting on the floor.”  Both “a” and “the” are syncategorematic; they modify the nouns “girl” and “map” but do not, when standing alone, impart meaning.

The common examples of syncategorematic terms in English can be classified into a number of categories: (1) Logical Operators: (and, or, not, if, then) which structure logical relationships between propositions; (2) Articles: (the, a, an) which specify definiteness or indefiniteness but lack standalone meaning; (3) Prepositions: (in, on, at, by, with) which indicate relationships between nouns or pronouns but do not on their own impart meaning; (4) Conjunctions: (and, but, or, because) which connect clauses or ideas; (5) Quantifiers: (all, some, none, many, few) which express quantity or extent without referring to a specific entity; (6) Negations: (not, no) which modify the meaning of other terms or clauses and (7) Adverbs of Degree: (very, quite, somewhat, too) which modify adjectives or adverbs to indicate intensity or degree.

There are nuances in use such as the legendary exchange which gave language the word “laconic” (using few words; expressing much in few words).  In Antiquity, Laconia was the region inhabited and ruled by the Spartans, known for their brevity in speech and in English, the meaning “concise, abrupt” emerged in the 1580s (although laconical was created and went extinct a decade earlier).  The origin of this sense was when Philip II of Macedon (382–336 BC; king (basileus) of Macedonia 359-336) threatened the Spartans with the words: "If I enter Laconia, I will raze Sparta to the ground." to which the Spartans replied: "If.”  In that case, the meaning is derived from the context, a case in which “if” (usually syncategorematic) has meaning as a superficially single, stand-alone word.

Saturday, December 28, 2024

Macropterous & Brachypterous

Macropterous (pronounced muh-krop-ter-uhs)

(1) In zoology (mostly in ornithology, ichthyology & entomology), having long or large wings or fins.

(2) In engineering, architecture and design, a structure with large, untypical or obvious “wings” or “fins”.

Late 1700s: The construct was macro- + -pterous.  Macro is a word-forming element meaning “long, abnormally large, on a large scale”, from the French, from the Medieval Latin, from the Ancient Greek μακρός (makrós), a combining form of makrós (long) (cognate with the Latin macer (lean; meager)), from the primitive Indo-European root mak (long, thin).  In English it is used as a general purpose prefix meaning “big; large version of”).  The English borrowing from French appears as early as the sixteenth century but it tended to be restricted to science until the early 1930s when there was an upsurge in the publication of material on economics during the Great Depression (ie as “macroeconomy” and its derivatives).  It subsequently became a combining form meaning large, long, great, excessive et al, used in the formation of compound words, contrasting with those prefixed with micro-.  In computing, it covers a wide vista but describes mostly relatively short sets of instructions used within programs, often as a time-saving device for the handling of repetitive tasks, one of the few senses in which macro (although originally a clipping in 1959 of “macroinstruction”) has become a stand-alone word rather than a contraction.  Other examples of use include macrophotography (photography of objects at or larger than actual size without the use of a magnifying lens (1863)), macrospore (in botany, "a spore of large size compared with others (1859)), macroeconomics (pertaining to the economy as a whole (1938), macrobiotic (a type of diet (1961)), macroscopic (visible to the naked eye (1841)), macropaedia (the part of an encyclopaedia Britannica where entries appear as full essays (1974)) and macrophage (in pathology "type of large white blood cell with the power to devour foreign debris in the body or other cells or organisms" (1890)).

The –pterous suffix was from the Ancient Greek, the construct being πτερ(όν) (pter(ón) (feather; wing), from the primitive Indo-European péthr̥ (feather) and related to πέτομαι (pétomai) (I fly) (and (ultimately), the English feather) +‎ -ous.  In zoology (and later, by extension, in engineering and design), it was appended to words from taxonomy to mean (1) having wings and (2) having large wings.  Later, it was used also of fins.  The –ous suffix was from the Middle English -ous, from the Old French –ous & -eux, from the Latin -ōsus (full, full of); a doublet of -ose in an unstressed position.  It was used to form adjectives from nouns to denote (1) possession of (2) presence of a quality in any degree, commonly in abundance or (3) relation or pertinence to.  In chemistry, it has a specific technical application, used in the nomenclature to name chemical compounds in which a specified chemical element has a lower oxidation number than in the equivalent compound whose name ends in the suffix -ic.  For example, sulphuric acid (H2SO4) has more oxygen atoms per molecule than sulphurous acid (H2SO3).  The comparative is more macropterous and the superlative most macropterous.  Macropterous & macropteran are adjectives and macropter & macroptery are nouns; the noun plural is macropters.

Google ngram: Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Brachypterous (pronounced bruh-kip-ter-uhs)

In zoology (mostly in ornithology & entomology), having short, incompletely developed or otherwise abbreviated wings (defined historically as being structures which, when fully folded, do not reach to the base of the tail.long or large wings or fins.

Late 1700s: The construct was brachy- + -pterous.  The brachy- prefix was from the Ancient Greek βραχύς (brakhús) (short), from the Proto-Hellenic brəkús, from the primitive Indo-European mréǵus (short, brief).  The cognates included the Sanskrit मुहुर् (múhur) & मुहु (múhu), the Avestan m̨ərəzu.jīti (short-lived), the Latin brevis, the Old English miriġe (linked ultimately to the English “merry”) and the Albanian murriz.  It was appended to convey (1) short, brief and (2) short, small.  Brachypterous & brachypteran are adjectives and brachyptery & braˈchypterism are nouns.  The comparative would be more brachypterous and the superlative most brachypterous but because of the nature of the base word, that would seem unnatural.  The noun brachypter does not means “a brachypterous creature; it describes taeniopterygid stonefly of the genus Brachyptera”.

The European Chinch Bug which exists in both macropterous (left) and brachypterous (right) form; Of the latter, entomologists also use the term "micropterous" and use does seem interchangeable but within the profession there may be fine distinctions. 

The difference in the use of macropterous (long wings or fins) and brachypterous (short wings) is accounted for less by the etymological roots than the application and traditions of use.  In zoological science, macropterous was granted a broad remit and came to be used of any creature (form the fossil record as well as the living) with long wings (use most prevalent of insects) and water-dwellers with elongated fins.  The word was applied first to birds & insects before being used of fish (fins being metaphorical “wings” and in environmentally-specific function there is much overlap.  By extension, in the mid-twentieth century, macropterous came to be used in engineering, architecture and design including of cars, airframes and missiles.

Brachypterous (short wings) is used almost exclusively in zoology, particularly entomology, the phenomenon being much more common than among birds which, being heavier, rely for lift on wings with a large surface area.  Short wing birds do exist but many are flightless (the penguin a classic example where the wings are used in the water as fins (for both propulsion and direction)) and this descriptor prevails.  Brachypterous is less flexible in meaning because tightly it is tied to a specific biological phenomenon; essentially a “short fin” in a fish is understood as “a fin”.  Cultural and linguistic norms may also have been an influence in that while “macro-” is widely used a prefix denoting “large; big”, “brachy-” has never entered general used and remains a tool in biology.  So, in common scientific use, there’s no recognized term specifically for “short fins” equivalent to brachypterous (short wings) although, other than tradition, there seems no reason why brachypterous couldn’t be used thus in engineering & design.  If so minded, the ichthyologists could coin “brachyichthyous” (the construct being brachy- + ichthys (fish)) or brachypinnate (the construct being brachy- + pinna (“fin” or “feather” in Latin)), both meaning “short-finned fish”.  Neither seem likely to cath on however, the profession probably happy with “short-fin” or the nerdier “fin hypoplasia”.

The tailfin: the macropterous and the brachypterous

Lockheed P-38 Lightning in flight (left) and 1949 Cadillac (right).

Fins had appeared on cars during the inter-war years when genuinely they were added to assist in straight-line stability, a need identified as speeds rose.  The spread to the roads came from the beaches and salt flats where special vehicles were built to pursue the world land speed record (LSR) and by the mid 1920s, speeds in these contests were exceeding 150 mph (240 km/h) and at these velocities, straight-line stability could be a matter of life and death.  The LSR crew drew their inspiration from aviation and that field also provided the motif for Detroit’s early post-war fins, the 1949 Cadillac borrowing its tail features from the Lockheed P-38 Lightning (a US twin-boom fighter first flown in 1939 and built 1941-1945) although, despite the obvious resemblance, the conical additions to the front bumper bar were intended to evoke the image of speeding artillery shells rather than the P-38’s twin propeller bosses.

1962 Ford (England) Zodiac Mark III (left) and 1957 DeSoto Firesweep two-door hardtop (right).

From there, the fins grew although it wasn’t until in 1956 when Chrysler released the next season’s rage that extravagance truly began.  To one extent or another, all Chrysler’s divisions (Plymouth, Dodge, DeSoto, Chrysler, Imperial) adopted the macropterous look and the public responded to what was being described in the press as “futuristic” or “jet-age” (Sputnik had yet to orbit the earth; “space-age” would soon come) with a spike in the corporation’s sales and profits.  The competition took note and it wasn’t long before General Motors (GM) responded (by 1957 some Cadillac fins were already there) although, curiously Ford in the US was always tentative about the fin and their interpretation was always rather brachypterous (unlike their English subsidiary which added surprisingly prominent fins to their Mark III Zephyr & Zodiac (1961-1966).

Macropterous: Lindsay Lohan with wings, generated with AI (artificial intelligence) by Stable Diffusion.

Even at the time the fins attracted criticism although it was just as part of a critique of the newer cars as becoming too big and heavy with a notable level of inefficiency (increasing fuel consumption and little (if any) increase in usable passenger space with most of the bulk consumed by the exterior dimensions, some created by apparently pointless styling features of which the big fins were but one.  The public continued to buy the big cars (one did get a lot of metal for the money) but there was also a boom in the sales of both imported cars (their smaller size among their many charms) but the corporation which later became AMC (American Motor Corporation) enjoyed good business for their generally smaller offerings.  Chrysler and GM ignored Ford’s lack of commitment to the macropterous and during the late 1950s their fin continued to grow upwards (and, in some cases, even outwards) but, noting the flood of imports, decided to join the trend, introducing smaller ranges; whereas in 1955, the majors offered a single basic design, by 1970 there would be locally manufactured “small cars”, sub-compacts”, “compacts” and “intermediates” as well as what the 1955 (which mostly had been sized somewhere between a “compact” and an “intermediate”) evolved into (now named “full-size”, a well-deserved appellation).

1959 Cadillac with four-window hardtop coachwork (the body-style known also as the "flattop" or "flying wing roof") (left) and 1961 Imperial Crown Convertible (right).

It was in 1959-1961 that things became “most macropterous” (peak fin) and the high-water mark of the excess to considered by most to be the 1959 Cadillac, east of the towering fins adorned with a pair of taillights often described as “bullet lights” but, interviewed year later, a member of the General Motors Technical Center (opened in 1956 and one of the mid-century’s great engines of planned obsolescence) claimed the image they had in mind was the glowing exhaust from a rocket in ascent, then often seen in popular culture including film, television and advertising.  However, although a stylistic high, it was the 1961 Imperials which set the mark literally, the tip of those fins standing almost a half inch (12 mm) taller and it was remembered too for the “neo-classical” touch of four free-standing headlights, something others in the industry declined to follow.

Tending to the brachypterous: As the seasons went by, the Cadillac's fins would retreat but would not for decades wholly vanish.

It’s a orthodoxy in the history of design that the fins grew to the point of absurdity and then vanished but that’s not what literally happened in all cases.  Some manufacturers indeed suddenly abandon the motif but Cadillac, perhaps conscious of having nurtured (and in a sense “perfected”) the debut of the 1949 range must have felt more attached because, after 1959, year after year, the fins would become smaller and smaller although decades later, vestigial fins were still obviously part of the language of design.  In Europe, others would also prune.

Macropterous to brachypterous.  Sunbeam Alpine: 1960 Series I (left) and 1966 Series V. 

Built in five series between 1959-1968, the fins on the Sunbeam Alpine would have seemed a good idea in 1957 when the lines were approved but trend didn’t persist and with the release in 1964 of the revised Series IV, the effect was toned down, the restyling achieved in an economical way by squaring off the rake at the rear, this lowering the height of the tips.  Because the release of the Series IV coincided with the debut of the Alpine Tiger (fitted initially with a 260 cubic inch (4.2 litre) V8 (and later a 4.7 (289)), all the V8 powered cars used the “low fin” body.

Macropterous to brachypterous. 1961 Mercedes-Benz 300 SE Lang (Long) (left) and 1971 Mercedes-Benz 280 SE 3.5 coupé.

Regarded by some as a symbol of the way the Wirtschaftswunder (the post war “economic miracle” in the FRG (Federal Republic of Germany, the old West Germany)) had ushered away austerity, the (slight) exuberance of the fins which appeared on the Mercedes-Benz W111 (1959-1968) & W112 (1961–1965) seemed almost to embarrass the company, offended by the suggestion they would indulge in a mere “styling trend”.  Although the public soon dubbed the cars the Heckflosse (literally “tail-fins”), the factory insisted they were Peilstege (parking aids or sight-lines (literally "bearing bars")), the construct being peil-, from peilen (take a bearing; find the direction) + Steg (bar) which marked the extent of the bodywork, this to assist while reversing.  That may have been true (the company has never been above a bit of myth-making) but when a coupé and cabriolet was added to the W111 & W112 range, the fins were noticeably smaller, achieving an elegance of line Mercedes-Benz has never matched.  Interestingly, a la Cadillac, when the succeeding sedans (W108-W109 (1965-1972) & W116, (1972-1979)) were released, both retained a small hint of a fin although by 1972 it wasn’t enough even to be called vestigial; the factory said the small deviation from the flat was there to increase structural rigidity.

Macropterous to brachypterous: 1962 Vanden Plas Princess 3 Litre (left) and 1967 Vanden Plas Princess 4 Litre R (right).

The Italian design house Pinninfarina took to fins in the late 1950s and applied what really were variations of the same basic design to commissions from Fiat, Lancia, Peugeot and BMC (British Motor Corporation, a conglomerate created by merger in 1952 which brought together Morris, Austin (and soon Austin-Healey), MG, Riley, Wolseley & Vanden Plas under the one corporate umbrella.  There were a several BMC “Farinas” sold under six badges and the ones with the most prominent fins were the “big” Farinas, the most expensive of which were Princess 3 Litre (1959-1960), Vanden Plas Princess 3 Litre (1960-1964) and Vanden Plas Princess 4 Litre R (1964-1968); the “R” appended to the 4 Litre’s model name was to indicate its engine (which had begun life as a military unit) was supplied by Rolls-Royce, a most unusual arrangement.  The 4 Litre used the 3 Litre’s body with a number of changes, one of which was a change in the shape and reduction in the size of the rather chunky fins.  Although the frumpy shell remained, the restyling was thought quite accomplished though obviously influenced by the Mercedes-Benz W111 & W112 coupés & cabriolets but if one is going to imitate, one should choose to emulate the finest.

Thursday, November 28, 2024

Cereal & Serial

Cereal (pronounced seer-ee-uhl)

(1) Any plant of the grass family yielding an edible grain (wheat, rye, oats, rice, corn, maize, sorghum, millet et al).

(2) The grain from those plants.

(3) An edible preparation of these grains, applied especially to packaged, (often process) breakfast foods.

(4) Of or relating to grain or the plants producing it.

(5) A hamlet in Alberta, Canada.

(6) As Ceres International Women's Fraternity, a women's fraternity focused on agriculture, founded on 17 August 1984 at the International Conclave of FarmHouse fraternity.

1590s: From the sixteenth century French céréale (having to do with cereal), from the Latin cereālis (of or pertaining to the Roman goddess Ceres), from the primitive Indo-European ker-es-, from the root er- (to grow”) from which Latin gained also sincerus (source of the English sincere) and crēscō (grow) (source of the English crescent).  The noun use of cereal in the modern sense of (a grass yielding edible grain and cultivated for food) emerged in 1832 and was developed from the adjective (having to do with edible grain), use of which dates from 1818, also from the French céréale (in the sense of the grains).  The familiar modern use (packaged grain-based food intended for breakfast) was a creation of US English in 1899.  If used in reference to the goddess Ceres, an initial capital should be used.  Cereal, cereology & cerealogist are nouns and ceralic is an adjective; the noun plural is cereals.

Lindsay Lohan mixing Pilk.

Cereal is often used as modifier (cereal farming, cereal production, cereal crop, non-cereal, cereal bar, pseudocereal, cereal dust etc) and a cereologist is one who works in the field of cerealogy (the investigation, or practice, of creating crop circles).  The term “cereal killer” is used of one noted for their high consumption of breakfast cereals although some might be tempted to apply it to those posting TikTok videos extolling the virtue of adding “Pilk” (a mix of Pepsi-Cola & Milk) to one’s breakfast cereal.  Pilk entered public consciousness in December 2022 when Pepsi Corporation ran a “Dirty Sodas” promotion for the concoction, featuring Lindsay Lohan.  There is some concern about the high sugar content in packaged cereals (especially those marketed towards children) but for those who want to avoid added sugar, Pepsi Corporation does sell “Pepsi Max Zero Sugar” soda and Pilk can be made using this.  Pepsi Max Zero Sugar contains carbonated water, caramel color, phosphoric acid, aspartame, acesulfame potassium, caffeine, citric acid, potassium benzoate & calcium disodium EDTA.

TikTok, adding Pilk to cereal and the decline of Western civilization.

A glass of Pilk does of course make one think of Lindsay Lohan but every mouthful of one’s breakfast cereal is something of a tribute to a goddess of Antiquity.  In 496 BC, Italy was suffering one of its periodic droughts and one particularly severe and lingering, the Roman fields dusty and parched.  As was the practice, the priests travelled to consult the Sibylline oracle, returning to the republic’s capital to report a new goddess of agriculture had to be adopted and sacrifices needed immediately to be made to her so rain would again fall on the land.  It was Ceres who was chosen and she became the goddess of agriculture and protector of the crops while the caretakers of her temple were the overseers of the grain market (they were something like the wheat futures traders in commodity exchanges like the Chicago Board of Trade (CBOT)).  It was the will of the goddess Ceres which determined whether a harvest was prolific or sparse and to ensure abundance, the Romans ensured the first cuttings of the corn were always sacrificed to her.  It’s from the Latin adjective cereālis (of or pertaining to the Roman goddess Ceres) English gained “cereal”.

For millennia humanity’s most widely cultivated and harvested crop, cereal is a grass cultivated for its edible grain, the best known of which are rice, barley, millet, maize, rye, oats, sorghum & wheat.  Almost all cereals are annual crops (ie yielding one harvest per planting) although some strains of rice can be grown as a perennial and an advantages of cereals is the differential in growth rates and temperature tolerance means harvesting schedules can be spread from mid-spring until late summer.  Except for the more recent hybrids, all cereals are variations of natural varieties and the first known domestication occurred early in the Neolithic period (circa 7000–1700 BC).  Although the trend in cultivated area and specific yield tended over centuries to display a gradual rise, it was the “green revolution” (a combination of new varieties of cereals, chemical fertilizers, pest control, mechanization and precise irrigation which began to impact agriculture at scale in the mid twentieth century) which produced the extraordinary spike in global production.  This, coupled with the development of transport & distribution infrastructure (ports and bulk carriers), made possible the increase in the world population, now expected to reach around 10 billion by mid-century before declining.

Serial (pronounced seer-ee-uhl)

(1) Anything published, broadcast etc, in short installments at regular intervals (a novel appearing in successive issues of a magazine (ie serialized); a radio or TV series etc).

(2) In library & publishing jargon, a publication in any medium issued in successive parts bearing numerical or chronological designation and intended to be continued indefinitely.

(3) A work published in installments or successive parts; pertaining to such publication; pertaining to, arranged in, or consisting of a series.

(4) Occurring in a series rather than simultaneously (used widely, serial marriage; serial murderer, serial adulterer etc).

(5) Effecting or producing a series of similar actions.

(6) In IT, of or relating to the apparent or actual performance of data-processing operations one at a time (in the order of occurrence or transmission); of or relating to the transmission or processing of each part of a whole in sequence, as each bit of a byte or each byte of a computer word.

(7) In grammar, of or relating to a grammatical aspect relating to an action that is habitual and ongoing.

(8) In formal logic and logic mathematics (of a relation) connected, transitive, and asymmetric, thereby imposing an order on all the members of the domain.

(9) In engineering & mass-production (as “serial number”), a unique (to a certain product, model etc) character string (which can be numeric or alpha-numeric) which identifies each individual item in the production run.

(10) In music, of, relating to, or composed in serial technique.

(11) In modern art, a movement of the mid-twentieth century avant-garde in which objects or constituent elements were assembled in a systematic process, in accordance with the principles of modularity.

(12) In UK police jargon, a squad of officers equipped with shields and other protective items, used for crowd and riot control.

1823: From the New Latin word seriālis, from the Classical Latin seriēs (series), the construct being serial + -al on the Latin model which was seriēs + -ālis.  It was cognate to the Italian seriale.  The Latin seriēs was from serere (to join together, bind), ultimately from the primitive Indo-European ser- (to bind, put together, to line up).  The -al suffix was from the Middle English -al, from the Latin adjectival suffix -ālis, ((the third-declension two-termination suffix (neuter -āle) used to form adjectives of relationship from nouns or numerals) or the French, Middle French and Old French –el & -al.  It was use to denote the sense "of or pertaining to", an adjectival suffix appended (most often to nouns) originally most frequently to words of Latin origin, but since used variously and also was used to form nouns, especially of verbal action.  The alternative form in English remains -ual (-all being obsolete).  The –alis suffix was from the primitive Indo-European -li-, which later dissimilated into an early version of –āris and there may be some relationship with hel- (to grow); -ālis (neuter -āle) was the third-declension two-termination suffix and was suffixed to (1) nouns or numerals creating adjectives of relationship and (2) adjectives creating adjectives with an intensified meaning.  The suffix -ālis was added (usually, but not exclusively) to a noun or numeral to form an adjective of relationship to that noun. When suffixed to an existing adjective, the effect was to intensify the adjectival meaning, and often to narrow the semantic field.  If the root word ends in -l or -lis, -āris is generally used instead although because of parallel or subsequent evolutions, both have sometimes been applied (eg līneālis & līneāris).  Serial, serializer , serialization serialism & serialist are nouns, serialing, serialize & serialed are verbs, serializable is an adjective and serially is adverb; the noun plural is serials.

The “serial killer” is a staple of the horror film genre.  Lindsay Lohan’s I Know Who Killed Me (2007) was not well received upon release but it has since picked up a cult following.

The adjective serial (arranged or disposed in a rank or row; forming part of a series; coming in regular succession) seems to have developed much in parallel with the French sérial although the influence of one on the other is uncertain.  The word came widely to be used in English by the mid nineteenth century because the popular author Charles Dickens (1812–1870) published his novels in instalments (serialized); sequentially, chapters would appear over time in periodicals and only once the series was complete would a book appear containing the whole work.  The first use of the noun “serial” to mean “story published in successive numbers of a periodical” was in 1845 and that came from the adjective; it was a clipping of “serial novel”.  By 1914 this had been extended to film distribution and the same idea would become a staple of radio and television production, the most profitable for of which was apparently the “mini-series”, a term first used in 1971 although the concept had been in use for some time.  Serial number (indicating position in a series) was first recorded in 1866, originally of papers, packages and such and it was extended to soldiers in 1918.  Surprisingly perhaps, given the long history of the practice, the term, “serial killer” wasn’t used until 1981 although the notion of “serial events” had been used of seemingly sequential or related murders as early as the 1960s.  On that model, serial became a popular modifier (serial rapist, serial adulterer, serial bride, serial monogamist, serial pest, serial polygamy etc)

For those learning English, the existence of the homophones “cereal” & “serial” must be an annoying quirk of the language.  Because cereals are usually an annual crop, it’s reasonable if some assume the two words are related because wheat, barley and such are handled in a “serial” way, planting and harvesting recurrent annual events.  Doubtless students are told this is not the case but there is a (vague) etymological connection in that the Latin serere meant “to join together, to bind” and it was used also to mean “to sow” so there is a connection in agriculture: sowing seeds in fields.  For serial, the connection is structural (linking elements in a sequence, something demonstrated literally in the use in IT and in a more conceptual way in “serial art”) but despite the differences, both words in a way involve the fundamental act of creating order or connection.

Serial art by Swiss painter Richard Paul Lohse (1902–1988): Konkretion I (Concretion I, 1945-1946), oil on pavatex (a wood fibre board made from compressed industrial waste) (left), Zwei gleiche Themen (Two same topics, 1947), colored pencil on paper (centre) and  Konkretion III (1947), oil on pavatex.

In modern art, “serial art” was a movement of the mid-twentieth century avant-garde in which objects or constituent elements were assembled in a systematic process in accordance with the principles of modularity.  It was a concept the legacy of which was to influence (some prefer “infect”) other artistic schools rather than develop as a distinct paradigm but serial art is still practiced and remains a relevant concept in contemporary art.  The idea was of works based on repetition, sequences or variations of a theme, often following a systematic or conceptual approach; the movement was most active during the mid-twentieth century and a notable theme in Minimalism, Donald Judd (1928-1994), Andy Warhol (1928–1987), Sol LeWitt (1928-2007) (there must have been something “serial” about 1928) and Richard Paul Lohse (1902-1988) all pioneers of the approach.  Because the techniques of the serialists were adopted by many, their style became interpolated into many strains of modern art so to now speak of it as something distinctive is difficult except in a historic context.  The embrace by artists of digital tools, algorithms, and AI (Artificial Intelligence) technologies has probably restored a new sort of “purity” to serial art because generative processes are so suited to create series of images, sculptures or digital works that explore themes like pattern, progression, or variation, the traditional themes of chaos, order and perception represented as before.  In a way, serial art was just waiting for lossless duplication and the NFT (Non-fungible token) and more conservative critics still grumble the whole idea is little different to an architect’s blueprint which documents the structural framework without the “skin” which lends the shape its form.  They claim it's the engineering without the art.

Relics of the pre-USB age; there were also 25 pin serial ports.

In IT hardware, “serial” and “parallel” refer to two different methods of transmitting data between devices or components and the distinction lies in how data bits are sent over a connection.  In serial communication, data was transmitted one bit at a time over as little as single channel or wire which in the early days of the industry was inherently slow although in modern implementations (such as USB (Universal Serial Bus) or PCIe (Peripheral Component Interconnect Express)) high speeds are possible.  Given what was needed in the early days, serial technology was attractive because the reduction in wiring reduced cost and complexity, especially over the (relatively) long distances at which serial excelled and with the use of line-drivers, the distances frequently were extended to hundreds of yards.  The trade-off was of course slower speed but these were simpler times.  In parallel communication, data is transmitted multiple bits at a time, each bit traveling simultaneously over its own dedicated channel and this meant it was much faster than serial transmission.  Because more wires were demanded, the cost and complexity increased, as did the potential for interference and corruption but most parallel transmission was over short distances (25 feet (7½ metres) was “long-distance”) and the emergence of “error correcting” protocols made the mode generally reliable.  For most, it was the default method of connecting a printer and for large file sizes the difference in performance was discernible, the machines able to transmit more data in a single clock cycle due to simultaneous bit transmission.  Except for specialized applications or those dealing with legacy hardware (and in industries like small-scale manufacturing where such dedicated machines can be physically isolated from the dangers of the internet, parallel and serial ports and cables continue to render faithful service) parallel technology is effectively obsolete and serial connections are now almost universally handled by the various flavours of USB.

Sunday, November 17, 2024

Now

Now (pronounced nou)

(1) At the present time or moment (literally a point in time).

(2) Without further delay; immediately; at once; at this time or juncture in some period under consideration or in some course of proceedings described.

(3) As “just now”, a time or moment in the immediate past (historically it existed as the now obsolete “but now” (very recently; not long ago; up to the present).

(4) Under the present or existing circumstances; as matters stand.

(5) Up-to-the-minute; fashionable, encompassing the latest ideas, fads or fashions (the “now look”, the “now generation” etc).

(6) In law, as “now wife”, the wife at the time a will is written (used to prevent any inheritance from being transferred to a person of a future marriage) (archaic).

(7) In phenomenology, a particular instant in time, as perceived at that instant.

Pre 900: From the Middle English now, nou & nu from the Old English (at the present time, at this moment, immediately), from the Proto-West Germanic , from the Proto-Germanic nu, from the primitive Indo-European (now) and cognate with the Old Norse nu, the Dutch nu, the German nun, the Old Frisian nu and the Gothic .  It was the source also of the Sanskrit and Avestan nu, the Old Persian nuram, the Hittite nuwa, the Greek nu & nun, the Latin nunc, the Old Church Slavonic nyne, the Lithuanian and the Old Irish nu-.  The original senses may have been akin to “newly, recently” and it was related to the root of new.  Since Old English it has been often merely emphatic, without any temporal sense (as in the emphatic use of “now then”, though that phrase originally meant “at the present time”, and also (by the early thirteenth century) “at once”.  In the early Middle English it often was written as one word.  The familiar use as a noun (the present time) emerged in the late fourteenth century while the adjective meaning “up to date” is listed by etymologists as a “mid 1960s revival” on the basis the word was used as an adjective with the sense of “current” between the late fourteenth and early nineteenth centuries.  The phrase “now and then” (occasionally; at one time and another) was in use by the mid 1400s, “now or never” having been in use since the early thirteenth century.  “Now” is widely used in idiomatic forms and as a conjunction & interjection.  Now is a noun, adjective & adverb, nowism, nowness & nowist are nouns; the noun plural is nows.

Right here, right now: Acid House remix of Greta Thunberg’s (b 2003) How dare you? speech by Theo Rio.

“Now” is one of the more widely used words in English and is understood to mean “at the present time or moment (literally a point in time)”.  However, it’s often used in a way which means something else: Were one to say “I’ll do it now”, in the narrow technical sense that really means “I’ll do it in the near future”.  Even things which are treated as happening “now” really aren’t such as seeing something.  Because light travels at a finite speed, it takes time for it to bounce from something to one’s eye so just about anything one sees in an exercise in looking back to the past.  Even when reading something on a screen or page one’s brain is processing something from a nanosecond (about one billionth of a second) earlier.  For most purposes, “now” is but a convincing (an convenient) illusion and even though, in certain, special sense, everything in the universe is happening at the same time (now) it’s not something that can ever be experienced because of the implications of relativity.  None of this causes many problems in life but among certain physicists and philosophers, there is a dispute about “now” and there are essentially three factions: (1) that “now” happened only once in the history of the known universe and cannot again exist until the universe ends, (2) that only “now” can exist and (3) that “now” cannot ever exist.

Does now exist? (2013), oil & acrylic on canvas by Fiona Rae (b 1963) on MutualArt.

The notion that “now” can have happened only once in the history of our universe (and according to the cosmological theorists variously there may be many universes (some which used to exist, some extant and some yet to be created) or our universe may now be in one of its many phases, each which will start and end with a unique “now”) is tied up with the nature of time, the mechanism upon which “now” depends not merely for definition but also for existence.  That faction deals with what is essentially an intellectual exercise whereas the other two operate where physics and linguistics intersect.  Within the faction which says "now can never exist" there is a sub-faction which holds that to say “now” cannot exist is a bit of a fudge in that it’s not that “now” never happens but only that it can only every be described as a particular form of “imaginary time”; an address in space-time in the past or future.  The purists however are absolutists and their proposition is tied up in the nature of infinity, something which renders it impossible ever exactly to define “now” because endlessly the decimal point can move so that “now” can only ever be tended towards and never attained.  If pushed, all they will concede is that “now” can be approximated for purposes of description but that’s not good enough: there is no now.

nower than now!: Lindsay Lohan on the cover of i-D magazine No.269, September, 2006.

The “only now can exist” faction find tiresome the proposition that “the moment we identify something as happening now, already it has passed”, making the point that “now” is the constant state of existence and that a mechanism like time exists only a thing of administrative convenience.  The “only now can exist” faction are most associated with the schools of presentism or phenomenology and argue only the present moment (now) is “real” and that any other fragment of time can only be described, the past existing only in memory and the future only as anticipation or imagination; “now” is the sole verifiable reality.  They are interested especially in what they call “change & becoming”, making the point the very notion of change demands a “now”: events happen and things become in the present; without a “now”, change and causality are unintelligible.  The debate between the factions hinges often on differing interpretations of time: whether fundamentally it is subjective or objective, continuous or discrete, dynamic or static.  Linguistically and practically, “now” remains central to the human experience but whether it corresponds to an independent metaphysical reality remains contested.

Sunday, November 3, 2024

Anonymuncule

Anonymuncule (pronounced uh-non-uh-monk-u-elle)

An insignificant, anonymous writer

1859: A portmanueau word, the construct being anony(mous) + (ho)muncule.  Homnuncle was from the Latin homunculus (a little man), a diminutive of homō (man).  Anonymous entered English circa 1600 and was from the Late Latin anonymus, from the Ancient Greek ᾰ̓νώνῠμος (annumos) (without name), the construct being ᾰ̓ν- (an-) (“not; without; lacking” in the sense of the negating “un-”) + ὄνῠμᾰ (ónuma), an Aeolic & Doric dialectal form of ὄνομᾰ (ónoma) (name).  The construct of the English form was an- +‎ -onym +‎ -ous.  The an- prefix was an alternative form of on-, from the Middle English an-, from the Old English an- & on- (on-), from the Proto-Germanic ana- (on).   It was used to create words having the sense opposite to the word (or stem) to which the prefix is attached; it was used with stems beginning either with vowels or "h".  The element -onym (word; name) came from the international scientific vocabulary, reflecting a New Latin combining form, from Ancient Greek ὄνυμα (ónuma).  The –ous suffix was from the Middle English -ous, from the Old French –ous & -eux, from the Latin -ōsus (full, full of); a doublet of -ose in an unstressed position.  It was used to form adjectives from nouns to denote (1) possession of (2) presence of a quality in any degree, commonly in abundance or (3) relation or pertinence to.  In chemistry, it has a specific technical application, used in the nomenclature to name chemical compounds in which a specified chemical element has a lower oxidation number than in the equivalent compound whose name ends in the suffix -ic.  For example, sulphuric acid (H2SO4) has more oxygen atoms per molecule than sulphurous acid (H2SO3).  The Latin homunculus (plural homunculi) enjoyed an interesting history.  In medieval medicine, it was used in the sense of “a miniature man”, a creature once claimed by the spermists (once a genuine medical speciality) to be present in human sperm while in modern medicine the word was resurrected for the cortical homunculus, an image of a person with the size of the body parts distorted to represent how much area of the cerebral cortex of the brain is devoted to it (ie a “nerve map” of the human body that exists on the parietal lobe of the human brain).  Anonymuncule is a noun; the noun plural is anonymuncules.

Preformationism: Homunculi in sperm (1695) illustrated by Nicolaas Hartsoeker who is remembered also as the inventor in 1694 of the screw-barrel simple microscope.

Like astrology, alchemy once enjoyed a position of orthodoxy among scientists and it was the alchemists who first popularized homunculus, the miniature, fully formed human, a concept with roots in both folklore and preformationism (in biology. the theory that all organisms start their existence already in a predetermined form upon conception and this form does not change in the course of their lifetime (as opposed to epigenesis (the theory that an organism develops by differentiation from an unstructured egg rather than by simple enlarging of something preformed)).  It was Paracelsus (the Swiss physician, alchemist, lay theologian, and philosopher of the German Renaissance Theophrastus von Hohenheim (circa 1493-1541)) who seems to have been the first to use the word in a scientific paper, it appearing in his De homunculis (circa 1529–1532), and De natura rerum (1537).  As the alchemists explained, a homunculus (an artificial humanlike being) could be created through alchemy and in De natura rerum Paracelsus detailed his method.

A writer disparaged as an anonymuncule differs from one who publishes their work anonymously or under a pseudonym, the Chicago Tribune in 1871 explaining the true anonymuncule was a “little creature who must not be confounded with the anonymous writers, who supply narratives or current events, and discuss public measures with freedom, but deal largely in generalities, and very little in personalities.  That was harsh but captures the place the species enjoy in the literary hierarchy (and it’s a most hierarchal place). Anonymuncules historically those writers who publish anonymously or under pseudonyms, without achieving renown or even recognition and there’s often the implication they are “mean & shifty types” who “hide behind their anonymity”.

Primary Colors: A Novel of Politics (1996), before and after the lifting of the veil.

Some however have good and even honourable reasons for hiding behind their anonymity although there is also sometime mere commercial opportunism.  When former Time columnist Joe Klein (born 1946) published Primary Colors: A Novel of Politics (1996), the author was listed as “anonymous”, a choice made to avoid the political and professional risks associated with openly critiquing a sitting president and his administration.  Primary Colors was a (very) thinly veiled satire of Bill Clinton’s (b 1946; US president 1993-2001) 1992 presidential campaign and offered an insider's view of campaign life, showing both the allure and moral compromises involved.  By remaining anonymous, Klein felt more able candidly to discuss the ethical dilemmas and personal shortcomings of his characters, something that would have been difficult has his identity been disclosed, the conflicts of interest as a working political journalist obvious.  Critically and commercially, the approach seems greatly to have helped the roman à clef (a work of fiction based on real people and events) gain immediate notoriety, the speculation about the author’s identity lying at the core of the book’s mystique.  Others have valued anonymity because their conflicts of interest are insoluble.  Remarkably, Alfred Deakin (1856-1919; prime minister of Australia 1903-1904, 1905-1908 & 1909-1910) even while serving as prime-minister, wrote political commentaries for London newspapers including the National Review & Morning Post and, more remarkably still, some of his pieces were not uncritical of both his administration and his own performance in office.  Modern politicians should be encouraged to pursue this side-gig; it might teach them truthfulness and encourage them more widely to practice it.

For others, it can be a form of pre-emptive self defense.  The French philosopher Voltaire (François-Marie Arouet; 1694–1778) wrote under a nom de plume because he held (and expressed) views which often didn’t please kings, bishops and others in power and this at a time when such conduct was likely to attract persecution worse than censorship or disapprobation.  Mary Ann Evans (1819–1880) adopted the pseudonym George Eliot in an attempt to ensure her works would be taken seriously, avoiding the stigma associated with female authorship at the time.  George Eliot’s style of writing was however that of a certain sort of novelist and those women who wrote in a different manner were an accepted part of the literary scene and although Jane Austen’s name never appeared on her published works, when Sense and Sensibility (1811) appeared its author was listed as “A Lady”.  Although a success, all her subsequent novels were billed as: “By the author of Sense and Sensibility”, Austen's name never appearing on her books during her lifetime.  Ted Kaczynski (1942-2023), the terrorist and author of the Unabomber Manifesto (1995) had his own reasons (wholly logical but evil) for wanting his test to be read but his identity as the writer to remain secret.

Nazi poetry circle at the Berghof: Left to right, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945), Martin Bormann (1900–1945), Hermann Göring (1893–1946; leading Nazi 1922-1945, Hitler's designated successor & Reichsmarschall 1940-1945), and Baldur von Schirach (1907-1974; head of the Hitlerjugend (Hitler Youth) 1931-1940 & Gauleiter (district party leader) and Reichsstatthalter (Governor) of Vienna (1940-1945)), Berchtesgaden, Bavaria, Germany, 1936.  Of much, all were guilty as sin but von Schirach would survive to die in his bed at 67.

The "poet manqué" is a somewhat related term.  A poet manqué is an aspiring poet who never produced a single book of verse (although it’s used also of an oeuvre so awful it should never have been published and the poetry of someone Baldur von Schirach comes to mind.  The adjective manqué entered English in the 1770s and was used originally in the sense of “unfulfilled due to the vagary of circumstance, some inherent flaw or a constitutional lack”.  Because it’s so often a literary device, in English, the adjective does often retain many grammatical features from French, used postpositively and taking the forms manquée when modifying a feminine noun, manqués for a plural noun, and manquées for a feminine plural noun.  That’s because when used in a literary context (“poet manqué”, “novelist manqué” et all) users like it to remain inherently and obviously “French” and thus it’s spelled often with its diacritic (the accent aigu (acute accent): “é”) although when used casually (to suggest “having failed, missed, or fallen short, especially because of circumstances or a defect of character”) as “fly-half manqué”, “racing driver manqué” etc), the spelling manque” is sometimes used.

Manqué (that might have been but is not) was from the French manqué, past participle form of the sixteenth century manquer (to lack, to be lacking in; to miss), from the Italian mancare, from manco, from the Latin mancus (maimed, defective), from the primitive Indo-European man-ko- (maimed in the hand), from the root man- (hand).  Although it’s not certain, the modern slang adjective “manky” (bad, inferior, defective (the comparative mankier, the superlative mankiest)), in use since the late 1950s, may be related.  Since the 1950s, the use in the English-speaking world (outside of North America) has extended to “unpleasantly dirty and disgusting” with a specific use by those stationed in Antarctica where it means “being or having bad weather”.  The related forms are the noun mankiness and the adverb mankily.  Although it’s not an official part of avian taxonomy, bird-watchers (birders) in the UK decided “manky mallard” was perfect to describe a mallard bred from wild mallards and domestic ducks (they are distinguished by variable and uneven plumage patterns).  However, it’s more likely manky is from the UK slang mank which was originally from Polari mank and used to mean “disgusting, repulsive”.

No poet manqué:  In January 2017, Lindsay Lohan posted to Instagram a poem for her 5.2 million followers, the verse a lament of the excesses of IS (the Islamic State), whetting the appetite for the memoir which might one day appear (hopefully "naming names").  The critical reaction to the poem was mixed but the iambic pentameter in the second stanza attracted favorable comment:

sometimes i hear the voice of the one i loved the most
but in this world we live in of terror
who i am to be the girl who is scared and hurt
when most things that happen i cannot explain
i try to understand
when i'm sitting in bed alone at 3am
so i can't sleep, i roll over
i can't think and my body becomes cold
i immediately feel older.....
 
than i realise, at least i am in a bed,
i am still alive,
so what can really be said?
just go to bed and close the blinds,
still and so on, i cannot help but want to fix all of these idle isis
minds
because,
there has to be something i can figure out
rather than living in a world of fear and doubt
they now shoot, we used to shout.
 
if only i can keep trying to fix it all
i would keep the world living loving and small
i would share my smiles
and give too Many kisses