Wednesday, December 22, 2021

Radome

Radome (pronounced rey-dohm)

A dome-shaped device used as a protective housing for a radar antenna (although the word is loosely used and applied to structures of varied shapes in which radar equipment is installed).

1940–1945: A portmanteau word, a blend of ra(dar) + dome.  In electronics, radar is a device for determining the presence and location of an object by measuring the time for the echo of a radio wave to return from it and the direction from which it returns and in figurative use refers to a means or sense of awareness or perception.  Dating from 1940-1945, radar was originally the acronym RADAR which was creation of US scientific English: RA(dio)D(etecting)A(nd)R(anging).  In the way English does things, the acronym RADAR came to be used with such frequency that it became a legitimate common noun, the all lower-case “radar” now the default form.  Dating from 1505–1515, dome was from the Middle French domme & dome (a town-house; a dome, a cupola) (which persists in modern French as dôme), from the Provençal doma, from the Italian duomo (cathedral), from the Medieval Latin domus (ecclesiae; literally “house (of the church)”), a calque of the Ancient Greek οκος τς κκλησίας (oîkos tês ekklēsías).  Radome is a noun & verb; the noun plural is radomes.

Radomes at the Pine Gap satellite surveillance base, some 11 miles (18 km) south-west of Alice Springs (population circa 34,000) in Australia's Northern Territory (left) and a random radome which was blown onto an Indianapolis street by a storm (right).

Officially, the operation in Alice Springs jointly is operated by the defence departments of the US and Australia and was once known as the Joint Defence Space Research Facility (JDSRF) but, presumably aware nobody was fooled, it was in 1988 renamed the Joint Defence Facility Pine Gap (JDFPG).  The Pine Gap facility is a restricted zone so it's not a tourist attraction which is unfortunate because it's hard to think of any other reason to visit Alice Springs.  FoxNews in June 2025 published pictures of the “random radome” which had “fallen from the sky” in Indianapolis, Indiana after a severe thunderstorm swept through the region, wind gusts as high as 65 mph (105 km/h, 56 knots) measured.  The spherical cap was reported as being the size of a “small shed” and it was “parked” neatly, the flat base next to the curb and in case any of the conspiracy theorists in the Fox New audience began to speculate about alien invasions or government plots, it was revealed the radome came from an installation at the nearby tech infrastructure company V2X.

Lindsay Lohan on the cover of Radar magazine, June-July 2007.  The last print-edition of Radar was in 2008; since 2009 it's existed in on-line editions.

Dating from the mid 1940s, the word radar began as the acronym RADAR, (RA(dio)D(etecting)A(nd)R(anging)), coined in the US and entering English as a word within years.  Specialized forms are created as needed (radar gun, radar zone, radar tower, radar trap et al); radar is a noun, verb & adjective; the noun plural is radars.  In English, whether a string of letters is an acronym, abbreviation, initialism or word is determined both by form and organic process and the strings can emerge in more than one category.  Although it wouldn’t for a few years be known as radar, the system first became well known (within a small community on both sides of the English Channel) in 1940 because the string of radar installations along the English coast played such a significant role in the Royal Air Force’s (RAF) defense during the Battle of Britain, the crucial air-war fought that summer.  What the radar did was to provide sufficient notice of an attack to enable RAF Fighter Command to react to threats in the right place at the right time (altitude was always a problem to assess) by “scrambling” squadrons of aircraft on stand-by rather than having to maintain constant patrols in the sky, something which rapidly would have diminished resources.

Radomes don’t actually fulfil any electronic function as such.  They are weatherproof structures which are purely protective (and on ships where space is at a premium they also protect personnel from the moving machinery) and are thus constructed from materials transparent to radio waves.  The original radomes were recognizably domish but they quickly came to be built in whatever shape was most suitable to their location and application: pure spheres, planars and geodesic spheres are common.  When used on aircraft, the structures need to be sufficiently aerodynamic not to compromise performance, thus the early use of nose-cones as radomes and on larger airframes, dish-like devices have been fashioned.

North American Sabre:  F-86A (left) and F-86D with black radome (right).

Introduced in 1947, the North American F-86 Sabre was the US Air Force’s (USAF) first swept-wing fighter and the last trans-sonic platform used as a front-line interceptor.  Although as early as 1950 elements within the USAF were concerned it would soon be obsolete, it proved a solid, versatile platform and close to 10,000 were produced, equipping not only US & NATO forces but also those of a remarkable number of nations, some remaining in front-line service until the 1990s.  In 1952, the F-86D was introduced which historians of military aviation regard as the definitive version.  As well as the large number of improvements typical of the era, an AN/APG-36 all-weather radar system was enclosed in a radome which resembled an enlarged version of the central bosses previously often used on propellers.

What lies beneath a radome: Heinkel He 219 Uhu with radar antennae array.

The size of the F-86D’s radome is indicative also that the now familiar tendency for electronic components to become smaller is nothing new.  Only a half decade before the F86-D first flew, Germany’s Heinkel He 219 Uhu had entered combat as a night-fighter, its most distinctive feature the array of radar antennae protruding from the nose.  The arrangement was highly effective but, needing to be as large as they were, a radome would have been impossible.  The He 219 was one of the outstanding airframes World War II (1939-1945) and of its type, at least the equal of anything produced by the Allies but it was the victim of the internal politics which bedevilled industrial and military developments in the Third Reich (1933-1945), something which wasn’t fully understood until some years after the end of hostilities.  Remarkably, although its dynamic qualities should have made volume production compelling, fewer than 300 were ever built, mainly because Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945): (1) was less inclined to allocate priorities to defensive equipment (attack always his preferred strategy) and (2) the debacle of the Heinkel He 177 Gref heavy bomber (which he described as “the worst junk ever manufactured) had made him distrustful of whatever the company did.

Peak dagmar: 1955 Cadillac Series 62 Coupe de Ville.

As early as 1941, the US car industry had with enthusiasm taken to adorning the front of their vehicles with decorative conical devices they intended to summon in the minds of buyers the imagery of speeding artillery shells, then something often seen in popular publications.  However, in the 1950s, the hardware of the jet-age became the motif of choice but the protuberances remained, some lasting even into the next decade.  They came to be known as “dagmars” because of the vague anatomical similarity to one of the early stars of television but the original inspiration really had been military field ordnance.  Cadillac actually abandoned the use of dagmars in their 1959 models (a rare example of restraint that year and not extended to the rest of the design) but concurrent with that, they also toured the show circuit with the Cadillac Cyclone (XP-74) concept car.

1959 Cadillac Cyclone (XP-74) concept car.

Although it was powered by the corporation’s standard 390 cubic inch (6.5 litre) V8, there was some adventurous engineering including a rear-mounted automatic transaxle and independent rear suspension (using swing axles, something not as bad as it sounds given the grip of the cross-ply tyres of the era) but few dwelt long on such things, their attention grabbed by features such as the bubbletop canopy (silver coated for UV protection) which opened automatically in conjunction with the electrically operated sliding doors.  The decorative rear skegs (borrowed from nautical use where there were functional) had been see on earlier show cars (notably the 1959 "twin bubbletop" Firebird III) and they appeared on the 1961-1962 Cadillacs in two versions: skeg short & skeg-long.

1958 Edsel Citation Convertible (left) and 1964 GM-X Stiletto, a General Motors (GM) "dream car" built for the 1964 New York World's Fair.

Most innovative however was a feature which wouldn’t reach volume production until well into the twenty-first century: Borrowing from the North American F86-D Sabre, two radomes were fitted at the front, housing antennae for a radar-operated collision avoidance system (ROCAS) which fed to the driver information on object which lay in the vehicle’s path including distance and the length it would take to brake, audible signals and a warning lights part of the package.  Unfortunately, as was often the case with the concept cars, the crash avoidance system didn't function, essentially because the electronics required for it to be useful would not for decades become available.  As the dagmars had, the Cyclone’s twin radomes attracted the inevitable comparisons but given the sensor and antennae technology of the time, two were apparently demanded although, had Cadillac more slavishly followed the F-86D and installed a single central unit, the response might have been even more ribald, the frontal styling of the doomed Edsel then still being derisively compared to female genitalia; cartoonists would have had fun with a Cyclone so equipped seducing an Edsel.  In 1964, there's never been anything to suggest GM's designers were thinking of the anatomical possibilities offered by an Edsel meeting a Stiletto.

Tuesday, December 21, 2021

Funicular

Funicular (pronounced fyoo-nik-yuh-ler)

(1) Of or relating to a rope or cord, or its tension.

(2) Worked by a rope or the like.

(3) In physics and geometry, the curve an idealized hanging chain or cable assumes under its own weight when supported only at its ends (also known as a catenary).

(4) A type of cable car, usually described as a funicular railway which tends to be constructed on steep slopes and consist of a counterbalanced car sat either end of a cable passing round a driving wheel at the summit.

(5) Of or relating to a funicle.

(6) In medicine, of or pertaining to the umbilical cord.

(7) In botany, having a fleshy covering of the seed formed from the funiculus, the attachment point of the seed.

1655-1665: From the Latin funicle (a small cord) from the Latin funiculus (a slender rope), diminutive of funis (a cord, rope) of unknown etymology but possibly related to the Latin filum (thread), a doublet of file and (in anatomy), a filamentous anatomical structure.

The Funicular Railway

Castle Hill Funicular, Budapest, Hungary.  Opened in 1870, It ascends and descends 167 feet (51m) through a track of 312 feet (95m) in around ninety seconds.

A funicular railway employs (usually) two passenger vehicles pulled on a slope by the same cable which loops over a pulley wheel at the upper end of a track.  The vehicles are permanently attached to the ends of the cable and counterbalance each other. They move synchronously: while one is ascending, the other descends.  The use of two vehicles is what distinguishes funiculars from other types of cable railways although more complex funiculars have been built using four.  The first was built in 1874.

In 1943, Benito Mussolini (1883-1945; Duce (leader) & prime-minister of Italy 1922-1943) was deposed by a meeting of the Fascist Ground Council, a kind of senate he'd made the mistake of not dissolving when he had the chance.  In farcical circumstances, the Duce was arrested and spirited away and almost immediately, Fascism in Italy "burst like a bubble", a not inaccurate assessment but one which caused some embarrassment to Colonel-General Alfred Jodl (1890–1946; Chief of the Operations Staff OKW (Oberkommando der Wehrmacht (high command of the armed forces)) 1939-1945) who made the mistake of blurting it out in the presence of  Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945).  Not wanting the contagion to spread, Hitler ordered Mussolini be rescued so he could be established as a "puppet Duce" somewhere to try to preserve the illusion the "pact of steel" between the two fascist states remained afoot.    

Seeking a place to imprison the deposed Duce secure from any rescue attempt, the new Italian government locked him up at the Hotel Campo Imperatore, a mountain resort in Abruzzo accessible only by a funicular railway, judged (correctly) by the military authorities to be easily defensible against ground troops and without the facilities to support landings by aircraft.  However, a rapidly improvised operation using glider-borne Waffen-SS troops and a STOL (short take-off & landing) airplane staged a daring raid and freed the captive though it proved a brief reprieve, the Duce and his mistress executed by a mob less than two years later.

Fieseler Fi 156 Storch, Gran Sasso d'Italia massif, Italy, during the mission to rescue Mussolini from captivity, 12 September 1943.  The Duce is sitting in the passenger compartment.

The German liaison & communications aircraft, the Fieseler Fi 156 Storch (stork) was famous for its outstanding short take-off & landing (STOL) performance and low stalling speed of 30 mph (50 km/h) which enabled it almost to hover when faced into a headwind.  It was one of the classic aircraft designs of the era and so close to perfect it remained in production for years after the end of hostilities and re-creations are still often fabricated by those attracted by its close to unique capabilities.  The Storch’s ability to land in the length of a cricket pitch (22 yards (20.12 m)) made it a useful platform for all sorts of operations and while the daring landing on for a mountain-top rescue-mission in northern Italy was the most famous, for all of the war it was an invaluable resource; it was the last Luftwaffe (German air force) aircraft to land in Berlin during the last days of the Third Reich.  In 1943, so short was the length of the strip of grass available for take-off that even for a Storch it was touch & go (especially with the Duce’s not inconsiderable weight added) but with inches to spare, the little plane safely delivered its cargo.

In one of the war's more obscure footnotes, it was the characteristics of the Fieseler Storch which led to what was may have been the first appearance (in writing) for centuries of an old piece of Middle English slang, dating from the 1590s.  In sixteenth century England, the ability of the Kestrel (a common small falcon) to hover in even a light breeze meant it came to be known (in certain circles) as "the windfucker" and the similar ability of the Storch was noted in one British wartime diary entry in which the folk-name for the bird was invoked to describe the little aircraft seemingly "hanging in the air".

Monday, December 20, 2021

Cache

Cache (pronounced kash)

(1) A hiding place (historically most associated with one in the ground) for ammunition, food, treasures etc.

(2) Anything so hidden (even if not necessarily in a cache).

(3) In computing (hardware & software), a temporary storage space or memory permitting fast access (as opposed to a call to a hard drive).  The term “cache storage” is still sometimes used.

(4) In Alaska and Northern Canada, a small shed elevated on poles above the reach of animals and used for storing food, equipment etc.

(5) To put in a cache; to conceal or hide; to store.

1585–1595: From the French cache, a noun derivative of cacher (to hide), from the unattested Vulgar Latin coācticāre (to stow away (originally, “to pack together”), frequentative of the Classical Latin coāctāre, (constrain) the construct being coāct(us) (collected) (past participle of cōgere (to collect, compel)), + -icā- (the formative verb suffix) + -re (the infinitive suffix).  Cache is a noun & verb, cacheability is a noun, cacheable is an adjective and cached & caching are verbs; the noun plural is caches.

The bottom half of a bikini can be thought of as a cache-sexe.  Lindsay Lohan demonstrates, Los Angeles, 2009.

English picked up the word from French Canadian trappers who used it in the sense of “hiding place for stores” but more pleasing still was the early twentieth century French noun cache-sexe (slight covering for a woman's genitals), the construct being cacher "to hide" + sexe (genitals).  Cache can be confused with the (unrelated though from the same Latin source) noun “cachet”.  Dating from the 1630s, in the sense of “a wax seal”, it was from the sixteenth century French cachet (seal affixed to a letter or document)", from the Old French dialectal cacher (to press, crowd), from the Latin coāctāre (constrain).  In the eighteenth century the meaning (via the French lettre de cachet (letter under seal of the king) shifted to “(letter under) personal stamp (of the king)”, thus the idea of a cachet coming by the mid-1800s to be understood as “a symbol of prestige”.  In that sense it has since the mid-twentieth century become entrenched in English though not all approved.  Henry Fowler (1858–1933) was about as fond of foreign affectations as he was of literary critics and in his A Dictionary of Modern English Usage (1926) he maintained: (1) the only use English had for “cachet” was as the apothecaries used it to describe “a capsule containing a pharmaceutical preparation”, (2) the more common “stamp” & “seal” were preferable for stuff stuck on envelopes and (3) phrases like “a certain cachet” or “the cachet of genius” were clichés of literary criticism and the critics were welcome to them.  Interestingly, In English, cachet did find a niche as a (wholly un-etymological) variant of cache: it means “a hidden location from which one can observe birds while remaining unseen”.  The origins of this are thought to allude to such places being hiding places (thus a cache) and cramped (the irregular –et in the (cach)et a use of the suffix –et which was from the Middle English -et, from the Old French –et & its feminine variant -ette, from the Late Latin -ittus (and the other gender forms -itta & -ittum).  It was used to form diminutives, loosely construed.

Cachet is pronounced ka-shey or kash-ey (the French being ka-she) but some sites report there are those who use one of the English alternatives for cache; that’s obviously wrong but appears to be rare.  What is common (indeed it seems to have become the standard in some places) is kay-sh, something which really annoys the pedants.  However a case can be made that kash should remain the standard while kay-sh should be used of everything particular to computers (disk cache, web cache et al), rather along the lines of the US spelling “program” being adopted when referring to software in places where programme is used for all other purposes.  Both seem potentially useful points of differentiation although while there a chance for splitting the pronunciation of cache, it’s unlikely the Americans will take to programme.

Lindsay Lohan’s shoe stash.  She also has a handbag stash.

Cache may also be related to stash which is similar in meaning but conveys usually something quite disreputable, the verb dating from circa 1795 as was underworld slang meaning “to conceal or hide, the related forms being stashed & stashing.  The noun also was criminal slang meaning “hoard, cache, a collection of things stashed away” and was first observed in 1914 and, via popular literature, picked up in general English, often with the specific sense of “a reserve stock”.  The origin is unknown origin but most etymologists seem to have concluded it was a blend of either stick + cache or stow + cache.  Following the US use in the early 1940s (where most such adaptations began), stash is now most associated with drug slang (one’s stash of weed etc) but Urban Dictionary lists more recent co-options such as a stash being variously (1) “someone with whom one is involved but one has no intention of introducing to one’s friends or family”, (2) as “porn stash” an obscure (or even hidden) place among the directory tree on one’s computer where one keeps one’s downloaded (or created) pornography (analogous with the physical hiding places when such stuff was distributed in magazines), (3) a variety of the mechanics or consequences of sexual acts and (4) certain types of moustache (sometimes with modifiers).  Of the latter, as 'stache & stache, it’s long been one of the apheretic clippings of moustache ('tache, tache & tash the others).

So a cache is a hoard, stockpile, reserve or store of stuff, sometimes secreted from general view and often untouched for extended periods.  In modern computing, a cache is a busy place when much of what is stored is transitory and while there are now many variations of the caching idea (CPUs (Central Processing Units) & GPUs (Graphics Processing Units) have for years had multiple internal caches), the classic example remains the disk cache, a mechanism used temporarily to store frequently accessed or recently used data from a storage device, such as a HDD (Hard Disk Drive) or SSD (Solid-State Drive).  What the cache does is make things respond faster because accessing anything from the static electricity of a cache is many times faster than from a piece of physical media; fast, modern SSDs have reduced the margin but it still exists and at scale, remains measurable.

Caches started modestly enough but in the early days of PCs there were few means more effective at gaining speed unless you were a megalomaniac able to run a 4 MB (megabyte) RAMDrive (and such freaks did exist and were much admired).  However, caches grew with LANs (Local Area Networks), WANs (Wide Area Networks) and then the Web and as internet traffic proliferated, the behavior of caches could create something like the bottlenecks they were created to avoid.  Thus something of a science of cache management emerged, necessitated because unlike many aspects of computer design, the problems couldn’t always be solved by increasing size; beyond a certain point, not only did the law of diminishing returns begin to apply but if caches were too big, performance actually suffered: they are a Goldilocks device.

New problems begat new jargon and the most illustrative was the “cache stampede”, a phenomenon witnessed in massively parallel computing systems handing huge volumes of requests to cached data.  For a cache to be effective, it need to hold those pages which need most frequently to be accessed but it’s there’s an extraordinarily high demand for a single or a handful of URLs (Universal Resource Locator (the familiar address.com etc), if the requested page(s) in cache expire, as there is a “stampede” of demand, what can happen is the system becomes an internal loop as multiple servers simultaneously attempt to render the same content and in circumstances of high ambient load, congestion begins to “feed on itself”, shared resources become exhausted because they can’t be re-allocated as long as demand remain high.

Another attractive term is cache-buster, software which prevents duplication within a cache.  It’s an important part of the modern model of internet commerce which depends for much revenue flow on the alignment of the statistics between publishers and marketers.  All a cache buster does is prevent a browser from caching the same file twice so if a user “accepts cookies”, the browser will track and save them, enabling the user to access the previously cached site whenever they return which is good for speed but, it there have been changes to the site, user may not be able to see them.  The cache buster’s solution is simple brute-force: a random number appended to the ad-tag which means new ad-calls no longer have a link to the tag, compelling the browser to send a new request to the origin server.  This way, website owners can be assured the number of impressions registered by a marketing campaign will be very close to correct.

Intel i486 CPUs (left) and Asus ROG Matrix GeForce RTX 4090 Platinum 24G GPU (right).

Progress: In 1989, Intel released the 80486 CPU (the name later standardized as i486 because pure numeric strings are almost impossible to trademark), acclaimed by the press at the time as “phenomenally faster” and while that may have been hyperbolic, in the brief history of the PC, impressionistically, few new chips “felt” so much faster.  Part of that was attributable to a Level 1 instruction cache (8-16 KB depending on the version).  By 2023, nVidia’s GeForce RTX 4090 GPU included a L1 cache with 128 KB per SM (Streaming Multiprocessor) and a L2 cache with 72 MB.

Sunday, December 19, 2021

Scientist

Scientist (pronounced sahy-uhn-tist)

A person who studies or practises any of the sciences or who uses scientific methods, especially in the physical or natural sciences.

1833: Modeled after artist, the construct was the Latin stem scientia (knowledge) + -ist.  Science was from the Middle English science & scyence, from the Old French science & escience, from the Latin scientia (knowledge), from sciens, the present participle stem of scire (to know).  The -ist suffix was from the Middle English -ist & -iste, from the Old French -iste and the Latin -ista, from the Ancient Greek -ιστής (-ists), from -ίζω (-ízō) (the -ize & -ise verbal suffix) and -τής (-ts) (the agent-noun suffix).  It was added to nouns to denote various senses of association such as (1) a person who studies or practices a particular discipline, (2), one who uses a device of some kind, (3) one who engages in a particular type of activity, (4) one who suffers from a specific condition or syndrome, (5) one who subscribes to a particular theological doctrine or religious denomination, (6) one who has a certain ideology or set of beliefs, (7) one who owns or manages something and (8), a person who holds very particular views (often applied to those thought most offensive).

Natural Philosopher versus Scientist

Founded in 1831 and modelled on the Gesellschaft Deutscher Naturforscher und Ärzte (Society of German Researchers and Physicians), the British Association for the Advancement of Science (BAAS) was formed as an organisation open to anyone interested in science, unlike the exclusive Royal Society (originally Royal Society of London for Improving Natural Knowledge), founded on 28 November 1660 and granted a royal charter by Charles II (1630–1685; King of Scotland 1649-1651, King of Scotland, England and Ireland 1660-1685).  The Royal Society is the world's oldest continuously existing scientific academy and it's list of fellows range from Sir Isaac Newton (1642–1727) in 1672 to Elon Musk (b 1971) in 2018.  In an indication of the breadth of its attraction, at the meeting of the BAAS on 24 June 1834, unexpectedly in attendance was the poet Samuel Taylor Coleridge (1772-1834); he’d not left his home in Highgate Hill for years and would die within weeks.  That a poet should attend a meeting about science was not at the time a surprise, the division between science and the arts coming later and Taylor had previously written about the scientific method.

Scientists study all sorts of things.  Research like this can attract an Ig Nobel prize.

For most of history, those we would now think of as scientists had been called natural philosophers.  Coleridge declared true philosophers were those who sat in their armchairs and contemplated the cosmos; they did not scratch around in digs or fiddle with electrical piles.  Cambridge don, the Reverend William Whewell, an English polymath, responded by suggesting, by analogy with artist, they should be called scientists and added those studying physics could be styled physicists, the French having already applied physicien (physician) to the surgeons and etymologists once dated the word “scientist” from that meeting but it was later discovered Whewell had coined the term in 1833 and it first appeared in print a year later in his anonymous review of Mary Somerville's (1780-1872) On the Connexion of the Physical Sciences published in the Quarterly Review.  It took a while to catch on but was in wide use in the US by the late nineteenth century and the rest of the English-speaking world a few years later although as late as 1900 there were publishers which had scientist on their “not-acceptable” list.

Google ngram: Because of the way Google harvests data for their ngrams, they’re not literally a tracking of the use of a word in society but can be usefully indicative of certain trends, (although one is never quite sure which trend(s)), especially over decades.  As a record of actual aggregate use, ngrams are not wholly reliable because: (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI should improve).  Where numbers bounce around, this may reflect either: (1) peaks and troughs in use for some reason or (2) some quirk in the data harvested.

Saturday, December 18, 2021

Freelance

Freelance (pronounced free-lans or free-lahns)

(1) A person who works selling work or services by the hour, day, job etc, rather than working on some regular basis for one employer; also as freelancer or free-lancer; self-employed, free agent, unaffiliated.

(2) A person who contends in a cause or in a succession of various causes, as he or she chooses, without personal attachment or allegiance (applied often to politicians who tend to supports several causes or parties without total commitment to any one).

(3) The act of working as a freelancer; used often as a modifier.

(4) Of or relating to freelancing or the work of a freelance.

(5) A mercenary soldier or military adventurer in medieval Europe, often of knightly rank, who offered his services to any state, party, or cause (retrospectively applied).

1820: The construct was free + lance.  Free was from the Middle English free, fre & freo, from the Old English frēo (free), from the Proto-West Germanic frī, from the Proto-Germanic frijaz (beloved, not in bondage), from the primitive Indo-European priHós (dear, beloved), from preyH- (to love, please); from this evolved the related modern English friend.  It was cognate with the West Frisian frij (free), the Dutch vrij (free), the Low German free (free), the German frei (free), the Danish, Swedish & Norwegian fri (free) and the Sanskrit प्रिय (priyá).  The verb is from the Middle English freen & freoȝen, from the Old English frēon, & frēoġan (to free; make free), from the Proto-West Germanic frijōn, from the Proto-Germanic frijōną, from the primitive Indo-European preyH-.  Germanic and Celtic are the only Indo-European language branches in which the primitive Indo-European word with the meaning of "dear, beloved" acquired the additional meaning of "free" in the sense of "not in bondage".  This was an extension of the idea of "characteristic of those who are dear and beloved" (those who were friends and others in the tribe as opposed to the unfree, those of other tribes, slaves etc).  The evolution was comparable with the Latin use of liberi to mean both "free persons" and "children of a family".  Lance was from the Middle English launce, from the Old French lance, from the Latin lancea.  Ultimate root was via the Celtic & Celtiberian, possibly from the primitive Indo-European plehk- (to hit) and related to the Ancient Greek λόγχη (lónkhē).  The hyphenated form (free-lance, free-lancer et al) is a correct alternative but should be used with the usual convention of English use: consistency within a document.  If an alternative hyphenated form is used for one word, it should be used for all where the option exists.

The first know instance is in Ivanhoe (1820) by Sir Walter Scott (1771–1832) to describe a medieval mercenary warrior or "free-lance", the meaning lying in the notion of his arms (lance, sword etc) being freely available for hire and not sworn to any lord's services). Scott’s description of them resembles that of the Italian condottieri (a leader or member of a troop of mercenaries).  It became a figurative noun circa 1864, most frequent used when applied to writers & journalists from 1882, the unhyphenated “freelance” attested from 1898.  The Oxford English Dictionary (OED) listed it as a verb in 1903 and in modern use the word has morphed into an adjective, and an adverb, as well as the familiar derivative noun freelancer.  The sense of politicians who tended to go off on tangents and champion causes unrelated to the party platform they were hired to pursue was known since the late nineteenth century, sometimes with the implication (drawn from Sir Walter Scott’s picture of soldiers for hire by anyone) of mercenary motives.

The emergence of the gig economy doesn’t indicate any great change in the understanding of freelancing, the category of “gig worker” defined more by the method by which they obtain their work.  Gig worker encompasses just about any independent worker, including contingent and freelance workers, the convention of use being that they pickup their hours from a digital platform rather than the historically conventional channels.  Gig workers were understood not to be hired as employees by the company with which they contract to do the job, instead being freelance contractors, each individual task a separate contract.  That idea has been challenged in several jurisdictions with some courts and tribunals finding, in some cases, the nature of the relationship between freelancer and platform and the pattern of work performed being such that, within existing law, a conventional employer-employee relationship exists with all that implies.  This dispute is ongoing in many places and is played-out within the micro political economy.  

By contrast, a contingent worker is employed by a different company (usually a staffing agency or recruiter, often known as labour hire companies) than the one for which they’ll actually be performing the task.  The agency acts as the intermediary between the worker and the company, finding the jobs and the workers, billing the client and paying the contingent workers.  Contractors are different again and can be freelancers or contingent workers, the distinction being that they are deployed usually for fixed terms of longer duration than a gig.  There’s no precise definition but while a gig might last only minutes, a contract typically is measured in weeks or more.  The "freelance" status may be misleading in that there have been some known to to work exclusively for one entity (which might be an agent or other third-party) and for this reason and that they certainly weren't formally on the payroll.  In most cases though the freelancers can be thought of as proto-gig economy workers in that from an industrial relations viewpoint they were independent contractors even if in some cases their entire income might come from the one entity (indeed, some had signed contracts of exclusivity on some negotiated basis).     

Former Australian senator Cory Bernardi and animal welfare

Freelancing: Former Australian senator Cory Bernardi (b 1969; senator for South Australia 2006-2020) is a member of the Roman Catholic laity noted for leaving the Liberal Party in 2017 to form his own party, the Australian Conservatives.  Such creations, drawn often from the extremes of mainstream parties (of the left, right or single-issue operations) are usually short lived, the political inertia and structural advantages the incumbents create for each other being hard to topple; it’s tough even to sustain co-existence.  So it proved for the Australian Conservatives, Bernardi in 2019 announcing he was dissolving and consequently deregistering the party.  Electoral support had proved not only elusive but barely detectable although the senator claimed to be happy with the project’s outcomes, noting the Liberal-National Coalition's upset victory in 2019 was proof "common sense" had returned to national politics which was "all we, as Australian Conservatives, have ever sought to do.  Rarely has spin been so spun.   

Bernardi’s most publicised contribution to political discourse happened in 2012 when he suggested allowing same-sex marriage would lead to “legalised polygamy and bestiality”.  "The next step, quite frankly, is having three people or four people that love each other being able to enter into a permanent union endorsed by society - or any other type of relationship" the senator was quoted as saying.  "There are even some creepy people out there... [who] say it is OK to have consensual sexual relations between humans and animals.  Will that be a future step? In the future will we say, 'These two creatures love each other and maybe they should be able to be joined in a union'.  "I think that these things are the next step."

His views attracted scant support from his colleagues, even several who opposed marriage equality distancing themselves from Bernardi’s view it was the thin end of the homosexual wedge, a step on a slippery slope of depravity descending to the violation of beasts of the field.  The backlash compelled Bernardi’s resignation as parliamentary secretary to the leader of the opposition, Tony Abbott (b 1957; Australian prime-minister 2013-2015).  In accepting the resignation, Mr Abbott described the comments as "ill-disciplined", adding they were “…views I don't share...” and I think it's pretty clear that if you want to freelance, you can do so on the backbench."

From the backbench, the freelancer released a short statement saying he had resigned "in the interests of the Coalition", one of his thoughts with which few disagreed.  Sadly, even from organisations like PETA (People for the Ethical Treatment of Animals) he never received any credit for his efforts to protect goats and other hapless creatures from the predations of packs of crazed gay marriage advocates.  Despite that, the former senator found his natural home working as one of the right-wing fanatics at Rupert Murdoch's (b 1931) Sky News where his thoughts are imparted to an appreciative audience which believes the ills of this world are the consequence of conspiracies by the Freemasons, the Jews, the Jesuits and the Secret Society of the Les Clefs d'Or.  His viewers agree with everything he says.

The paparazzi are the classic freelancers.

Friday, December 17, 2021

Adjunct

Adjunct (pronounced aj-uhngkt)

(1) Something added to another thing but not essential to it; an appendage; something attached to something else in a subordinate capacity.

(2) A person associated with lesser status, rank, authority, etc., in some duty or service; assistant; things joined or associated, especially in an auxiliary or subordinate relationship.

(3) In higher education, a person working at an institution but not enjoying full-time or permanent status (exact status can vary between institutions).

(4) In systemic English grammar, a modifying form, word, or phrase depending on some other form, word, or phrase, especially an element of clause structure with adverbial function; part of a sentence other than the subject, predicator, object, or complement; usually a prepositional or adverbial group.

(5) In reductionist English grammar, part of a sentence that may be omitted without making the sentence ungrammatical; a modifier.

(6) In the technical language of logic, another name for an accident.

(8) In brewing, an un-malted grain or grain product that supplements the main mash ingredient.

(9) In metaphysics, a quality or property of the body or mind, whether natural or acquired, such as color in the body or judgement in the mind (archaic).

(10) In music, a key or scale closely related to another as principal; a relative or attendant key.

(11) In the syntax of X-bar theory, a constituent which is both the daughter and the sister of an X-bar.

(12) In rhetoric, as symploce, the repetition of words or phrases at both the beginning and end of successive clauses or verses: a combination of anaphora and epiphora (or epistrophe); also known as complexio.

(13) In category theory, one of a pair of morphisms which relate to each other through a pair of "adjoint functors".

1580-1590: From the Latin adjunctus (a characteristic, essential attribute), perfect past participle of adiungō (join to) & adjungere (joined to).  The construct of adiungō was ad- (from the Proto-Italic ad, from the primitive Indo-European haed (near, at); connate with the English at) + iungō (join); a doublet of adjoint.  The usual sense of "to join to" is now applied usually with a notion of subordination, but this is not etymological.  The first adjunct professor appears to have been appointed in 1826.  Adjunct is a noun, verb & adjective, adjunction, adjunctiveness & adjuncthood are nouns, adjunctive is a noun & adjective, adjunctivity is an adjective and adjunctively & adjunctly are adverbs; the noun plural is adjuncts.

Although the title has existed for almost two centuries, neither the duties or the nature of appointment of an adjunct professor have ever been (even variously) codified or consistently applied in a way that a generalised understanding of the role could be said to exist as it does for other academic ranks (tutor, lecturer, reader, professor et al).  The terms of appointment of adjunct professors vary between countries, between institutions within countries and even within the one institution.  In the academic swirl of titles there can also be adjunct lecturers, adjunct fellows etc and other adjectives are sometimes used; “contingent” and “sessional” applied sometimes to appointments which appear, at least superficially, similar to adjunct appointments elsewhere.  Beyond the English-speaking world however, the term adjunct, in the context of education, is often just another rung in the academic hierarchy, used in a similar way to “assistant” & “associate”.

In the English-speaking world, it’s probably easiest to understand the title in relation to what it’s not and, grossly simplified, the most important relationship between an adjunct appointment and one unadorned is whether or not the appointee is paid.  In institutions where adjuncts are paid, as a general principle, that’s indicative of an appointment where the emolument package is structured to provide lesser compensation (lower salary, no health insurance, no permanent term etc) and perhaps a limitation of duties (eg a teaching role only without the scope to undertake research).  If paid, an “adjunct” appointee is an employee.  Where the appointment is unpaid, while there are no set rules, there do seem to be conventions of use in that (1) a “visiting” professor is usually a eminent academic from another place granted to a short-term appointment on some basis, (2) an “honorary” professor is someone from outside academia (but whose career path is within the relevant scholastic field) and the title is granted, sometimes in perpetuity, in exchange for services like the odd lecture (often about some very specialised topic where expertise is rare) whereas (3), an adjunct professor can be entirely unconnected with any traditional academic path and may be appointed in exchange for consultancy or other services although, there’s often the suggestion donations to institutions can smooth the path to appointment.  If unpaid (even if able to claim “actual, defined or reasonable” expenses), an “adjunct appointee is not an employee.

Billionaire Adjunct Professor Clive Palmer (b 1954) counts some small change.  House of Representatives, Parliament House, Canberra, Australia, 2016.

More than one university bestowed the title adjunct professor on Australian businessman Clive Palmer.  Gold Coast’s Bond University noted the recognition was extended in recognition of "goodwill, positive endeavours and support" of the institution.  In answer to a critic who suggested styling himself as “Professor Palmer” in documents associated with his commercial interests might be not in the spirit of the generally accepted use of the title, he replied that they were suffering from “academia envy” and should “take a cold shower".

In law, adjunct relief should not be confused with injunctive relief.  Commonly known as “an injunction”, injunctive relief is a legal remedy which may be sought in civil proceedings and it can be something in addition to, or in place of monetary damages and usually takes the form of a court order requiring a person or entity to do, or (more typically) to refrain from doing, certain things.  They are unusual in that even if a judge thinks an application for injunctive relief is without merit, the order will anyway be granted (lasting usually until the matter is resolved in a defended hearing) if the consequences of the act are irreversible and an award of damages would not be a remedy (such as demolishing a building, publishing something or euthanizing an animal).  Injunctive relief can however work in coordination with injunctive relief.  Adjunct relief is the term which describes a class of relief granted to a party in proceedings which is not the primary relief sought.  A typical example of adjunct relief is that in circumstances where the primary relief sought is the award of monetary damages, a plaintiff may also be awarded an injunction as a protection against future breaches.  In that sense,

The word adjunct is also used in contract law.  To be a legally correct contract which will be recognised and enforced by a court, it must contain a number of elements: (1) All parties must have the capacity to enter contracts and the purpose of the contract must be lawful, (2) An offer by one party, (3) Acceptance of the offer by another, (4) An intention between the parties that the agreement is intended to be legally binding, (5) Consideration (an exchange of value between the parties), (6) Certainty of terms which can extend only to acts which are not impossible.  Those principles are the same regardless of whether one is buying an apple at the market or a nuclear-powered aircraft-carrier but there can also be collateral contracts or adjunct clauses.

During her litigious phase, Lindsay Lohan became well-acquainted with the operation of the rules which apply when seeking injunctive relief.  In a brief few years, she sought injunctions against at least two stalkers (one said to be a Freemason), a company she claimed was basing on aspects of her life their "milkaholic" baby, a rap artist who mentioned her in his lyrics and a video game-maker she alleged had usurped her likeness for commercial purposes.  The courts granted relief against the stalkers but her record in seeking injunctive relief generally was patchy.

A collateral contract is a separate contract which exists only because the primary contract has been executed yet it remains separate from it although the two will tend usually to operate in parallel.  Typically, a collateral contract is formed between one party to the main contract and a third party and it arises because a one party has made a promise which has induced another to enter into the main contract.  Other circumstances can apply but the general principle is that a collateral contract relies upon the existence of a primary contract; the reverse does not apply.  If the main contract is breached, the injured party can seek remedies based on the collateral contract.  By contrast, an adjunct clause is a provision (which may only retrospectively be found by a court to be a clause) within the primary contract.  It’s thus not a separate contract and does not include the “essential terms” upon which the contract may stand or fall, adjunct clauses typically serving as a schedule of additional terms & conditions.  Importantly, if the subject of dispute, the violation of adjunct terms may attract some form of compensation or an order for specific performance but not an invalidation of the contract.

Thursday, December 16, 2021

Amn't

Amn't (pronounced am-uhnt)

A non-standard (except in Irish & Scottish English) contraction of “am not”.

Circa 1600: Am is from the Middle English am & em, from the Old English eam & eom (am), from the Proto-Germanic immi & izmi (am) a form of the verb wesaną (to be; dwell), from the primitive Indo-European hiésmi (I am, I exist).  As a suffix, the contraction –n’t (not) negates the meaning of the clause in which it occurs (don’t, can’t et al).  In English, the suffix -n’t can be added only to auxiliary verbs (including dare and need in certain uses), as well as main verbs be (in almost all uses) and have (in some uses).  Indeed, in some dialects, not even all auxiliary verbs accept -n’t; for example, mayn’t is present in some dialects and absent in others.  Though verbs with -n’t are usually considered contractions of versions using the adverb not, grammatically they behave a bit differently; when subject and verb are inverted, "-n’t" remains attached to the verb, whereas "not" does not (compare: “Isn’t that difficult?” with “Is that not difficult?”)

Contractions

In English, other personal pronouns have two contracted forms that can be used in present-tense negative constructions, such as “we’re not” or “we aren’t”. The first person singular however has “I’m not” and “I amn’t” doesn’t exist.  That’s not wholly true because it’s long been in the dialectical English of Scotland and Ireland but it’s no longer part of Standard English because of shifts in pronunciation associated with a loss of favor generations ago.  Amn’t has a long history, the Oxford English Dictionary (OED) citing an example from 1691, but it was almost certainly known earlier, it and many other shortened forms such as can’t, don’t and shan’t, seemingly arriving in the language circa 1600.  Amn’t however was never popular, most etymologists concluding there was some reluctance to “m” and “n” together in one syllable.  So, while centuries old, amn’t isn’t part of Standard English but is common in Ireland, used especially in colloquial speech though not limited to informal registers.  It’s also used in Scotland (alongside amnae and other variants) and, the OED notes, parts of northern England and the West Midlands with even the occasional instance in Wales.  How amn’t came to be so geographically limited is not clear.  Another variant, an’t, probably supplanted it in general usage, again because speakers wanted to avoid sounding an “n” immediately after a “m” so it was therefore a natural development to simplify the consonant cluster.  The final “t” made it more likely the simplification would go to “ant” rather than “amt”, and this is the form which emerged in eighteenth century texts, where it appears as an’t.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

An’t (also spelt a’n’t), although said to be “phonetically natural and the philologically logical shortening”, fell from favour, but not before morphing in two significant ways. It gave rise to ain’t, famous in its own right for reasons good and bad and also began being spelt aren’t (by “orthographic analogy” in one etymologist’s memorable phrase), which is pronounced the same as an’t in non-rhotic accents.  This certainly explains “aren’t I” which would otherwise seem a grammatical anomaly and its irregularity does sometime offend the fastidious but it has become accepted in much of the English-speaking world.  In that sense, the Irish and Scottish dialects are the exception in retaining and favouring its ancestor, “amn’t I” which James Joyce (1882–1941) used in Ulysses (1922) and the younger Jonathan Swift (1667–1745) certainly liked it although, later in life, he would come to abhor just about every contraction.

Because it’s so rarely heard outside of Scotland & Ireland, the form amn’t has never been as controversial as ain't (often written as aint and occasionally variously as ain', a'n't, arn't, & ar'n't).  According to the authoritative Etymology Online, the first known appearance in print dates from 1706 (in the sense of “am not”) and that’s how it was used until early in the nineteenth century when in the Cockney dialect it began to be used as a generic contraction for “are not”, “is not” etc.  That was the downfall of “ain’t” as respectable English because it was picked up by authors wanting to spice their text with the flavour of “authentic working-class speech” and in class-conscious England, that was enough to see ain’t “banished from correct English” though one interesting outlier was noted in the Dictionary of Americanisms (1848): “hain't” for "have not" recorded as “A contraction much used in common conversation in New England.”  However, while “ain’t lacked the support of the genteel, in the idioms of popular culture, it flourished: “it ain’t necessarily so”, “if it ain't broke, don't fix it” & “you ain't seen nothing yet”.

Henry Fowler (1858–1933) in his A Dictionary of Modern English Usage (1926) had no doubts about ain’t, condemning it as “merely colloquial, and as used for isn’t is an uneducated blunder and serves no useful purpose.  Writing a century-odd ago, Henry Fowler long predated cultural relativism but one does wonder, were he writing today, noting the place ain’t has since claimed in popular song and idiomatic use, he might have been more forgiving.  He was however sympathetic to ain’t as a handy substitute for “am not” and lamented amn’t remained trapped in its Gaelic silo, decrying the “… shamefaced reluctance” of the English to adopt the form which “betrays the speakers sneaking fear that the colloquially respectable and indeed almost universal “aren’t I” is “bad grammar” and that “ain’t I” will convict him of low breeding.