Saturday, March 11, 2023

Thumbnail

Thumbnail (pronounced thuhm-neyl)

(1) The (finger)nail of the thumb.

(2) As thumbnail sketch, anything quite small or brief, as a small drawing or short essay, a précis or summary.

(3) In printing, a small, rough dummy.

(4) In journalism, a half-column portrait in a newspaper (also called the porkchop).

(5) Something quite small or brief; concise.

(6) Concisely to describe (something or someone).

(7) In computing (on the graphical user interfaces (GUI) of operating systems), a small image used as a preview of the original which loads upon clicking the thumbnail.  Unlike an icon, which is (Usually) a representative symbol, a thumbnail is a smaller copy of the original larger image (although technically, a thumbnail can be constructed which reports a smaller file size than the original).

1595–1605: The construct was thumb + nail.  Thumb was from the Middle English thombe, thoume & thoumbe, from the Old English þūma, from the Proto-West Germanic þūmō, from the Proto-Germanic þūmô from Proto-Indo-European tūm- (to grow).  The spellings thum, thume & thumbe were still in use in the late seventeenth century but are all long obsolete.  Nail was from the Middle English nail & nayl, from the Old English næġl, from the Proto-West Germanic nagl, from the Proto-Germanic naglaz, from the primitive Indo-European hnogh- (nail).  The earliest known instance of the phrase “thumbnail sketch” in the sense of "drawing or sketch of a small size" (though usually not literally the size of a thumbnail) dates from 1852, the verb usage adopted in the 1930s.  Thumbnail is a noun & adjective; thumbnailer is a noun, thumbnailed is a verb & adjective and thumbnailing is a verb; the noun plural is thumbnails.

Fifteen images of Lindsay Lohan’s thumbnails.

The term "thumbnail sketch" began with architects, designers and artists who quickly would create small, conceptual sketches of their ideas so they could be tested without the time or effort required to render at full-scale.  While it’s possible some may literally have been the size of a actual thumbnail, most would have been larger and the term was chosen just as something indicative of “smallness”.  The practice or architects and others creating small sketches was of course ancient and may even have been associated with prehistoric cave painting but it was in the mid-nineteenth century the term “thumbnail sketch” came to be used.  The use of the thumbnail sketch (including the companion “pencil test” in graphic design) is now universal in industries where images need to be created and the techniques learned proved useful in the 1980s when icons became widely used in the on were used on graphical user interfaces (GUI) of operating systems.  In text, in the 1950s, the thumbnail sketch came to be applied to any a précis or summary and has always been prevalent in publishing and criticism (as brief plot summaries, reviews etc) and as short-form biographical data, especially when assembled in a list of those so profiled.

Thumbnail sketches of recent Australian administrations

Kevin Rudd (right) & Cardinal Pell (left), 2010.

Kevin Rudd (b 1957; Australian prime-minister 2007-2010 & 2013): There have been few Australian prime-ministers who entered office with such goodwill as that enjoyed by Kevin Rudd and none who have so quickly squandered it all.  Mr Rudd’s win in 2007 was a testament to his personal popularity and a reasonable achievement given that, by any standards, on paper, the previous government shouldn’t have lost office, there being no crisis, an outstandingly good fiscal position, low unemployment and no serious scandals.  Essentially, the electorate seemed bored by a decade-odd of dull competence and Mr Rudd was new, presentable and in his nerdy, weird way, appealing and thus the country voted.  His honeymoon wasn’t noticeably short but he had the misfortune to be prime-minister when the global financial crisis (GFC) hit and while for many reasons, Australia was relatively unaffected, the stresses it induced revealed tensions in his government and his background as a public servant wasn’t useful whenever decisiveness was required; long used to providing advice to others who made decisions, his government stuttered under the weight of committees and boards of enquiry.  A contrast with this intellectual timidity was his reputation for arrogance and abrasiveness when dealing with his colleagues and this didn’t help him maintain their support; he lost an internal party vote in 2010 and the Australian Labor Party (ALP) choose another leader.  In 2023, it was announced Dr Rudd would be Australia’s next ambassador to the United States and there are rumors he’s negotiated a secret, back-channel deal whereby he reports directly to the prime-minister and not, as is usual, to the foreign minister.

Julia Gillard (left) & Kevin Rudd (right), 2013.

Julia Gillard (b 1961; Australian prime minister 2010-2013):  Julia Gillard is thus far the only woman to become Australia’s prime-minister and some of the treatment she endured in office might make a few women wonder if reaching the top of the greasy pole is worth the price to be paid.  That said, it’s still a good gig and many will try.  Metaphorically knifing her predecessor in the back meant her premiership didn’t start in the happiest of circumstances and it didn’t help and he made little attempt to conceal his thoughts on recent events.  The poison spread through the party and the healthy majority gained in 2007 was lost in the 2010 election, the Gillard government surviving only with the support of three independents, all of whom extracted their own price.  Bizarrely as it might seem to some, Rudd returned for a while as foreign minister, an unhappy experience for many.  It couldn’t last and it didn’t, Mr Rudd resigning and unsuccessfully contesting the leadership.  Still despite it all, on paper, the Gillard government managed things successfully in a tight parliament and although the actual achievements were slight, they probably exceed expectations.  Ms Gillard is probably best remembered for her “misogyny” speech which deservedly went viral because it was highly entertaining although it did reveal someone sensitive to criticism and one wonders if she’d ever reviewed some of things said about male politicians over the centuries.  It’s clearly a more sensitive age but nor did she appear to see any inconsistencies between the words spat at her and her use of “poodle” and “mincing” (with all that they imply) when decrying one of her male opponents.  As it was, Mr Rudd got his revenge, toppling her in 2013 although his victory may have seemed pyrrhic (his second coming lasting three months-odd), he was probably content.

Tony Abbott (left) & Vladimir Putin (right) with koalas, 2014.

Tony Abbott (b 1957; Australian prime-minister 2013-2015): One probably disappointed that Ms Gillard was in 2013 replaced was Mr Abbott because all the indications were the Liberal-National coalition’s victory in the 2013 election would have produced a landslide-scale majority rather than the merely comfortable one achieved against Mr Rudd.  Still, the majority was sufficient for Mr Abbott easily to purse his objectives and he immediately set to reducing expenditure, cutting taxes, stopping irregular immigration (his famous “stop the boats” campaign lent three word slogans (3WS) a new popularity which endures to this day) and attacking trade unions.  He was a very different character from Mr Rudd but similarly inept in managing public perception of his government.  In his thoughts, there was a certainly of purpose Mr Rudd lacked but the core problem was that his world view seemed to have been set in stone by the Jesuits who taught him while he was training for the priesthood and while much had changed since the fourteenth century, he’d not moved on.  Thus created were the tensions which marked his government which was split between technocratic realists, right-wing fanatics, a genuinely liberal wing and his coalition partners, the National Party which was devoted to the horse trading necessary to extract the money required to pork-barrel their electorates.  Presiding over this lot as a leader with thoughts were more akin to the old Democratic Labor Party (DLP) than anything from the third millennium, it’s probably remarkable Mr Abbott lasted as long as he did.  The 2014 budget which made big cuts was blamed by many for his demise and while it’s true it was badly designed and poorly explained, it does appear Mr Abbott, while one of the most formidably focused and effective oppositions leaders, simply lacked the skills needed to be prime-minister.  In 2013, he lost an internal party ballot to the man he’d replaced in a similar vote in 2009.

Malcolm Turnbull (right) & Peter Dutton (left) roadside billboard (2016).

Malcolm Turnbull (b 1954; Australian prime minister 2015-2018):  There was an unusually great public optimism which immediately surrounded Mr Turnbull’s accession to office.  So encouraging were the polls that he probably should have gone to an early election as Anthony Eden (1897-1977; UK prime-minister 1955-1957) did in 1955, thus avoiding the grinding down of energy inevitable in “fag-end” administrations.  Instead he delayed, making the same mistake as Gordon Brown (b 1951; UK prime-minister 2007-2010) and John Gorton (1911-2002; Australian prime-minister 1968-1971) and the early support evaporated, the government surviving the 2016 election with only a slender majority.  Being from the liberal wing, Mr Turnbull really wasn’t a good fit as leader of the modern Liberal Party he’d been accepted only because he was rich, a virtue which in the party tends to mean other vices are overlooked (if not forgiven).  This allowed him sometimes to prevail but ultimately it was the corrosive and related issues of energy and an emissions reduction policy which proved his nemesis.  Even if the public didn’t fully understand the intricacies of the issue (and the especially complex mechanisms in the associated legislation), increasingly they were being persuaded by the science underlying climate change and just wanted the matter resolved.  The factions in the Liberal-National coalition had for more than a decade been torn asunder by climate policy and the divisions poisoned public perception of the government; Mr Abbott may have been wrong in how he handled the matter in 2013 but he was at least certain and decisive and was accordingly rewarded.  Support for Mr Turnbull eroded and in an amusingly chaotic leadership coup in 2018, he lost the leadership.  In retirement, he found common cause with Mr Rudd as they joined to complain about the undue influence Rupert Murdoch’s (b 1931) News Corporation exerts in Australian politics, especially the national daily The Australian which, despite a notionally small distribution, is highly effective in setting agendas, forcing other outlets to pursue News Corp's pet issues.

The Turnbull administration is remembered also for imposing the "bonk ban", a consequence of one of the many extra-parliamentary antics of "bonking Barnaby" (Barnaby Joyce, b 1967; thrice (between local difficulties) deputy prime minister of Australia 2016-2022 and known also within the beltway as "the beetrooter", a nicknamed explained as (1) an allusion to this often florid complexion and (2) the use of "root" in Australia to refer to sexual intercourse).  Mr Turnbull was a keen student of etymology and having once worked as a journalist was fond of the alliterative phrase so when writing his memoir (A Bigger Picture (2020)) he included a short chapter entitled "Barnaby and the bonk ban".  As well as the events which lent the text it's title, the chapter was memorable for his inclusion of perhaps the most vivid thumbnail sketch of Barnaby Joyce yet penned:

"Barnaby is a complex, intense, furious personality.  Red-faced, in full flight he gives the impression he's about to explode.  He's highly intelligent, often good-humoured but also has a dark and almost menacing side - not unlike Abbott (Tony Abbott (b 1957; prime-minister of Australia 2013-2015)) - that seems to indicate he wrestles with inner troubles and torments."

Mr Turnbull and Mr Joyce in parliament, House of Representatives, Canberra, ACT.

The substantive matter was the revelation in mid-2017 the press had become aware Mr Joyce (a married man with four daughters) was (1) conducting an affair with a member of his staff and (2) that the young lady was with child.  Mr Turnbull recorded that when asked, Mr Joyce denied both "rumors", which does sound like a lie but in the narrow sense may have verged on "the not wholly implausible" on the basis that, as he pointed out in a later television interview, the question of paternity was at the time “...a bit of a grey area”.  Mr Joyce and his mistress later married and now have two children so all's well that end's well (at least for them) and Mr Turnbull didn't so much shut the gate after the horse had bolted as install inter-connecting doors in the stables.  His amendments to the Australian Ministerial Code of Conduct (an accommodating document very much in the spirit of Lord Castlereagh's (1769–1822; UK foreign secretary 1812-1822) critique of the Holy Alliance) banned ministers from bonking their staff which sounds uncontroversial but was silent on them bonking the staff of the minister in the office down the corridor.  So the net effect was probably positive in that staff having affairs with their ministerial boss would gain experience through cross-exposure to other portfolio areas although there's the obvious moral hazard in that they might be tempted to conduct trysts just to engineer a transfer in the hope of career advancement.  There are worse reasons for having an affair and a bonk for a new job seems a small price to pay.  It's been done before.

Scott Morrison (left) & Grace Tame (right), 2022.

Scott Morrison (b 1968; prime-minister 2018-2022): There are a few candidates who deserve to be regarded as Australia’s worst prime-minister (some of them quite recent) but the uniquely distinguishing feature of assessments of Mr Morrison’s term is that so many view it with such distaste.  His narrow victory in the 2018 election was a remarkable personal achievement but that proved the high-water mark of his administration.  Many critiques noted his lack of background, his experience limited to sales, marketing and slogans which has its place but did seem to result in him viewing a democracy rather as a sales manager views his employer’s customer loyalty programmes: Just as only good customers are entitled to the benefits of membership, in the Morrison government it seemed only electorates which returned coalition members were deserving of funding.  That did change in the run-up to an election; then marginal electorates which might elect coalition members attracted largess and while all parties do this, few have been so so blatant or extreme as Mr Morrison.  He also blundered in foreign affairs, publicly and pugnaciously calling for an international enquiry into the origins of the SARS-COV-2 virus responsible for the COVID-19 pandemic.  That was a good idea but it should have been handled through the usual channels, not as foghorn diplomacy and the assumption of most was he was looking forward to going to his church (one where they clap, sing, strum guitars and the preacher assures the congregation God approves of surf-skis and big TVs) and telling everyone he’d stood up to the Godless atheists in the Chinese Communist Party.  Then there was the matters like the way a submarine contract was cancelled (costing the taxpayer a few hundred million) and the “robodebt” scandal (which turned out to be unlawful) which cost an as yet uncertain millions more.  Robodebt also exposed the contrast between his attitude to poor people who might be entitled to small welfare payments and that towards corporations which benefited from COVID-19 payments intended for those suffering certain defined losses in revenue.  When it was pointed out many companies which had received millions actually increased their revenue during the pandemic, Mr Morrison made it clear they could keep the money.  Maybe poor people should become Liberal Party donors.

Thumbnails of Lindsay Lohan image files in a sub-directory.

Cammer

Cammer (pronounced kham-ah)

(1) A content-provider who uses a webcam to distribute imagery on some basis (applied especially to attractive young females associated with the early use of webcams).

(2) Slang for an engine produced in small numbers by Ford (US) in the mid-late 1960s.

(3) A general term for any camera operator (now less common because the use in the context of webcam feeds prevailed.

1964: A diminutive of single overhead cam(shaft).  Cam was from the sixteenth century Middle English cam, from the Dutch kam (cog of a wheel (originally, comb)) and was cognate with the English comb, the form preserved in modern Dutch compounds such as kamrad & kamwiel (cog wheel).  The association with webcams began in the mid-1990s, cam in that context a contraction of camera.  The Latin camera (chamber or bedchamber) was from the Ancient Greek καμάρα (kamára) (anything with an arched cover, a covered carriage or boat, a vaulted room or chamber, a vault) of uncertain origin; a doublet of chamber.  Dating from 1708, it was from the Latin that Italian gained camera and Spanish camara, all ultimately from the Ancient Greek kamára and the Old Church Slavonic komora, the Lithuanian kamara and the Old Irish camra all are borrowings from Latin.  Cammer was first used in 1964 as oral shorthand for Ford’s 427 SHOC (single overhead camshaft) V8 engine, the alternative slang form being the phonetic “sock” and it became so associated with the one item that “cammer” has never been applied to other overhead camshaft engines.  The first web-cam (although technically it pre-dated the web) feed dates from 1991 and the first to achieve critical mass (ie “went viral”) was from 1996.  Cammer is a noun; the noun plural is cammers. 

Lindsay Lohan on webcam in Get a Clue (2002) a Disney Channel original movie.

The word came be used for photographic devices as a clipping of the New Latin camera obscura (dark chamber) a black box with a lens that could project images of external objects), contrasted with the (circa 1750) camera lucida (light chamber), which used prisms to produce an image on paper beneath; it was used to generate an image of a distant object.  Camera was thus (circa 1840) adopted in nineteenth century photography because early cameras used a pinhole and a dark room.  The word was extended to filming devices from 1928. Camera-shy (not wishing to be photographed) dates from 1890, the first camera-man (one who operates a camera) recorded in 1908.  The first webcam feed into the wild (pre-dating the worldwideweb (www), dates from 1991.  

jennicam.org (1996-2003)

It wasn’t the internet’s first webcam feed, that seems to have been one in 1991 aimed at a coffee machine in a fourth floor office at the University of Cambridge's computer science department, created by scientists based in a lab the floor below so they would know whether to bother walking up a flight of stairs for a cup, but in 1996, nineteen year-old Jennifer Ringley (b 1976), from a webcam in her university dorm room, broadcast herself live to the whole world, 24/7.  With jennicam.org, she effectively invented "lifecasting" and while the early feed was of grainy, still, monochrome images (updated every fifteen seconds) which, considered from the twenty-first century, sounds not interesting and hardly viral, it was one of the first internet sensations, attracting a regular following of four-million which peaked at almost twice that.  According to internet lore, it more than once crashed the web, seven million being a high proportion of the web users at the time and the routing infrastructure then wasn't as robust as it would become.  Tellingly, Ms Ringley majored in economics which explains the enticingly suggestive title "jennicam" whereas the nerds at Cambridge could think of nothing more catchy than "coffee pot camera".  

Jenni and pussy.

Although there were more publicized moments, jennicam.org was mostly a slideshow of the mundane: Jennifer studying at her desk, doing the laundry or brushing her teeth but it hinted at the realisation of earlier predictions, Andy Warhol's (1928–1987) fifteen minutes of fame and Marshall McLuhan's (1911-1980) global village.  While not exactly pre-dating reality television, jennicam.org was years before the genre became popular and was closer to real than the packaged products became.

The 1964 Ford 427 SOHC (the Cammer)

1964 426 HEMI in Plymouth race-car.

There was cheating aplenty in 1960s NASCAR (National Association for Stock Car Auto Racing) racing but little so blatant as Chrysler in 1964 fielding their 426 HEMI, a pure racing engine, in what was supposed to be a series for mass-produced vehicles.  Whatever the legal position, it was hardly in the spirit of gentlemanly competition though in fairness to Chrysler, they didn't start it, NASCAR for years something of a parallel universe.  In 1957, the Automobile Manufacturers Association (AMA) had announced a ban on auto-racing and the public positions of General Motors (GM), Ford and Chrysler supported the stand, leaving the sport to dealer and privateers although, factory support of these operations was hardly a secret.  NASCAR liked things this way believing the popularity of their “stock cars” relied on the vehicles raced being close to (ie "in stock") what was available for purchase by the general public.  Additionally, they wished to maintain the sport as affordable even for low budget teams and the easy way to do this was restricting the hardware to mass-produced, freely available parts, thereby leveling the playing field.  The façade was maintained until the summer of 1962 when Ford announced it was going to "go racing".  Market research had identified the competitive advantage to be gained from motorsport in an era when, uniquely, the demographic bulge of the baby-boomers, unprecedented prosperity and cheap petroleum would coalesce, Ford understanding that in the decade ahead, a historically huge catchment of 17-25 year old males with high disposable incomes were there to be sold stuff and they’d likely be attracted to fast cars.  Thus began Ford's "Total Performance" era which would see successful participation in just about everything from rally tracks to Formula One, including four memorable victories at the Le Mans twenty-four hour classic.

1963 Chevrolet 427 "Mystery Motor"

The market leader, the more conservative GM, said they would "continue to abide by the spirit of the AMA ban" and, despite the scepticism of some, it seems they meant it because their racing development was halted though not without a parting shot, Chevrolet in 1963 providing their preferred team a 427 cubic inch (7 litre) engine that came to be known as the "mystery motor".  It stunned all with its pace but, being prematurely delivered, lacked reliability and, after a few races, having proved something, GM departed, saving NASCAR the bother of the inevitable squabble over eligibility.

Beware of imitations.  1961 Ford Galaxie Starliner & 1962 Starlift (brochure).

Ford stayed and cheated, though not yet with engines.  Their streamlined two-door, the 1961 Galaxie Starliner, possessed the aerodynamic qualities needed on the big ovals and was a successful race-car but, after early enthusiasm, sales dropped so it was replaced in 1962 with a more commercially palatable notchback roofline.  That sold well but lacked the slipperiness of the Starliner so performance on the track suffered.  To regain the lost aerodynamic advantage, Ford fabricated a handful of fibreglass detachable hard-tops which essentially transformed a Galaxie convertible back into a Starliner.  Not wishing to incur the expense of actually offering them as an option they knew few would buy, Ford gave the plastic roof the name “Starlift”, allocated a part-number and even mocked-up a brochure for NASCAR to read.  Only three had been built with one race won when NASCAR, not fooled, rapidly issued a ban.  After Ford took one of the black-balled Starlifts, now fitted with a 483 cubic inch (7.9 litre) engine, to the Bonneville salt flats and set a number of international speed records, NASCAR took the opportunity to impose a 7 litre (usually expressed as 427 cid) displacement limit, one rule that was easy to enforce.

1964 427 SOHC (Cammer).  Note the long timing chain.

Ford, which while enjoying great success in 1963 had actually adhered to the engine rules, responded to Chrysler’s 426 HEMI (which had dominated the 1964 season) within a remarkable ninety days with a derivation of their 427 FE which replaced the pushrod activated valves with two single overhead camshafts (SOHC), permitting higher engine speeds and more efficient combustion, thereby gaining perhaps a hundred horsepower.  The engine, officially called the 427 SOHC, was nicknamed the Cammer (although some, noting the acronym, called it the "sock").  The problem for NASCAR was that neither the 426 HEMI nor the 427 Cammer was in a car which could be bought from a showroom.

1964 Chrysler 426 HEMI DOHC Prototype.

Not best pleased, NASCAR was mulling over things when Chrysler responded to the 427 Cammer by demonstrating a mock-up of their 426 HEMI with a pair of heads using double overhead camshafts (DOHC) and four valves per cylinder instead of the usual two.  Fearing an escalating war of technology taking their series in an undesired direction, in October 1964, NASCAR cracked down and issued new rules for the 1965 season.  Although retaining the 427 cubic inch limit, engines now had to be mass-production units available for general sale and thus no hemi heads or overhead camshafts would be allowed  The rule change had been provoked also by an increasing death toll as speeds rose beyond what was safe for both tyres and on circuits.

1965 Ford 427 FE.

That meant Ford’s 427 FE was eligible but Chrysler’s 426 HEMI was not and a disgruntled Chrysler withdrew from NASCAR, shifting their efforts to drag-racing where the rules of the NHRA (National Hot Rod Association) were more accommodating (though it's not clear if Chrysler complied even with those though the NHRA welsomed them anyway).  In 1965, Chrysler seemed happy with the 426 HEMI's impact over the quarter-mile and Ford seemed happy being able to win just about every NASCAR race.  Not happy was NASCAR which was watching crowds and revenue drop as the audience proved less interested in a sport where results had become predictable, their hope the rule changes would entice GM back to motor-sport not realised.

1966 Chrysler 426 Street HEMI

It was 1967 before everybody was, (more or less) happy again.  Chrysler, which claimed it had intended always to make the 426 HEMI available to the general public and that the 1964 race programme had been just part of engineering development, for 1966 introduced the 426 Street HEMI, a detuned version of the race engine, a general-production option for just about any car in which it would fit.  NASCAR responded quickly, announcing the HEMI now complied with the rules and was welcome, with a few restrictions, to compete.  Ford assumed NASCAR needed them more than they needed NASCAR and announced they would be using the 427 Cammer in 1966.  NASCAR was now trapped by its own precedents, conceding only that Ford could follow Chrysler’s earlier path, saying the 427 Cammer would be regarded “…as an experimental engine in 1966… (to) …be reviewed for eligibility in 1967."   In other words, eligibility depended still on mass-production.

Ford, although unable easily to create a 427 Street Cammer, recalled the Starlift trick and announced the SOHC was now available as a production item.  That was, at best, economical with the truth, given not only could nobody walk into a showroom and buy a car with a 427 Cammer under the hood but it seemed at the time not always possible to purchase one even in a crate.  Realising the futility of kicking the can down the road, NASCAR decided to kick it to the umpire, hoping all sides would abide by the decision, referring the matter to the Fédération Internationale de l'Automobile (FIA), the world governing body for motor-sport.  Past-masters at compromise, the FIA approved the 427 Cammer but imposed a weight handicap on any car in which it was used.

Ford called that not just unfair but also unsafe, citing concerns at the additional stress the heavier vehicles would place of suspension and tyres, adding their cars couldn’t “… be competitive under these new rules."  Accordingly, Ford threatened to withdraw from NASCAR in 1966 but found the public’s sympathy was with Chrysler which had done the right thing and made their engine available to the public.  Ford sulked for a while but returned to the fray in late 1966, the math of NASCAR’s new rules having choked the HEMI a little so the 427 FE remained competitive, resulting in the curious anomaly of the 426 Street HEMI running dual four-barrel induction while on the circuits only a single carburetor was permitted.  Mollified, Ford returned in force for 1967 and the arrangement, which ushered in one of the classic eras of motorsport, proved durable, the 427 FE used until 1969 and the 426 HEMI until the big block engines were finally banned after the 1974 season, three years after the last 426 Street HEMI was sold.

Ford 427 Cammer in 1967 Fairlane.

While the 426 HEMI DOHC never ran (the display unit's valve train was electrically activated), the 427 Cammer was produced for sale in crates and although the number made seems to be uncertain, most sources suggest it may have been as high as several-hundred and it enjoyed decades of success in various forms of racing including off-shore power boats.  Whether it would ever have been reliable in production cars is questionable.  Such was Ford’s haste to produce the thing there wasn’t time to develop a proper gear drive system for the various shafts so it ended up with a timing-chain over six feet (1.8m) long.  For competition use, where engines are re-built with some frequency, that proved satisfactory but road cars are expected to run for thousands of miles between services and there was concern the tendency of timing-chains to stretch would impair reliability and tellingly, Ford never considered the 427 Cammer for a production car.  Production cars, unlike racing engines, attract warranties.  The 427 Cammer attracted a following and, even today, it’s possible to buy all the parts needed to build one.

Friday, March 10, 2023

Abnegate

Abnegate (pronounced ab-ni-geyt)

(1) To refuse or deny oneself (privileges, pleasure, rights, conveniences etc); reject; renounce.

(2) To relinquish; give up.

1650–1660: From the Latin abnegātus (denied), past participle of abnegāre (to deny), the construct being ab- + negate.  The Ab- prefix was from the Latin ab-, from the primitive Indo-European hepo (off, away) and a doublet of apo- and off-.  The alternative prefixes were (1) a- (with root words starting with m, p, or v) & (2) abs- (with root words starting with c or t).  Ab- was used to convey (1) “from” & (2) “away from” & “outside of”.  Negate was from then Latin negātus, past participle of negāre (to deny, refuse, decline), reduced from nec-aiare (or some similar form), the construct being nec (not, nor) + aiere (to say).  Abnegate is a verb, abnegated & abnegating are verbs & adjectives, abnegation & abnegator are nouns; the most common noun plural is abnegations.

Abnegate should not be confused with abdicate.  Dating (perhaps surprisingly) only from 1541, abdicate was from the Latin abdicātus (renounced), perfect passive participle of abdicō (renounce, reject, disclaim), the construct being ab + dicō (proclaim, dedicate, declare), akin to dīcō (say).  Abdicate now (except informally) is used almost exclusively to refer to a reigning monarch renouncing their throne in favour of a successor (chosen or imposed) but was once applied with greater latitude.  Between the mid-sixteenth & early nineteenth centuries, it was used to mean “to disclaim and expel from the family” (as a parent might of a child) and when this is done now, one is said to have disowned (as a statement of family & social relations) or disinherited (at law in the matter of inheritance).  Between the mid-sixteenth & late seventeenth centuries it could mean “formally to separate oneself from or to divest oneself of”.  Between the early seventeenth & late eighteenth centuries, it could mean “to depose” which meant (1) remove from office suddenly and forcefully (ie what might now be thought a forced (or “constructive”) abdication or (2) in law, to testify to or give evidence under oath (usually in writing).  Between the mid-sixteenth & late seventeenth centuries it could mean “to reject; to cast off; to discard (an object, an association, an obligation etc).

The modern meaning has existed since the mid-sixteenth century (though not commonly used for another two-hundred odd years) and means “to surrender, renounce or relinquish, as sovereign power; to withdraw definitely from filling or exercising, as a high office, station, dignity.  This can apply to anyone personally exercising sovereign authority (kings, queens, popes, tsars et al) and is the act of renouncing the throne (and thus sovereignty).  Procedurally, most monarchies have detailed administrative procedures (and abdication has of late assumed a new popularity) to ensure the transfer from old to new is legally identical in consequence to what happens in the case of a sovereign dying but the lawyers have previously resolved cases where formalities were lacking.  In the matter of James VII and II (1633–1701; King of England and King of Ireland (as James II) & King of Scotland (as James VII) 1685-1688 who left the throne in the circumstances of the Glorious Revolution of 1688, the act of “abandonment” or “forfeiture”, even in the absence of any formal mechanism, was held to be an abdication, albeit one that might (analogously with use in other aspects of law) be styled a “constructive abdication”.

Pope Benedict XVI in Popemobile (Mercedes-Benz ML 430 (W163)), 1600 Pennsylvania Avenue, Washington DC, 2008.

Although the term abdication is sometimes used of papal resignations, the Vatican is emphatic the word is not used in any official documents of the Church.  This imprecise use of abdication is attributable to the Holy See being (as well as the universal government of the multi-national Roman Catholic Church) the authority ruling the Vatican City State, a sovereign, independent territory since the Lateran Concordat of 1929.  The Pope is thus the ruler of both Vatican City State and the Holy See; collectively an absolute theocracy.  It’s thus a fine point and were the Holy See to prefer “abdicate” to “resign”, it would seem not a substantive change and the fact the office is elected and not dynastic is not significant, Holy Roman emperors and the some early kings of England all elected. 

Pope Benedict XVI in Popemobile, Seravalle stadium, San Marino, 2011.

What none can deny is that the Holy See has a long (if of late infrequent) history of precedent, five popes between the tenth & fifteenth centuries resigning with a further four between the third & eleventh possibly having done so.  Mysteriously, there’s even another event which may or may not have been a resignation and indeed the subject may not even have been a pope but rather an anti-pope, somewhat analogous with the idea the MAGA Republicans have of Joe Biden (b 1942; US president since 2021) being an anti-president.  The revisions to canon law in 1917 and 1983 only clarified certain aspects of the resignation process and had no effect on anything definitional.  Thus, what Pope Benedict XVI (1927–2022; pope 2005-2013, pope emeritus 2013-2022) did when renouncing office in 2013 was an act of abnegation and not an abdication and that he chose subsequently to be styled pope emeritus remains of no legal or constitutional significance.

Herostratic

Herostratic (pronounced hera-strat-ick)

The act of seeking fame at any cost; desire for notoriety.

1640s: First noted in English in translation of fourth century documents, the construct was Herostratus +‎ -ic 

Herostratus was a learned borrowing from the Latin Hērostratus, from the Ancient Greek Ἡρόστρατος (Hēróstratos), the construct being Ἥρᾱ (Hḗrā) (Greek goddess of marriage, women, and family) + στρᾰτός (stratós) (army, military force; band or body of men; common people).  The suffix -ic was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (HSO) has more oxygen atoms per molecule than sulphurous acid (HSO).  Herostratical is an adjective and herostratically is an adverb.  The jocular noun herostratisphere is non-standard.

Attention seeking

In Asia Minor (near present-day Selcuk, Türkiye), in a sacred grove not far from the city of Ephesus, stood the Great Temple of Artemis (also known as the Temple of Diana), one of the seven wonders of the ancient world. During the evening of 21 July, 356 BC, Herostratus (also called Erostratus) of Ephesus saturated the timber and fabric furnishings of the temple with gallons of oil and when all was thoroughly soaked, he set fires in many places, inside and out.  Within minutes, as he had planned, the fire was uncontrollable and the temple was doomed.  Coincidently, on the day the temple was razed, Alexander the Great (356-323 DC) was born.

St. Paul Preaching in Ephesus Before the Temple of Artemis (1885), by Adolf Pirsch (1858-1929).

Herostratus was apparently a wholly undistinguished and previously obscure citizen, different from others only in his desire to be famous and the lengths to which he was prepared to go to achieve that fame.  As shocked Ephesians rushed to the fire, Herostratus met them and proudly proclaimed his deed, telling them his name would for all eternity be remembered as the man who burned down the Great Temple of Artemis and razed one of the wonders of the world.  Herostratus was, as he expected, executed for his arson.  In an attempt to deny him the fame he craved, the Ephesians passed the damnatio memoriae law, making it a capital crime ever to speak of him or his deed.  However, it proved impossible to suppress the truth about such an event; the historian Theopompus (circa 380–circa 315 BC) relates the story in his Philippica and it later appears in the works of the historian Strabo (circa 64 BC–circa 24 AD).  His name thus became a metonym for someone who commits a criminal act in order to become noted.  Subsequent attempts to erase names from history (tried on a grand scale by comrade Stalin and the Kim dynasty in the DPRK) seem always to fail.

It's unfortunate history didn't unfold so Android and iOS were available in 356 BC so  Herostratus could have played Lindsay Lohan's The Price of Fame instead of turning to arson.  The game was said to be "a parody on celebrity culture and paparazzi" and enabled players to become world famous celebrities by creating an avatar which could "purchase outfits, accessories, toys and even pets".  Played well. he could have entered a virtual herostratisphere and the temple might stand today.  As Ms Lohan would understand, the tale of Herostratus reminds all that for everything one does, there's a price to be paid. 

Like many of the tales from antiquity, the story of destruction by arson is doubted.  Various conjectures have been offered, some of which doubt the technical possibility of what Herostratus is said to have done, some claiming it was a kind of inside job by the temple’s priests who had their own reasons for wanting a new building and even a reference to the writings of Aristotle which offers a lightning strike as the catalyst for the conflagration.  However, whatever did or didn’t happen in 356 BC, the word herostatic, to describe one who seeks fame at any cost, has endured, the attempt to make his name unspeakable as doomed as the temple. 

Thursday, March 9, 2023

Gully

Gully (pronounced guhl-ee)

(1) A small valley or ravine originally worn away by running water and serving as a drainage-way after prolonged or heavy rain.

(2) A ditch or gutter.

(3) In cricket, a position in the off-side field (some 30o behind square), between point and the widest of the slips (or wicket-keeper if no slip is set); the fielder occupying this position.

(4) In tenpin bowling, either of the two channels at the side of the bowling lane.

(5) To make gullies in the ground or an object

(6) In hydrology, to form channels by the action of water.

(7) In slang, or relating to the environment, culture, or life experience in poor urban neighborhoods; vulgar, raw, or authentic and sometimes used as an alternative to ghetto.

(8) In (US) slang, as gullywasher, an intense, but typically brief rain event, the form dating from 1887.

(9) In Scotland and northern England, a knife, especially a large kitchen or butcher’s knife (the alternative spelling gulley).

(10) In some parts of the English-speaking word, a synonym for valley, especially one heavily wooded; a deep, wide fissure between two buttresses in a mountain face, sometimes containing a stream or scree (although in most traditions gullies are usually dry, water flowing only after heavy rain or a sudden input of water from other drainage systems.

(11) In engineering slang, any channel like structure which is available to be used for some purpose such as ducts or cables (applied to anything from computer motherboards to nuclear reactors).

(12) In engineering, a grooved iron rail or tram plate (mostly UK).

(13) In civil engineering, sometimes used as a descriptor for drop-kerbs, gutters etc.

(14) Of liquid, noisily to flow (obsolete).

(15) In South Asia (chiefly India but known also in Pakistan, Bangladesh & Sri Lanka, an alleyway or side street.     

1530–1540: Etymologists have traced several possible sources of the word and it’s not impossible the word evolved independently in different places.  It may have been a variant of the Middle English golet (esophagus, gullet), from Old French goulet (the French –et ultimately replace by –y), from Latin gula (throat) and the meaning-shift in the Middle English to "water channel, ravine" may have been influenced by Middle English gylle, gille & galle (deep narrow valley, ravine), hence gill for some time being a synonym.   An alternative source from The French has been suggested as goulet (neck of a bottle).  The use is South Asia is more certain, borrowed from Hindi गली (galī) and the Urdu گَلی‎ (galī) with the spelling evolving under the Raj under the influence of English.  It was inherited from Ashokan Prakrit galī and was cognate with the Punjabi ਗਲੀ (galī) / گَلی‎ (galī), the Gujarati ગલી (galī), the Sindhi ڳَليِ / ॻली, the Marathi गल्ली (gallī) and the Bengali গলি (gôli), the Latin callis, the Italian calle and Spanish calle (street, lane or path).  The first reference (in Scottish English) to the knife (the spelling gully or gulley) dates from circa 1575–1585, the origin unknown.  Gully is a noun & verb and gullied & gullying are verbs; the noun plural is gullies.

Historically, a gully was a natural formation of water flows which was usually dry except after periods of heavy rainfall or a sudden input of water from other drainage systems after more remote flooding or the melting of snow or ice.  Over the years the meaning has become less precise and other words are sometimes used to describe what are understood by many as gullies.  The noun ravine (long deep gorge worn by a stream or torrent of water) dates from 1760 and was from the mid seventeenth century French ravin (a gully), from the Old French raviner (to pillage; to sweep down, cascade), and the French ravine (a violent rush of water, a gully worn by a torrent), from the Old French ravine (violent rush of water, waterfall; avalanche; robbery, rapine).  Both the French noun and verb ultimately came from the Latin rapina (act of robbery, plundering (related to rapine and the source of much modern confusion because “rape” was long used in the sense of “pillage” or “kidnapping”)) with sense development influenced by the Latin rapidus (rapid).  Entries for ravine appear in early seventeenth century dictionaries with the meaning “a raging flood” whereas in fourteenth century Middle English, both ravin & ravine meant “booty, plunder, robbery”, this circa 1350-1500 borrowing of the Latin influenced French word.  Dating from 1832, the noun gulch (deep ravine), despite being of recent origin, is a mystery.  It may have been from the obsolete or dialectal verb gulsh (sink in to the soil) or "gush out" (of water), from the early thirteenth century Middle English gulchen (to gush forth; to drink greedily), the most evocative use of which was the mid thirteenth century gulche-cuppe (a greedy drinker).  Despite the vague similarities, etymologists maintain these forms had no etymological connection with gully.  Other words (trench, culvert, crevasse, chasm, notch, chase, watercourse, channel, gutter gorge watercourse etc), even when they have precise meanings in geography or hydrology, are also sometimes used interchangeably with gully.

Japanese manhole covers (マンホールの蓋 (Manhōru no futa)) can be delightful or functional (in a typically thoughtful Japanese manner, some include a locality map with directions) but usually provide little inspiration for those designing wheels.

In the nineteenth century, Modern German picked up Gully from English in the sense of “a road drain, a drainage channel” (synonym: Straßenablauf), the covering of a road drain or gully being Ablaufgitter & Ablaufdeckel.  One adaptation quickly coined was Gullydeckel (manhole cover), the construct being gully + deckel, (an untypically economical construct in German given the usual forms for manhole were Kontrollschacht & Einstiegschacht), an alternative to Kanaldeckel (manhole cover).  Deckel (lid, cap, cover of a container) was an ellipsis of Bierdeckel (beer mat) and also used in humorous slang to mean “headwear, hat” although it was most productive in the formation of compounds with cap in the sense of “an artificial or arbitarily imposed upper limit or ceiling” such as Preisdeckel (price cap), the common synonym being Deckelung (capping).

A German Gullideckel (left), a Mercedes-Benz “Gullideckel” aluminum wheel (centre) and a 1988 Mercedes-Benz 560 SL so equipped.

The alternative spelling was Gullideckel and it was this which was picked up to describe the design of aluminum wheel adopted by Mercedes-Benz in 1982.  The reference is explained by the wheel’s design bearing a similarity to that typically used by German manhole covers although Mercedes-Benz dryly explained their concerns were less artistic or a tribute to Teutonic urban hydrology than a reflection of the imperatives of optimizing the air-flow required for brake cooling and a reduction in drag compared to their earlier, long-serving design.  It was in the 1980s that the greatest improvement in the aerodynamic efficiency of cars was achieved and wheels were a significant, though often little-noticed part of the process.

Top row: Mercedes-Benz C111 at Hockenheimring, 1969 (left).  The C111 series was originally a rolling test bed for the evaluation of Wankel engines ad it was on the C111 that the new wheels (then called “Premier”) were first shown although no production versions (centre) were ever made so wide.  The 6½ inch versions were first used on the 450 SEL 6.9 (right).  Bottom row: A bundt cake tin (left); like the wheels, the tins are made from aluminum but are always cast or pressed, not forged.  A ginger bundt cake (centre) and a lemon blueberry bundt cake with vanilla icing (right).      

The earlier design used by Mercedes-Benz was apparently not inspired by any existing product but the public soon found nicknames.  Introduced in 1969 and soon an option throughout the range except du Grosser (the 600 (W100) 1963-1981) until 1986, the factory initially listed them as the “Premier Wheel” (ie the “top of the range”) but in the public imagination the nicknames prevailed.  First informally dubbed "Baroque" because of what was then considered an ornate design, the name which endured was “Bundt” an allusion to the popular “bundt cakes”, a circular cake with a hole in the centre and there was certainly some resemblance.  Produced by the Otto Fuchs (pronounced fuks) Company of Meinerzhagen (near Cologne), the early versions were all painted silver (though not clear-coated) and available only in a 14 x 6-inch size, 5½ inch versions soon offered to suit the lower powered cars while in the mid-1970s, production began of 6½ inch versions to handle the tyres fitted to the much faster 450 SEL 6.9 (W116) and 450 SLC 5.0.  Demand for the bundt wheel option grew rapidly, forcing Fuchs to add a line of cast wheels in the same design, the casting process able to achieve both higher volumes and a lower unit cost.  The process of forging aluminum requires great heat and immense pressure (Fuchs used as much as 7,000 tons of force) and realigns the granular structure of the material in the direction of the flow, creating a more homogeneous and less porous micro-structure.  Forging renders aluminum as strong as steel for less weight and provides a notably higher resistance to fatigue and corrosion but the process is expensive.  Fuchs also manufactured small runs of a 15 x 7-inch version and today these are much sought after.  Such is the appeal of the style, specialists in the US have fabricated versions in both a 16 & 17-inch format although, being very expensive, they remain rare.  Today, factories often offer a variety of designs of aluminum wheels with some styles available only briefly but for over fifteen years, the bundt was the only one offered.

Fuchs wheels on Porsche 911s in matt metal, (left), polished (centre) & with painted highlights.  The Porsche pedants chide any restorer who finishes the wheels in any manner than that originally done at the factory.  

A half a decade earlier, Mercedes-Benz neighbors in Stuttgart had also designed an aluminum wheel.  Porsche had planned a 1965 release for its new 911 S, at that time the fastest, sportiest version of the 911 which had been on sale since 1963 and the distinctive five-spoke shape would first be sold in 1966 and remain on the option list until 1989, the popularity so enduring it’s since been reprised more than once.  Distinctive though it was, there were really only two requirements for the new wheel: It needed to be durable and light, strong enough to endure the stresses the higher speed of the 911 S and delivering a reduction in un-sprung mass weight significant enough to enhance handling.  The design target was an aluminum wheel which weighted 3 kg (6½ lb) less than steel wheel of the same dimensions.

Porsche had also used the Otto Fuchs Company, impressed by the foundry having developed a new manufacturing process which, instead of using a cast rim, manufactured it in one piece from an alloy made of 97% aluminum with the remainder composed mostly of magnesium, silicon, manganese & titanium, the technique still used by the company today.  The five-leaf clover design was based on nothing in particular and was done in-house by Porsche, the only change from the original prototype apparently a smoothing of the scalloped shape which first adorned the spokes.  The design proved adaptable, the original 15 x 4½-inch wide wheels growing eventually to eight inches when fitted to the rear of the 911 Turbo (930), the additional rubber required to tame the behavior of a machine which some labeled the “widow maker”.  Later designs have offered various specific improvements but none has matched the charm of the original and Fuchs have continued its manufacture for later model 911s, some in larger diameters to accommodate advances in suspension geometry and tyres.

Gas-burners: Lindsay Lohan using gas-burner as improvised cigarette lighter, Terry Richardson (b 1965) photo-shoot, 2012 (left), two of Mahle GmbH’s magnesium wheels (centre) and a 1971 Porsche 911 T so equipped.

The five spoke wheel is a matter of particular interest to the originality police in the Porsche collector community and great attention is paid to date-stamping and paint, it being very important that where appropriate the wheels variously should either be unpainted or painted in a certain way in a certain color.  Deviations from what the factory did are not tolerated.  Although five-leaf clover design never picked-up an association with other circular shapes like manhole covers or cakes, there was another Porsche wheel which did.  Produced by Mahle GmbH and quickly dubbed “gas-burners”, they were available on the 911, 912 & 914-6 between 1970-1972 and although generally not thought as attractive as Fuchs’ creations, the “gas-burners” have a cult following based on pure functionality: Pressure cast in magnesium and available only in a 15 x 5½-inch format, at 4.3 kg (9½ lb) they’re said to be the lightest 15-inch wheel ever made, more svelte even than the 15 x 6-inch units Michelin rendered in glass fibre & resin for the Citroën SM (1970-1975).

Aunger magazine advertisement, 1974.  Not all wheels use an existing circular product as a model.  A style popular in the 1970s, it was known colloquially as the “jellybean”, “slotted” or “beanhole”.