Showing posts sorted by date for query Fail Safe. Sort by relevance Show all posts
Showing posts sorted by date for query Fail Safe. Sort by relevance Show all posts

Wednesday, December 4, 2024

Snoot

Snoot (pronounced snoot)

(1) In slang, the nose (of humans, animals, geological formations, distant galaxies and anything else with a feature even vaguely “nose-like”).

(2) In slang, an alcoholic drink.

(3) In slang, a police officer (especially a plain-clothed detective, the use explained by the notion of police “sticking their noses into” things).

(4) In clothing, the peak of a cap.

(5) In photography and film production, a cylindrical or conical e-shaped fitment on a studio light to control the scene area illuminated by restricting spill light.

(6) In informal use, a snob; an elitist individual; one who looks down upon those “not of the better classes”.

(7) In linguistics, a language pedant or snob; one who practices linguistic elitism (and distinct from a “grammar Nazi”).

(8) In engineering, as “droop snoot”, a design in which the nose of a machine is lowered (temporarily or permanently) for reasons of visibility or to optimize aerodynamics.

(9) To behave disdainfully toward; to condescend to (usually as “snooty”).

(10) To apply a snoot attachment to a light.

1861: From the Scots snoot (a variation of snout (nose or projecting feature of an animal), from the Middle English snowte, from the Middle Dutch snute, ultimately from the Proto-West Germanic snūt, from the Proto-Germanic snūtaz, source also of the German Schnauze (the basis of schnauzer, a name for a type of dog) and it’s presumed the slang schnoz (a nose, especially if large) is probably related.  Snoot is a noun & verb, snootiness, snooter & snootful are nouns, snooting & snooted are verbs, snooty, snootier & snootiest are adjectives and snootily is an adverb; the noun plural is snoots.

Lindsay Lohan's snoot.

The noun snootful dates from 1885 and was a synonym of skinful (to have imbibed as much liquor as one could manage).  It was based on the use of snout to mean “an an alcoholic drink” whereas skinful was an allusion to the time when wine was transported in containers made from animal skin (ie in original use skinful meant “the container is full”).  The adjective snooty (proud, arrogant) was first noted as university student slang in 1918 and presumably was in some way related to the earlier snouty (insolent, overbearing) which was in use by at least 1857, doubtlessly on the basis of “looking down one's nose at someone or something”.  In dialectal or slang use a snout (in the sense of “nose” is not of necessity derogatory and in fields like engineering, cosmology, geography, geology, cosmology or zoology, it is merely descriptive.  However, when used as a slang term for a snob (a snooty person), the sense is almost always negative although there are some elitists who are proud of their snootiness.  Those who don’t approve of barbarisms such as country & western music sometimes make sure their snootiness is obvious but as a general principle it’s usually better just to ignore such things.  The adjective snooty is in much more common use than the noun snoot and it appears often with a modifier such as “a bit snooty”.  That may seem strange because one is either snooty about someone or something or one isn’t but there are degrees of severity with which one can allow ones snootiness to manifest (the comparative “snootier”, the superlative “snootiest”.

In engineering, “droop snout” is used to describe a design in which the nose of a machine is lowered (temporarily or permanently) for reasons of visibility or to optimize aerodynamics.  The term was apparently first used between engineers in the late 1950s while working on the first conceptual plans for the Anglo-French supersonic airliner which became the Concorde although the first known use in print dates from 1963 (“droop nose” appearing in the same era).  The idea wasn’t developed for use on the Concorde.  An experimental British supersonic test-bed with a droop-nose had flown as early 1954 and proved the utility of the concept by being the first jet aircraft to exceed 1000 mph (1600 km/h) in level flight, later raising the world speed record of to 1132 mph (1822 km/h), exceeding the previous mark by an impressive 310 mph (500 km/h).  In aviation, the basic idea of a sloping nose had been around for decades and one of the reasons some World War II (1939-1945) Allied fighter pilots found targeting easier in the Hawker Hurricane than the Supermarine Spitfire was the nose of the former noticeably tapered towards the front, greatly enhancing forward visibility.

How the Concorde's droop snoot was used.

On the Concorde, the droop snoot wasn’t a mere convenience.  The combination of the engineers slide-rules and wind tunnel testing had proved what the shape had to be to achieve the combination of speed and fuel economy (the latter an under-estimated aspect of the development process) but that shape also meant the pilots’ view was so obstructed during take-offs, landings and taxiing that safety was compromised.  The solution was the “droop nose” mechanism which included a moving transparent visor which retracted into the nose prior to being lowered.  At supersonic speeds, the temperatures are high and so are the stresses so much attention was devoted to “fail-safe” systems including the droop snoot because a structural failure at Mach 2 would potentially be catastrophic for the entire airframe (and obviously every soul on board).  Thus, the hydraulic systems controling the droop snoot’s movement was duplicated and, as a last resort, the pilots had access to a simple mechanical lever which would disengage the pins holding the structure in place, the apparatus afterwards gracefully (hopefully) descending into its lowered position by the simple operation of gravity.  Droop snoots appeared also on Soviet supersonic aircraft including the short-lived Tupolev Tu-144 (visually close to a Concorde clone) and the Sukhoi T-4 strategic bomber which never entered production.  Interestingly, the USAF’s (US Air Force) North American XB-70 Valkyrie (a Mach 3 experimental bomber) didn’t use a droop snoot because it was developed exclusively for high-altitude, high-speed strategic bombing missions and, being a military airplane, would only ever operate from large, controlled airbases where additional ground support systems (monitoring and guidance) negated the need for the mechanism.

1955 Ford Customline (left) and the 1967 “droop snoot” “Custaxie” (right), the construct being Cust(omline) + (Gal)axie, the unusual hybrid created by merging (some of) a 1955 Customline with a 427 cubic inch (7.0 litre) Ford Galaxie V8.  The bizarre machine won the 1967 New Zealand Allcomers (a wonderful concept) saloon car championship, the modifications to the nose reckoned to be the equivalent of an additional 40-50 horsepower.

At sub-supersonic speeds, throughout the 1960s race-cars proved the virtue of the droop snoot (though a fixed rather than a moveable structure.  While sometimes weight-reduction was also attained, overwhelmingly the advantage was in aerodynamics and the idea began to spread to road cars although it would be decades before the concept would no longer be visually too radical for general market acceptance.

1972 Vauxhall Firenza coupé promotional material for the Canadian launch, a market in which the car was a disaster (left) and 1975 High Performance (HP) Firenza "dropsnoot".  GM in South Africa actually made a good car out of the Firenza coupé, building 100 (for homologation purposes) with the 302 cubic inch (4.9 litre) V8 used in the original Z/28 Chevrolet Camaro.  In South Africa, they were sold as the "Chevrolet Firenza".  

In 1973, officially, Vauxhall called their new version of the Firenza coupé the “High Performance (HP) Firenza” but quickly the press, noting the Concorde (then still three years from entering commercial service), dubbed it the “droopsnoot”, the reference obviously to the distinctive nosecone designed for aerodynamic advantage.  The advantages were real in terms of performance and fuel consumption but Vauxhall had the misfortune to introduce the model just as the first oil crisis began which stunted demand for high-performance cars (BMW’s 2002 Turbo another victim) and triggered a sharp recession which was a prelude to that decade’s stagflation.  Vauxhall had planned a build of some 10,000 a year but in the difficult environment, a paltry 204 were built.

A Ford Escort Mark 2 in the 1977 Rally of Finland (left) and a 1976 Escort RS2000  with the droop snoot (right).

In 1976, Ford launched their own take on the droop snoot, the Mark 2 Escort RS2000 featuring a similar mechanical specification to that of the Mark 1 but with a distinctive nosecone.  Ford claimed there was an aerodynamic benefit in the new nose but it was really a styling exercise designed to stimulate interest because the Escort was the corporation’s platform for rallying rather than something used on high-speed circuits and it certainly achieved the desired results, the model proving popular.  Ford Australia even offered it with four doors as well as two although emission regulations meant the additional horsepower on offer in Europe was denied to those down under.  Interestingly, although the range’s high-performance flagship, the factory rally team didn’t use the droop snoot version, those in competition using the standard, square-fronted body.

Godox Pro Snoot S-Type Mount SN-05

Friday, October 11, 2024

Floppy

Floppy (pronounced flop-ee)

(1) A tendency to flop.

(2) Limp, flexible, not hard, firm, or rigid; flexible; hanging loosely.

(3) In IT, a clipping of “floppy diskette”.

(4) In historic military slang (Apartheid-era South Africa & Rhodesia (now Zimbabwe), an insurgent in the Rhodesian Bush War (the “Second Chimurenga” (from the Shona chimurenga (revolution)) 1964-1979), the use a reference to the way they were (in sardonic military humor) said to “flop” when shot.

(5) In informal use, a publication with covers made with a paper stock little heavier and more rigid that that used for the pages; Used mostly for comic books.

(6) In slang, a habitué of a flop-house (a cheap hotel, often used as permanent or semi-permanent accommodation by the poor or itinerant who would go there to “flop down” for a night) (archaic).

(7) In slang, as “floppy cats”, the breeders’ informal term for the ragdoll breed of cat, so named for their propensity to “go limp” when picked up (apparently because of a genetic mutation).

1855-1860: The construct was flop + -y.  Flop dates from 1595–1605 and was a variant of the verb “flap” (with the implication of a duller, heavier sound).  Flop has over the centuries gained many uses in slang and idiomatic form but in this context it meant “loosely to swing; to flap about”.  The sense of “fall or drop heavily” was in use by the mid-1830s and it was used to mean “totally to fail” in 1919 in the wake of the end of World War I (1914-1918), the conflict which wrote finis to the dynastic rule of centuries also of the Romanovs in Russia, the Habsburgs in Austria-Hungary and the Ottomans in Constantinople although in the 1890s it was recorded as meaning “some degree of failure”.  The comparative is floppier, the superlative floppiest.  Floppy a noun & adjective, floppiness is a noun, flopped is a noun & verb, flopping is a verb, floppier& floppiest are adjectives and floppily is an adverb; the noun plural is floppies.  The adjective floppish is non-standard and used in the entertainment & publishing industries to refer to something which hasn’t exactly “flopped” (failed) but which had not fulfilled the commercial expectations.

Lindsay Lohan in "floppy-brim" hat, on-set during filming of Liz & Dick (2012).  In fashion, many "floppy-brim" hats actually have a stiff brim, formed in a permanently "floppy" shape.  The true "floppy hats" are those worn while playing sport or as beachwear etc.

The word is used as a modifier in pediatric medicine (floppy baby syndrome; floppy infant syndrome) and as “floppy-wristed” (synonymous with “limp-wristed”) was used as a gay slur.  “Flippy-floppy” was IT slang for “floppy diskette” and unrelated to the previous use of “flip-flop” or “flippy-floppy” which, dating from the 1880s was used to mean “a complete reversal of direction or change of position” and used in politics to suggest inconsistency.  In the febrile world of modern US politics, to be labelled a “flip-flopper” can be damaging because it carries with it the implication what one says can’t be relied upon and campaign “promises” might thus not be honored.  Whether that differs much from the politicians’ usual behaviour can be debated but still, few enjoy being accused of flip-floppery (definitely a non-standard noun).  The classic rejoinder to being called a flip-flopper is the quote: “When the facts change, I change my mind. What do you do, sir?”  That’s often attributed to the English economist and philosopher Lord Keynes (John Maynard Keynes, 1883-1946) but it was said originally by US economist Paul Samuelson (1915–2009) the 1970 Nobel laureate in Economics.  In the popular imagination Keynes is often the “go to” economist for quote attribution in the way William Shakespeare (1564–1616) is a “go to author” and Winston Churchill (1875-1965; UK prime-minister 1940-1945 & 1951-1955) a “go to politician”, both credited with thing they never said but might have said.  I phraseology, the quality of “Shakespearian” or “Churchillian” not exactly definable but certainly recognizable.  In the jargon of early twentieth century electronics, a “flip-flop” was a reference to switching circuits that alternate between two states.

Childless cat lady Taylor Swift with her “floppy cat”, Benjamin Button (as stole).  Time magazine cover, 25 December 2023, announcing Ms Swift as their 2023 Person of the Year.  "Floppy cat" is the the breeders' informal term for the ragdoll breed an allusion to their tendency to “go limp” when picked up, a behavior believed caused by a genetic mutation.

The other use of flop in IT is the initialism FLOP (floating point operations per second).  Floating-point (FB) arithmetic (FP) a way of handling big real numbers using an integer with a fixed precision, scaled by an integer exponent of a fixed base; FP doesn’t really make possible what would not in theory be achievable using real numbers but does make this faster and practical and the concept became familiar in the 1980s when Intel made available FPUs (floating point units, also known as math co-processors) which could supplement the CPUs (central processing units) of their x86 family.  The 8087 FPU worked with the 8086 CPU and others followed (80286/80287, 80386/80387, i486/i487 etc) until eventually the FPU for the Pentium range was integrated into the CPU, the early implementation something of a debacle still used as a case study in a number of fields departments including management and public relations.

FLOPs are an expression of specific performance and are used to measure those computations requiring floating-point calculations (typically in math-intensive work) and for purposes of “benchmarking” or determining “real-world” performance under those conditions, it’s a more informative number than the traditional rating of instructions per second (iSec).  The FLOPs became something of a cult in the 1990s when the supercomputers of the era first breached the trillion FLOP mark and as speeds rose, the appropriate terms were created:

kiloFLOPS: (kFLOPS, 103)
megaflops: (MFLOPS, 106)
gigaflops: GFLOPS, 109)
teraflops: TFLOPS, 1012)
petaFLOPS: PFLOPS, 1015)
exaFLOPS: (EFLOPS, 1018)
zettaFLOPS: ZFLOPS, 1021)
yottaFLOPS: YFLOPS, 1024)
ronnaFLOPS: RFLOPS, 1027)
quettaFLOPS: QFLOPS, 1030)

In the mysterious world of quantum computing, FLOPs are not directly applicable because the architecture and methods of operation differ fundamentally from those of classical computers.  Rather than FLOPs, the performance of quantum computers tends to be measured in qubits (quantum bits) and quantum gates (the operations that manipulate qubits).  The architectural difference is profound and explained with the concepts of superposition and entanglement:  Because a qubit simultaneously can represent both “0” & “1” (superposition) and these can be can be entangled (a relationship in which distance is, at least in theory, irrelevant), under such multi-string parallelism, performance cannot easily be reduced to simple arithmetic or floating-point operations which remain the domain of classical computers which operate using the binary distinction between “0” (off) and “1” (on).

Evolution of the floppy diskette: 8 inch (left), 5¼ inch (centre) & 3½ inch (right).  The track of the floppy for the past half-century has been emblematic of the IT industry in toto: smaller, higher capacity and cheaper.  Genuinely it was one of the design parameters for the 3½ inch design that it fit into a man's shirt pocket.

In IT, the term “floppy diskette” used the WORM (write once, read many, ie "read only" after being written) principle first appeared in 1971 (soon doubtless clipped to “floppy” although the first known use of this dates from 1974).  The first floppy diskettes were in an 8 inch (2023 mm) format which may sound profligate for something with a capacity of 80 kB (kilobyte) but the 10-20 MB (megabit) hard drives of the time were typically the same diameter as the aperture of domestic front-loading washing machine so genuinely they deserved the diminutive suffix (-ette, from the Middle English -ette, a borrowing from the Old French -ette, from the Latin -itta, the feminine form of -ittus.  It was used to form nouns meaning a smaller form of something).  They were an advance also in convenience because until they became available, the usual way to transfer files between devices was to hard-wire them together.  Introduced by IBM in 1971, the capacity was two years later raised to 256 kB and by 1977 to a heady 1.2 MB (megabyte) with the advent of a double-sided, double-density format.  However, even then it was obvious the future was physically smaller media and in 1978 the 5¼ inch (133 mm) floppy debuted, initially with a formatted capacity of 360 kB but by 1982 this too had be raised to 1.2 MB using the technological advance if a HD (high density) file system and it was the 5¼ floppy which would become the first widely adopted industry “standard” for both home and business use, creating the neologism “sneakernet”, the construct being sneaker + net(work), the image being of IT nerds in their jeans and sneakers walking between various (unconnected) computers and exchanging files via diskette.  Until well into the twenty-first century the practice was far from functionally extinct and it persists even today with the use of USB sticks.

Kim Jong-un (Kim III, b 1982; Supreme Leader of DPRK (North Korea) since 2011) with 3½ inch floppy diskette (believed to be a HD (1.44 MB)).

The meme-makers use the floppy because it has become a symbol of technological bankruptcy. In OS (operating system) GUIs (graphical user interface) however, it does endure as the "save" icon and all the evidence to date does suggest that symbolic objects like icons do tend to outlive their source, thus the ongoing use in IT of analogue, rotary dial phones in iconography and the sound of a camera's physical shutter in smart phones.  Decades from now, we may still see representations of floppy diskettes.

The last of the mainstream floppy diskettes was the 3½ inch (89 mm) unit, introduced in 1983 in double density form with a capacity of 720 KB (although in one of their quixotic moves IBM used a unique 360 kB version for their JX range aimed at the educational market) but the classic 3½ was the HD 1.44 MB unit, released in 1986.  That really was the end of the line for the format because although in 1987 a 2.88 MB version was made available, few computer manufacturers offered the gesture of adding support at the BIOS (basic input output system) so adoption was infinitesimal.  The 3½ inch diskette continued in wide use and there was even the DMF (Distribution Media Format) with a 1.7 MB capacity which attracted companies like Microsoft, not because it wanted more space but to attempt to counter software piracy; within hours of Microsoft Office appearing in shrink-wrap with, copying cracks appeared on the bulletin boards (where nerds did stuff before the www (worldwideweb).  It was clear the floppy diskette was heading for extinction although slighter larger versions with capacities as high as 750 MB did appear but, expensive and needing different drive hardware, they were only ever a niche product seen mostly inside corporations.  By the time the CD-ROM (Compact Disc-Read-only Memory) reached critical mass in the mid-late 1990s the once ubiquitous diskette began rapid to fade from use, the release in the next decade of the USB sticks (pen drives) a final nail in the coffin for most.

In the mid 1990s, installing OS/2 Warp 4.0 (Merlin) with the optional packs and a service pack could require a user to insert and swap up to 47 diskettes.  It could take hours, assuming one didn't suffer the dreaded "floppy failure".

That was something which pleased everyone except the floppy diskette manufacturers who had in the early 1990s experienced a remarkable boom in demand for their product when Microsoft Windows 3.1 (7 diskettes) and IBM’s OS/2 2.0 (21 diskettes) were released. Not only was the CD-ROM a cheaper solution than multiple diskettes (a remarkably labor-intensive business for software distributors) but it was also much more reliable, tales of an installation process failing on the “final diskette” legion and while some doubtlessly were apocryphal, "floppy failure" was far from unknown.  By the time OS/2 Warp 3.0 was released in 1994, it required a minimum of 23 floppy diskettes and version 4.0 shipped with a hefty 30 for a base installation.  Few mourned the floppy diskette and quickly learned to love the CD-ROM.

What lay inside a 3½ inch floppy diskette.

Unlike optical discs (CD-ROM, DVD (Digital Versatile Disc) & Blu-Ray) which were written and read with the light of a laser, floppy diskettes were read with magnetic heads.  Inside the vinyl sleeve was a woven liner impregnated with a lubricant, this to reduce friction on the spinning media and help keep the surfaces clean.

Curiously though, niches remained where the floppy lived on and it was only in 2019 the USAF (US Air Force) finally retired the use of floppy diskettes which since the 1970s had been the standard method for maintaining and distributing the data related to the nation’s nuclear weapons deployment.  The attractions of the system for the military were (1) it worked, (2) it was cheap and (3) it was impervious to outside tampering.  Global thermo-nuclear war being a serious business, the USAF wanted something secure and knew that once data was on a device in some way connected to the outside world there was no way it could be guaranteed to be secure from those with malign intent (ayatollahs, the Secret Society of the Les Clefs d'Or, the CCP (Chinese Communist Party), the Freemasons, those in the Kremlin or Pyongyang et al) whereas a diskette locked in briefcase or a safe was, paradoxically, the state of twenty-first century security, the same philosophy which has seen some diplomatic posts in certain countries revert to typewriters & carbon paper for the preparation of certain documents.  In 2019 however, the USAF announced that after much development, the floppies had been retired and replaced with what the Pentagon described as a “highly-secure solid-state digital storage solution which work with the Strategic Automated Command and Control System (SACCS).

It can still be done: Although no longer included in PCs & laptops, USB floppy diskette drives remain available (although support for Windows 11 systems is said to be "inconsistent").  Even 5¼ inch units have been built.

It thus came as a surprise in 2024 to learn Japan, the nation which had invented motorcycles which didn’t leak oil (the British though they’d proved that couldn’t be done) and the QR (quick response) code, finally was abandoning the floppy diskette.  Remarkably, even in 2024, the government of Japan still routinely asked corporations and citizens to submit documents on floppies, over 1000 statutes and regulations mandating the format.  The official in charge of updating things (in 2021 he’d “declared war” on floppy diskettes) in July 2024 announced “We have won the war on floppy disks!” which must have be satisfying because he’d earlier been forced to admit defeat in his attempt to defenestrate the country’s facsimile (fax) machines, the “pushback” just too great to overcome.  The news created some interest on Japanese social media, one tweet on X (formerly known as Twitter) damning the modest but enduring floppy as a “symbol of an anachronistic administration”, presumably as much a jab at the “tired old men” of the ruling LDP (Liberal Democratic Party) as the devices.  There may however been an element of technological determinism in the reform because Sony, the last manufacturer of the floppy, ended production of them in 2011 so while many remain extant, the world’s supply is dwindling.  In some ways so modern and innovative, in other ways Japanese technology sometimes remains frozen, many businesses still demanding official documents to be endorsed using carved personal stamps called the印鑑 (ikan) or 判子 (hanko); despite the government's efforts to phase them out, their retirement is said to be proceeding at a “glacial pace”.  The other controversial aspect of the hanko is that the most prized are carved from ivory and it’s believed a significant part of the demand for black-market ivory comes from the hanko makers, most apparently passing through Hong Kong, for generations a home to “sanctions busters”.

Sunday, November 12, 2023

Efficacy

Efficacy (pronounced ef-i-kuh-see)

(1) A capacity for producing a desired result or effect; effectiveness; an ability to produce a desired effect under ideal testing conditions.

(2) The quality of being successful in producing an intended result; effectiveness; a measure of the degree of ability to produce a desired effect.

1520-1530: From the Old French efficace (quality of being effectual, producing the desired effect), from the Late Latin efficācia (efficacy), from efficāx (powerful, effectual, efficient), genitive efficacis (powerful, effective), from stem of efficere (work out, accomplish).  In eleventh century English, in much the same sense was efficace from the Old French eficace from the same Latin root efficācia; there was also the early fifteenth century efficacite from the Latin efficacitatem.  The sixteenth century adjective efficacious (certain to have the desired effect) was often used of medicines (presumably a favorite of apothecaries), the construct being the Latin efficaci-, stem of efficax from the stem of efficere (work out, accomplish)  + -ous.  The –ous suffix was from the Middle English -ous, from the Old French –ous & -eux, from the Latin -ōsus (full, full of); a doublet of -ose in an unstressed position.  It was used to form adjectives from nouns, to denote possession or presence of a quality in any degree, commonly in abundance.  In chemistry, it has a specific technical application, used in the nomenclature to name chemical compounds in which a specified chemical element has a lower oxidation number than in the equivalent compound whose name ends in the suffix -ic.  For example, sulphuric acid (H2SO4) has more oxygen atoms per molecule than sulphurous acid (H2SO3).  The noun inefficacy (want of force or virtue to produce the desired effect) dates from the 1610s, from the Late Latin inefficacia, from inefficacem (nominative inefficax), the construct being in- (not, opposite of) + efficax.

The most familiar related form in modern use is efficacious but in general use this is often used in a more nuanced way than the pass/fail dichotomy of "efficacy" familiar in medical trials.  In general use, efficacious is a "spectrum word" which describes degrees of the ameliorative effects of treatments although while the comparative is "more efficacious", a more common form is "quite efficacious"; the superlative "most efficacious" appears to be popular among the small subset of the population who use efficacious at all.  Efficacy, efficacity & efficaciousness are nouns, effectuate is a verb, effectual & efficacious are adjectives and efficaciously is an adverb; the noun plural is efficacies.

Clinical trials in the pharmaceutical industry

In the development of vaccines (and medicinal drugs in general), efficacy trials (sometimes called phase III or explanatory trials) determine the percentage reduction of disease in a vaccinated group of people compared to an unvaccinated group, under the most favorable conditions, which is with the subjects housed in a hospital equipped to handle intensive care patients.  Conducted on human subjects if tests on animals proved satisfactory, it’s a purely clinical exercise, practiced since 1915 and can be done as a double-blind, randomized study if no safety concerns exist.  One potentially distorting aspect of both efficacy and (particularly) safety trials is a historic bias towards healthy young males as the subjects.  The antonym of the adjective efficacious is inefficacious but the word is rarely used when drug trials produce unsatisfactory results: the punchier "failed" is almost always used.  Under normal circumstances, the testing process can take many years, the industry usually referring to trials as phases:

Phase I: Safety Trial

Phase I trials are done to test a new biomedical intervention for the first time in a small group of people (typically 20-100) to evaluate safety.  Essentially, this determines the safe dosage range and identifies side effects.

Phase II: Efficacy Trial

Phase II trials are done to study an intervention in a larger group of people (several hundred or more depending on the product) to determine efficacy (ie whether it works as intended) and further to evaluate safety.

Phase III: Clinical Study

Phase III studies are done to study the efficacy of an intervention in large groups of trial participants (often thousands) by comparing the intervention to other standard or experimental interventions (or to non-interventional standard care).  Phase III studies are also used to monitor adverse effects and to collect information that will allow the intervention to be used safely.

Phase IV: Efficiency Study

Phase IV studies are done after the drug has been released and is being prescribed.  These studies are designed to monitor the effectiveness of the approved intervention in the general population and to collect information about any adverse effects associated with widespread use over longer periods of time.  They may also be used to investigate the potential use of the intervention in a different condition, or in combination with other therapies.

Proven efficacy: Adderall.

Adderall and Mydayis are trade names for a combination drug called mixed amphetamine salts (a mix of four salts of amphetamine).  In all Western jurisdictions  Belonging to a class of drugs known as stimulants, Adderall is a prescription medication and it contains two active ingredients: the amphetamines and dextroamphetamine.  As a prescribed medicine, primarily Adderall is used in the treatment of attention deficit hyperactivity disorder (ADHD), a neurobehavioral disorder characterized by symptoms such as inattention, hyperactivity, and impulsivity.  Adderall works by increasing the levels of certain neurotransmitters (most critically dopamine and norepinephrine) in the brain, these both mechanisms which play some role in regulating attention, focus, and impulse control.  Beyond ADHD, Adderall is sometimes prescribed off-label for the treatment of narcolepsy, a sleep disorder characterized by excessive daytime sleepiness and sudden, unpredictable episodes of sleep.

Adderall also has something of a cult following among those who seek to experience some of its more desirable side-effects.  Like many of the earlier amphetamines (most famously Tenuate Dospan (diethylpropion or amfepramone) an appetite suppressant of legendary efficacy), Adderall can assist in weight-loss and can safely be used for this by most people but because of its potential for dependence, it should be taken (for whatever purpose) only under clinical supervision.  For those prescribed Adderall, in some circumstances, it may continue to be taken (at the prescribed level) even if one is in a substance rehabilitation facility as was the case in 2013 when Lindsay Lohan completed a 48-hour drug detox at the Betty Ford Clinic in Indio, California.  Ms Lohan was prescribed Adderall after being diagnosed with ADHD but the standard protocol used by rehab clinics is that doctors routinely re-evaluate (1) the ADHA diagnosis and (2) the efficacy of the treatment regime.  Depending on their findings, doctors can prescribe alternative drugs or cease drug intervention entirely.  Ms Lohan was quoted as saying she’d been using Adderall "for years" and that she cannot function without it and her choice of rehab facility was once which would permit both smoking (tobacco) and the use of Adderall.

As she earlier explained it: “I have severe ADD. I can’t stand still.  So, I take Adderall for that; it calms me.”  Ms Lohan further noted she was not unaware there were those who took Adderall for its side-effects, notably weight loss or the ability to function effectively for extended durations without needing to sleep but that she wasn’t someone who needed to regulate her weight and that her sleeping patterns were normal.  However, the cult if anything is growing and in the US shortages of Adderall were reported (not for the first time) in late 2022.  The US Food and Drug Administration (FDA) responded by issuing a statement noting that while there was nothing unusual about episodic shortages of generic drugs at any given time because the profit margins are low and production is sometimes restricted to avoid a sacrifice in the opportunity cost vis-a-vis higher margin products, Adderall was “a special case because it is a controlled substance and the amount available for prescription is controlled by the Drug Enforcement Administration (DEA).”  The FDA added that because there had been “a tremendous increase in prescribing” because of virtual medicine (e-consultations) and a general trend towards over-prescribing and over-diagnosing, the periodic shortages were likely to continue.  THE FDA’s conclusion was that “if only those who needed these drugs got them, there probably wouldn't be a [stimulant medication] shortage” but the diagnosis of ADHD continues to grow and the desire for rapid weight-loss solutions remains strong.

Friday, July 7, 2023

Cruise

Cruise (pronounced krooz)

(1) To sail about on a pleasure trip (often as cruising).

(2) To sail about, as a warship patrolling a body of water.

(3) To travel about without a particular purpose or destination.

(4) To fly, drive, or sail at a constant speed that permits maximum operating efficiency for sustained travel.

(5) In aeronautics, the portion of aircraft travel at a constant airspeed and altitude between ascent and descent phases.

(6) To travel at a moderately fast, easily controllable speed.

(7) To travel about slowly, looking for customers or for something demanding attention.

(8) As cruise missile, an intermediate-range weapon.

(9) Among male homosexuals, actively to seek a casual sexual partner by moving about a particular area known to be frequented by those there for such purposes, an area known to be productive known as “cruisy” (“to troll” & “trolling” were once used as a synonyms but those terms have now been claimed by their use on the internet).

(10) In informal use in the US military, a period spent in the Marine Corps.

(11) In casual use in sporting competition, easily to win.

1645-1655:  From the Dutch kruisen (to cross, sail to and fro), from kruis or cruis (cross), from the Middle Dutch cruce, from the Latin crux.  Root was the primitive Indo-European sker (to turn, to bend); etymologists suggest it may be cognate with the Latin circus (circle) and curvus (curve).  In English, it began to be used as a noun in 1706 in the sense of “a voyage taken in courses” and by 1906 as “a voyage taken by tourists on a ship".  It was related to the French croiser (to cross, cruise), the Spanish cruzar and the German kreuzen.  The alternative spelling cruize is obsolete.  Cruise & cruising are nouns & verbs, cruised is a verb, cruiser is a noun and cruisy is an adjective; the noun plural is cruises.

Cruiser in the sense of "one who or that which cruises"(agent noun from the verb cruise) is from the 1670s, probably, borrowed from similar words in continental languages (such as the Dutch cruiser & French croiseur).  In older use, a cruiser was a warship built to patrol and protect commerce of the state to which it belongs and to chase hostile ships; cruisers were the classic gun boats used by the European colonial powers for patrolling their empires.  In this use they were often compared to the frigates of old in that they possessed good speed and were employed to protect the trade-routes, to glean intelligence, and to act as the “eyes of the fleet” and in casual use, during the eighteenth century, the term was often applied to the ships of privateers (pirates).  Cruiser was used to describe homosexuals “cruising for sex partners" (ie frequenting and lingering in places well-known for such things) from 1903, as a boxing weight (cruiserweight) class, from 1920.  The meaning "police patrol car" is a 1929 adoption of American English.

Royal Navy battlecruiser HMS Hood entering Valletta harbor, Malta 1937.

In admiralty use, cruisers are now the largest of the conventional warships still in service.  Navies used to use the term “cruiser” more as a description of the tasks for which the ships were used rather than specific nature of the construction, the early cruisers those ships which were used for long-range missions such as costal raiding or scouting and it was only in the late nineteenth century as the fleets grew and became more specialized that the classic model of the corvette / frigate / destroyer / cruiser / battleship evolved.  Even then there were distinctions such as light & heavy cruisers but the most interesting development in warship architecture was the battlecruiser, built essentially because the Dreadnought had created “a gap in the market”.  Battlecruisers were battleships with less armor, therefore gaining speed at the cost of greater vulnerability.  The theory was they would have the firepower to out-gun all but the battleships and those they could out-run with their greater speed.  The concept seemed sound and in December 1914, at the Battle of the Falkland Islands, two Royal Navy battlecruisers vindicated the theory when they chased and destroyed the German East Asia Squadron. However, in 1916, the performance of the battlecruisers in the Jutland engagement forced the Admiralty to re-consider.  Jutland was the closest thing to the great battle of the fleets which had been anticipated for decades but proved anti-climatic, both sides ultimately choosing to avoid the decisive encounter which offered the chance of victory or defeat.  What it did prove was that the naval theorists had been right; the battlecruiser could not fight the battleship and if their paths threatened to cross, the less-armored vessel should retreat and rely on greater speed to make good her escape.  There were technical deficiencies in the British ships, without which perhaps three of their battlecruisers wouldn’t have been lost, but what happened at Jutland made it clear to the admirals that uneven contests between the big capital ships were to be avoided.  The consequence was that the battlecruiser became unfashionable and after the round of disarmament in the 1920s, none were built until, unexpectedly, the Soviet Navy commissioned four in the 1980s.  They proved the last of the breed.

Origin of cruise missiles

US Pershing II cruise missiles in Neu-Ulm military base, Swabia, Bavaria in the then Federal Republic of Germany (The FRG, the old West Germany), 1984.

Carrying large warheads long distances, cruise missiles are guided weapons, used against ground targets; they fly at both subsonic and supersonic speed, remain in the atmosphere and, self-propelled for the most of their flight, travel for mostly at a constant speed.  In this they differ from ballistic missiles which fly in an arc, often reaching suborbital flight with a final trajectory much like a bullet because, once the fuel is expended, the path from that point is determined by the speed and direction of launch and the force of gravity pulling towards Earth.  Both cruise and ballistic missiles can carry nuclear warheads but cruise missiles are most often equipped with conventional warheads.  Theorists and researchers were exploring the possibility of military missiles as early as 1908, described then as the aerial torpedo, envisaged as remote-controlled weapons with which to shoot-down airships bombing London, perceived then as the most credible airborne delivery system.  .  Between the first and second world wars, the major powers all devoted resources to research but few projects reached even the prototype stage.

Annotated schematic of the V-1 (left) and a British Military Intelligence drawing (dated 16 June 1944, 3 days after the first V-1 attacks on London (right). 

First deployed in 1944 the German Vergeltungswaffen eins (“retaliatory weapon 1” or "reprisal weapon 1” and eventually known as the V-1) was the world’s first cruise missile.  One of the rare machines to use a pulse-jet, it emitted such a distinctive sound that those at whom it was aimed nicknamed it the “buzz-bomb” although it attracted other names including “flying bomb” and “doodlebug”.  In Germany, before Dr Joseph Goebbels (1897–1945; Reich Minister of Propaganda 1933-1945) decided it was the V-1, the official military code name was Fi 103 (The Fi stood for Fieseler, the original builder of the airframe and most famous for their classic Storch (Stork), short take-off & landing (STOL) aircraft) but there were also the code-names Maikäfer (maybug) & Kirschkern (cherry stone).  While the Allied defenses against the V-1 did improve over time, it was only the destruction of the launch sites and the occupation of territory within launch range that ceased the attacks.  Until then, the V-1 remained a highly effective terror weapon but, like the V-2 and so much of the German armaments effort, bureaucratic empire-building and political intrigue compromised the efficiency of the project.

Lindsay Lohan on a cruise in the Maldives, January 2019.

The V-1 used a gyroscope guidance system and was fitted with an unusual triple-layer fuse system, the primary device and a backup augmented by a fail-safe designed to ensure destruction of “duds” (weapons which fail to detonate) so they couldn’t be examined.  The accuracy of the thing was sufficient only for use against very large targets (such as the general area of a city which made sprawling London ideal) while the range of 250 km (155 miles) was significantly less than that of a medium bomber carrying the same payload. The main advantages were speed (although not sufficient to outrun the fastest of the low-altitude propeller-driven interceptors), expendability and economy of operation.  Indeed, it was probably the war’s outstanding delivery system in terms of cost per ton of explosive, able to carry a warhead of 850 kg (1,870 lb) to London at a tiny fraction of the cost of using manned aircraft for the same task with the priceless additional benefit of not risking the loss of aircrew.  The production cost of a V-1 was also only a small fraction of that of the supersonic V-2 ballistic missile which carried a warhead only of a similar-size although once launched, it was effectively invulnerable.  Unlike the V-2, the initial deployments of the V-1 required large, fixed launch ramps which were relatively easy to detect and susceptible to bombardment.  Later experiments produced much smaller launch facilities which provided for a greater rate of sustained fire.  Bomber-launched variants of the V-1 saw limited operational service near the end of the war, with the pioneering V-1's design reverse-engineered by the Americans as the Republic-Ford JB-2 cruise missile.

Luftwaffe Mistel Aircraft ( Focke-Wulf Fw 190 (upper) & Junkers Ju 88 (lower), Merseburg, Germany, 1945.

The "cruise missile" project which was the best example of the improvisation which characterized much of the ad-hoc weapon development of war time was the Mistel (mistletoe) or Beethoven-Gerät (Beethoven Device) composite aircraft program which the Germans developed in 1943.  It was a rudimentary air-launched cruise missile, made by a piloted fighter aircraft being mounted atop an unpiloted bomber-sized aircraft, packed with explosives and the larger aircraft would be released to glide towards the target.  Calling it the mistletoe reveals a sense of humor mot usually associated with the Luftwaffe but it was known rather more evocatively as the Vati und Sohn (Daddy and Son) or the Huckepack (Piggyback).  Although built in the hundreds, by the time it was available for deployment, the scope for attacking large targets with manned aircraft had reduced and the need was for precision delivery, something for which the Mistel was ill-suited and success was limited.

Wednesday, March 23, 2022

Ouija

Ouija (pronounced wee-juh (sometimes wee-jee (US)))

(1) An instrument in the shape of a board on which is written the alphabet, the numbers 0-9 and the words "Yes", "No" & "Goodbye" (with occasional additions), the characters selected by a small, heart-shaped piece called a planchette.  Board is used during a séance to contact spirits of the dead, the characters selected by the participants collectively placing their hands on the planchette which is then guided by the spirit(s) to the appropriate letter or number.

(2) As Ouija board, a small-scale replica of an aircraft carrier's flight and hangar decks, installed in the in the flight control room and manually updated with scale models as a communications fail-safe.  Used in every US carrier since WWII (although now in the throes of being replaced by electronic versions).

1891: A trademark name granted to the Kennard Novelty Company (US), a compound of the from French oui (yes) and the German ja (yes).  Oui is from the Old French oïl, a compound of o (the affirmative particle) and il (he), akin to o-je (I), o-tu (thou), o-nos (we) and o-vos (you), all ‘yes’ constructed with pronouns.  O and òc are both from the Latin hoc (this) and may correspond to the Vulgar Latin construction hoc ille.  Ja is from the Middle High German ja, from Old High German (yes), from Proto-Germanic ja from the primitive Indo-European (already).  It was cognate with the Dutch ja, the English yea (yes) and the Latin iam (already).

Although Ouija, as a propriety brand-name, dates only from 1891, similar boards existed in China from circa 1100 BC and have long been part of occult and spiritual practice in the west, attaining great popularity in the mid-nineteenth century and again during WWI and its aftermath.

Analog Ouija Board on USS Ronald Reagan aircraft carrier.

Available for niche markets.