Showing posts sorted by relevance for query Efficacy. Sort by date Show all posts
Showing posts sorted by relevance for query Efficacy. Sort by date Show all posts

Sunday, November 12, 2023

Efficacy

Efficacy (pronounced ef-i-kuh-see)

(1) A capacity for producing a desired result or effect; effectiveness; an ability to produce a desired effect under ideal testing conditions.

(2) The quality of being successful in producing an intended result; effectiveness; a measure of the degree of ability to produce a desired effect.

1520-1530: From the Old French efficace (quality of being effectual, producing the desired effect), from the Late Latin efficācia (efficacy), from efficāx (powerful, effectual, efficient), genitive efficacis (powerful, effective), from stem of efficere (work out, accomplish).  In eleventh century English, in much the same sense was efficace from the Old French eficace from the same Latin root efficācia; there was also the early fifteenth century efficacite from the Latin efficacitatem.  The sixteenth century adjective efficacious (certain to have the desired effect) was often used of medicines (presumably a favorite of apothecaries), the construct being the Latin efficaci-, stem of efficax from the stem of efficere (work out, accomplish)  + -ous.  The –ous suffix was from the Middle English -ous, from the Old French –ous & -eux, from the Latin -ōsus (full, full of); a doublet of -ose in an unstressed position.  It was used to form adjectives from nouns, to denote possession or presence of a quality in any degree, commonly in abundance.  In chemistry, it has a specific technical application, used in the nomenclature to name chemical compounds in which a specified chemical element has a lower oxidation number than in the equivalent compound whose name ends in the suffix -ic.  For example, sulphuric acid (H2SO4) has more oxygen atoms per molecule than sulphurous acid (H2SO3).  The noun inefficacy (want of force or virtue to produce the desired effect) dates from the 1610s, from the Late Latin inefficacia, from inefficacem (nominative inefficax), the construct being in- (not, opposite of) + efficax.

The most familiar related form in modern use is efficacious but in general use this is often used in a more nuanced way than the pass/fail dichotomy of "efficacy" familiar in medical trials.  In general use, efficacious is a "spectrum word" which describes degrees of the ameliorative effects of treatments although while the comparative is "more efficacious", a more common form is "quite efficacious"; the superlative "most efficacious" appears to be popular among the small subset of the population who use efficacious at all.  Efficacy, efficacity & efficaciousness are nouns, effectuate is a verb, effectual & efficacious are adjectives and efficaciously is an adverb; the noun plural is efficacies.

Clinical trials in the pharmaceutical industry

In the development of vaccines (and medicinal drugs in general), efficacy trials (sometimes called phase III or explanatory trials) determine the percentage reduction of disease in a vaccinated group of people compared to an unvaccinated group, under the most favorable conditions, which is with the subjects housed in a hospital equipped to handle intensive care patients.  Conducted on human subjects if tests on animals proved satisfactory, it’s a purely clinical exercise, practiced since 1915 and can be done as a double-blind, randomized study if no safety concerns exist.  One potentially distorting aspect of both efficacy and (particularly) safety trials is a historic bias towards healthy young males as the subjects.  The antonym of the adjective efficacious is inefficacious but the word is rarely used when drug trials produce unsatisfactory results: the punchier "failed" is almost always used.  Under normal circumstances, the testing process can take many years, the industry usually referring to trials as phases:

Phase I: Safety Trial

Phase I trials are done to test a new biomedical intervention for the first time in a small group of people (typically 20-100) to evaluate safety.  Essentially, this determines the safe dosage range and identifies side effects.

Phase II: Efficacy Trial

Phase II trials are done to study an intervention in a larger group of people (several hundred or more depending on the product) to determine efficacy (ie whether it works as intended) and further to evaluate safety.

Phase III: Clinical Study

Phase III studies are done to study the efficacy of an intervention in large groups of trial participants (often thousands) by comparing the intervention to other standard or experimental interventions (or to non-interventional standard care).  Phase III studies are also used to monitor adverse effects and to collect information that will allow the intervention to be used safely.

Phase IV: Efficiency Study

Phase IV studies are done after the drug has been released and is being prescribed.  These studies are designed to monitor the effectiveness of the approved intervention in the general population and to collect information about any adverse effects associated with widespread use over longer periods of time.  They may also be used to investigate the potential use of the intervention in a different condition, or in combination with other therapies.

Proven efficacy: Adderall.

Adderall and Mydayis are trade names for a combination drug called mixed amphetamine salts (a mix of four salts of amphetamine).  In all Western jurisdictions  Belonging to a class of drugs known as stimulants, Adderall is a prescription medication and it contains two active ingredients: the amphetamines and dextroamphetamine.  As a prescribed medicine, primarily Adderall is used in the treatment of attention deficit hyperactivity disorder (ADHD), a neurobehavioral disorder characterized by symptoms such as inattention, hyperactivity, and impulsivity.  Adderall works by increasing the levels of certain neurotransmitters (most critically dopamine and norepinephrine) in the brain, these both mechanisms which play some role in regulating attention, focus, and impulse control.  Beyond ADHD, Adderall is sometimes prescribed off-label for the treatment of narcolepsy, a sleep disorder characterized by excessive daytime sleepiness and sudden, unpredictable episodes of sleep.

Adderall also has something of a cult following among those who seek to experience some of its more desirable side-effects.  Like many of the earlier amphetamines (most famously Tenuate Dospan (diethylpropion or amfepramone) an appetite suppressant of legendary efficacy), Adderall can assist in weight-loss and can safely be used for this by most people but because of its potential for dependence, it should be taken (for whatever purpose) only under clinical supervision.  For those prescribed Adderall, in some circumstances, it may continue to be taken (at the prescribed level) even if one is in a substance rehabilitation facility as was the case in 2013 when Lindsay Lohan completed a 48-hour drug detox at the Betty Ford Clinic in Indio, California.  Ms Lohan was prescribed Adderall after being diagnosed with ADHD but the standard protocol used by rehab clinics is that doctors routinely re-evaluate (1) the ADHA diagnosis and (2) the efficacy of the treatment regime.  Depending on their findings, doctors can prescribe alternative drugs or cease drug intervention entirely.  Ms Lohan was quoted as saying she’d been using Adderall "for years" and that she cannot function without it and her choice of rehab facility was once which would permit both smoking (tobacco) and the use of Adderall.

As she earlier explained it: “I have severe ADD. I can’t stand still.  So, I take Adderall for that; it calms me.”  Ms Lohan further noted she was not unaware there were those who took Adderall for its side-effects, notably weight loss or the ability to function effectively for extended durations without needing to sleep but that she wasn’t someone who needed to regulate her weight and that her sleeping patterns were normal.  However, the cult if anything is growing and in the US shortages of Adderall were reported (not for the first time) in late 2022.  The US Food and Drug Administration (FDA) responded by issuing a statement noting that while there was nothing unusual about episodic shortages of generic drugs at any given time because the profit margins are low and production is sometimes restricted to avoid a sacrifice in the opportunity cost vis-a-vis higher margin products, Adderall was “a special case because it is a controlled substance and the amount available for prescription is controlled by the Drug Enforcement Administration (DEA).”  The FDA added that because there had been “a tremendous increase in prescribing” because of virtual medicine (e-consultations) and a general trend towards over-prescribing and over-diagnosing, the periodic shortages were likely to continue.  THE FDA’s conclusion was that “if only those who needed these drugs got them, there probably wouldn't be a [stimulant medication] shortage” but the diagnosis of ADHD continues to grow and the desire for rapid weight-loss solutions remains strong.

Monday, April 8, 2024

Virtual

Virtual (pronounced vur-choo-uhl)

(1) Being as specified in power, force, or effect, though not actually or expressly such; having the essence or effect but not the appearance or form.

(2) In optics, of an image (such as one in a looking glass), formed by the apparent convergence of rays that are prolonged geometrically, but not actually (as opposed to a real image).

(3) Being a focus of a system forming such images.

(4) In mechanics, pertaining to a theoretical infinitesimal velocity in a mechanical system that does not violate the system's constraints (applied also to other physical quantities); resulting from such a velocity.

(5) In physics, pertaining to a theoretical quality of something which would produce an observable effect if counteracting factors such as friction are disregarded (used often of the behavior of water if a factor such as friction were to be disregarded.

(6) In physics, designating or relating to a particle exchanged between other particles that are interacting by a field of force (such as a “virtual photon” and used also in the context of an “exchange force”).

(7) In digital technology, real, but existing, seen, or happening online or on a digital screen, rather than in person or in the physical world (actually an adaptation of an earlier use referring to political representation).

(8) In particle physics, pertaining to particles in temporary existence due to the Heisenberg uncertainty principle.

(9) In quantum mechanics, of a quantum state: having an intermediate, short-lived, and unobservable nature.

(10) In computing (of data storage media, operating systems, et al) simulated or extended by software, sometimes temporarily, in such a way as to function and appear to the user as a physical entity.

(11) In computing, of a class member (in object-oriented programming), capable of being overridden with a different implementation in a subclass.

(12) Relating or belonging to virtual reality (once often used as “the virtual environment” and now sometimes clipped to “the virtual”) in which with the use of headsets or masks, experiences to some degree emulating perceptions of reality can be produced with users sometimes able to interact with and change the environment.

(13) Capable of producing an effect through inherent power or virtue (archaic and now rare, even as a poetic device).

(14) Virtuous (obsolete).

(15) In botany, (literally, also figuratively), of a plant or other thing: having strong healing powers; a plant with virtuous qualities (obsolete).

(16) Having efficacy or power due to some natural qualities; having the power of acting without the agency of some material or measurable thing; possessing invisible efficacy; producing, or able to produce, some result; effective, efficacious.

1350–1400: From the Middle English virtual & virtual (there were other spellings, many seemingly ad hoc, something far from unusual), from the Old French virtual & vertüelle (persisting in Modern French as virtuel), from their etymon Medieval Latin virtuālis, the construct being the Classical Latin virtū(s) (of or pertaining to potency or power; having power to produce an effect, potent; morally virtuous (and ultimately the source of the modern English “virtue” from the Latin virtūs (virtue)) + -ālis.  The Latin virtūs was from vir (adult male, man), ultimately from the primitive Indo-European wihrós (man) (the construct of which may have been weyh- (to chase, hunt, pursue) + -tūs (the suffix forming collective or abstract nouns)).  The –alis suffix was from the primitive Indo-European -li-, which later dissimilated into an early version of –āris and there may be some relationship with hel- (to grow); -ālis (neuter -āle) was the third-declension two-termination suffix and was suffixed to (1) nouns or numerals creating adjectives of relationship and (2) adjectives creating adjectives with an intensified meaning.  The suffix -ālis was added (usually, but not exclusively) to a noun or numeral to form an adjective of relationship to that noun. When suffixed to an existing adjective, the effect was to intensify the adjectival meaning, and often to narrow the semantic field.  If the root word ends in -l or -lis, -āris is generally used instead although because of parallel or subsequent evolutions, both have sometimes been applied (eg līneālis & līneāris).  The alternative spellings vertual, virtuall and vertuall are all obsolete.  Virtual is a noun & adjective, virtualism, virtualist, virtualism, virtualness, virtualization (also as virtualisation) & virtuality are nouns, virtualize (also as virtualise) is a verb and virtually is an adverb; the noun plural is virtuals.  The non virtualosity is non-standard.

The special use in physics (pertaining to a theoretical infinitesimal velocity in a mechanical system that does not violate the system’s constraints) came into English directly from the French.  The noun use is derived from the original adjective.  Virtual is commonly used in the sense of being synonymous with “de facto”, something which can now be misleading because “virtue” has become so associated with the modern use related to computing.  In the military matters it has been used as “a virtual victory” to refer to what would by conventional analysis be thought a defeat, the rationale being the political or economic costs imposed on the “winner” were such that the victory was effectively pyrrhic.  It was an alternative to the concept of “tactical defeat; strategic victory” which probably was a little too abstract for some.

"Virtual art galleries" range from portals which enable works to be viewed on any connected device to actual galleries where physical works are displayed on screens or in some 3D form, either as copies or with a real-time connection to the original.   

In computing, although “virtual reality” is the best known use, the word has for some time been used variously.  “Virtual memory” (which nerds insist should be called “virtual addressing” is a software implementation which enables an application to use more physical memory than actually exists.  The idea dates from the days of the early mainframes when the distinction between memory and storage space often wasn’t as explicit as it would later become and it became popular in smaller systems (most obviously PCs) where at a time when the unit cost of RAM (random access memory) hardware was significantly higher than the default storage media of the HDD (hard disk drive).  Behaving as static electricity does, RAM was many orders of magnitude faster than the I/O (input/output) possible on hard disks but allocating a portion of free disk space to emulate RAM (hence the idea “virtual memory”) did make possible many things which would not run were a system able to work only with the installed physical RAM and rapidly it became a mainstream technique.

There’s also the VPN (virtual private network), a technology which creates a secure and encrypted connection over a public network (typically the Internet) and use is common to provide remote access to a private network or to establish a secure tunnel between two networks using the internet for transport.  The advantage of VPNs is they should ensure data integrity and confidentiality, the two (or multi) node authentication requirement making security breaches not impossible but less likely.  Widely used by corporations, VPNs are best known as the way traditionally used to evade surveillance and censorship in certain jurisdictions as diverse as the PRC (People’s Republic of China), the Islamic Republic of Iran and the UK although this is something of an arms race, the authorities with varying degrees of enthusiasm working out way to defeat the work-arounds.  VPNs often use an IP tunnel which is a related concepts but the IP tunnel is a technique used to encapsulate one type of network packet within another type of network packet to transport it over a network that wouldn't normally support the type of packet being transported.  IP tunnels are particularly useful in connecting networks using different protocols and (despite the name), the utility lies in them being able to transport just about any type of network traffic (not just IP).  A modular technology, not all IP tunnels natively provide authentication & encryption but most support “bolt-ons” which can add either or both.  So, while all VPNs use some form of tunnelling (however abstracted), not all tunnels are VPNs.

Microsoft really wanted you to keep their Java Virtual Machine.

Then there are “virtual machines”.  In personal computing, the machine came quickly to be thought of as a box to which a monitor and keyboard was attached and originally it did one thing at a time; it might be able to do many things but not simultaneously.  That situation didn’t long last but the idea of the connection between one function and one machine was carried over to the notion of the “virtual machine” which was software existing on one machine but behaving functionally like another.  This could include even a full-blown installation of the operating systems of several servers running on specialized software (sometimes in conjunction with hardware components) on a singles server.  What made this approach practical was that it is not unusual for a server to be under-utilized for most of its life (critically components often recording 2-3% utilization for extended periods, thus the attraction of using one physical server rather than several.  Obviously, the economic case was also compelling, the cost savings of having one server rather than a number multiplied by reductions in electricity use, cooling needs, insurance premiums and the rent of space.  There was also trickery, Microsoft’s JVM (Java Virtual Machine) an attempt to avoid having to pay licensing fees to Sun Microsystems (later absorbed by Oracle) for the use of a Java implementation.  The users mostly indifferent but while the hardware was fooled, the judges were not and the JVM was eventually declared an outlaw.

Operating a computer remotely (there are few ways to do this) rather than physically being present is sometimes called “virtual” although “remote” seems to have been become more fashionable (the form “telecommuting” used as early as 1968 is as archaic as the copper-pair analogue telephone lines over which it was implemented although “telemedicine” seems to have survived, possibly because in many places voice using an actual telephone remains a part).  In modern use (and the idea of virtual as “not physically existing but made to appear by software” was used as early as 1959), there are all sorts of “virtuals” (virtual personal trainers, virtual assistants et al), the idea in each case is that the functionality offered by the “real version” of whatever is, in whole or in part, emulated but the “virtual version”, the latter at one time also referred to as a “cyberreal”, another word from the industry which never came into vogue.  “Virtual keyboards” are probably the most common virtual device used around the world, now the smartphone standard, the demise of the earlier physical devices apparently regretted only by those with warm memories of their Blackberries.  Virtual keyboards do appear elsewhere and they work, although obviously offer nothing like the tactile pleasure of an IBM Model M (available from ClickyKeyboards.com).  The idea of “a virtual presence” is probably thought something very modern and associated with the arrival of computing but it has history.  In 1766, in the midst of the fractious arguments about the UK’s reaction to the increasing objections heard from the American colonies about “taxation without representation” and related matters (such as the soon to be infamous Stamp Act), William Pitt (1708-1778 (Pitt the Elder and later Lord Chatham); UK prime-minister 1766-1768) delivered a speech in the House of Commons.  Aware his country’s government was conducting a policy as inept as that the US would 200 years on enact in Indochina, his words were prescient but ignored.  Included was his assertion the idea of “…virtual representation of America in this house is the most contemptible idea that ever entered into the head of man and it does not deserve serious refutation.  However, refute quite seriously just about everything his government was doing he did.  Pitt’s use of the word in this adjectival sense was no outlier, the meaning “being something in essence or effect, though not actually or in fact” dating from the mid-fifteenth century, an evolution of the sense of a few decades earlier when it was used to mean “capable of producing a certain effect”.  The adverb virtually was also an early fifteenth century form in the sense of “as far as essential qualities or facts are concerned while the meaning “in effect, as good as” emerged by the early seventeenth.

Lindsay Lohan's 2021 predictions of the US$ value of Bitcoin (BTC) & Ethereum (ETH).  By April 2024 the trend was still upward so the US$100,000 BTC may happen.  

In general use, the terms “cybercurrency”, “cryptocurrency” & “virtual currency” tend to be used interchangeably and probably that has no practical consequences, all describing electronic (digital) “currencies” which typically are decentralized, the main point of differentiation being that cryptocurrencies claim to be based on cryptographic principles and usually limited in the volume of their issue (although the decimal point makes this later point of little practical significance)  Whether they should be regarded as currencies is a sterile argument because simultaneously they are more and less, being essentially a form of gambling but for certain transactions (such as illicit drugs traded on various platforms), they are the preferred currency and in many jurisdictions they remain fully convertible and it’s telling the values are expressed almost always in US$, “cross-rates” (ie against other cryptocurrencies) rarely quoted.  However, to be pedantic, a “virtual currency” is really any not issued by a central government or authority (in the last one or two centuries-odd usually a national or central bank) and they can include in-game currencies, reward points and, of course, crybercurrencies.  The distinguishing feature of a cryptocurrency is the cryptotography.

Although the term is not widely used, in Christianity, "virtuality" was the view that contrary to the Roman Catholic doctrine of transubstantiation, the bread & wine central to Holy Communion do not literally transform into flesh and blood but are the medium or mechanism through which the spiritual or immaterial essence of the flesh and blood of Jesus Christ are received.  Within the Church, those who espoused or adhered to the heresy of virtuality were condemned as "virtualists.  In philosophy, the concept of virtuality probably sounds something simple to students but of course academic philosophy has a “marginal propensity to confuse”, the important distinction being “virtual” is not opposed to “real” but instead to “actual”, “real” being opposed to “possible”.

Monday, September 2, 2024

Malachite

Malachite (pronounced mal-uh-kahyt)

(1) In mineralogy, a bright-green monoclinic mineral, occurring as a mass of crystals (an aggregate).  It manifests typically with a smooth or botryoidal (grape-shaped) surface and, after cutting & polishing, is used in ornamental articles and jewelry.  It’s often concentrically banded in different shades of green, the contrast meaning that sometimes lends the substance the appearance of being a variegated green & black.  Malachite is found usually in veins in proximity to the mineral azurite in copper deposits.  The composition is hydrated copper carbonate; the chemical formula is Cu2CO3(OH)2 and the crystal structure is monoclinic.

(2) A ceramic ware made in imitation of this (in jewelry use, “malachite” is used often as a modifier).

(3) In mineralogy, as pseudomalachite, a mineral containing copper, hydrogen, oxygen, and phosphorus.

(4) In mineralogy, as azurite-malachite, a naturally-occurring mixture of azurite and malachite

(5) In organic chemistry, as malachite green, a toxic chemical used as a dye, as a treatment for infections in fish (when diluted) and as a bacteriological stain.

(6) Of a colour spectrum, ranging from olive-taupe to a mild to deeply-rich (at times tending to the translucent) green, resembling instances in the range in which the mineral is found.  In commercial use, the interpretation is sometimes loose and some hues are also listed as “malachite green”).

1350-1400: From the Middle French malachite, from the Old French, from the Latin molochītēs, from the Ancient Greek malachitis (lithos) (mallow (stone)) & molochîtis (derivative of molóchē, a variant of maláchē), from μολόχη (molókhē) (mallow; leaf of the mallow plant).  It replaced the Middle English melochites, from the Middle French melochite, from the Latin molochītis.  Malachite is a noun & adjective; the noun plural is malachites.

A pair of Malachite & Onyx inlay cufflinks in 925 Sterling Silver (ie 92.5% pure silver & 7.5% other metals), Mexico, circa 1970.

Although in wide use as a gemstone, technically malachite is copper ore and thus a “secondary mineral” of copper, the stone forming when copper minerals interact with different chemicals (carbonated water, limestone et al.  For this reason, geologists engaged in mineral exploration use malachite as a “marker” (a guide to the likelihood of the nearby presence of copper deposits in commercial quantities).  It’s rare for malachite to develop in isolation and it’s often found in aggregate with azurite, a mineral of similar composition & properties.  Visually, malachite & azurite are similar in their patterning and distinguished by color; azurite a deep blue, malachite a deep green.  Because the slight chemical difference between the two makes azurite less stable, malachite does sometimes replace it, resulting in a “pseudomorph”.  Although there is a range, unlike some minerals, malachite is always green and the lustrous, smooth surface with the varied patterning when cut & polished has for millennia made it a popular platform for carving, the products including al work, jewelry and decorative pieces.  For sculptors, the properties of malachite make it an easy and compliant material with which to work and it’s valued by jewelers for its color-retention properties, the stone (like many gemstones) unaffected by even prolonged exposure to harsh sunlight.  Despite the modern association of green with the emerald, the relationship between mankind & malachite is much more ancient. evidence of malachite mining dating from as early as 4000 BC found near the Isthmus of Suez and the Sinai whereas there’s nothing to suggest the emerald would be discovered until Biblical times, some two millennia later.

Lindsay Lohan in malachite green, this piece including both the darker and lighter ends of the spectrum.

The Malachite is relatively soft meant it was easy to grind into a powder even with pre-modern equipment; it was thus used to create what is thought to be the world’s oldest green pigment (described often as chrysocolla or copper green).  In Antiquity, the dye was so adaptable it was used in paint, for clothing and Egyptians (men & women) even found it was the ideal eye makeup.  Use persisted until oil-based preparations became available in quantity and these were much cheaper because of the labor-intensive grinding processes and the increasing price of malachite which was in greater demand for other purposes.  This had the side-effect of creating a secondary market for malachite jewelry and other small trinkets because the fragments and wastage from the carving industry (once absorbed by the grinders for the dye market) became available.  The use in makeup wasn’t without danger because, as a copper derivate, raw malachite is toxic; like many minerals, the human body needs a small amount of copper to survive but in high doses it is a poison’ in sufficient quantities, it can be fatal.  Among miners and process workers working with the ore, long-term exposure did cause severe adverse effects (from copper poisoning) so it shouldn’t be ingested or the dust inhaled.  Once polished, the material is harmless but toxicology specialists do caution it remains dangerous if ingested and any liquid with which it comes in contact should not be drunk.  Despite the dangers, the mineral has long been associated with protective properties, a belief not restricted to Antiquity or the medieval period; because the Enlightenment seems to have passed by New Agers and others, malachite pendants and other body-worn forms are still advertised with a variety of improbable claims of efficacy.

The Malachite Room of the Winter Palace, St Petersburg, Russia was, during the winter of 1838-1839, designed as a formal reception room (a sort of salon) for the Tsar & Tsarina by the artist Alexander Briullov (1798–1877), replacing the unfortunate Jasper Room, destroyed in the fire of 1837.  It’s not the only use of the stone in the palace but it’s in the Malachite Room where a “green theme” is displayed most dramatically, the columns and fireplace now Instagram favorites, as is the large large urn, all sharing space with furniture from the workshops of Peter Gambs (1802-1871), those pieces having been rescued from the 1837 fire.  Between June-October 1971 it was in the Malachite Room that the Provisional Government conducted its business until the representatives were arrested by Bolsheviks while at dinner in the adjoining dining room.  The putsch was denounced by the Mensheviks who the Bolsheviks finally would suppress in 1921.

Polished malachite pieces from the Congo, offered on the Fossilera website.

Where there is demand for something real, a supply of a imitation version will usually emerge and the modern convention is for items erroneously claiming to be the real thing are tagged “fake malachite” while those advertised only as emulation are called “faux malachite”.  Although not infallible, the test is that most fake malachite stones are lighter than the real thing because, despite being graded as “relatively soft” by sculptors, the stone is of high in density and deceptively heavy.  The patterning of natural malachite is infinitely varied while the synthetic product tends to some repetition and is usually somewhat brighter.  The density of malachite also lends the stone particular thermal properties; it’s inherently cold to the touch, something which endures even when a heat source is applied.  Fake malachite usually is manufactured using glass or an acrylic, both of which more rapidly absorb heat from the hand.

Lindsay Lohan with Rolex Datejust in stainless steel with silver face (left) and the Rolex's discontinued "malachite face" (centre & right).  Well known for its blue watch faces, during the more exuberant years of the 1970s & 1980s the company “splashed out” a bit and offered a malachite face.  The Datejust is now available with a choice of nine faces but the Green one is now a more restrained hue the company calls “mint green”.

Thursday, June 29, 2023

Phlebotomy

Phlebotomy (pronounced fluh-bot-uh-mee)

(1) The act or practice of opening a vein for letting or drawing blood as a therapeutic or diagnostic measure; the letting of blood and known in historic medicine as "a bleeding".

(2) Any surgical incision into a vein (also known as venipuncture & (less commonly) venesection).  It shouldn’t be confused with a phlebectomy (the surgical removal of a vein).

1350–1400: From the earlier flebotomye & phlebothomy, from the Middle French flebotomie, from the thirteenth century Old French flebothomie, (phlébotomie the Modern French) from the Late & Medieval Latin phlebotomia, from the Ancient Greek φλεβοτόμος (phlebotómos) (a lancet used to open a vein), the construct being φλέψ (phléps) (genitive phlebos) (vein), of uncertain origin + tomē (a cutting), from the primitive Indo-European root tem- (to cut).  The form replaced the Middle English fleobotomie.  The noun phlebotomist (one who practices phlebotomy, a blood-letter) is documented only as late as the 1650s but may have been in use earlier and operated in conjunction with the verb phlebotomize.  The earlier noun and verb in English (in use by the early fifteenth century) were fleobotomier & fleobotomien.  The Latin noun phlebotomus (genitive phlebotomī) (a lancet or fleam (the instruments used for blood-letting)) was from the Ancient Greek φλεβότομος (phlebótomos) (opening veins), the construct being φλέψ (phléps) (blood vessel) + τέμνω (témnō) (to cut) + -ος (-os) (the adjectival suffix).  The alternative spelling was flebotomusThe noun fleam (sharp instrument for opening veins in bloodletting (and this in the pre-anesthetic age)) was from the late Old English, from Old French flieme (flamme in Modern French), from the Medieval Latin fletoma, from the Late Latin flebotomus, from Greek φλεβοτόμος (phlebotómos) (a lancet used to open a vein).  The doublet was phlebotome and in barracks slang, a fleam was a sword or dagger.  Phlebotomy & Phlebotomist are nouns, phlebotomize is a verb and phlebotomic & phlebotomical are adjectives; the noun plural is phlebotomies.

Phlebotomy describes the process of making a puncture in a vein cannula for the purpose of drawing blood.  In modern medicine the preferred term is venipuncture (used also for therapy) although the title phlebotomist continues to be used for those who specialize in the task.  One of the most frequently performed procedures in clinical practice, it’s commonly undertaken also by doctors, nurses and other medical staff.  Although the origins of phlebotomy lie in the ancient tradition of blood letting, it’s now most associated with (1) the taking of blood samples for testing by pathologists and (2) those carried out as “therapeutic phlebotomies” as part of the treatment regimen for certain disorders of the blood.  The inner elbow is the most often used site but in therapeutic medicine or in cases where the veins in the arms are not suitable, other locations can be used.

Bleeding the foot (circa 1840), oil on canvas following Honoré Daumier (1808-1879).

It’s an urban myth the Hippocratic Oath includes the clause: “First, do no harm” but by any reading that is a theme of the document and while the Greek physician Hippocrates of Kos (circa 460-circa 375 BC) wouldn’t have been the first in his field to regard illness as something to be treated as a natural phenomenon rather than something supernatural, he’s remembered because of his document.  His doctrine was one which took a long time to prevail (indeed there are pockets where still it does not), holding that treatment of ailments needed to be based on science (“evidence-based” the current phrase) rather than devotion or appeals to the gods.  His influence thus endures but one of his most famous theories which persisted for decades resulted in much lost blood for no known benefit and an unknown number of deaths.  Drawing from the notion of earlier philosophers that the basis of the universe was air, earth, water & fire, the theory was that there were four “humors” which had to be maintained in perfect balance to ensure health in body & mind, the four being flegmat (phlegm), sanguin (blood), coleric (yellow bile) & melanc (black bile) which were the source of the four personality types, the phlegmatic, the sanguine, the choleric & the melancholic.  Had Hippocrates and his successors left the humors in the realm of the speculative, it would now be thought some amusing fragment from Antiquity but unfortunately surgical intervention was designed to ensure balance was maintained and the mechanism of choice was bloodletting to “remove excess liquids”.

George Washington in his last illness, attended by Doctors Craik and Brown (circa 1800) engraving by unknown artist, Collection of The New-York Historical Society.

Apparently, bloodletting was practiced by the ancient Egyptians some 3000 years ago and it’s not impossible it was among the medical (or even religious) practices of older cultures and From there it’s known to have spread to the Middle East, Rome, Greece and West & South Asia, physicians and others spilling blood in the quest to heal and the evidence suggests it was advocated for just about any symptom.  The very idea probably sounds medieval but in the West that really was the nature of so much medicine until the nineteenth century and even well into the twentieth, there were still some reasonably orthodox physicians advocating its efficacy.  Still, in fairness to Hippocrates, he was a pioneer in what would now be called “holistic health management” which involved taking exercise, eating a balanced diet and involving the mind in art & literature.  He was an influencer in his time.  All the humors were of course good but only in balance so there could be too much of a good thing.  When there was too much, what was in excess had to go and apart from bloodletting, there was purging, catharsis & diuresis, none of which sound like fun.  Bloodletting however was the one which really caught on and was for centuries a fixture in the surgeon’s bag.

Blood self-letting: Lindsay Lohan as Carrie from the eponymous film, Halloween party, Foxwoods Resort & Casino, Connecticut, October 2013.

Actually, as the profession evolved, the surgeons emerged from the barber shops where they would pull teeth too.  The formal discipline of the physician did evolve but they restricted themselves to providing the diagnosis and writing scripts from which the apothecary would mix his potions and pills, some of which proved more lethal than bloodletting.  The bloodletting technique involved draining blood from a large vein or artery (the most productive soon found to be the median cubital at the elbow) but if a certain part of the body was identified as being out-of-balance, there would be the cut.  The mechanisms to induce blood loss included cupping, leeching & scarification and with the leeches, they were actually onto something, the thirsty creatures still used today in aspects of wound repair and infection control, able often to achieve better results more quickly than any other method.  Leeches have demonstrated extraordinary success in handing the restoration of blood flow after microsurgery and reimplantation and works because the little parasites generate substances like fibrinase, vasodilators, anticoagulants & hyaluronidase, releasing them into the would area where they assist the healing process by providing an unrestricted blood flow.  Of course the leeches don't always effect a cure.   When in 1953 doctors were summoned to examine a barely conscious comrade Stalin (1878-1953; Soviet leader 1924-1953), after their tests they diagnosed a haemorrhagic stroke involving the left middle cerebral artery.  In an attempt to lower his blood pressure, two separate applications of eight leeches each were applied over 48 hours but it was to no avail.  Had he lived he might have had the leeches shot but they probably lived to be of further service.

A Surgeon Letting Blood from a Woman's Arm, and a Physician Examining a Urine-flask (in some descriptions named Barber-Surgeon Bleeding a Patient), eighteenth century oil on canvas, attributed to school of Jan Josef Horemans (Flemish; 1682-1752); Previously attributed to Richard Brakenburg (Dutch; 1650-1702); Previously attributed to the Flemish School,

Scarification was a scraping of the skin and if the circumstances demanded more, leeches could be added.  Cupping used dome-shaped cups placed on the skin to create blisters through suction and once in place, suction was achieved through the application of heat.  However it was done it could be a messy, bloody business and in the twelfth century the Church banned the practice, calling it “abhorrent” and that had the effect of depriving priests and monks of a nice, regular source of income which wasn’t popular.  However, especially in remote villages far from the bishop’s gaze, the friars continued to wield their blades and harvest their leeches, the business of bloodletting now underground.  In the big towns and cities though the barbers added bloodletting to their business model and it’s tempting to wonder whether package deals were offered, bundling a blooding with a tooth pulling or a haircut & shave.  From here it was a short step to getting into the amputations, a not uncommon feature of life before there were antibiotics and to advertise their services, the barber-surgeons would hang out white rags smeared in places with blood, the origin of the red and white striped poles some barbers still display.  To this day the distinctions between surgeons and physicians remains and in England the Royal College of Physicians (the RCP, a kind of trade union) was founded by royal charter in 1518.  By the fourteenth century there were already demarcation disputes between the barber surgeons and the increasingly gentrified surgeons and a number of competing guilds and colleges were created, sometimes merging, sometimes breaking into factions until 1800 when the Royal College of Surgeons (RCS) was brought into existence.  It's said there was a time when fellows of the RCP & RCS, when speaking of each-other, would only ever make reference to "the other college", the name of the institution never passing their lips. 

Bloodletting tools: Late eighteenth century brass and iron “5-fingered” fleam.

Unfortunately, while doubtlessly lobbying to ensure the fees of their members remained high, the colleges did little to advance science and the byword among the population remained: “One thing's for sure: if illness didn't kill you, doctors would”.  It was the researchers of the nineteenth century, who first suggested and then proved germ theory, who sounded the death knell for most bloodletting, what was visible through their microscopes rendering the paradigm of the four humors obsolete.  By the twentieth century it was but a superstition.

Saturday, September 21, 2024

Misocapnic

Misocapnic (pronounced miss-oh-kap-nick or migh-soh-kap-nick)

Hating tobacco smoke (the more recent extensions in meaning including “hating those who smoke tobacco” and “hating the tobacco industry).

1855: A linguistic mongrel, misocapnic was borrowed from Greek and combined with English elements, modelled on a Latin lexical item, the construct being miso- (a combining form of Ancient Greek μῑσέω (mīséō) (to hate) from μῖσος (mîsos) (hatred) which was used to create forms conveying the notion of a “hatred, dislike or aversion” of or to something) + the stem of the Ancient Greek καπνός (kapnós) (smoke) + ‑ic.  The -ic suffix was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (H₂SO₄) has more oxygen atoms per molecule than sulphurous acid (H₂SO₃)).  Misocapnic is an adjective and misocapnist & misocapnism are nouns; the noun plural is misocapnists.  A person who hates tobacco smoke or smoking (and often smokers) is a misocapnist and if it becomes a calling (noted in “reformed” smokers) they become practitioners of misocapnism.  Misocapnists range from the merely disapproving to the rabid activists, the comparative “more misocapnic”, the superlative “most misocapnic”.

The earliest known use of misocapnic was in the book: A Paper, Of Tobacco: Treating Of The Rise, Progress, Pleasures, And Advantages Of Smoking, With Anecdotes Of Distinguished Smokers (1839) by Joseph Fume (a pseudonym of English writer William Andrew Chatto (1799–1864) (who also published as Stephen Oliver (Junior))).  Noted by scholars as work of genuine interest and now in the public domain (still available in re-print), “Of Tobacco” explored the history, chemistry, and cultural significance of smoking discussing the ceremonial use of tobacco by Native Americans and its introduction to Europe.  It includes also the word “mundungus” (used usually to mean “offal; waste animal product; organic matter unfit for consumption”, it came also to be slang for “poor-quality tobacco with a foul, rancid, or putrid smell”) which was from the Spanish mondongo (tripe, entrails).  The earliest known use of the adjectival form misocapnic was in an 1855 pamphlet by Church of England (broad faction) priest & historian Charles Kingsley (1819–1875), a notorious controversialist.

In the West, anti-smoking measures began seriously to be imposed in the 1980s, displeasing those accustomed to enjoying cigarettes at their desk or while flying on airliners.  That was consequent upon a legal and medical saga which dates from the mid-century, the US Surgeon-General first issuing warnings in the 1960s, trigging the campaign (fought tooth and nail by the tobacco industry) which saw multi-billion dollar settlements imposed.  Opposition to smoking however wasn’t something new, one of the most celebrated of the unimpressed being noted amateur theologian James I (1566–1625) King of Scotland as James VI (1567-1625) & King of England and Ireland as James I (1603-1625) who in 1604 issued his A Counterblaste to Tobacco, one of the earliest diatribes against the habit:

Have you not reason then to bee ashamed, and to forbeare this filthie noveltie, so basely grounded, so foolishly received and so grossely mistaken in the right use thereof? In your abuse thereof sinning against God, harming your selves both in persons and goods, and raking also thereby the markes and notes of vanitie upon you: by the custome thereof making your selves to be wondered at by all forraine civil Nations, and by all strangers that come among you, to be scorned and contemned. A custome lothsome to the eye, hatefull to the Nose, harmefull to the braine, dangerous to the Lungs, and in the blacke stinking fume thereof, neerest resembling the horrible Stigian smoke of the pit that is bottomelesse…

Such was the king’s disdain for "the noxious plante" he imposed a heavy excise tax on tobacco imported from the North American colonies (an approach now favoured by Western governments as a public health measure) but within two decades-odd, politics & economics had triumphed, the population’s ever-growing demand for tobacco compelling him to instead create a royal monopoly for the crop.  Over the ensuing centuries, the plant would prove a mainstay of the economy and, via the trade routes secured by the Royal Navy, Great Britain would emerge as tobacco merchant to the world.  The combination of the royal imprimatur and his subjects’ embrace of the addictive habit lent tobacco a respectability which would extend to all classes of society, including (until well into the twentieth century), much of the medical establishment and the alleged medical efficacy had a long history, smoking a pipe at breakfast made compulsory for the schoolboys at London’s Eton College during The Great Plague of 1665, something widely advocated as a defence against “bad air”.

Mid-century cigarette advertising.  Even in the 1950s the public's suspicion that tobacco was a dangerous product was rising and the industry's advertising switched from the traditional "lifestyle" model to one which relied on endorsements by celebrities and scientists and much quoting of research and statistics, much of which would later be wholly debunked.  The tactics and techniques similar to those later adopted by the fossil fuel lobby in their long campaign to discredit the science of human-activity induced climate change. 

One attempt at social engineering began in earnest in the 1980s: Pressure was applied on film & television studios, advertisers and publishers to stop depicting smoking as "attractive, sexy and cool".  Because cigarette smoke is known to be carcinogenic and sustained use typically reduced the human lifespan by about a decade, it was an admirable part of the public health programme but the difficult thing was that images of smoking undeniably could be sexy.  Lindsay Lohan demonstrates.    

The industry learned early the value of celebrity endorsement & association, “Prince Albert” tobacco introduced by the RJ Reynolds Tobacco Company in 1907 and named after the prince who would become King Edward VII (1841–1910; King of the UK & Emperor of India 1901-1910) although the myth it was named after heavy smoker Prince Albert of Saxe-Coburg and Gotha (1819-1861; consort of Victoria (1819–1901; Queen of the UK 1837-1901)) persists.  Prince Albert tobacco is rated as “high quality” and Albert Speer (1905–1981; Nazi court architect 1934-1942; Nazi minister of armaments and war production 1942-1945), on 3 October 1947 (two years into the 20 year sentence he was lucky to receive for war crimes and crimes against humanity) noted with approval in his clandestine prison diary (Spandauer Tagebücher (Spandau: The Secret Diaries) (1975)): “After breakfast my first pipe.  No matter which nation is on duty we receive a tin of American Prince Albert as our weekly ration.  High quality the Prince Albert may have been but some seven months later he observed “I nearly made myself sick to my stomach breaking in my pipe.  Still, he kept smoking although it’s not clear if he’d quit the habit when, aged 76, he died in a London hotel room in the company of a woman some decades younger and not his wife.

Although later the industry would use their sponsorship of sport to turn the sporting organizations into “tobacco industry lobbyists”, even before the political pressures appeared, the usefulness of sport as a promotional tool was understood, the Gallaher (to become best known for the “Benson & Hedges” brand) company in 1966 gaining the “naming rights” to the annual 500 mile (805 km) endurance race for what then genuinely were “production cars”, run on the 3.9 mile (6.2 km) Mount Panorama Circuit at Bathurst in Australia.  It’s the race which in 1973 became the Bathurst 1000 (625 miles), the country that year switching to the metric system.  Gallaher took up the event sponsorship to promote their brand but the sales numbers hadn’t much improved after the well-publicized 1966 race so they decided to leverage their money, “suggesting” certain changes to the race rules.

Changing of the guard: Mini Coopers (1275 cm3), Bathurst, 1966 and Ford Falcon GTs (4482 cm3), Bathurst, 1967.

The Bathurst race then was unusual in that it was a true stand-alone event, neither part of any series nor governed by rules set by the Confederation of Australian Motor Sport (CAMS) or the Fédération Internationale de l'Automobile (the FIA; the International Automobile Federation (world sport’s dopiest regulatory body)) and in 1966 there was no rule requiring a minimum number of pit stops.  Taking advantage of this were the “giant-killing” 1.3 litre (78 cubic inch) Morris Mini Cooper 1275 S, able to run the 500 miles without needing tyre changes and, at most, only one stop for fuel.  Accordingly, although not the fastest machines in a straight line, the Minis filled the first nine places, the only other car in the top ten a 273 cubic inch (4.5 litre) Chrysler Valiant V8 which finished tenth, six laps down on the winner.  Timed at a then impressive 120 mph (193 km/h) down the long Conrod Straight, the Valiant posted competitive lap times but the frequent stops for tyres and fuel (more time-consuming tasks then than now) lent the Minis a significant advantage.

Clockwise from top left: The eight “Gallaher GT” Falcon GTs in corporate livery outside the corporation's Rydalmere facility in Sydney, September 1967; a packet of “Gallaher GTs 20s”; one of the surviving cars after restoration and an image from the 1967 advertising campaign (note the "driving glove" an affectation from the days of open roadsters, sweaty palms & teak-rim steering wheels).

No documents have ever been sighted which prove it was Gallaher which “suggested” mandating a minimum number of pit-stops but few have doubts and once implemented for the 1967 event, the advantage enjoyed by the small, light, economical cars was negated and not for another 20 years would a four-cylinder car win the race and the Mini remains the only front wheel drive (FWD) vehicle to enjoy a victory.  With a little nudge, the planets were thus aligned for Gallaher and their “Gallaher GT” cigarette brand.  As a promotional tie-in, eight of the new 289 cubic inch (4.7 litre) XR Ford Falcon GTs were painted silver to match the cigarette’s packaging and, adorned with corporate livery, issued to the travelling salesmen (and they were then all men) who went forth and promoted.  Other than the paint, the cars were standard except for an alarm system fitted to the boot (trunk) lid; even at 50c a packet, the Falcon could be holding over Aus$3000 in stock (as late as the early 1980s, the agents would visit places like sports grounds or shopping centres, handing out free samples of cigarettes).  So the plan was to use the Falcon GT’s victory at Bathurst to promote sales of Gallaher GT cigarettes and part of the plan worked in that the Fords finished first and second but the success didn’t rub off on the fags, the Gallaher GT quietly withdrawn in March 1968, some six months after the chequered flag had been waved at Bathurst, Gallaher leaving to others (like Benson & Hedges, Gallaher holding the UK but not Australian rights to the trademark) the task of getting Australians addicted.  Tobacco advertising finally vanished from Australian race-tracks in 1996 when the federal government imposed a ban.

Sydney Morning Herald “souvenir” front page, 14 March 1983 (left), Benson & Hedges packet with royal warrant (1877-1999, centre) and packet with “B&H coat of arms”, used after the warrant was withdrawn (right).

Gallaher took advantage of the 1983 royal tour of Australia to promote its Benson & Hedges brand, a packet embossed with the royal warrant (indicated by a coat of arms and the title “By appointment to…”) appearing on a “souvenir” front page, Sydney Morning Herald, 14 March 1983.  In 1999, the UK papers reported it was the advocacy of the most misocapnic Prince of Wales (now Charles III (b 1948; King of the United Kingdom since 2022)) which persuaded Elizabeth II (1926-2022; queen of the UK and other places, 1952-2022) to withdraw the royal warrant.