Saturday, January 13, 2024

Diaspora

Diaspora (pronounced dahy-as-per-uh or dee-as-per-uh)

(1) The scattering of the Jews among the Gentiles living beyond Palestine after the sixth century BC Babylonian captivity and the later Roman conquests of Palestine (the historic origin; usually capitalized).

(2) The body of Jews living in countries outside Israel.

(3) In the New Testament, the those Christians living outside Palestine

(4) Any group which involuntarily has been dispersed outside its traditional homeland.

(5) Any group migration or from a country or region.

(6) Any religious group living as a minority among people of the prevailing religion (not a definition accepted by all).

(7) By extension, the spread or dissemination of something originally confined to a local, homogeneous group (language, cuisine, an economic system etc).

(8) A collective of niche social media communities, run under the auspices of diasporafoundation.org.

1690-1700: From the Ancient Greek διασπορά (diasporá) (scattering; dispersion), from διασπείρω (diaspeírō) (I spread about; I scatter), derived from διά (diá) (between, through, across) + σπείρω (speírō) (I sow), the modern construct being diaspeirein (dia + speirein) (to scatter about, disperse) and διασπορά (diaspora) was thus understood as “a scattering".  Diaspora & diasporite are nouns, diasporan & diasporal are nouns & adjectives, diasporic is an adjective; the noun plural is diasporae, diasporai or diasporas.

The word diaspora must be lexicographically sexy because it has over many years attracted much interest from historians and etymologists, the conclusion of many that there may be “missing links” (ie, lost texts), this accounting for the murkiness of the transition from the verb of Antiquity to the idea of “diaspora” as it came to be understood.  There is confusion over the exact process of derivation from these old verbs to the contemporary concept(s) and although the Athenian historian and general Thucydides (circa 460–circa 400 BC) was for a long time cited as the first to use the word, this later was found to be a medieval misunderstanding (something not unusual) of his use of the verb σπείρω (speíro) (to sow).  The Greek word does appear in the Septuagint (the earliest extant Greek translation of the Old Testament from the original Hebrew):

ἔσῃ ἐν διασπορᾷ ἐν πάσαις ταῖς βασιλείαις τῆς γῆς, esē en diaspora en pasais tais basileiais tēs gēs ("thou shalt be a dispersion in all kingdoms of the earth"). (Deuteronomy 28:25).  The word in the Hebrew was galuth (exile) although the translation in the King James edition of the Bible (KJV 1611) read: “The Lord shall cause thee to be smitten before thine enemies: thou shalt go out one way against them, and flee seven ways before them: and shalt be removed into all the kingdoms of the earth”.

οἰκοδομῶν Ἰερουσαλὴμ ὁ Kύριος καὶ τὰς διασπορὰς τοῦ Ἰσραὴλ ἐπισυνάξει, oikodomōn Ierousalēm ho Kyrios kai tas diasporas tou Israēl episynaxē ("The Lord doth build up Jerusalem: he gathereth together the outcasts of Israel"). (Psalms 147.2)

When the Bible was translated into Greek, the word was used of (1) the Kingdom of Samaria, exiled from Israel by the Assyrians between 740-722 BC, (2) Jews, Benjaminites and Levites exiled from the Kingdom of Judah by the Babylonians (587 BC) and (3) Jews exiled by the empire from Roman Judea (72 AD).  From that use can be traced the development of the word to its modern form when it can be used not only of populations of one land living in another but linguistic novelties such the “diasporic capitalism” which found it’s natural home under the auspices of the Chinese Communist Party (CCP) and “diasporic cuisine” (such as the ubiquitous sushi which has colonized takeaway outlets east & west).  In the English-speaking world, the convention is that when capitalized, Diaspora refers specifically to the Jews (no longer does there seem to be a faction which insists it can be only of the event in 72 AD) while the word is un-capitalized for all other purposes.  Even then, controversy remains.  Because of the origins in which exile and expulsion were central to the experience, it is by some held that properly to be thought a diasporic, one must have been forced from one’s homeland but that seems now a minority position, someone in self-imposed exile, an economic migrant or a “mail order” bride all able to be included.  The foreign element does though remain essential; a refugee can be part of a diaspora whereas an IDP (internally displaced person) cannot, even if geographically, religiously or ethnically segregated, if in their homeland, they remain (an unfortunate) part of that community.  The first known instance of “diaspora” in an English text is thought to appear in 1594 in John Stockwood's (circa 1545-1610) translation of Commentarius in XII prophetas minores (Commentary on the Twelve Minor Prophets (1594)) by French theologian. Lambert Daneau Lambert Daneau (circa 1530-1595): “This scattering abrode of the Iewes, as it were an heauenly sowing, fell out after their returne from the captiuitie of Babylon. Wherevpon both Acts. 2. and also 1. Pet. 1. and 1. Iam. ver. 1. [sic] they are called Diaspora, that is, a scattering or sowing abrode.  The word was used in 1825 in reference to Moravian protestants and in 1869 in reference to the dispersion of the Jews although in English, the word earlier used to convey the concept was the late fourteenth century Latinate dispersion.

Google Ngram for diaspora.  Google’s Ngrams are not wholly reliable as a record in the trend-line of a word’s use because (1) the sub-set of texts Google uses is slanted towards the scientific & academic and (2) the technical limitations imposed by the use of OCR (optical character recognition) when handling older texts of sometime dubious legibility (a process AI might improve).  Despite that, the trend of disapora’s increasing use in the post-war years seems solid.  In the nineteenth and into the twentieth century the word was used in theological and academic writing and there doesn’t appear to have been a great volume of argument about whether it exclusively should be of Jews and nor was that aspect of the history controversial in the post-war years when use of the word spiked, a product presumably of (1) the vast increase in migration from European nations, both within the area and to countries beyond and (2) the rapid expansion of the university sector in the West, a new cohort of academics suddenly available (and anxious) to study these populations and the effects, both on their homelands and the places in which they became resident.

The classical etymology and the idea of something leaving its original site and travelling to other places meant “diaspora” appealed to scientists coining technical terms.   In geology, the noun diaspore describes a natural hydrate of aluminium (also as diasporite, tanatarite, empholite or kayserite) which in addition to its other properties is famed for its stalactites and in crystal form, it exists as a gemstone.  Diaspore is a major component in the ore bauxite which is smelted into aluminium and the name was chosen to suggest “scatter”, an allusion to its decrepitation when heated.  In petrology (the study of certain rocks and their transformative processes), the related noun diasporite refers to the metamorphic rock containing diaspore.  In botany, diaspore is used to refer to seeds and fruit which operate in unison as a dispersal unit.

A very modern diasporic: Living in the United Arab Emirates, island designer Lindsay Lohan (pictured here in an empire line dress), is part of the western diaspora in Dubai.

A diasporite is a member of a diaspora (although the adjective diasporic has been used as a (non-standard) noun, the usual plural in English being diasporas, the alternatives diasporae & diasporai.  It’s certainly a loaded word, something perhaps based in the idea of exile in some form, a particular form of migration, displacement, scattering, exodus or dispersal although one also associated with the “escape” of the refugee.  In use, the connotation seems to be different from “expatriate” (often clipped to “expat”), another example of someone living in a foreign land and it’s hard to escape the impression the modern “diaspora” has become a Western construct and one which applies (almost) exclusively to religious, cultural or ethnic minorities and although diasporites increasingly are where they are by choice rather than an act of expulsion, the distinction remains and sometimes there are ethnic-specific adaptations such as Afrodiaspora (those of African extraction (and not necessarily birth)) living in other places.

Although the irregular immigration northward from South & Central America is trending up, the Indian diaspora remains the largest. 

By implication too, a disapora, sharing a common origin, culture or ethnicity tends to be thought a group which maintains a strong connection to the “homeland”, its culture and heritage. They may engage in cultural, social, or economic activities that tie them back to their original community.  By contrast, an expat seems almost always to be white and in some well-paid job, there perhaps for the long-term but probably still temporarily; the British lawyers and accountants in Hong Kong before the handover (1997) were “expats” whereas the workers from the Philippines employed as domestic help were a “diaspora”.  It’s a distinction which would have seemed both understandable and unremarkable under the Raj and it's hard to see its origin as based in anything but racialism.

Friday, January 12, 2024

Gore

Gore (pronounced gawr or gohr)

(1) Blood when shed, especially in volume or when coagulated.

(2) Murder, bloodshed, violence etc, often in the context of visual depictions (film, television etc) and frequently an element in the “pornography of violence”.

(3) Dirt; mud; filth (obsolete except in some regional dialects and obviously something of which to be aware when reading historic texts).

(4) In cartography, the curved surface that lies between two close lines of longitude on a globe (or the as represented in the segmented two-dimensional depiction in certain maps or charts.

(5) In nautical design, a triangular piece of material inserted in a sail to produce a greater surface areas or a desired shape.

(6) In apparel, one of the panels, usually tapering or triangular in shape, making up a garment (most often used with skirts) or for other purposes such as umbrellas, hot-air balloons etc.

(7) In a bra (sometimes (tautologically) as “centre gore”), the panel connecting the cups and housing the centre ends of the underwires (if fitted).

(8) On cobbling, an elastic gusset for providing a snug fit in a shoe.

(9) A triangular tract of land, especially one lying between larger divisions; in the jargon of surveying, a small patch of land left unincorporated due to unresolved competing surveys or a surveying error (also know in the US as “neutral area” and in the UK as “ghost island”).

(10) In road-traffic management, a designated “no-go” area at a point where roads intersect.

(11) In heraldry, a charge delineated by two inwardly curved lines, meeting in the fess point and considered an abatement.

(12) To create, mark or cut (something) in a triangular shape.

(13) Of an animal, such as a bull, to pierce or stab (a person or another animal) with a horn or tusk.

(14) To pierce something or someone (with a spear or similar weapon), as if with a horn or tusk.

(15) To make or furnish with a gore or gores; to add a gore.

Pre 900: From the Middle English gorre & gore (filth, moral filth), from the Old English gor (dung, bull dung, filth, dirt), from the Proto-Germanic gurą (half-digested stomach contents; faeces; manure) and the ultimate source may have been the primitive Indo-European gher- (hot; warm).  It was cognate with the Dutch goor, the Old High German gor (filth), the Middle Low German göre and the Old Norse gor (cud; half-digested food).  The idea of gore being “clotted blood” dates from the 1560s and was applied especially on battlefields; the term gore-blood documents since the 1550s.

The noun gore in the sense “patch of land or cloth of triangular shape” dates also from before 900 and was from the Middle English gor, gore, gar & gare (triangular piece of land, triangular piece of cloth), from the Old English gāra (triangular piece of land, corner, point of land, cape, promontory) the ultimate source thought to be the Proto-Germanic gaizon- or gaizô.   It was cognate with the German Gehre (gusset) and akin to the Old English gār (spear).  The seemingly strange relationship between spears, pieces of fabric and patches of land is explained by the common sense of triangularity, the allusion being to the word gore used in the sense of “a projecting point”, the tip of a spear visualized as the acute angle at which two sides of a triangle meet.  From this developed in the mid-thirteenth century the use to describe the panel used the front of a skirt, extended by the early 1300s just about any “triangular piece of fabric”.

Al Gore (b 1948; US vice president 1993-2001) with crooked Hillary Clinton (b 1947; US secretary of state 2009-2013).  Al Gore used to be “the next President of the United States” and when this photo was taken at Miami Dade College, Florida during October 2016, crooked Hillary was also TNPOTUS.  They have much in common.

Al Gore's oft-repeated (and much derided) "quote" that he "invented the internet" is a misrepresentation of his actual statement, made on 9 March 1999 during an interview with CNN reporter Wolf Blitzer (b 1948): "I took the initiative in creating the Internet."  By this, Gore meant that while a member of the Senate during the 1980s, he was an advocate of the roll-out of high-speed telecommunications and network infrastructure.  He introduced legislation that led to the increased funding for and and expansion of the ARPANET (the US Advanced Research Projects Agency Network, the first public packet-switched computer network which operated between 1969-1989; the precursor to the modern internet, it was used mostly by the academic institutions and the military).  The High Performance Computing and Communication Act (1991) was known as the "Gore Bill" and it provided the framework for the national infrastructure.  However one looks at things, he achieved more than crooked Hillary.

Gore entered the jargon of surveying in the 1640s, adopted in the New England region of the American colonies to describe “a strip of land left out of any property by an error when tracts are surveyed”.  Such errors and disputes were not uncommon (there and elsewhere), the most famous resolved by the Mason-Dixon Line, the official demarcation defining the boarders of what would become the US states of Pennsylvania, Maryland, Delaware, and West Virginia (which was until 1863 attached to Virginia).  The line was determined by a survey undertaken between 1763-1767 by two English astronomers Charles Mason (1728–1786) & Jeremiah Dixon (1733–1779), commissioned because the original land grants issued by Charles I (1600–1649; King of England, Scotland & Ireland 1625-1649) and Charles II (1630–1685; King of Scotland 1649-1651, King of Scotland, England and Ireland 1660-1685) were contradictory, something not untypical given the often outdated and sometimes dubious maps then in use.  Later, "Mason-Dixon Line" would enter the popular imagination as the border between "the North" and "the South" (and thus "free" & "slave" states) because the line, west of Delaware, marked the northern limit of slavery in the United States.  Even though the later abolition of slavery in some areas rendered the line less of a strict delineation for this purpose, both phrase and implied meaning endured.

Arizona Department of Transport’s conceptual illustration of a gore used in traffic management.  The gore area is (almost always at least vaguely triangular) space at a point where roads in some way intersect and depending on the environment and available space, a gore may be simply a designated space (often painted with identifying lines of various colors) or a raised structure, sometime large and grassed.  The purpose of a gore is to ensue (1) the visibility of drivers is not restricted by other vehicles (most important with merging traffic) and (2) vehicle flow is in a safe direction and for this reason gores are designated “no go” areas through which vehicles should neither pass nor stop; something often enforced by statute.

The verb (in the sense of “to pierce, to stab”) emerged in the late fourteenth century (although use seems to have been spasmodic until the sixteenth) and was from the Middle English gorren & goren (to pierce, stab) which was derived from gōre (spear, javelin, dart), from the Old English gār (spear, shaft, arrow).  The adjective gory (covered with clotted blood) dates from the late fifteenth century and developed from the noun and the derived noun goriness is now a favorite measure by which produces in the horror movie genre are judged, some sites offering a “goriness index” or “goriness rating” for those who find such metrics helpful (the noun gorinessness is non-standard but horror movie buffs get the idea).  “To gore” also meant “add a gore (to a skirt, sail etc)” but surprisingly given the profligate ways of English degore or de-gore (removing a gore form a skirt, sail etc) seems never to have evolved.  Gore is a noun & verb, gory is an adjective, gored is a verb & adjective, goriness is a noun and goring is a verb; the noun plural is gores.

Shyaway’s diagram detailing how even mainstream bras can have as many as 16 separate components (although more individual parts are used in the construction; some (obviously) at least duplicated).  Who knew?

The gore (sometimes (tautologically) as “centre gore”) fits in the space between breasts, the panel connecting the cups and providing locating points for the centre ends of the underwires (if fitted).  Because there are so many types of design, the height of gore varies greatly, one fitted to a full support bra rising higher than that used by a plunge bra but the general principle is the panel should lie flat between the breasts, aligned with the skin, the gore's purpose as a piece of structural engineering being to provide separation and it's also the critical mounting point for the underwires.

HerRoom's deconstruction of the art and science of the gore.

According to HerRoom.com, the significance of the gore sitting firmly against the sternum is it provides an indication of fit.  If a gap appears between skin and gore, that suggests the cups lack sufficient depth and the user should proceed up the alphabet until snugness is achieved.  Where the gap is especially obvious (some fitters recommending a standard HB pencil as a guide while others prefer fingers, the advantage with the pencil being that globally it's a uniform size), it may be necessary to both go up more than one cup letter and decrease the band-size although there are exceptions to the gore-sternum rule and that includes “minimizers” (which achieve their visual trick by a combination of reducing forward protection and redistributing mass laterally) and most “wireless” (or “wire-free”) units (except for the smaller sizes).  The design of the gore also helps in accommodating variations in the human shape; although almost all gores are triangular and the difference in their height is obvious (and as a general principle: the greater the height, the greater the support) a difference in width will make different garments suitable for different body-types.

Gory: Lindsay Lohan was photographed in 2011 & 2013 by Tyler Shields (b 1982) in sessions which involved knives and the depiction of blood & gore.  The shoot attracted some attention and while the technical achievement was noted, it being quite challenging to work with blood (fake or real) and realize something realistic but it was also criticized as adding little to the discussion about the pornography of violence against women.

Thursday, January 11, 2024

Callosity

Callosity (pronounced kuh-los-i-tee)

(1) In pathology, an alternative name for a callus.

(2) In botany, a hardened or thickened part of a plant.

(3) In zoology, as ischial callosity, a large callus on the butts of certain animals.

(4) In the human condition, being of a callous demeanor; insensitivity or hard-heartedness

1375–1425: From the late Middle English calosite, from the Late Latin callōsitās, the construct being callōs(us) (callous) (from callum (hardened skin) + -ōsus (the suffix added to a noun to form an adjective indicating an abundance of that noun)) + -itās which in English was rendered as callus + -ity, the substitute “o” a familiar device.  The –ity suffix was from the French -ité, from the Middle French -ité, from the Old French –ete & -eteit (-ity), from the Latin -itātem, from -itās, from the primitive Indo-European suffix –it.  It was cognate with the Gothic –iþa (-th), the Old High German -ida (-th) and the Old English -þo, -þu & (-th).  It was used to form nouns from adjectives (especially abstract nouns), thus most often associated with nouns referring to the state, property, or quality of conforming to the adjective's description.  Callosity is a noun; the noun plural is callosities.

Essentially a thickening of the skin which forms in response to damage, a callus is one of the body’s protective mechanisms and example of how human skin have evolved to respond to a “fragile” area by replacing it with something “anti-fragile”.  The skin is a good barrier to much which would be dangerous if able to penetrate the surface but easily it can be cut and it’s prone to delimitation if exposed to repeated friction, something well known to gardeners digging holes, the skin on the palms of the hands soon “wearing off” at the points where the handle of the shovel repeatedly rubs.  That will be painful but the body will respond, replacing the dead skin with new skin which is thicker and harder, thus enabling the gardener to soon again pick up their shovel and return to their excavations.  This is an example of the general principle of healthy human physiology which responds to damage not by replacing things with something just as strong but something stronger, able to resist whatever force it was which caused the injury and it is the same with a bone fracture; when the bone knits back together, it will not be merely as strong as it was but a little stronger.  The new skin on the gardener’s hands will also be stronger and as the holes continue to be dug, the skin will become more robust still but the difference should not be thought of as fragile vs robust but as fragile vs anti-fragile, the point being that as pressure is applied, the material responds by becoming less-fragile.

Fragile and robust, although often used as antonyms (and in general use usefully so because the meanings are so well conveyed and understood) are really not opposites but simply degrees of the same thing.  In the narrow technical sense an expression of robustness or fragility is a measure of the same thing; a degree of strength.  The traditions of language obscure this but it becomes clear if measures of fragility or robustness are reduced to mathematics and expressed as comparative values in numbers.  It's true that on such a continuum a point could be set at which point something is regarded as no longer robust and becomes defined as fragile (indeed this is the essence of stress-testing) but this doesn't mean one is the antonym of the other.  The opposite of fragile is actually anti-fragile (the anti prefix was from the Ancient Greek ντι- (anti-) (against, hostile to, contrasting with the norm, opposite of, reverse (also "like, reminiscent of"))).  The concept is well known in physiology and part of the object in some forms of strength training is to exploit the propensity of muscles to tear at stress points, relying on the body to repair these tears in a way that doesn’t restore them to their original form but makes them stronger so that if subjected again to the same stress, a tear won’t happen.  It’s thus an act of anti-fragility, the process illustrated also by the calluses which form on the hands after the skin blisters in response to work.  Fragile and robust merely express points on a spectrum and are used according to emphasize the extent of strength; anti-fragile is the true opposite.

The idea of anti-fragile was introduced by Lebanese-born, US-based mathematician and trader Nassim Nicholas Taleb (b 1960) in the book Antifragile: Things That Gain From Disorder (2012), the fourth of five works which explore his ideas relation to uncertainty, randomness & probability, the best-known and most influential was The Black Swan: The Impact of the Highly Improbable (2007).  His work was thoughtful, intriguing and practical and was well received although the more accessible writing he adopted for the later volumes attracted criticism from some who felt an academic style more suited to the complex nature of his material; probably few who read the texts agreed with that.  Apart from the ideas and the use to which they can be put, his deconstruction of many suppositions was also an exploration of the rigidities of thought we allow our use of language to create.

Anti-callus devices (gloves the most common type) are used when the aim is to avoid the growth of a callus, the use of an “artificial callus” sometimes preferable to the natural.  A carpet layer in knee pads (left) and bra strap “cushions” (right).

When the new areas of skin are called calluses (calli the alternative plural), callus from the Latin callum (hard skin).  Most often used to describe the hardened areas of skin (typically on hands & feet) induced as a response to repeated friction, wear or use, in anatomy, the same word is applied to the initially soft or cartilaginous substance exuded at the site of a bone fracture which converts ultimately into bone, knitting the fragments into the one piece.  One the process fully is complete, if again exposed to the same stress, the bone will not break.  In botany, it’s used of the new formation over the end of a cutting. Callus is a noun & verb, the calluses, the present participle callusing and the past, callused.

In some professions, the callus can be close to essential; those whose life involves supporting weights on their shoulders form them on the pressure points, enabling them to ply their trade without undue pain or further damage.  However, not all whose shoulders might suffer welcome calluses, however beneficial they might be:  Women who wish to avoid what manufactures term the “shoulder grooving” caused by the pressure of their bra’s shoulder straps (the physics of this a product of (1) the weight supported and (2) its surface distribution which is dictated essentially by the width of the strap) can buy inserts for the straps which increase the surface area, thereby reducing the specific loading by re-distributing the downward pressure.  A variation on this idea is the “knee pad” worn by those who lay carpets, floor tiles and such.  These folk are compelled to work “on their knees” for hours at a time, often upon hard and sometimes rough surfaces and although, given time, calluses would form were the work to be performed unprotected, it would not be a pleasant experience and the degree of hardening needed would likely adversely affect normal mobility.  In zoology, calluses are a noted environmental adaptation among some species, (Old World) gibbons, monkeys and some chimpanzees having evolved notably large calluses on their butts (described as ischial callosities (the seventeenth century ischium (from the Latin ischium, from the Ancient Greek σχίον (iskhíon) (hip joint)) describing the lowest of the three bones that make up each side of the pelvis).  On the animals so endowed, the advantage is the ability to sleep while sitting upright on thin branches, safe from both predators and the risk of falling.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

In figurative use callosity came to be used to refer to one with a lack of feeling or capacity for emotion but the use when documented comes usually with the caveat that those so described are not “psychopaths” but merely the “hard-hearted”.  So it’s there to be used and if it seems not to suit, English offers has quite an array of choice when speaking of those lacking emotional range.  There is “heartless” & “hard-hearted”, both of which allude to the ancient idea of the special significance of the heart as the source of all that human feeling and character; even now it’s known to be a “just a pump”, the romantic notions persist in many culture and variations of the symbol are among the most frequently used emojis.  “Cold-blooded” is different in that although it’s blood the heart pumps, the operative word really is “cold”, implying decisions made or actions taken without emotion intruding and in idiomatic use, a “cold-blooded murder” (such as a contract killing done for payment) is viewed with less sympathy than a crime of passion (such murderers of said to have been “seeing the red mist” of “hot” blood at the time of their crime.  “Stolid” and “impassive” differs in that they can often be virtues and anyway suggest not an absence of capacity for feeling but its repression and one who wrote on how essential that was to civilization yet simultaneous damaging to individuals was Sigmund Freud (1856-1939), his ideas later taken up by German-American philosopher Herbert Marcuse (1898–1979).  Mankind probably didn’t surprise Freud but doubtless we disappointed Marcuse.  Finally, there is “stoic” which traces back to the Hellenic school of stoicism, a philosophy with a great following in Antiquity which was intended always to be practical, a way to help citizens live good lives rather than anything concerned with abstractions.  In its pure form it survives in that form but the modern re-purposing of the word means it’s now used to mean something like “suffering in silence”.  “Callosity” then is one of many ways to refer to the “unfeeling” and its use in this context is based on the use in medicine, a callosity (ie a callus) being “skin of abnormal hardness & thickness” which can be poked or pricked with the subject barely feeling the intrusion.  In that it’s subtly different from “thick skinned” which usually means “not easily offended”.

Wednesday, January 10, 2024

Asymmetric

Asymmetric (pronounced a-sim-et-rick)

(1) Not identical on both sides of a central line; unsymmetrical; lacking symmetry.

(2) An asymmetric shape.

(3) In logic or mathematics, holding true of members of a class in one order but not in the opposite order, as in the relation “being an ancestor of”.

(4) In chemistry, having an unsymmetrical arrangement of atoms in a molecule.

(5) In chemistry, noting a carbon atom bonded to four different atoms or groups.

(6) In chemistry (of a polymer), noting an atom or group that is within a polymer chain and is bonded to two different atoms or groups that are external to the chain.

(7) In electrical engineering, of conductors having different conductivities depending on the direction of current flow, as of diodes

(8) In aeronautics, having unequal thrust, as caused by an inoperative engine in a twin-engined aircraft.

(9) In military theory, a conflict where the parties are vastly different in terms of military capacity.  This situation is not in all circumstances disadvantageous to the nominally inferior party.

(10) In gameplay, where different players have different experiences

(11) In cryptography, not involving a mutual exchange of keys between sender a7 receiver.

(12) In set theory, of a relation R on a set S: having the property that for any two elements of S (not necessarily distinct), at least one is not related to the other via R.

1870–1875: The construct was a- + symmetric.  The a- prefix was from the Ancient Greek - (a-) (ν-) (an- if immediately preceding a vowel) and was added to stems to created the sense of "not, without, opposite of".  The prefix is referred to as an alpha privative and is used with stems beginning with consonants (except sometimes “h”); “an-“ is synonymous and is used in front of words that start with vowels and sometimes “h”.  Symmetric was from the Latin symmetria from Ancient Greek συμμετρία (summetría).  Symmetry was from the 1560s in the sense of "relation of parts, proportion", from the sixteenth century French symmétrie and directly from the Latin symmetria, from the Greek symmetria (agreement in dimensions, due proportion, arrangement", from symmetros (having a common measure, even, proportionate), an assimilated form of syn- (together) + metron (measure) from the primitive Indo-European me- (to measure).  The meaning "harmonic arrangement of parts" dates from the 1590s.  The suffix -ic was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically.  In English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (H₂SO₄) has more oxygen atoms per molecule than sulphurous acid (H₂SO₃).  Asymmetric & asymmetrical are adjectives, asymmetricity, asymmetricality, asymmetricalness & asymmetry are nouns and asymmetrically is an adverb; the noun plural is asymmetries.

The usually symmetrically attired Lindsay Lohan demonstrates the possibilities of asymmetry.

1975 Kawasaki 750 H2 Mach IV.

Manufacturers of triple-cylinder motorcycles traditionally used single (3 into 1) or symmetrical (3 into 2) exhaust systems (although, during the 1970s, Suzuki offered some of their "Ram-Air" models with a bizarre 3 into 4 setup (the centre cylinder’s header bifurcated) but in 1969 Kawasaki adopted an asymmetric addition for one of the memorable machines of the time.  The Kawasaki 500 H1 Mach III had two outlets to the right, one to the left and was a fast, lethally unstable thing which was soon dubbed the "widow maker".  Improvements to the Mach III made it a little more manageable and its successor, the 750 H2 Mach IV was claimed to be better behaved but was faster still and best enjoyed by experts, preferably in a straight line although, with a narrow power band which peaked with a sudden rush, even that could be a challenge.  The Kawasaki triples remain the most charismatic of the Japanese motorcycles but the prototype had been even more intriguing, using an "asymmetric V" with the two outer cylinders upright while the central barrel was at 90o so, viewed in profile it appeared a conventional 90o vee engine, the novelty obvious only from other aspects.  There were genuine advantages in cooling and weight distribution but ultimately the complexity was thought unjustified and the project was cancelled when doubts were expressed about market-acceptance. 

1973 Triumph X-75 Hurricane.

Available only during 1972-1973 and produced in small numbers, the Triumph X75 Hurricane was typical of the motorcycles being produced by the British manufacturers which had neglected development and re-investment and consequently were unable adequately to respond to the offerings of the Japanese industry which had done both aplenty.  Whatever their charms, models like the X75 were being rendered obsolescent, some of the underlying technology dating back decades yet, without the capital to invest, this was as good as it got and some of the fudges of the era were worse.  The X-75 was however ahead of its time in one way, it was a “factory special”, a design influenced by what custom shops in the US had been doing as one-offs for customers and in the years ahead, many manufacturers would be attracted by the concept and its healthy profit margins.  The X-75 is remembered also for the distinctive asymmetric stack of three exhaust pipes on the right-hand side a look adopted in the twenty-first century by MV Agusta and others.

1985 Ferrari Testarossa (1984-1991) monospecchio-monodado.

Some of Ferrari's early-production Testarossas were fitted with a single high-mounted external mirror, on the left or right depending on the market into which it was sold and although the preferred term was the Italian “monospecchio” (one mirror), in the English speaking-world it was quickly dubbed the “flying mirror" (rendered sometimes in Italian as “specchio volante” (a ordinary wing mirror being a “specchietto laterale esterno”, proving everything sounds better in Italian)).  The unusual placement and blatant asymmetry troubled some and delighted others, the unhappy more disgruntled still if they noticed the vent on right of the front spoiler not being matched by one to the left.  It was there to feed the air-conditioning’s radiator and while such offset singularities are not unusual in cars, many manufacturers create a matching fake as an aesthetic device: Ferrari did not.  The mirror’s curious placement was an unintended consequence of a European Union regulation (and it doubtful many institutions have in a relatively short time created as many regulations of such collective length as the EU) regarding the devices and this was interpreted by the designers as having to provide 100% rearward visibility.  Because of the sheer size of the rear bodywork necessitated by the twin radiators which sat behind the side-strakes (another distinctive Testarossa feature), the elevation was the only way this could be done but it later transpired the interpretation of the law was wrong, a perhaps forgivable mistake given the turgidity of EU legalese.

The Blohm & Voss BV 141

Focke-Wulf Fw 189 Eurl (Owl).

In aircraft, designs have for very good reason (aerodynamics, weight distribution, flying characteristics, ease of manufacture etc) tended to be symmetrical, sometimes as an engineering necessity such as the use of contra-rotating propellers on some twin-engined airframes, a trick to offset the destabilizing effects of the torque when very potent power-plants are fitted.  There has though been the odd bizarre venture into structural asymmetry, one of the most intriguing being the Blohm & Voss BV 141, the most distinctive feature of which was an offset crew-capsule.  The BV 141 was tactical reconnaissance aircraft built in small numbers and used in a desultory manner by the Luftwaffe (the German air force) during World War II (1939-1945) and although it was studied by engineers from many countries with some prototypes built, the layout never entered mainstream use. The origin of the curious craft lay in a specification issued in 1937 by the Reichsluftfahrtministerium (RLM; the German Air Ministry) which called for a single-engine reconnaissance aircraft, optimized for visual observation and, in response, Focke-Wulf responded with their Fw 189 Eurl (Owl) which, because of the then still novel twin-boomed layout, encountered some resistance from the RLM bureaucrats but it found much favor with the Luftwaffe; over the course of the war, some nine-hundred entered service and it was used almost exclusively as the German's standard battlefield reconnaissance aircraft.  In fact, so successful did it prove in this role that the other configurations it was designed to accommodate (liaison and close-support ground-attack) were never pursued.  Although its performance was modest, it was a fine airframe with superb flying qualities and an ability to absorb punishment which, on the Russian front where extensively it was deployed, became famous and captured examples provided Russian aeronautical engineers with ideas which would for years influence their designs.

1982 Ford (Australia) XE Fairmont Ghia 5.8 (351) ESP.  Finding asymmetry in unexpected places brings joy to some and annoys others.

The RLM had also invited Arado to tender but their Ar 198, although featuring an unusual under-slung and elongated cupola which afforded for the observer a uniquely panoramic view, proved unsatisfactory in test-flights and development ceased.  Blohm and Voss hadn't been included in the RLM's invitation but anyway chose to offer a design which was radically different even by the standards of the innovative Fw 189.  The asymmetric BV 141 design was eye-catching with the crew housed in an extensively glazed capsule, offset to starboard of the centre-line with a boom offset to the left which housed the single-engine in front with the tail to the rear.  Prototypes were built as early as 1938 and the Luftwaffe conducted operational trials over both the UK and USSR between 1939-1941 but, despite being satisfactory in most respects, the Bv 141 was hampered by poor performance, a consequence of using an under-powered engine.  A re-design of the structure to accommodate more powerful units was begun but delays in development and the urgent need for the up-rated engines for machines already in production doomed the project and the Bv 141 was in 1943 abandoned.

Blohm & Voss BV 141 prototype with full-width rear elevators & stabilizers.

Production Blohm & Voss BV 141 with port-only rear elevator & stabilizer.

Despite the ungainly appearance, test-pilots reported the Fw 141 was a nicely balanced airframe, the seemingly strange weight distribution well compensated by (1) component placement, (2) the specific lift characteristics of the wing design and (3) the choice of opposite rotational direction for crankshaft and propeller, the torque generated used as a counter-balance.  Nor, despite the expectation of some, were there difficulties in handling whatever behavior was induced by the thrust versus drag asymmetry and pilots all indicated some intuitive trimming was all that was needed to compensate for any induced yaw.  The asymmetry extended even to the tail-plane, the starboard elevator and horizontal stabilizer removed (to afford the tail-gunner a wider field of fire) after the first three prototypes were built; surprisingly, this was said barely to affect the flying characteristics.  Focke-Wolf pursued the concept, a number of design-studies (including a piston & turbojet-engine hybrid) initiated but none progressed beyond the drawing-board.

Asymmetric warfare

In the twenty-first century, the term “asymmetric warfare” became widely used.  The concept describes conflicts in which there are significant disparities in power, capability and strategies between opposing forces and although the phrase has become recently fashionable, the idea is ancient, based often on the successes which could be exploited by small, mobile and agile (often irregular) forces against larger, conventionally assembled formations.  Reports of such tactics are found in accounts of conflicts in Asia, Africa, the Middle East and Europe from as early as reliable written records have been found.  The classic example is what came later to be called “guerrilla warfare”, hit-and-run tactics which probe and attack a weak spots as they are detected, the ancestor of insurgencies, “conventional” modern terrorism and cyber-attacks.  However, even between conventional national militaries there have long been examples of the asymmetric such as the use of small, cheap weapons like torpedo boats and mines which early in the twentieth century proved effective against the big, ruinously expensive Dreadnoughts.  To some extent, the spike in use of the phrase in the post-Cold War era happened because it provided such a contrast between the nuclear weapon states which, although having a capacity to destroy entire countries without having one soldier step foot on their territory, found themselves vulnerable to low-tech, cleverly planned attacks.

Although the term “asymmetric warfare” encompasses a wide vista, one increasingly consistent thread is that it can be a difficult thing for "conventional" military formations to counter insurgencies conducted by irregular combatants who, in many places and for much of the time, are visually indistinguishable from the civilian population.  The difficulty lies not in achieving the desired result (destruction of the enemy) but managing to do so without causing an “excessive” number of civilian causalities; although public disapproval has meant the awful phrase “collateral damage” is now rarely heard, civilians (many of them women & children) continue greatly to suffer in such conflicts, the death toll high.  Thus the critique of the retaliatory strategy of the Israel Defence Force (IDF) in response to the attack by the Hamas on 7 October 2023, Palestinian deaths now claimed to exceed 20,000; that number is unverified and will include an unknown number of Hamas combatants but there is no doubt the percentage of civilian deaths will be high, the total casualty count estimated early in January 2024 at some 60,000.  What the IDF appear to have done is settle on the strategy adopted by Ulysses S Grant (1822–1885; US president 1869-1877) in 1863 when appointed head of the Union armies: the total destruction of the opposing forces.  That decision was a reaction to the realization the previous approach (skirmishes and the temporary taking of enemy territory which was soon re-taken) was ineffectual and war would continue as long as the other side retained even a defensive military capacity.  Grant’s strategy was, in effect: destroy the secessionist army and the secessionist cause dies out.

In the US Civil War (1861-1965) that approach worked though at an appalling cost, the 1860s a period when ballistics had advanced to the point horrific injuries could be inflicted at scale but battlefield medical tools and techniques were barely advanced from Napoleonic times.  The bodies were piled high.  Grant’s success was influential on the development of the US military which eventually evolved into an organization which came to see problems as something not to be solved but overwhelmed by the massive application of force, an attitude which although now refined, permeates from the Pentagon down to platoon level.  As the US proved more than once, the strategy works as long as there’s little concern about “collateral damage”, an example of this approach being when the Sri Lankan military rejected the argument there was “no military solution” to the long running civil war (1983-2009) waged by the Tamil Tigers (the Liberation Tigers of Tamil Eelam (LTTE)).  What “no military solution” means is that a war cannot be won if the rules of war are followed so the government took the decision that if war crimes and crimes against humanity were what was required to win, they would be committed.

In the 1990s, a number of political and military theorists actually advanced the doctrine “give war a chance”, the rationale being that however awful conflicts may be, if allowed to continue to the point where one side gains an unambiguous victory, the dispute is at least resolved and peace can ensue, sometimes for generations.  For most of human history, such was the usual path of war but after the formation of the United Nations (UN) in 1945 things changed, the Security Council the tool of the great powers, all of which (despite their publicity) viewed wars as a part of whatever agenda they were at the time pursuing and depending on this and that, that meant their interests sometimes lay in ending conflicts and sometimes in prolonging them.  In isolation, such an arrangement probably could have worked (albeit with much “collateral damage”) but over the years, a roll-call of nations run by politicians appalled by the consequences of war began to become involved, intervening with peace plans,  offering mediation and urging the UN to deploy “peacekeeping” forces, something which became an international growth industry.  Added to that, for a number of reasons, a proliferation of non-government organizations (NGO) were formed, many of which concerned themselves with relief programmes in conflict zones and while these benefited may civilians, they also had the effect of allowing combatant forces to re-group and re-arm, meaning wars could drag on for a decade or more.

In the dreadful events in Gaza, war is certainly being given a chance and the public position of both the IDF and the Israeli government is that the strategy being pursued is one designed totally “to destroy” not merely the military capacity of Hamas but the organization itself.  Such an idea worked for Grant in the 1860s and, as the Sri Lankan military predicted they would, end-game there was achieved in 2009 on the basis of “total destruction”.  However, Gaza (and the wider Middle East) is a different time & place and even if the IDF succeeds in “neutralizing” the opposing fighters and destroying the now famous network of tunnels and ad-hoc weapons manufacturing centres, it can’t be predicted that Hamas in some form won’t survive and in that case, what seems most likely is that while the asymmetry of nominal capacity between the two sides will be more extreme than before, Hamas is more likely to hone the tactics than shift the objective.  The IDF high command are of course realists and understand there is nothing to suggest “the Hamas problem” can be solved and being practical military types, they know if a problem can’t be solved it must be managed.  In the awful calculations of asymmetric conflict, this means the IDF calculate that while future attacks will happen, the more destructive the response now, the longer will be the interval before the next event.