Showing posts sorted by relevance for query Aggression. Sort by date Show all posts
Showing posts sorted by relevance for query Aggression. Sort by date Show all posts

Saturday, February 26, 2022

Aggression

Aggression (pronounced uh-gresh-uhn)

(1) The action of a state in violating by force the rights of another state, particularly its territorial rights; an unprovoked offensive, attack, invasion, or the like.

(2) Any offensive action, attack, or procedure; an inroad or encroachment.

(3) The practice of making assaults or attacks; offensive action in general.

(4) In clinical psychiatry, overt or suppressed hostility, either innate or resulting from continued frustration and directed outward or against oneself.

(5) In the study of animal behavior and zoology, behavior intended to intimidate or injure an animal of the same species or of a competing species but is not predatory.  Aggression may be displayed during mating rituals or to defend territory, as by the erection of fins by fish and feathers by birds.

1605–1615: English borrowed the word directly from the French aggression, derived from the Latin aggressionem (nominative aggressio (a going to, an attack)), a noun of action from past participle stem of aggredi (to approach; attack) a construct of ad (to) + gradi (past participle gressus (to step)) from gradus (a step).  The Classical Latin aggressiōn (stem of aggressiō), was equivalent to aggress(us) + iōn derived from aggrēdi (to attack).  Psychological sense of "hostile or destructive behavior" had its origin in early psychiatry, first noted in English in 1912 in a translation of Freud.  Related forms are antiaggression (adjective), counteraggression and preaggression (nouns); most frequently used derived form is aggressor (noun).

Aggression and International Jurisprudence, Locarno, Kellogg–Briand and the Nuremberg Trial

For centuries, philosophers, moral theologians and other peripheral players had written of the ways and means of outlawing wars of aggression but in the twentieth century, in the aftermath of the carnage of World War I (1914-1918), serious attempts were made to achieve exactly that, the first of which was the Locarno Pact.

Gustav Stresemann, Austen Chamberlain & Aristide Briand, Locarno, 1926.

Although usually referred to as the Locarno Pact, technically the pact consisted of seven treaties, the name derived from the Swiss city of Locarno at which the agreements negotiated between 5-16 October, 1925 although the documents were formally signed in London on 1 December.  Cynically, it can be said the Locarno Pact was a device by the western European powers to ensure they’d not again be the victims of German aggression which, if and when if were to happen, would be directed against those countries on its eastern border.  Of the seven treaties, it was the first which mattered most, a guarantee of the existing frontiers of Belgium, France, and Germany, underwritten by the UK and Italy.  Of the other agreements, two were intended to reassure the recently created Czechoslovakia and the recreated Poland, both of which, presciently as it turned out, felt some threat from Germany.

Whatever the implications, the intent was clear and about as pure as anything in politics can be: an attempt to ensure European states would never again need to resort to war.  Although the structural imbalances appear, in retrospect, obvious, at the time there were expectations of continued peaceful settlements and there arose, for a while, what was called the "spirit of Locarno": Germany was admitted to the League of Nations in September 1926, with a permanent seat on its council and Nobel Peace Prizes were awarded to the lead negotiators of the treaty, Sir Austen Chamberlain (1863-1937; UK foreign secretary 1924-1929), Aristide Briand (1862-1932; French foreign minister 1926-1932) and Gustav Stresemann (1878-1929; German foreign minister 1923-1929).

Members of the Cabinet, Senate, and House are seen gathered in the East Room of the White House, after President Coolidge and Secretary of State Kellogg signed the Kellogg-Briand Pact.

The spirit of Locarno proved infectious and inspired the noble notion it might be possible for men to gather around tables and sign papers which for all time would outlaw war and the Kellogg–Briand Pact (known also as the Pact of Paris and technically the General Treaty for Renunciation of War as an Instrument of National Policy) was a product of this optimism.  Signed in 1928 and named after the two main authors, Briand and Frank Kellogg (1856-1937; US Secretary of State 1925-1929), it was soon ratified by dozens of countries, all the signatory states promising not to use war to resolve "disputes or conflicts of whatever nature or of whatever origin they may be, which may arise among them".  It gained Kellogg his Nobel Peace Prize but peace proved elusive and in little more than a decade, the world was at war.  Another point cynics note is that the real consequence of the pact was not the prevention of war but the unfashionability of declaring war; wars continuing with a thin veneer of legal high-gloss.  Anthony Eden (1897-1977; UK prime-minister 1955-1957) during the Suez Crisis (1956), noting no declaration had been made, distinguished between being “at war” and being in “a state of armed conflict” although those on the battlefield doubtless noticed no difference.  Because the pact was concluded outside the League of Nations, it remains afoot and the influence lingers; although hardly militarily inactive since 1945, the last declaration of war by the United States was in 1942.

Defendants at the International Military Tribunal for the Far East (IMTFE), popularly known as the Tokyo War Crimes Tribunal.

Kellogg–Briand thus failed but was a vitally important twentieth century instrument.  It was from Kellogg-Briand the prosecutors at the Nuremberg Trial in 1945-1946 were able to find the concept of a crime against peace as pre-existing law that was of such importance in establishing the legal validity of the incitements, both there and at the subsequent Tokyo Tribunal.  Without that legal framework from the 1920s, the construction of the legal basis for the concept of crimes against peace (the first two of the four articles of indictment at Nuremberg), may not have been possible.

At Nuremburg, the indictments served by the International Military Tribunals were:

(1) Conspiracy to plan the waging of wars of aggression.

(2) Planning, initiating and waging wars of aggression.

(3) War crimes.

(4) Crimes against humanity.

It’s always been the fourth which has attracted most attention because the crimes committed were of such enormity and on such as scale, the word genocide had to be invented.  However, the greater effect on international law was the creation of the notion that those who plan wars of aggression can be punished for that very act, punishments wholly unrelated to the mechanics or consequences of how the wars may be fought.  Form this point can be traced the end of the centuries-old legal doctrine of sovereign immunity for those waging wars of aggression.

So, after Nuremberg, the long tradition of the preemptive and preventative war as an instrument of political policy was no longer the convenient option it had for thousands of years been.  With section 4 of the United Nations (UN) Charter prohibiting all members from exercising "the threat or use of force against the territorial integrity or political independence of any state", there was obvious interest in the charter's phrase phrase of exculpation: "armed attack" which effectively limited the parameters of the circumstances in which the use of military force might be legitimate under international law.  Stretching things as far as even the most accommodating of impartial lawyers were prepared to reach, if no armed attack has been suffered, for an act of preemptive self-defense to be lawful, (1) a threat must be demonstratively real and not merely a perception of the possible and (2), the force applied in self-defense must be proportional to the harm threatened.  All this is why General Colin Powell's (1937–2021; US Secretary of State 2001-2005) statement of justification to the Security Council seeking authority to invade Iraq in 2003 took the tortured form it did.

Mr Putin.

The state of international law is why President Vladimir Putin (b 1952; prime-minister or president of Russia since 1999) has resorted to some unusual terminology and some impressive, if not entirely convincing, intellectual gymnastics in his explanations of geography and history.  While hardly the direct and unambiguous speech used by some of his predecessors in the Kremlin, it's certainly kept the Kremlinologists and their readers interested.  As early as December 2020, Mr Putin was already using the phrase "military-technical measures" should NATO (again) approach Russia's borders and the charm of that presumably was that having no precise meaning, it could at any time mean what Mr Putin wanted it to mean at that moment.  Mr Putin also claimed the government in the Ukraine is committing genocide against ethnic Russians within the territory and, in an echo of similar claims from the troubled 1930s "seemed to believe his own atrocity stories", later doubling-down, calling the Ukranian government a "Nazi regime" and said he was seeking a process of "de-Nazification" (an actual structured and large-scale programme run in post-war Germany by the occupying forces aimed at removing the worst elements of the Third Reich from public life).  

Most interestingly, Mr Putin said Ukraine wasn’t a real country, a significant point if true because it's only foreign countries which can be invaded.  If a government moves troops into parts of their own territory, it's not an invasion; it might be a police action, a counter-insurgency or a military exercise or any number of things but it can't be an invasion.  Technically of course, that applies also to renegade provinces.  It seemed an adventurous argument to run given Ukraine has for decades been a member of the UN and recognized by just about every country (including Russia) as a sovereign state.  To clarify, Mr Putin added the odd nuance, claiming Ukraine was "...not a real country..." and had "...never had its own authentic statehood. "There has never been a sustainable statehood in Ukraine.”  The basis of that was his assertion that Ukraine was created by the Soviet Union's first leader, Vladimir Lenin (1870–1924; Leader of Soviet Russia 1917-1924 & the USSR 1922-1924) as either a sort of administrative zone or just as a mistake depending on interpretation.  Ignoring the wealth of historical material documenting the pre-Soviet history of the Ukraine, Mr Putin insisted it was part of Russia, an "...integral part of our own history, culture, spiritual space.”

Having established his case the Ukraine was no foreign country but just another piece of Russia, Mr Putin turned his thoughts to the nature of the threat the obviously renegade province posed.  Although after the collapse of the USSR, the Ukraine voluntarily (and gratefully) gave up the nuclear weapons in its territory in exchange for a security guarantees issued by the US, UK, and Russia, Mr Putin expressed concern the neo-Nazi regime there had both the knowledge and the desire to obtain nuclear weapons and delivery systems, adding: If Ukraine acquires weapons of mass destruction, the situation in the world and in Europe will drastically change, especially for us, for Russia... we cannot but react to this real danger, all the more so since, let me repeat, Ukraine’s Western patrons may help it acquire these weapons to create yet another threat to our country.”

The internal logic of this was perfect to satisfy international law: (1) The territory which on maps is called Ukraine is not a country and just a part of Russia and (2), the illegal administration running the renegade province of Ukraine is plotting to acquire weapons of mass-destruction.  Under those conditions, military action by Moscow would be valid under international law but just to make sure, Mr Putin recognized Donetsk and Luhansk (two separatist regions in the Donbas), and deployed Russian troops as "peacekeepers".  Around the world, just about everybody except the usual suspects called it an invasion.

Many also discussed the legal position, perhaps not a great consolation to the citizens of Ukraine and the limitations of international law had anyway long been understood by those who were most hopeful of their civilizing power.  In his report to President Truman (1884–1972; US president 1945-1953) at the conclusion of the Nuremberg trial (1945-1946), Justice Robert Jackson (1892–1954; sometime justice of the US Supreme Court, US solicitor general & attorney general and chief US prosecutor at the Nuremberg trials), noted the judgment had "...for the first time made explicit and unambiguous what was theretofore, as the Tribunal has declared, implicit in International Law, namely, that to prepare, incite, or wage a war of aggression, or to conspire with others to do so, is a crime against international society, and that to persecute, oppress, or do violence to individuals or minorities on political, racial, or religious grounds in connection with such a war, or to exterminate, enslave, or deport civilian populations, is an international crime, and that for the commission of such crimes individuals are responsible. This agreement also won the adherence of nineteen additional nations and represents the combined judgments of the overwhelming majority of civilized people. It is a basic charter in the International Law of the future."  However, his idealism tempered by what he knew to be the nature of men, he conceded it would be "... extravagant to claim that agreements or trials of this character can make aggressive war or persecution of minorities impossible." although he did add that there was no doubt "they strengthen the bulwarks of peace and tolerance."  One of the US judges at Nuremburg had, whatever the theoretical legal position, reached an even more gloomy conclusion, Francis Biddle (1886–1968; US solicitor general 1940-1941 & attorney general 1941-1945 and primary US judge at the Nuremberg Trials) writing to the president that the judgements he'd helped deliver couldn't prevent war but might help men to "... learn a little better to detest it."  "Aggressive war was once romantic, now it is criminal."

Biddle was a realist who understood the forces which operated within legal systems and nation states.  Even the long-serving liberal judge William O Douglas (1898–1980; associate justice of the US Supreme Court 1939-1975) couldn’t bring himself to accept that the aggression which led to World War II (1939-1945) in which as many a sixty millions died was not reason enough to overcome his aversion to ex post facto law (the construct being the Latin ex (from) + post (after) + facto, ablative of factum (deed), (that which retrospectively changes the legal consequences of actions from what would have applied prior to the application of the law).  Douglas deplored the way the IMT had not only convicted but imposed capital sentences of those indicted for conduct which has at time been legal under metropolitan and international law:

No matter how many books are written or briefs filed, no matter how finely the lawyers analyzed it, the crime for which the Nazis were tried had never been formalized as a crime with the definiteness required by our legal standards, nor outlawed with a death penalty by the international community. By our standards that crime arose under ex post facto law. Goering et al. deserved severe punishment. But their guilt did not justify us in substituting power for principle.

Developments since in international law have seen progress.  The United Nations Charter, adopted in 1945, prohibits the use of force by one state against another, except in cases of self-defense or when authorized by the UN Security Council for the purpose of maintaining or restoring international peace and security, Article 2(4) of the UN Charter stating “all Members shall refrain in their international relations from the threat or use of force against the territorial integrity or political independence of any state."  That works in conjunction with the Nuremberg Principles which declared the planning, preparation, initiation, or execution of a war of aggression is a crime against peace and a violation of international law, a more concrete underpinning of customary international law than the Kellogg-Briand Pact which was in the same vein but always was of limited practical application because there existed no mechanism of enforcement or codification of penalties.  Despite that, the core concept of just what does constitute the crime of “aggressive war” has never been generally agreed and although the UN’s 1974 statement: “Aggression is the use of armed force by a State against the sovereignty, territorial integrity or political independence of another State, or in any other manner inconsistent with the Charter of the United Nations.” seems compelling, the debate continues.

Friday, March 25, 2022

Microaggression

Microaggression (pronounced mahy-kroh-uh-gresh-uhn)

(1) A casual comment or action directed at a marginalized, minority or other non-dominant group that (often) unintentionally but unconsciously reinforces a stereotype and can be construed as offensive.

(2) The act of discriminating against a non-dominant group by means of such comments or actions.

1970: A construct of micro- + aggression coined by Chester Middlebrook Pierce (1927-2016), former Professor of Education and Psychiatry at Harvard Medical School.  Micro (small, microscopic; magnifying; one millionth) is a word-forming element from the New Latin micro- (small), from the Ancient Greek μικρός (mikrós) (small).  The origin is disputed between etymologists, the traditional view being it was derived from the primitive Indo-European (s)meyg- & (s)mēyg- (small, thin, delicate) and was cognate with the Old English smicor (beauteous, beautiful, elegant, fair, fine, tasteful), source also of the Modern English smicker and related to the German mickrig.   However, there’s a highly technical discussion within the profession, hinged around the unexplained “k” in the Greek and there’s the suggestion of a pre-Greek origin on the basis of variation between initial /m/ and /sm/, as well as the variant forms μικός (mikós) and μικκός (mikkós).  Aggression, dating from 1605–1615, is from the French aggression, from the Latin aggressionem (nominative aggressio (a going to, an attack)), a noun of action from past participle stem of aggredi (to approach; attack) the construct being ad (to) + gradi (past participle gressus (to step)) from gradus (a step).  The Classical Latin aggressiōn (stem of aggressiō), was equivalent to aggress(us) + iōn derived from aggrēdi (to attack).  The psychological sense of "hostile or destructive behavior" had its origin in early psychiatry, first noted in English in 1912 in a translation of Freud.

Chester Middlebrook Pierce (1927-2016)

Microaggression is an adaptable and possibly infinitely variable concept which probably most belongs in sociology and is typically defined as any of the small-scale verbal or physical interactions between those of different races, cultures, beliefs, or genders that are presumed to have no malicious intent but which can be interpreted as aggressions.  The criteria can be both objective and subjective and it’s noted compliments or positive comments can be microaggression; the standard psychology texts suggest the behavior manifests in three forms:

Microassault: An explicit racial derogation which can be verbal or nonverbal which can include labelling, avoidant behavior and purposeful discriminatory actions.

Microinsult: Communications that convey rudeness or insensitivity and demean a person's racial heritage or identity; subtle snubs which may be unknown to the perpetrator; hidden insulting messages to the recipient of color.

Microinvalidation: Communications that exclude, negate, or nullify the psychological thoughts, feelings, or experiential reality of a person belonging to a particular group.

The concept emerged to address the underlying racism which endured even after overt, deliberate expressions of racism had become socially unacceptable.  It held that microaggressions generally happened below the level of awareness of well-intentioned members of the dominant culture and were different from overt, deliberate acts of bigotry, such as the use of racist epithets because the people perpetrating microaggressions often intend no offense and are unaware they are causing harm.  In the abstract, this positions the dominant culture as normal and the minority one as aberrant or pathological.

Although the word’s origin is in the politics of race and ethnicity, it proved readily adaptable to other areas such as gender, sexual orientation, mental illness, disability and age.  Within the discipline, there’s a (typically) highly technical debate about the nature of microaggression and the intersectionality at the cross-cutting cleavages of non-dominant groups.  As regards the media, the discipline had a well-refined model to describe how microaggressions were either reinforced or encouraged by a news and entertainment media which reflected the hegemony of the dominant culture.  The sudden shock of the emergence of social media has changed that in both diversity of source and content and its substantially unmediated distribution.  To date, much work in exploring this area has been impressionistic and it’s not clear if the analytical metrics, where they exist, are sufficiently robust for theories in this area to be coherent.



Friday, July 29, 2022

Prevent & preempt or pre-empt

Prevent (pronounced pri-vent)

(1) To keep from occurring; avert; hinder, especially by the taking of some precautionary action.

(2) To hinder or stop from doing something.

(3) To act ahead of; to forestall (archaic).

(4) To precede or anticipate (archaic).

(5) To interpose a hindrance.

(6) To outdo or surpass (obsolete).

1375–1425: From the late Middle English preventen (anticipate), from the Latin praeventus, past participle of (1) praevenīre (to anticipate; come or go before, anticipate), the construct being prae- (pre; before) + ven- (stem of venīre (come)) + -tus (the past participle suffix) and (2) praeveniō (I anticipate), the construct being prae- (pre; before) + veniō (I come).  In Classical Latin the meaning was literal but in Late Latin, by the 1540s the sense of “to prevent” had emerged, the evolution explained by the idea of “anticipate to hinder; hinder from action by opposition of obstacles”.  That meaning seems not to have entered English until the 1630s.

The adjective preventable (that can be prevented or hindered) dates from the 1630s, the related preventability a decade-odd later.  The adjective preventative (serving to prevent or hinder) is noted from the 1650s and for centuries, dictionaries have listed it as an irregular formation though use seems still prevalent; preventive is better credentialed but now appears relegated to be merely an alternative form.  The adjective preventive (serving to prevent or hinder; guarding against or warding off) has the longer pedigree (used since the 1630s) and was from the Latin praevent-, past-participle stem of praevenīre (to anticipate; come or go before, anticipate).  It was used as a noun in the sense of "something taken or done beforehand” since the 1630s and had entered the jargon of medicine by the 1670s, and under the influence of the physicians came the noun preventiveness (the quality of being preventive).  The noun prevention came from the mid-fifteenth century prevencioun (action of stopping an event or practice), from the Medieval Latin preventionem (nominative preventio) (action of anticipating; a going before), the noun of action from the past-participle stem of the Classical Latin praevenīre.  The original sense in English has been obsolete since at least the late seventeenth century although it was used in a poetically thus well into the 1700s.  Prevent is a verb, preventable (or preventible), preventive & preventative are adjectives, preventability (or preventibility) is a noun and preventably (preventibly) is an adverb.  The archaic spelling is prævent.

Many words are associated with prevent including obstruct, obviate, prohibit, rule out, thwart, forbid, restrict, hamper, halt, forestall, avoid, restrain, hinder, avert, stop, impede, inhibit, bar, preclude, counter, limit & block.  Prevent, hamper, hinder & impede refer to so degree of stoppage of action or progress.  “To prevent” is to stop something by forestalling action and rendering it impossible.  “To hamper” or “to hinder” is to clog or entangle or put an embarrassing restraint upon; not necessarily preventing but certainly making more difficult and both refer to a process or act intended to prevent as opposed to the prevention.  “To impede” is to make difficult the movement or progress of anything by interfering with its proper functioning; it implies some physical or figurative impediment designed to prevent something.

Preempt or pre-empt (pronounced pree-empt)

(1) To occupy (usually public) land in order to establish a prior right to buy.

(2) To acquire or appropriate before someone else; take for oneself; arrogate.

(3) To take the place of because of priorities, reconsideration, rescheduling, etc; supplant.

(4) In bridge, to make a preemptive bid (a high opening bid, made often a bluff by a player holding a weak hand, in an attempt to shut out opposition bidding).

(5) To forestall or prevent (something anticipated) by acting first; preclude; head off.

(6) In computer operating systems, the class of actions used by the OS to determine how long a task should be executed before allowing another task to interact with OS services (as opposed to cooperative multitasking where the OS never initiates a context switch one running process to another.

(7) In the jargon of broadcasting, a euphemism for "cancel” (technical use only).

1830: An invention of US English, a back formation from preemption which was from the Medieval Latin praeēmptiō (previous purchase), from praeemō (buy before), the construct being prae- (pre; before) + emō (buy).  The creation related to the law or real property (land law), to preempt (or pre-empt) being “to occupy public land so as to establish a pre-emptive title to it".  In broadcasting, by 1965 it gained the technical meaning of "set aside a programme and replace it with another" which was actually a euphemism for "cancel”.  Preempt is a verb (and can be a noun in the jargon of broadcasting and computer coding), preemptor is a noun and preempted, preemptory, preemptive & preemptible are adjectives.  The alternative spelling is pre-empt and the (rare) noun plural preempts.

In law, broadcasting and computer operating system architecture, preempt has precise technical meanings but when used casually, it can either overlap or be synonymonous with words like claim, usurp, confiscate, acquire, expropriate, seize, assume, arrogate, anticipate, commandeer, appropriate, obtain, bump, sequester, take, usurp, annex & accroach.  The spelling in the forms præemption, præ-emption etc is archaic).

Preemptive and Preventive War

A preemptive war is a military action by one state against another which is begun with the intent of defeating what is perceived to be an imminent attack or at least gaining a strategic advantage in the impending (and allegedly unavoidable) war before that attack begins. The “preemptive war” is sometimes confused with the “preventive war”, the difference being that the latter is intended to destroy a potential rather than imminent threat; a preventative war may be staged in the absence of enemy aggression or even the suspicion of military planning.  In international law, preventive wars are now generally regarded as aggressive and therefore unlawful whereas a preemptive war can be lawful if authorized by the UN Security Council as an enforcement action.  Such authorizations are not easily gained because the initiation of armed conflict except in self-defense against “armed attack” is not permitted by the United Nations (UN) Charter and only the Security Council can endorse an action as a lawful “action of enforcement”.  Legal theorists suggest that if it can be established that preparations for a future attack have been confirmed, even if the attack has not be commenced, under international law the attack has actually “begun” but the UN has never upheld this opinion.  Militarily, the position does make sense, especially if the first two indictments of the International Military Tribunal (IMT) assembled at Nuremberg (1945-19465) to try the surviving Nazi leadership ((1) planning aggressive war & (2) waging aggressive war) are considered as a practical reality rather than in the abstract.

Legal (as opposed to moral or ethical) objections to preemptive or preventive wars were not unknown but until the nineteenth century, lawyers and statesmen gave wide latitude to the “right of self-defense” which really was a notion from natural law writ large and a matter determined ultimately on the battlefield, victory proof of the ends justifying the means.  Certainly, there was a general recognition of the right forcibly to forestall an attack and the first legal precedent of note wasn’t codified until 1842 in the matter of the Caroline affair (1837).  Then, some Canadian citizens sailed from Canada to the US in the Caroline as part of a planned offensive against the British in Canada.  The British crossed the border and attached, killing both Canadians and a US citizen which led to a diplomatic crisis and several years of low-level clashes.  Ultimately however, the incident led to the formulation of the legal principle of the "Caroline test" which demands that for self-defense to be invoked, an incident must be "…instant, overwhelming, and leaving no choice of means, and no moment for deliberation".  Really, that’s an expression little different in meaning to the criteria used in many jurisdictions which must exist for the claim of defense to succeed in criminal assault cases (including murder).  The "Caroline test" remains an accepted part of international law today, although obviously one which must be read in conjunction with an understanding of the events for the last 250-odd years.

The "Caroline test" however was a legal principle and such things need to be enforced and that requires both political will and a military mechanism.  In the aftermath of the Great War (1914-1918), that was the primary purpose of the League of Nations (LON), an international organization (the predecessor of the UN) of states, all of which agreed to desist from the initiation of all wars, (preemptive or otherwise).  Despite the reputation the LON now has as an entirely ineffectual talking shop, in the 1920s it did enjoy some success in settling international disputes and was perceived as effective.  It was an optimistic age, the Locarno Treaties (1925) and the Kellogg-Briand Pact (1928) appeared to outlaw war but the LON (or more correctly its member states) proved incapable of halting the aggression in Europe, Asia and Africa which so marked the 1930s.  Japan and Italy had been little punished for their invasions and Nazi Germany, noting Japan’s construction of China as a “technical aggressor” claimed its 1939 invasion of Poland was a “defensive war” and it had no option but to preemptively invade Poland, thereby halting the alleged Polish plans to invade Germany.  Berlin's claims were wholly fabricated.  The design of the UN was undertaken during the war and structurally was different; an attempt to create something which could prevent aggression.

There have been no lack of examples since 1939.  Both the British and Germans staged preemptive invasions of Norway in 1940 though the IMT at Nuremberg was no more anxious to discuss this Allied transgression than they were war crimes or crimes against humanity by anyone except the Nazis.  The Anglo-Soviet invasion of Iran in 1941 proceeded without undue difficulty but that couldn’t be said of the Suez Crisis of 1956 when the British, French and Israelis staged an war of aggression which not even London was hypocritical enough to claim was pre-emption or preventive; they called it a peace-keeping operation, a claim again wholly fabricated.  The Six-Day War (1967) which began when Israel attached Egypt is regarded by most in the West as preemptive rather than preventive because of the wealth of evidence suggesting Egypt was preparing to attack although the term “interceptive self-defense” has also be coined although, except as admirable sophistry, it’s not clear if this is either descriptive or helpful.  However, whatever the view, Israel’s actions in 1967 would seem not to satisfy the Caroline test but whether “…leaving no choice of means, and no moment for deliberation”, written in the age of sail and musketry, could reasonably be held in 1967 to convey quite the same meaning was obviously questionable.

Interest in the doctrine of preemption was renewed following the US invasion of Iraq (2003).  The US claimed the action was a necessity to intervene to prevent Iraq from deploying weapons of mass destruction (WMD) prior to launching an armed attack.  Subsequently, it was found no WMDs existed but the more interesting legal point is whether the US invasion would have been lawful had WMDs been found.  Presumably, Iraq’s resistance to the attack was lawful regardless of the status of the US attack.  The relevant sections (Article 2, Section 4) of the UN Charter are considered jus cogens (literally "compelling law" (ie “international law”)).  They prohibit all UN members from exercising "the threat or use of force against the territorial integrity or political independence of any state".  However, this apparently absolute prohibition must be read in conjunction with the phrase "armed attack occurs" (Article 51, Section 37) which differentiates between legitimate and illegitimate military force.  It states that if no armed attack has occurred, no automatic justification for preemptive self-defense has yet been made lawful under the Charter and in order to be justified, two conditions must be fulfilled: (1) that the state must have believed that the threat is real and not a mere perception and (2) that the force used must be proportional to the harm threatened.  As history has illustrated, those words permit much scope for those sufficiently imaginative.

Mr Putin (Vladimir Putin (b 1952; prime-minister or president of Russia since 1999)), although avoiding distasteful words like "aggression" “war” or “invasion”, did use the language associated with preemptive and preventive wars in his formal justification for Russia’s “special military operation” against Ukraine.  Firstly he claimed, Russia is using force in self-defence, pursuant to Article 51 of the Charter, to protect itself from a threat emanating from Ukraine.  This threat, if real, could justify preemptive self-defence because, even if an attack was not “imminent”, there was still an existential threat so grave that it was necessary immediately to act (essentially the same argument the US used in 2003).  This view met with little support, most holding any such theory of preemption is incompatible with Article 51 which really is restricted to permitting anticipatory self-defence in response to imminent attacks. Secondly he cited the right of collective self-defence of the Donetsk and Luhansk “republics” although neither are states and even if one accepts they’ve been subject to a Ukranian attack, the extent of Russia’s military intervention and the goal of regime change in Kyiv appear far to exceed the customary criteria of necessity and proportionality.  Finally, the Kremlin claimed the special military action was undertaken as a humanitarian intervention, the need to stop or prevent a genocide of Russians in Eastern Ukraine.  Few commented on this last point.

Monday, March 28, 2022

Tyrannicide

Tyrannicide (pronounced ti-ran-uh-sahyd or tahy- ran-uh-sahyd)

(1) The act of killing a tyrant.

(2) A person who kills a tyrant.

1640-1650: From the French tyrannicide, from the Latin tyrrannicīdium & tyrannicīda, the construct being tryant + -cide.  Tryant was from the Middle English ttyraun, tiraunt, tyrant & tyrante, from the Old French tyrant, constructed with the addition of a terminal -t to tiran (from the Middle French tyran (a tryant or bully), from the Latin tyrannus (despot (source also of the Spanish tirano and the Italian tiranno)), from the Ancient Greek τύραννος (túrannos) (usurper, monarch, despot) of uncertain origin but which some have speculated may be a loan -word from a language of Asia Minor (perhaps Lydian); some etymologists compare it to the Etruscan Turan (mistress, lady (and the surname of Venus)).  The evolutionary process was via a back-formation related to the development of French present participles out of the Latin -ans form, thus the unetymological spelling with -t arose in Old French by analogy with present-participle endings in -ant.  The feminine form tyranness seems first to have been documented in 1590, perhaps derived from the Medieval Latin tyrannissa, although whether this emerged from courtiers in palaces or husbands in more humble abodes isn’t recorded.  The plural was tryants.

In Archaic Greece, tryant was a technical rather than a casually descriptive term, applied to a usurper (one who gains power and rules extra-legally, distinguished from kings elevated by election or natural succession), something discussed by Jean-Jacques Rousseau (1712–1778) in his landmark The Social Contract (1762) in which he noted “they applied it indifferently to good and bad princes whose authority was not legitimate”.  It’s now used to describe a despot; a ruler who governs unjustly, cruelly, or harshly and, by extension, any person in a position of authority who abuses the power of their position or office to treat others unjustly, cruelly, or harshly.  In Greece, a ruler (tyrannical or otherwise) was variously the archon, basileus or aisymnetes; an unjust ruler or superior is typically now called autocrat, dictator, despot or martinet.  What Rousseau didn’t dwell on was that while in the Greek tradition, the word was not applied to old hereditary sovereignties (basileiai) and despotic kings, it was used of usurpers, even when popular, moderate, and just (the most celebrated in the surviving histories being Cypselus of Corinth in the seventh century BC) but, presumably by unfortunate association, it soon became a word of reproach in the modern sense.  A hint of this may be found in the way in Greek theatre of the fourth century BC, cherished pathos in regard to tyrannicide.  The noun plural was tyrannicides.

The suffix –cide was from the From Middle French -cide, from the Latin -cīda (cutter, killer), from -cīdium (killing), from caedō (to cut, hew, kill) and was a noun-forming suffix denoting “an act of killing or a slaughter”, “one who kills” or “one who cuts” from the appropriate nouns stems.  In English, the alternative form was –icide.

Tyrannicide is a noun.  The adjective tyrannous (of tyrannical character) was from the late fifteenth century whereas the now more common adjective tyrannical dates from the 1530s from the Classical Latin tyrannicus (arbitrary, despotic), from the Ancient Greek tyrannikos (befitting a despot) from tyrannos.  The adjectival variation tyrannic was used in this sense from the late fifteenth century and the companion adverb was tyrannically.  The adjective tyrannicidal was a creation of the mid-1800s which gained a new popularity in the next century when examples abounded.  The late fourteenth century noun tyranny (cruel or unjust use of power; the government of a tyrant) was from the thirteenth century Old French tyranie, from the Late Latin tyrannia (tyranny), from the Ancient Greek tyrannia (rule of a tyrant, absolute power) from tyrannos (master).

The tyrannosaurus (carnivorous Cretaceous bipedal dinosaur) was named in 1905 and came to public attention the following year when US paleontologist, geologist (and enthusiastic eugenicist) Henry Fairfield Osborn (1857–1935) who coined the term, published his research in the Bulletin of the American Museum of Natural History, the construct being the Ancient Greek tyrannos + -saurus (from the Ancient Greek σαρος (saûros) (lizard, reptile)).  The now familiar abbreviation T-Rex appears not to have been used before 1970 when it was adopted as the name of a pop-group.  In the avian branch of zoology, tyrant birds are members of the family Tyrannidae, which often fight or drive off other birds which approach their nests which seems a bit of a slur.

In the early days of Antiquity, tyrannicide was a part of the political process and rather than being thought of as what would now be called a “criminal” act, it was just another method of transferring power.  As societies evolved and recognizable civilizations emerged from competing cultures, attitudes did change and tyrannicide began to be regarded as a form of murder which might be self-justifying depending on the context and the degree of tyranny eradicated although Aristotle did distinguish between those who committed tyrannicide for personal gain and those (rare) disinterested souls who did it for the good of the community.

However intricately philosophers and legal theorists added the layer of nuance, tyrannicide (many of which were of course also acts of regicide ("the killing of a king" (used also for assassinated queens, ruling princes etc) or "one who does the killing", from the Latin rēgis (king (genitive singular of rēx)) + -cide (killer), patterned after suicide, tyrannicide etc) remained a popular and expedient way to hasten dynastic or political change.  It could be said the Peace of Westphalia (1648), which ended the Thirty Years' War (1618–1648) and Eighty Years' War (1568–1648) and established the principle that the religion a ruler choose to adopt for himself and his nation was a purely internal matter and not one to be changed by foreign intervention, represented the beginning of an international law which would come to outlaw the assassinations of rulers, tyrants or not.  That however is a retrospective view and not one at the time discussed.

Nor would legal niceties have been likely much to influence those who would wish to kill a tryant, some of whom have even claimed some justification under natural law.  Whether Brutus (85-42 BC) ever uttered the phrase Sic semper tyrannis (thus always to tyrants) after stabbing Julius Caesar (100-44 BC) or not (as the historian Plutarch (46-circa 122) maintained), it resonated through history, John Wilkes Booth, noting in his diary that he shouted "Sic semper tyrannis" after killing Abraham Lincoln (1809–1865; US president 1861-1865) in 1865.  History doesn’t record if the words were on the lips of those who either attempted or succeeded in dispatching Adolf Hitler (1944), Benito Mussolini (1945), Nicaraguan dictator Anastasio Somoza García (1956), the Dominican Republic’s dictator Rafael Trujillo (1961), South Korean dictator Park Chung-hee (1979), President Anwar Sadat of Egypt (1981), Afghan President Mohammad Najibullah (1996) & Colonel Muammar Gaddafi (2011), but it can be imagined they weren’t far from the assassins’ thoughts.

International law did however evolve to the point where the UN’s Convention on the Prevention and Punishment of Crimes against Internationally Protected Persons was presented in 1973, coming into force in 1977 and eventually ratified by 180 countries.  Although the convention was inspired by a spike in the assassination of diplomats in the early 1970s, the protection was extended to tyrants, the wording of the relevant clause being in Article 1a which declared that the ranks of “internationally protected persons” included:

A Head of State, including any member of a collegial body performing the functions of a Head of State under the constitution of the State concerned, a Head of Government or a Minister for Foreign Affairs, whenever any such person is in a foreign State, as well as members of his family who accompany him.

While it’s true Libya’s ratification of the convention didn’t save Colonel Gaddafi from becoming a victim of tyrannicide, he would at least have died knowing he was being assassinated in contravention of a UN convention.  Whether Joe Biden (b 1942; US president since 2021) was either explicitly calling for or hinting that an act of tyrannicide should be visited upon Vladimir Putin excited much interest recently when the US president labeled his Russian counterpart as a “butcher” who “cannot remain in power”.  It certainly could be construed as a call for Mr Putin’s “removal”, despite the White House in recent weeks having repeatedly emphasized that regime change in Russia is not US policy.  For God’s sake, this man cannot remain in power” Mr Biden said at the end of his speech in front of the Royal Castle in Warsaw, an unscripted sentiment he apparently added in the heat of the moment.

Methods of tyrannicide vary: this is the kiss of death.

It took only minutes for the White House damage-control team to scramble, playing down the remarks with a Kafkaesque assertion that the president “was not discussing Putin’s power in Russia, or regime change” but was instead making the point that Putin “…cannot be allowed to exercise power over his neighbors or the region.”  Within the Washington DC’s Capital Beltway the internal logic of the distinction makes complete sense, the White House insisting, a la the Barry Goldwater (1909–1998; Republican presidential candidate 1964) school of clarity of expression that what matters is not what Mr Biden says but what he means and they’re here to explain that.  Perhaps the staff should give Mr Biden a list of helpful ways of advocating tyrannicide.  Arthur Calwell (1896–1973; Leader of the Australian Labor Party 1960-1967) didn’t escape controversy when he called for “the visitation of the angel of death” upon the tyrannical Archbishop Daniel Mannix (1864–1963; Roman Catholic Archbishop of Melbourne 1917-1963) but it was more poetic than Mr Biden’s efforts and Calwell, if accused of advocating tyrannicide, could point out he was calling merely for episcopicide (the killing of a bishop, the construct being the Latin episcopus (bishop in a Christian church who governs a diocese), from the Ancient Greek πίσκοπος (epískopos) (overseer), the construct being πί (epí) (over) + σκοπός (skopós) (watcher, lookout, guardian) + -cide), something with a long if not always noble tradition.

US Secretary of State Antony Blinken (b 1962; US secretary of state since 2021), noted for his precision of oral expression, followed up by saying it wasn’t the intention of Mr Biden to topple Mr Putin.  The president made the point last night that, quite simply, President Putin cannot be empowered to wage war or engage in aggression against Ukraine or anyone else” Mr Blinken said while speaking in Jerusalem on Sunday, adding that “the US did not have a strategy of regime change in Russia or anywhere else”.  It’s “… up to the people of the country in question… the Russian people”.

Given the context of Mr Biden’s speech, it wasn’t difficult to understand why it aroused such interest.  Earlier, he’d called the invasion of Ukraine an act of aggression “… nothing less than a direct challenge to the rule-based international order established since the end of World War II” and that the valiant resistance of the Ukrainian people was a “battle for freedom” and the world must prepare for a “long fight ahead”.  We stand with you,” he told Ukrainians in the speech which had begun with the famous words of the Polish Pope Saint John Paul II (1920–2005; pope 1978-2005): “Be not afraid”, a phrase associated with a earlier call for regime change within the countries of what was then the Warsaw Pact.  In remarks addressed directly to citizens of Russia, he added: This war is not worthy of you, the Russian people”.

The Kremlin’s displeasure at the remarks was soon expressed, prompting the White House cleaners to explain that what Mr Biden said was not what he meant and by Sunday the president appeared to be back on-message.  When asked by a reporter if he was calling for regime change in the Kremlin, he answered: “No”.

Forms in English constructed with the suffix –cide.

Monday, January 8, 2024

Solemncholy

Solemncholy (pronounced sol-uhm-kol-ee)

(1) Solemn; serious.

(2) Solemn and melancholic.

1772: The construct was solemn +‎ (melan)choly.  The element –choly was never a standard suffix and was a Middle English variant of –colie used in French.  The Middle English adjective solemn dated from the late thirteenth century and was from solemne & solempne, from either Old French or directly from the Late Latin sōlennis & sōlempnis or the Classical Latin sōlemnis, a variant of sollemnis (consecrated, holy; performed or celebrated according to correct religious forms) which has always been of obscure origin although Roman scholars thought it could have come only from sollus (whole; complete), the derivative adjective formed by appending the noun annus (year), thus the idea of sollemnis meaning “taking place every year”.  Not all modern etymologists are convinced by that but acknowledge “some assimilation via folk-etymology is possible”.  In English, the extension of meaning from “annual events; sacred rites, ceremonies, holy days” to “a grave and serious demeanor; mirthless” was associative describing the behaviour expected of individuals attending such events.  Over time, the later sense became dissociated from the actual events and the original meaning became obsolete, surviving only in a handful of formal ecclesiastical calendars.  The word, without any reference to religious ceremonies meaning “marked by seriousness or earnestness” was common by the late fourteenth century, the sense of “fitted to inspire devout reflection” noted within decades.    Solemncholy is an adjective and no sources list the noun solemncholic or the adverb solemncholically as standard forms although, by implication, the need would seem to exist.  Emos presumably apply the adjectival comparative (more solemncholy) & superlative (most solemncholy) and perhaps too (during emo get-togethers) the plural forms solemncholics & solemncholies.

Melancholy was from the Middle English melancolie & malencolie (mental disorder characterized by sullenness, gloom, irritability, and propensity to causeless and violent anger), from the thirteenth century Old French melancolie (black bile; ill disposition, anger, annoyance), from the Late Latin melancholia, from the Ancient Greek μελαγχολία (melancholia) (atrabiliousness; sadness, (literally “excess of black bile”)), the construct being μέλας (mélas) or μελαν- (melan-) (black, dark, murky) + χολή (khol) (bile).  It appeared in Latin as ātra bīlis (black bile) and was for centuries part of orthodox medical diagnosis and the adjectival use was a genuine invention of Middle English although whether the used of the –ly as a component of the suffix was an influence or a product isn’t known.  Pre-modern medicine attributed what would now be called “depression” to excess “black bile”, a secretion of the spleen and one of the body's four “humors” which needed to be “in balance” to ensure physical & mental well-being.  The adjectival use in Middle English to describe “sorrow, gloom” was most associated by unrequited love or doomed affairs but this is likely more the influence of poets than doctors.  As the medical profession’s belief in the four humors declined during the eighteenth century as understanding of human physiology improved, the word was in the mid-1800s picked up by the newly (almost) respectable branch of psychiatry where it remained a defined “condition” until well into the twentieth century.

The physicians from Antiquity attributed mental depression to unnatural or excess "black bile," a secretion of the spleen and one of the body's four "humors," which help form and nourish the body unless altered or present in excessive amounts. The word also was used in Middle English to mean "sorrow, gloom" (brought on by unrequited love, disappointment etc).  In antiquity it was a concept rather than something with a standardized systemization and there existed competing models with more or fewer components but it’s because the description with four was that endorsed by the Greek physician Hippocrates (circa 460–circa 370 BC) that it became famous in the West and absorbed into medical practice.  The four humors of Hippocratic medicine were (1) black bile (μέλαινα χολή (melaina chole)), (2) yellow bile (ξανθη χολή (xanthe chole)), (3) phlegm (φλέγμα (phlegma)) & (4) blood (αἷμα (haima)), each corresponding with the four temperaments of man and linked also to the four seasons: yellow Bile=summer, black bile=autumn, phlegm=winter & blood=spring.  Since antiquity, doctors and scholars wrote both theoretical and clinical works, the words melancholia and melancholy used interchangeably until the nineteenth century when the former came to refer to a pathological condition, the latter to a temperament.  Depression was derived from the Latin verb deprimere (to press down) and from the fourteenth century, "to depress" meant to subjugate or to bring down in spirits and by 1665 was applied to someone having "a great depression of spirit", Dr Johnson (Samuel Johnson, 1709-1784) using the word in a similar sense in 1753.  Later, the term came into use in physiology and economics.

What was for over two-thousand years known as melancholia came gradually to be called depression, a reclassification formalized in the mid-twentieth century when mental illness was subject to codification.  The first edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM (1952)) included depressive reaction and the DSM-II (1968) added depressive neurosis, defined as an excessive reaction to internal conflict or an identifiable event, and also included a depressive type of manic-depressive psychosis within the category of Major Affective Disorders.  The term Major Depressive Disorder was introduced by a group of US clinicians in the mid-1970s and was incorporated into the DSM-III (1980).  Interestingly, the ancient idea of melancholia survives in modern medical literature in the notion of the melancholic subtype but, from the 1950s, the newly codified definitions of depression were widely accepted (although not without some dissent) and the nomenclature, with enhancements, continued in the DSM-IV (1994) and DSM-5 (2013)

According to the Oxford English Dictionary (OED), the earliest known instance of solemncholy in text dates from 1772 in the writings of Philip Vickers Fithian (1747–1776), peripatetic tutor, missionary & lay-preacher of the Presbyterian denomination of Christianity, now best remembered for his extensive diaries and letters which continue to provide historians with source material relating to the pre-revolutionary north-eastern colonies which would later form the United States of America.  His observations on slavery and the appalling treatment of those of African origin working the plantations in Virginia remain a revealing counterpoint to the rationalizations and justifications (not infrequently on a theological or scriptural basis) offered by many other contemporary Christians.  Those dictionaries which include an entry for solemncholy often note it as one of the humorous constructions in English, based usually on words from other languages or an adaptation of a standard English form.  That’s certainly how it has come to be used but Fithian was a Presbyterian who aspired to the ministry, not a breed noted for jocularity and in his journal entries its clear he intended to word to mean only that he was pursuing serious matters, in 1773 writing: “Being very solemncholy and somewhat tired, I concluded to stay there all night.

So it was an imaginative rather than a fanciful coining.  In contemporary culture, with mental health conditions increasingly fashionable, solemncholy (although still sometimes, if rarely, used in its original sense) found a new niche among those who wished to intellectualize their troubled state of mind and distinguish their affliction from mere depression which had become a bit common.  In a roundabout way, this meant it found a role too in humor, a joke about someone’s solemncholy still acceptable whereas to poke fun at their depression would be at least a micro-aggression:

Q: Victoria says she suffers from solemncholy.  Do you think that's a real condition?

A: Victoria is an emo; for her solemncholy is a calling.

Lindsay Lohan and her lawyer in court, Los Angeles, December 2011.

The companion term to solemncholy is the sometimes acronym leucocholy (a state of feeling that accompanies preoccupation with trivial and insipid diversions).  The construct of leucocholy was leuco- + (melan)choly.  The leuco- prefix (which had appeared also as leuko-, leuc- & leuk-) was from the Proto-Hellenic λευκός (leukós) (white; colourless; leucocyte), from the primitive Indo-European lewk- (white; light; bright), the cognates including the Latin lūx, the Sanskrit रोचते (rocate), the Old Armenian լոյս (loys) and the Old English lēoht (light, noun) from which English gained “light”.  In the Ancient Greek, the word evolved to enjoy a range or meanings, just as in would happen English including (1) bright, shining, gleaming, (2) light in color; white, (3) pale-skinned, weakly, cowardly & (4) fair, happy, joyful.  Leucocholy is said to have been coined by the English poet and classical scholar Thomas Gray (1716–1771) whose oeuvre was highly regarded despite being wholly compiled into one slim volume and he’s remembered also for declining appointment as England’s Poet Laureate, thereby forgoing the both the tick of approval from the establishment and the annual cask of “strong wine” which came with the job.  What he meant by a “white melancholy” seems to have been a state of existence in which there may not be joy or enchantment but is pleasant: unfulfilling yet undemanding.  In such a state of mind, as he put it:  ca ne laisse que de s’amuser (which translates most elegantly as something like “all that is left for us is to have some fun”).