Showing posts sorted by relevance for query modernism. Sort by date Show all posts
Showing posts sorted by relevance for query modernism. Sort by date Show all posts

Tuesday, January 11, 2022

Modernism

Modernism (pronounced mod-er-niz-uhm)

(1) A modern usage or characteristic; modern character, tendencies, or values; adherence to or sympathy with what is modern.

(2) In theology, a movement in Roman Catholic thought that sought to interpret the teachings of the Church in the light of philosophic and scientific conceptions prevalent in the late nineteenth and early twentieth centuries.  The movement was condemned in 1907 by Pope Pius X (always with initial capital).

(3) In theology, the liberal theological tendency in Protestantism which emerged in the twentieth century and became the intellectual core of the moderate wings in many denominations, most obviously displayed in the interplay between the liberal and evangelical factions at recent Lambeth Conferences.

(4) A deliberate philosophical and practical estrangement or divergence from the techniques and practices of the past in the arts and literature, occurring from late in the nineteenth century and (a little arbitrarily), thought to have concluded in the post-war years.  Taking the form in any of various innovative movements and styles, it was an aesthetic descriptor used in mostly in architecture, literature and art to label work thought modern in character, quality of thought, expression or technique (sometimes with initial capital).

1730-1740: The construct is modern + ism.  Modern was from the Middle French moderne, from the Late Latin modernus (modern), from the Classical Latin modo (just now; in a (certain) manner)), from the adverb modo (to the measure) originally ablative of modus (manner; measure) and hence, by measure, "just now".  Root was the primitive Indo-European med- (take appropriate measures).  The adjective modern entered English circa 1500 in the sense of "now existing", the meaning "of or pertaining to present or recent times" attested from the 1580s.  The verb modernize (give a modern character or appearance to, cause to conform to modern ideas, adapt to modern persons) has existed since the 1680s, thought probably a borrowing from the French modernizer rather than a native coining.

In the narrow, technical sense, modern began in English enjoying a broad meaning, simply a description of that which was not ancient and medieval, but it quickly gained niches, Shakespeare using it to indicate something "every-day, ordinary, commonplace" and the meaning "not antiquated or obsolete, in harmony with present ways" was formalized by the early nineteenth century.  Formerly, the scientific linguistic division of historical languages into old, middle, and modern began only in the nineteenth century, the first university department of modern (ie those still living (unlike Latin and Ancient Greek) and of some literary or historical importance) was created in 1821 although the use of Modern English can be traced from at least circa 1607 when jurist John Cowell (1554–1611) published The Interpreter, a dictionary of law which, inter alia, examined and explained the words of Old and Middle English.  The extended form modern-day was noted from 1872; Modern dance is attested by 1912; modern conveniences (increasingly electrically-powered labour-saving devices) dates from 1926; modern jazz was used in 1954 and the slang abbreviation mod (tidy, sophisticated teen-ager (and one contrasted usually with a disreputable rocker or some other delinquent)) dates from a surprisingly early 1960.

There have been references to modern art since 1807 but it then was simply a chronological acknowledgement in the sense of something being recent rather than a recognizable style or school, that sense, “work representing a departure from or repudiation of traditional or accepted styles”, not attested until 1895, the idea by 1897 extended to the individual, a modern person one thought “thoroughly up to date”.  Although it would take decades to assume its (now) modern meaning, as an adjective, post–modern appears first to have been use in 1919 and by the late 1940s had become common in the language of architects (presumably the first among the modernists to feel no longer modern) and use extended to the arts in the 1960s, infecting politics, literature and just about every form of criticism.

The –ism suffix is from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & -isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).

Tarta T77, Czechlosavakia 1934.

Confusion is sometimes attached to the words “modernity” and “modernism”.  The noun modernity, dating from the 1620s, meant simply the "quality or state of being modern", an elaborated way of saying “recent”, the sense "something that is modern" noted since 1733.  As a word to describe specifically the epoch which began in the early twentieth century, it was in use before the First World War.  The sense of that modernity was the particular combination of circumstances and forces which had coalesced to produce in Europe and North America a society which had in technical and other ways progressed with greater rapidity than any in history: (1) A long period of general peace, (2) a level of prosperity, which although uneven and often erratic, was unprecedented, (3) an array of technical innovations (electricity, radio, flight, telegraphs, internal combustion engines etc), (4) increasing literacy and a proliferation of mass-media, (4) an on-rush of new and transformative theories in psychology, philosophy & science (Freud, Darwin, Einstein et al), and (5) a relaxation of historically censorious attitudes which allowed innovations in art and literature to flourish.

Woman with a Hat (1905) by Henri Matisse (1869-1954).

Modernism, a word Dr Johnson (1709-1784) claimed was invented by the novelist Jonathan Swift (1667–1745) to suggest any "deviation from the ancient and classical manner" had by 1830 come to refer to generally to "modern ways and styles" but since the mid-1920s, it’s been used exclusively to describe the movements in art, music, architecture and literature which depart in some radical way from classical and other traditional antecedents.  By 1925, the noun modernist was applied to a producer or follower of the work of the movement.  In the 1580s it had meant "one who admires or prefers the modern"; something similar but without the political and other connotations modernity imposed.

Monument House of the Bulgarian Communist Party (Buzludzha Monument), Buzludzha Peak, Bulgaria (1974-1981).  Now a ruin.

Modernism, even if considered separately within the genres where the label was adopted had no coherence of style, intent or result, the significance being what it was not in that it deliberately aimed to depart from classical and traditional forms.  Although it’s contested still, much of what began as modernist art is attributed to the modernist’s growing alienation from the strictures and conventions which had characterized their formative years during the Victorian age.  It defined an epoch and, somewhere in the post-war period, became part of the history of art or architecture and for those for whom modernism is now just another product, such as art dealers, there’s really no distinction between modernist and postmodernist phases, both grouped under the rubric of Modern Art. 

Portrait of Wilhelm Uhde (1910) by Pablo Picasso (1881-1973), analytical cubism from his cubist period.

Modernist literature was influenced by the bleak surroundings created by urbanization and industrialization but it was the blast of the Great War which left writers grasping to find a response to what was a much-changed world.  The roots of modernist literature can be found in pre-war works but the war had so undermined faith in the foundations of Western society and culture which early in the century had been viewed with an optimism it’s hard now to convey.  An inescapable element in modernism was a rejection of beauty as if humanity had proved itself somehow unworthy of loveliness and modernist literature reflected disillusionment and betrayal, TS Eliot’s (1888-1965) epic-length poem The Waste Land (1922) is emblematic, reflecting the search for redemption in a spiritually empty landscape and technically, its discordant images and deliberate obscurity typify Modernism and preview the techniques developed in post-modernism which demand the reader be involved in interpreting and deconstructing the text.

Marilyn Monroe reading Ulysses, 1955.

Modernist fiction however, in content or technique, wasn’t monolithic and even to label it a genre does require that word to adopt a nuance of meaning, the works could explore the life of one, of a group or all of humanity and ranged from the nihilistic to the utopian.  The great landmark was James Joyce’s (1882-1941) Ulysses (1922), a work of such sometimes impenetrable density which adopted the even then known device of a stream of consciousness to document the events of one day in the lives of three Dubliners, the idea being the abandonment of any structural order, conventional sentence construction sacrificed in an attempt to capture the nature of human fragmentary thought.  Not all readers were seduced by the approach but Ulysses picked up a cult following which endures to this day, some of the adherents compelled even to praise Joyce’s Finnegans Wake (1939) which a few praise as rare genius and others merely modernism pursued to its inevitability unintelligible conclusion.  Reading it is hard work; Anthony Burgess (1917–1993), a fair critic of modernity, claimed to find a laugh of every page of Finnegans Wake, but for most, it's no fun.

Paysage coloré aux oiseaux aquatiques (Colored Landscape with Aquatic Birds, circa 1907) by Jean Metzinger (1883-1956).

In painting, the roots really lie in mannerism although elements can be traced through thousands of years of what is, strictly speaking, non-representational art, exemplified in many aspects by the sometimes (and not deliberately) almost abstract Italian primitives created between the late eleventh and mid-thirteenth centuries.  However, historians regard the first of the modernists as Édouard Manet (1832-1883) who, from his early thirties began to distort perspective and devalue representation, using techniques of brushstroke to emphasize the very nature of the flat canvas on which he worked, something artists had for centuries crafted to disguise.  In his wake came the rush of -isms that would characterize modernism: Impressionism, Cubism, Constructivism, Post-Impressionism, Cubism, Futurism & Expressionism, an array of visions which would see the techniques and physical construction of the works vie for attention (at least among fellow artists and critics) with what was once thought the art itself.

TWA Flight Center (1959-1962) by Eero Saarinen and Associates, John F Kennedy International Airport, New York.

In architecture, while not literally technologically deterministic, the advances in the technologies available made possible the abandonment of earlier necessities.  However, while  the existence of steel frames and increasingly complex shapes rendered in reinforced concrete may have allowed what came to be known as the “new international style” but neither precluded the ongoing use of decoration and embellishment, the discarding of which was a hallmark of modernist architecture.  It was the choice of the architects to draw simple geometric shapes with unadorned facades, a starkness associated especially with the mid-to-late twentieth century steel and glass skyscrapers.  Architects are less keen to be associated with some other high-rise buildings of the era, especially the grim housing projects which blighted so much of the urban redevelopment of the post-war years; there was none of the compelling grandeur of the best neo-brutalism to many of these.

Sunday, September 24, 2023

Futurism

Futurism (pronounced fyoo-chuh-riz-uhm)

(1) A movement in avant-garde art, developed originally by a group of Italian artists in 1909 in which forms (derived often from the then novel cubism) were used to represent rapid movement and dynamic motion  (sometimes with initial capital letter)

(2) A style of art, literature, music, etc and a theory of art and life in which violence, power, speed, mechanization or machines, and hostility to the past or to traditional forms of expression were advocated or portrayed (often with initial capital letter).

(3) As futurology, a quasi-discipline practiced by (often self-described) futurologists who attempt to predict future events, movements, technologies etc.

(4) In the theology of Judaism, the Jewish expectation of the messiah in the future rather than recognizing him in the presence of Christ.

(5) In the theology of Christianity, eschatological interpretations associating some Biblical prophecies with future events yet to be fulfilled, including the Second Coming.

1909: From the Italian futurism, the construct being futur(e) + -ism.  Future was from the Middle English future & futur, from the Old French futur, (that which is to come; the time ahead) from the Latin futūrus, (going to be; yet to be) which (as a noun) was the irregular suppletive future participle of esse (to be) from the primitive Indo-European bheue (to be, exist; grow).  It was cognate with the Old English bēo (I become, I will be, I am) and displaced the native Old English tōweard and the Middle English afterhede (future (literally “afterhood”) in the given sense.  The technical use in grammar (of tense) dates from the 1520s.  The –ism suffix was from the Ancient Greek ισμός (ismós) & -isma noun suffixes, often directly, sometimes through the Latin –ismus & isma (from where English picked up ize) and sometimes through the French –isme or the German –ismus, all ultimately from the Ancient Greek (where it tended more specifically to express a finished act or thing done).  It appeared in loanwords from Greek, where it was used to form abstract nouns of action, state, condition or doctrine from verbs and on this model, was used as a productive suffix in the formation of nouns denoting action or practice, state or condition, principles, doctrines, a usage or characteristic, devotion or adherence (criticism; barbarism; Darwinism; despotism; plagiarism; realism; witticism etc).  Futurism, futurology, & futurology are nouns, futurist is a noun & adjective and futuristic is an adjective; the noun plural is futurisms.

Lindsay Lohan in Maison Martin Margiela Futuristic Eyewear.

As a descriptor of the movement in art and literature, futurism (as the Italian futurism) was adopted in 1909 by the Italian poet Filippo Tommaso Marinetti (1876-1944) and the first reference to futurist (a practitioner in the field of futurism) dates from 1911 although the word had been used as early as 1842 in Protestant theology in the sense of “one who holds that nearly the whole of the Book of Revelations refers principally to events yet to come”.  The secular world did being to use futurist to describe "one who has (positive) feelings about the future" in 1846 but for the remainder of the century, use was apparently rare.  The (now probably extinct) noun futurity was from the early seventeenth century.  The noun futurology was introduced by Aldous Huxley (1894-1963) in his book Science, Liberty and Peace (1946) and has (for better or worse), created a minor industry of (often self-described) futurologists.  Futures, a financial instrument used in the trade of currencies and commodities appeared first in 1880; they allow (1) speculators to be on price movements and (2) producers and sellers to hedge against price movements.  In theology, the adjective futuristic came into use in 1856 with reference to prophecy but use soon faded.  In concert with futurism, by 1915 it referred in art to “avant-garde; ultra-modern” while by 1921 it was separated from the exclusive attachment to art and meant also “pertaining to the future, predicted to be in the future”, the use in this context spiking rapidly after World War II when technological developments in fields such as ballistics, jet aircraft, space exploration, electronics, nuclear physics etc stimulated interest in such progress.

The Arrival (1913, oil on canvas by Christopher Richard Wynne Nevinson (1889-1946), Tate Gallery.

Given what would unfold over during the twentieth century, it’s probably difficult to appreciate quite how optimistic was the Western world in the years leading up to the World War I.  Such had been the rapidity of the discovery of novelties and of progress in so many fields that expectations of the future were high and, beginning in Italy, futurism was a movement devoted to displaying the energy, dynamism and power of machines and the vitality and change they were bringing to society.  It’s also often forgotten that when the first futurist exhibition was staged in Paris in 1912, the critical establishment was unimpressed, the elaborate imagery with its opulence of color offending their sense of refinement, now so attuned to the sparseness of the cubists.

The Hospital Train (1915, oil on canvas by Gino Severini (1883-1966), Stedelijk Museum.

Futurism had debuted with some impact, the Paris newspaper Le Figaro in 1909 publishing the manifesto by Italian poet Filippo Tommaso Marinetti. Marinetti which dismissed all that was old and celebrated change, originality, and innovation in culture and society, something which should be depicted in art, music and literature. Marinetti exalted in the speed, power of new technologies which were disrupting society, automobiles, aeroplanes and other clattering machines.  Whether he found beauty in the machines or the violence and conflict they delivered was something he left his readers to decide and there were those seduced by both but his stated goal was the repudiation of traditional values and the destruction of cultural institutions such as museums and libraries.  Whether this was intended as a revolutionary roadmap or just a provocation to inspire anger and controversy is something historians have debated.

Uomo Nuovo (New Man, 1918), drawing by Mario Sironi (1885-1961).

As a technique, the futurist artists borrowed much from the cubists, deploying the same fragmented and intersecting plane surfaces and outlines to render a number of simultaneous, overlaid views of an object but whereas the cubists tended to still life, portraiture and other, usually static, studies of the human form, the futurists worshiped movement, their overlays a device to depict rhythmic spatial repetitions of an object’s outlines during movement.  People did appear in futurist works but usually they weren’t the focal point, instead appearing only in relation to some speeding or noisy machine.  Some of the most prolific of the futurist artists were killed in World War I and as a political movement it didn’t survive the conflict, the industrial war dulling the public appetite for the cult of the machine.  However, the influence of the compositional techniques continued in the 1920s and contributed to art deco which, in more elegant form, would integrate the new world of machines and mass-production into motifs still in use today.

Jockey Club Innovation Tower, Hong Kong (2013) by Zaha Hadid (1950-2016).

If the characteristics of futurism in art were identifiable (though not always admired), in architecture, it can be hard to tell where modernism ends and futurism begins.  Aesthetics aside, the core purpose of modernism was of course its utilitarian value and that did tend to dictate the austerity, straight lines and crisp geometry that evolved into mid-century minimalism so modernism, in its pure form, should probably be thought of as a style without an ulterior motive.  Futurist architecture however carried the agenda which in its earliest days borrowed from the futurist artists in that it was an assault on the past but later moved on and in the twenty-first century, the futurist architects seem now to be interested above all in the possibilities offered by advances in structural engineering, functionality sacrificed if need be just to demonstrate that something new can be done.  That's doubtless of great interest at awards dinners where architects give prizes to each other for this and that but has produced an international consensus that it's better to draw something new than something elegant.  The critique is that while modernism once offered “less is more”, with neo-futurist architecture it's now “less is bore”.  Art deco and mid-century modernism have aged well and it will be interesting to see how history judges the neo-futurists.

Wednesday, January 25, 2023

Schizophrenia

Schizophrenia (pronounced skit-suh-free-nee-uh or skit-suh-freen-yuh)

(1) In psychiatry (also called dementia praecox), a severe mental disorder characterized by some, but not necessarily all, of the following features: withdrawal from reality, illogical patterns of thinking, delusions, and hallucinations, and varying degrees of other emotional, behavioral, or in emotional blunting, intellectual deterioration, social isolation, disorganized speech and behavior, delusions, and hallucinations.

(2) A state characterized by the coexistence of contradictory or incompatible elements; informal behavior that appears to be motivated by contradictory or conflicting principles.

(3) In informal use, used to suggest a split personality, identity or other specific forms of dualism.  In popular usage, the term is often confused with dissociative identity disorder (also known as multiple personality disorder).

1908: From German Schizophrenie, from the New Latin schizophrenia and Coined by Swiss psychiatrist Eugen Eugen Bleuler (1857-1939) as an umbrella term covering a range of more or less severe mental disorders involving a breakdown of the relation between thought, emotion, and action and literally "a splitting of the mind", the construct being the Ancient Greek σχίζω (skhizein or skhízō) (to split), from the primitive Indo-European root skei (to cut, split apart), + φρήν (phrn(genitive phrenos(mind, heart, diaphragm) + -ia (the suffix from the Latin -ia and Ancient Greek -ία (-ía) & -εια (-eia), which forms abstract nouns of feminine gender).  It's from phrthat English gained phrenes (wits, sanity) and hence phreno-.

The adjective schizophrenic (characteristic of or suffering from schizophrenia) dates in the medical literature from 1912 (in English translations of Bleuler's publications) and was immediately adopted also as a noun (schizophrenic patient).  That survived but another noun formation in English was schizophrene which emerged in 1925, the construct presumably a tribute to Dr Bleuler's original work having been written in German.  As such things became more publicized during the post-war years (and picked up in popular culture including film and novels), the transferred adjectival sense of "contradictory, inconsistent" emerged in the mid 1950s, applied to anything from the behavior of race horses and motor-cycles to the nature of mucical composition.  The jargon of psychology also produced schizophrenogenic (tending to spark or inspire schizophrenia).  The adjective schizoid (resembling schizophrenia; tending sometimes to less severe forms of schizophrenia) dates from 1925, from the 1921 German coining schizoid (1921), the construct being schiz(ophrenia) + -oid.  The suffix -oid was from a Latinized form of the the Ancient Greek -ειδής (-eids) & -οειδής (-oeids) (the “ο” being the last vowel of the stem to which the suffix is attached); from εδος (eîdos) (form, shape, likeness).  It was used (1) to demote resembling; having the likeness of (usually including the concept of not being the same despite the likeness, but counter-examples exist), (2) to mean of, pertaining to, or related to and (3) when added to nouns to create derogatory terms, typically referring to a particular ideology or group of people (by means of analogy to psychological classifications such as schizoid).  Schizophrenia is a noun, schizophrenic & schizoids are nouns & adjectives and schizophrenically is an adverb; the noun plural is schizophrenics.

Madness

Within the profession of psychiatry, schizophrenia has a long (and technical) definitional history although, in essence, it’s always been thought a severe and chronic mental disorder characterized by disturbances in thought, perception and behavior.  Lacking any physical or laboratory test, it can be difficult to diagnose as schizophrenia involves a range of cognitive, behavioral, and emotional symptoms.  In the fifth edition of the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-5 (2013)), the lifetime prevalence of schizophrenia is noted as 0.3%-0.7%, the psychotic manifestations typically emerging between the mid-teens and mid-thirties, with the peak age of onset of the first psychotic episode in the early to mid-twenties for males and late twenties for females.  The DSM-5 editors also made changes to the criteria to make a diagnosis of schizophrenia, the most significant amendments since DSM-III (1980).

Madness and Modernism: Insanity in the Light of Modern Art, Literature, and Thought (1992), by US clinical psychologist Louis A Sass (b 1949), was an exploration of why mystery continues to shroud schizophrenia, which, despite advances in biological psychiatry and neuroscience, appears little changed in the quarter-century since.  Sass quoted approvingly a description of schizophrenia as "a condition of obscure origins and no established etiology, pathogenesis and pathology…" without "…even any clear disease marker or laboratory test by which it can readily be identified."

However, in a departure from most writings on mental illness, Sass explored the "striking similarities" between the seemingly bizarre universe of schizophrenic experiences and the sensibilities and structures of consciousness revealed in the works of modernist artists and writers such as Kafka, Valery, Beckett, Robbe-Grillet, de Chirico and Dali.  Applying the techniques of psychology to modernism, he traced similar cognitive configurations reflected in schizophrenia and modern art & literature, finding both artist and schizophrenic characterized by a pronounced thrust to deconstruct the world and subjectively to reconstruct human experience without reference to objective reality.  Layers of reality, real and constructed, co-exist and interact, frequently fusing into each-other, producing an acute self-awareness Sass called "hyperreflexivity", as well as a profound sense of alienation from the empirical world.  Sass allowed his analysis to reach its logical conclusion, that there is a tenuous, though clearly discernible, connection between modern culture and madness, speculating that insanity might be “…a disease of certain highly advanced forms of cultural organization, perhaps a part of the price we pay for civilization?"  His thesis wasn’t without critics although most acknowledged Madness and Modernism was a literary classic.

Duncan's Ritual Of Freemasonry (2021 edition) by Malcolm C Duncan, Lushena Books, 288 pp, ISBN-10-1631829904.

As the DSM makes clear, not all schizophrenics are the same.  In 2011, Lindsay Lohan was granted a two-year restraining order against alleged stalker David Cocordan.  The order was issued some days after she filed complaint with police who, after investigation by their Threat Management Department, advised the court Mr Cocordan (who at the time had been using at least five aliases) “suffered from schizophrenia”, was “off his medication and had a "significant psychiatric history of acting on his delusional beliefs.”  That was worrying enough but Ms Lohan may have revealed her real concerns in an earlier post on twitter in which she included a picture of David Cocordan, claiming he was "the freemason stalker that has been threatening to kill me- while he is TRESPASSING!"  Being stalked by a schizophrenic is bad enough but the thought of being hunted by a schizophrenic Freemason is truly frightening.  Apparently an unexplored matter in the annals of psychiatry, it seems the question of just how schizophrenia might particularly manifest in Freemasons awaits research so there may be a PhD there for someone.  

Thursday, September 8, 2022

Terpsichore

Terpsichore (pronounced turp-sik-uh-ree)

(1) In Classical Mythology, the goddess of dancing and choral song and one of the nine Muses who were daughters of Zeus (god of sky and thunder) & Mnemosyne (the goddess of memory).

(2) In choreography; the art of dancing (should always be lowercase).

(3) In astronomy (as 81 Terpsichore), a main belt asteroid.

Circa 1760: From the Classical Latin Terpsichorē from the Ancient Greek Τερψιχόρη (Terpsikhórē) (literally “enjoyment of dance”), noun use of the feminine of terpsíchoros (delighting in the dance), the construct being τέρψις (térpsis; térpein) (to delight; enjoyment) + χορός (khorós) (dance; chorus); it’s from terpsíchoros that English gained chorus.  The Greek was térpein was from the primitive Indo-European root terp- (to satisfy) source also of Sanskrit trpyati (takes one's fill) and the Lithuanian tarpstu & tarpti (to thrive, prosper).

The adjective terpsichorean (pertaining to dancing (literally “of Terpsichore”)) dates from 1869, and was from the Latinized form of the Greek noun terpsikhore (Muse of dancing and dramatic chorus).  From this came the theatrical slang terp (stage dancer, chorus girl) noted since 1937.  The adjectival form terpsichorean often appears with an initial capital letter because of its etymology from a proper noun.  Either is acceptable but the conventions of Modern English tend eventually to prevail which suggests use of the capital T will reduce with time but, given the rarity of the word except in a few technical and historical disciplines, the classic form is likely to endure among those few who enjoy its use.

In the mythology of Ancient Greece, nine goddesses ruled over art and literature.  The Greeks called them Muses and the Muse of dance and choral music was Terpsichore.  Of late she’s been of interest to astronomers who adopted her as a metaphor for the rhythm and ordered movement in the universe, such as mechanical oscillations.  In Archaic Greece, there were but three Muses, all associated with song and dance; it was only in the classical period they became nine and assigned to distinct spheres: Calliope of epic poetry, Clio of history, Euterpe of lyric poetry, Erato of love poetry, Melpomene of tragedy, Polyhymnia of hymns, Thalia of comedy, Urania of astronomy, and Terpsichore of dance and choral music.  The daughters of Zeus and Mnemosyne, they dwelt in the land of Pieria in the foothills of Mount Olympus.  In later mythology, writers tended to compare the Muses to the sirens but while both were young nymphs famous for their beautiful songs, the Muses sang to enrich men’s souls while the sirens were chthonic and sang to lure them to their deaths.

The Greek historian Herodotus (circa 484-425 BC) wrote his στορίαι (historíai̯; in the West styled variously as The History or The Histories of Herodotus)) as an account of the Greco-Persian Wars (499-449 BC).  Although there was once some doubt about the veracity as a historical document (reflecting more the scepticism about medieval editors and translators than the original texts), modern research has concluded the work is one of the most reliable histories from antiquity.  Herodotus was called “The Father of History” by the Roman orator Cicero (103-46 BC) because of the quality of his writing but he's also now acknowledged as the father of historiography's modern structural form.  Written histories had existed prior to Herodotus but he was the first to adopt a recognizably modern thematic form.  At some unknown time, one or more editors reorganized The History into nine chapters, each name after a Muse, one of which was Terpsichore:  Book I (Clio), Book II (Euterpe), Book III (Thalia), Book IIII (Melpomene), Book V (Terpsichore), Book VI (Erato), Book VII (Polymnia), Book VIII (Urania) & Book IX (Calliope).

The Muse Terpsichore In Ancient & Modern Greece.  Allegory of the Muse Terpsichore playing a harp, from the Florentine School of the eighteenth century, oil on canvas by an unknown artist (left) and Lindsay Lohan dancing The Lilo, Lohan Beach House, Mykonos, Greece, 2018 (right).

Despite the similarity, there’s no verified connection between khorós (dance; chorus) and χώρς (khrās), inflection of χώρ (kh) (location, place, spot; the proper place; one's place in life; piece of land: tract, land, field; country (as opposed to a city or town), countryside; country, nation.  The origin of khôra is unknown and it may be from a Pre-Greek substrate or other regional language although speculative links have been suggested including χ́ος (kháos) (empty space, abyss, chasm) and χατέω (khatéō) (to lack, miss, need, desire) but few etymologists have supported either and the lack of cognates beyond the Greek rendered research a dead end.  Khōra had been adopted as the ancient name for the land lying between the Tigris and Euphrates rivers north of Babylon (modern-day Iraq), from the Greek mesopotamia (khōra (literally "a country between two rivers)), from the feminine of mesopotamos, the construct being mesos (middle), from the primitive Indo-European root medhyo- (middle") + potamos (river).  That use borrowed directly from the discussions from Antiquity.

However, khôra did attract the interest of those scourges of late twentieth century linguistics: the French deconstructionists.  Their attention seems to have been excited by the concept of khôra in the sense of “the territory of the Ancient Greek polis which lay beyond the city proper”.  The philosophers of Antiquity, noting the idea of khôra simultaneously as (1) the physical space between city & the wilderness, (2) the time it takes to transverse the space and (3) one’s state of mind while in the space.  It was very much a concept of the indeterminate, a triton genos (third kind), being neither civilization nor the state of nature and city nor wilderness and few things so appealed to the deconstructionists as the indeterminate.

Of course, one attraction of deconstruction was that it was in itself a layer of indeterminacy and Jacques Derrida (1930-2004), post-modernism’s most famous explorer of khôra in the context of apophatism or negative theology, was interested not how the word had been understood in the traditions of metaphysics & metaphorics but what meaning could be constructed for something which is neither present or absent, passive or active.  The findings from khôra’s time on post-modernism’s autopsy table did illustrate why deconstruction gained a special role in language because there was surely no other way that Plato’s entirely cosmological concept could become psycho-linguistic and produce, in all seriousness, the idea of khôra as “container of the uncontainable”.  Plato (circa 425-circa 347 BC) had imagined khôra as that space through which something could pass but in which nothing could remain so thus to a French deconstructionist the very essence of tout autre (fully other) and from there it wasn’t very far to the idea of khôra also as time and space interacting.  At that point, in the tradition of post-modernism, khôra meant whatever the observer decided it meant.

The Terpsichore in Modern Greece: Lindsay Lohan dancing The Lilo, Lohan Beach House, Mykonos, Greece, 2018.


Sunday, February 12, 2023

Oikophobia

Oikophobia (pronounced oick-oh-foh-bee-uh)

(1) In political science, an aversion to or rejection of one’s own culture, and traditions; a dislike of one's own compatriots.

(2) In psychiatry, one of a number of phobias related to (1) one’s home (either as a building or as place of abode), (2) returning to one’s home or (3) some or all of the contents of one’s home.

From the Ancient Greek οκος (oîkos) (house; household; a basic societal unit in Ancient Greece; a household or family line) + -phobia from phóbos (fear).  The suffix -phobia (fear of a specific thing; hate, dislike, or repression of a specific thing) was from the New Latin, from the Classical Latin, from the Ancient Greek -φοβία (-phobía) and was used to form nouns meaning fear of a specific thing (the idea of a hatred came later).  Oikophobia & oikophobe are nouns, oikophobic is a noun and adjective and oikophobically is an adverb; the noun plural is oikophobes.

Roger Scruton in his study.  Although a staunch conservative tied to earlier traditions, even The Guardian granted a deservedly generous obituary.

The political sense where oikophobia (literally the antonym of xenophobia (hatred, fear or strong antipathy towards strangers or foreigners)) dates only from 2004 when it was used by English philosopher Roger Scruton (1944-2020) as part of the culture wars which swirl still around the critiques and defenses of Western civilization, the Enlightenment and the implications of post-modernism.  Scruton’s slim volume England and the Need for Nations (2004, Civitas, 64 pp ISBN-10-1903386497) argued that empirically, based on the last two-hundred years odd, it was the nation state which best created the conditions necessary for peace, prosperity, and the defense of human rights.  There are obviously not a few examples of nation states which have proven not to be exemplars of the values Scruton values but his agreement was essentially structural: Where there have been attempts to replace the nation-state with some kind of transnational political order, such things have tended to descend to totalitarian dictatorships like the old Soviet Union or evolve into bloated unaccountable bureaucracies like the post Maastricht European Union (EU).  It surprised nobody that enthusiastically he supported the UK’s exit (Brexit) from the EU:

 I believe we are on the brink of decisions that could prove disastrous for Europe and for the world and that we have only a few years in which to take stock of our inheritance and to reassume it.  Now more than ever do those lines from Goethe’s Faust ring true for us: "Was du ererbt von deinen Vätern hast, Erwirb es, um es zu besitzen" (What you have inherited from your forefathers, earn it, that you might own it).  We in the nation states of Europe need to earn again the sovereignty that previous generations so laboriously shaped from the inheritance of Christianity, imperial government and Roman law. Earning it, we will own it, and owning it, we will be at peace within our borders.”

Portrait of Theodor Herzl (circa 1890), oil on canvas by Leopold Pilichowski (1869-1933), Ben Uri Gallery and Museum, London.

Scruton of course rejected the notion he was in any way xenophobic but did reference that as oikophobia’s antonym when he described the latter as a “…need to denigrate the customs, culture and institutions that are identifiably ours” and ominously implicit in his critique was the observation it was a cultural malaise which tended to befalls civilizations in the days of decline before their fall.  Plenty have documented the mechanisms by which the faith in Western civilization was undermined, their phrases famous landmarks in the development of post-modernism including “cultural relativism”, “march through the institutions” & “deconstructionism” et al.  However, in a political context the idea of oikophobia wasn’t then entirely new, the idea of the “self-hating Jew” documented in 1896 by Austro-Hungarian Jewish lawyer Theodor Herzl (1860–1904) in his book 1896 book The Jewish State.  Regarded as “the father of modern political Zionism”, Herzl denounced those who opposed his model of a Jewish state in Palestine, calling them “disguised anti-Semites of Jewish origin”.  Essentially, Herzl saw being Jewish as not only compulsory for Jews but defined the only “true” Judaism as his Zionist vision but despite that, among European Jews, especially the educated and assimilated, Zionism was by no means universally supported and both sides weaponized their vocabularies.  In 1930, German Jewish philosopher Theodor Lessing (1872–1933) published Juedischer Selbsthass (Jewish Self-Hatred) and from then onwards the “self-hating Jew” came to be slung at those (often intellectuals) opposed to Zionism.  In 1933, Lessing (who had fled to Czechoslovakia) was murdered at the instigation of the Nazis.  In the post war years “self-hating Jew” has come to be used by Israeli politicians against any Jew who opposes their policies, often with as little basis as “fascist” came to be deployed in post-Franco Spain.

Before it was picked up in political science and purloined for the culture wars, oikophobia had been a technical term in psychiatry to refer to a patient’s aversion to a home environment, or an abnormal fear (phobia) of being in their own home, the companion terms being (1) ecophobia (fear of a home environment) the construct being eco- (from the French eco-, from the Latin oeco, from the Ancient Greek οκος (oîkos) (house, household) + -phobia & (2) nostophobia (a fear of, or aversion to, returning to one's home), the construct being the Ancient Greek νόστος (nóstos) (a return home) + -phobia.  It was the idea of the “unwillingness to return home” that was later absorbed by the deconstructionists and other post-modernists in the sense of “an aversion to the past, the antithesis of nostalgia” because in their assault on Western society, it was the political and social relics they attacked, condemning them as symbols (indeed tools) of oppression and mechanisms by which the power elite maintained their hegemony.  Thus, Western legal & theological traditions and the artistic & literary canon were just one of many constructs and, because of their oppressive history, needed to be overthrown.

In psychiatry, oikophobia, ecophobia & nostophobia cold also be used of patients exhibiting the symptoms of phobia relating to all or some of the contents of a house: electrical appliances, the plumbing, the cupboards, the furniture, the light fittings etc.  So specific were some of these cases (an there were some not unjustified such as a fear of certain allergy-inducing substances such as chemicals) that the profession added domatophobia (a specific fear of a house as opposed to its contents), the construct being domato- (from the Middle French domestique, from the Latin domesticus, from domus (house, home) + -phobia.  In the years after World War II (1939—1945), the word domatophobia came to be used by journalists to described what was emerging as a mass phenomenon: women attracted to careers outside the home, this explained by (usually male) journalists as “a fear of or aversion to housework”, presumably their proper role.

Thursday, August 11, 2022

Surreal

Surreal (pronounced suh-ree-uhl (U) or sur-reel (non-U))

(1) Of, relating to, or characteristic of surrealism, an artistic and literary style; surrealistic.

(2) Having the disorienting, hallucinatory quality of a dream; unreal; fantastic & and incongruous.

(3) As surrealism, an artistic movement and an aesthetic philosophy that, inter alia, explored the “liberation of the mind” by emphasizing the critical and imaginative powers of the subconscious.

(4) In mathematics as surreal numbers, a collection of numbers which includes both the real numbers and the infinite ordinal numbers, each real number surrounded by surreals, which are closer to it than any real number.

1936: A back formation from surrealism, the construct being ; sur- + realism, from the French surréalisme, the construct being sur- (beyond) + réalisme (realism).  Sur- ((over in the sense of “on top of” & over- in the sense of “excessive; excessively; too much”)) was from the Old French sur-, sour-, sor- & soure-, from a syncopation of the Latin super- (above, on top, over; upwards; moreover, in addition, besides) from the Proto-Italic super, from the primitive Indo-European upér (over, above (and cognate with the Ancient Greek πέρ (hupér) (above) and the Proto-Germanic uber (which in English became “over”)).  The English sur- was from the Middle English sur-, from the Old French sur-, sour-, sor- & soure-, a syncopic form of the Latin super.  Sur is a doublet of super-, over- and hyper-.  Real was from the Middle English real, from the Old French reel, from the Late Latin reālis (actual), from the Latin rēs (matter, thing), from the primitive Indo-European rehís (wealth, goods).  Surreal is a noun & adjective, surreally is an adverb, surrealism & surreality are nouns and surrealistic is an adjective; the noun plural is surreals.

Lobster Telephone (1936) by Salvador Dali, one of a dozen-odd originals (in colors and shades of cream created by the artist).

In French, the noun surréalisme appeared first in the preface to Guillaume Apollinaire's (1880-1918) play Les Mamelles de Tirésias (1916-1917 and first performed in 1917).  The word was taken up in the 1920s by French intellectuals who created a number of (competing) Manifeste de Surréalisme (Surrealist manifesto) which were documents exploring the nature of human psychology and the way the radical imagination could produce transformative art.  Such was the nature of their texts, inspiration was offered to groups as diverse as landscape painters and anarchists and anyone else attracted to the idea (if not the business) of revolution.  The English form of the word appeared first in 1931, the French spelling having been in use since 1927.  Surrealist as an adjective and noun (from the 1917 French surréaliste) has been in use since 1925 while the adjective surrealistic dates from 1930.

La Trahison des Images (The Treachery of Images) (1929), oil on canvas, by Rene Magritte (1898-1967), Los Angeles County Museum of Art.

The French text Ceci n’est pas une pipe (This is not a pipe) is an act of deconstruction, a statement that a painting is a representation of something, not the object itself.  It’s a statement of the obvious but is both in the artistic tradition of opposition to oppressive rationalism and an influential strand in the history of Surrealism and Pop Art.

Mama, papa is wounded! (1927) by Yves Tanguy (1900-1955), oil on canvas, Museum of Modern Art, New York.

One of the motifs of surrealist painters was a deliberate disconnection between the title of a work and any immediately obvious meaning. Tanguy’s Mama, papa is wounded! was a painting in one of the recognized surrealist styles: a landscape of wide vista littered with abstract shapes, the title taken from a case-study in a psychiatry textbook.  Beyond mentioning he’d imagined the whole canvas before lifting a brush, Tangay gave no clue about the meaning, but coming so soon after Great War, many focused on a link with the many French causalities of the conflict, the depiction of their horrific injuries also part of an artistic movement in the post-war years.

Swans Reflecting Elephants (1937), oil on canvas by Salvador Dali (1904-1989)

Salvador Dali remains the best-known surrealist painter and Swans Reflecting Elephants is an example of his paranoiac-critical method, which attempted to use art to represent how subconscious thought might summon the irrational imagery when in a state of psychosis or paranoia.  The work is interesting too in that it’s the most perfect example of a double image, the trees and swans reflected in the mirror-like surface of the water as lake as elephants.  Dali himself would sometimes discuss the usefulness of the mirror as a device to explore the divergence between conscious reality and the world of the subconscious.

Jean-Martin Charcot, documentary photographs of hysteria patients at La Salpêtrière Asylum 1878, printed in Le Cinquantenaire de L’hystérie (La Révolution Surréaliste (1928)) by André Breton (1896–1966) & Louis Aragon (1897–1982).  Breton & Aragon lamented that hysteria (which they called "the greatest poetic discovery of the late nineteenth century") was being redefined by the new discipline of psychiatry as merely a symptom of mental illness which could be eliminated by suggestion alone.

The link between surrealist art and madness long intrigued the medical community and the interest later extended to the relationship with modernism in general.  Remarkably, it wasn’t until 1980 with the publication of the third edition (DSM-III) that the diagnosis “hysteria” was removed from the Diagnostic and Statistical Manual of Mental Disorders.  Hysteria had for centuries been a kind of omnibus diagnosis, applied to those (almost always women) displaying an extraordinary array of mental and physical symptoms, the gendered hysteria derived from the Ancient Greek word for uterus.  To many Surrealists, hysteria was the state in which a poetic expression of the mind’s wilder impulses could be unleashed, meaning that instead of being silenced, this fundamental condition of being female could usefully be objectified.  History and art met in the decade of the surrealists because the 1930s was a time to be hysterical, less about what was happening than the fear of what was to come but the reaction to the Exposition Internationale du Surréalisme, an exhibition by surrealist artists held in Paris in January-February 1938 was not despair or shock but indifference, the novelty of the form having passed, the claim the exhibition needed to be understood as a single installation convincing few.  In the history of the movement, the peak had actually passed and although surrealist works would continue to be produced (and actually mass-produced as wildly popular prints) in the post-war world, the output was repetitive.  The avant-garde having plundered from surrealism what could be carried off, explored other directions.

Woman’s Dinner Dress (February 1937) by Elsa Schiaparelli (1890-1973), printed silk organza and synthetic horsehair, Philadelphia Museum of Art.

The fragments however endure.  Elsa Schiaparelli was an Italian fashion designer who took the objects made famous by the Futurists, Dadaists, & Surrealists and integrated them into clothing, her most memorable piece a white evening gown adorned with a large Daliesque lobster.  A design which would now attract little attention, at the time it was a sensation, its audacity a contrast with the solid pastels and other subdued hues with which Coco Chanel (1883-1971) had defined Parisian sophistication.  The playful designs she adopted (a telephone-shaped handbag, buttons in the shape of lollipops, fingernail gloves and hats in shapes borrowed from industry and agriculture) were not always original but she lent them a respectability in the world of high-fashion. 

In the surreal style: Salvador Dali (2021) by Javier Peña and Lindsay Lohan by Mohamad Helmi on Displate.