Showing posts sorted by relevance for query Algorithm. Sort by date Show all posts
Showing posts sorted by relevance for query Algorithm. Sort by date Show all posts

Saturday, February 17, 2024

Algorithm

Algorithm (pronounced al-guh-rith-um)

(1) A set of rules for solving a problem in a finite number of steps.

(2) In computing, a finite set of unambiguous instructions performed in a prescribed sequence to achieve a goal, especially a mathematical rule or procedure used to compute a desired result.

(3) In mathematics and formal logic, a recursive procedure whereby an infinite sequence of terms can be generated.

1690s: From the Middle English algorisme & augrym, from the Anglo-Norman algorisme & augrimfrom, from the French algorithme, re-fashioned (under mistaken connection with Greek αριθμός (arithmos) (number)) from the Old French algorisme (the Arabic numeral system) from the Medieval Latin algorismus, a (not untypical) mangled transliteration of the Arabic الخَوَارِزْمِيّ (al-awārizmiyy), the nisba (the part of an Arabic name consisting a derivational adjective) of the ninth century Persian mathematician Muammad ibn Mūsā al-Khwārizmī and a toponymic name meaning “person from Chorasmia” (native of Khwarazm (modern Khiva in Uzbekistan)).  It was Muammad ibn Mūsā al-Khwārizmī works which introduced to the West some sophisticated mathematics (including algebra). The earlier form in Middle English was the thirteenth century algorism from the Old French and in English, it was first used in about 1230 and then by the English poet Geoffrey Chaucer (circa 1344-1400) in 1391.  English adopted the French term, but it wasn't until the late nineteenth century that algorithm began to assume its modern sense.  Before that, by 1799, the adjective algorithmic (the construct being algorithm + -ic) was in use and the first use in reference to symbolic rules or language dates from 1881.  The suffix -ic was from the Middle English -ik, from the Old French -ique, from the Latin -icus, from the primitive Indo-European -kos & -os, formed with the i-stem suffix -i- and the adjectival suffix -kos & -os.  The form existed also in the Ancient Greek as -ικός (-ikós), in Sanskrit as -इक (-ika) and the Old Church Slavonic as -ъкъ (-ŭkŭ); A doublet of -y.  In European languages, adding -kos to noun stems carried the meaning "characteristic of, like, typical, pertaining to" while on adjectival stems it acted emphatically; in English it's always been used to form adjectives from nouns with the meaning “of or pertaining to”.  A precise technical use exists in physical chemistry where it's used to denote certain chemical compounds in which a specified chemical element has a higher oxidation number than in the equivalent compound whose name ends in the suffix -ous; (eg sulphuric acid (HSO) has more oxygen atoms per molecule than sulphurous acid (HSO).  The noun algorism, from the Old French algorisme was an early alternative form of algorithm; algorismic was a related form.  The meaning broadened to any method of computation and from the mid twentieth century became especially associated with computer programming to the point where, in general use, this link is often thought exclusive.  The spelling algorism has been obsolete since the 1920s.  Algorithm, algorithmist, algorithmizability, algorithmocracy, algorithmization & algorithmics are nouns, algorithmize is a verb, algorithmic & algorithmizable are adjectives and algorithmically is an adverb; the noun plural is algorithms.

Babylonian and later algorithms

An early Babylonian algorithm in clay.

Although there is evidence multiplication algorithms existed in Egypt (circa 1700-2000 BC), a handful of Babylonian clay tablets dating from circa 1800-1600 BC are the oldest yet found and thus the world's first known algorithm.  The calculations described on the tablets are not solutions to specific individual problems but a collection of general procedures for solving whole classes of problems.  Translators consider them best understood as an early form of instruction manual.  When translated, one tablet was found to include the still familiar “This is the procedure”, a phrase the essence of every algorithm.  There must have been many such tablets but there's a low survival rate of stuff from 40 centuries ago not regarded as valuable.

So associated with computer code has the word "algorithm" become that it's likely a goodly number of those hearing it assume this was its origin and any instance of use happens in software.  The use in this context, while frequent, is not exclusive but the general perception might be it's just that.  It remains technically correct that almost any set of procedural instructions can be dubbed an algorithm but given the pattern of use from the mid-twentieth century, to do so would likely mislead or confuse confuse many who might assume they were being asked to write the source code for software.  Of course, the sudden arrival of mass-market generative AI (artificial intelligence) has meant anyone can, in conversational (though hopefully unambiguous) text, ask their tame AI bot to produce an algorithm in the syntax of the desired coding language.  That is passing an algorithm (using the structures of one language) to a machine which interprets the text and converts it to language in another structure, something programmers have for decades been doing for their clients.

A much-distributed general purpose algorithm (really more of a flow-chart) which seems so universal it can be used by mechanics, programmers, lawyers, physicians, plumbers, carpet layers, concreting contractors and just about anyone whose profession is object or task-oriented.   

The AI bots have proved especially adept at such tasks.  While a question such as: "What were the immediate implications for Spain of the formation of the Holy Alliance?" produces varied results from generative AI which seem to range from the workmanlike to the inventive, when asked to produce computer code the results seem usually to be in accord with a literal interpretation of the request.  That shouldn't be unexpected; a discussion of early nineteenth century politics in the Iberian Peninsular is by its nature going to to be discursive while the response to a request for code to locate instances of split infinitives in a text file is likely to vary little between AI models.  Computer languages of course impose a structure where syntax needs exactly to conform to defined parameters (even the most basic of the breed such as that PC/MS-DOS used for batch files was intolerant of a single missing or mis-placed character) whereas something like the instructions to make a cup of tea (which is an algorithm even if not commonly thought of as one) greatly can vary in form even though the steps and end results can be the same.

An example of a "how to make a cup of tea" algorithm.  This is written for a human and thus contains many assumptions of knowledge; one written for a humanoid robot would be much longer and include steps such as "turn cold tap clockwise" and "open refrigerator door".

The so-called “rise of the algorithm” is something that has attracted much comment since social media gained critical mass; prior to that algorithms had been used increasingly in all sorts of places but it was the particular intimacy social media engenders which meant awareness increased and perceptions changed.  The new popularity of the word encouraged the coining of derived forms, some of which were originally (at least to some degree) humorous but beneath the jocularity, many discovered the odd truth.  An algorithmocracy describes a “rule by algorithms”, a critique in political science which discusses the implications of political decisions are being made by algorithms, something which in theory would make representative and responsible government not so much obsolete as unnecessary.  Elements of this have been identified in the machinery of government such as the “Robodebt” scandal in Australia in which one or more algorithms were used to raise and pursue what were alleged to be debts incurred by recipients of government transfer payments.  Despite those in charge of the scheme and relevant cabinet ministers being informed the algorithm was flawed and there had been suicides among those wrongly accused, the politicians did nothing to intervene until forced by various legal actions.  While defending Robodebt, the politicians found it very handy essentially to disavow connection with the processes which were attributed to the algorithm.

The feeds generated by Instagram, Facebook, X (formerly known as Twitter) and such are also sometimes described as algorithmocracies in that it’s the algorithm which determines what content is directed to which user.  Activists have raised concerns about the way the social media algorithms operate, creating “feedback loops” whereby feeds become increasingly narrow and one-sided in focus, acting only to reinforce opinions rather than inform.  In fairness, that wasn’t the purpose of the design which was simply to keep the user engaged, thereby allowing the platform to harvest more the product (the user’s attention) they sell to consumers (the advertisers).  Everything else is an unintended consequence and an industry joke was the word “algorithm” was used by tech company CEOs when they didn’t wish to admit the truth.  A general awareness of that now exists but filter bubbles won’t be going away but what it did produce were the words algorithmophobe (someone unhappy or resentful about the impact of algorithms in their life) and algorithmophile (which technically should mean “a devotee or admirer of algorithms” but is usually applied in the sense of “someone indifferent to or uninterested in the operations of algorithms”, the latter represented by the great mass of consumers digitally bludgeoned into a state of acquiescent insensibility.

Some of the products are fighting back: The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired and Why We Need to Fight Back Now (2024) by  by Hilke Schellmann, pp 336, Hachette Books (ISBN-13: 978-1805260981).

Among nerds, there are also fine distinctions.  There are subalgorithms (sub-algorithm seems not a thing) which is a (potentially stand-alone) algorithm within a larger one, a concept familiar in many programming languages as a “sub-routine” although distinct from a remote procedure call (RPC) which is a subroutine being executed in a different address space.  The polyalgorithm (again hyphens just not cool) is a set of two or more algorithms (or subalgorithms) with instructions for choosing which in some way integrated.  A very nerdy dispute does exist within mathematics and computer science around whether an algorithm, at the definitional level, really does need to be restricted to a finite number of steps.  The argument can eventually extend to the very possibility of infinity (or types of infinity according to some) so it really is the preserve of nerds.  In real-world application, a program is an algorithm only if (even eventually), it stops; it need not have a middle but must have a beginning and an end.

There is also the mysterious pseudoalgorithm, something les suspicious than it may first appear.  Pseudoalgorithms exist usually for didactic purposes and will usually interpolate (sometime large) fragments of a real algorithm bit it may be in a syntax which is not specific to a particular (or any) programming language, the purpose being illustrative and explanatory.  Intended to be read by humans rather than a machine, all a pseudoalgorithm has to achieve is clarity in imparting information, the algorithmic component there only to illustrate something conceptual rather than be literally executable.  The pseudoalgorithm model is common in universities and textbooks and can be simplified because millions of years of evolution mean humans can do their own error correction on the fly.

Of the algorithmic

The Netflix algorithm in action: Lindsay Lohan (with body-double) during filming of Irish Wish (2024).  The car is a Triumph TR4 (1961-1967), one of the early versions with a live rear axle, a detail probably of no significance in the plot-line.

The adjective algorithmic has also emerged as an encapsulated criticism, applied to everything from restaurant menus, coffee shop décor, choices of typefaces and background music.  An entire ecosystem (Instagram, TikTok etc) has been suggested as the reason for this multi-culture standardization in which a certain “look, sound or feel” becomes “commoditised by acclamation” as the “standard model” of whatever is being discussed.  That critique has by some been dismissed as something reflective of the exclusivity of the pattern of consumption by those who form theories about what seem not very important matters; it’s just they only go to the best coffee shops in the nicest parts of town.  In popular culture though the effect of the algorithmic is widespread, entrenched and well-understood and already the AI bots are using algorithms to write music will be popular, needing (for now) only human performers.  Some algorithms have become well-known such as the “Netflix algorithm” which presumably doesn’t exist as a conventional algorithm might but is understood as the sets of conventions, plotlines, casts and themes which producers know will have the greatest appeal to the platform.  The idea is nothing new; for decades hopeful authors who sent manuscripts to Mills & Boon would receive one of the more gentle rejection slips, telling them their work was very good but “not a Mills & Boon book”.  To help, the letter would include a brochure which was essentially a “how to write a Mills & Boon book” guide and it included a summary of the acceptable plot lines of which there were at one point reputedly some two dozen.  The “Netflix algorithm” was referenced when Falling for Christmas, the first fruits of Lindsay Lohan’s three film deal with the platform was released in 2022.  It was an example of followed a blending of several genres (redemption, Christmas movie, happy ending etc) and the upcoming second film (Irish Wish)  is of the “…always a bridesmaid, never a bride — unless, of course, your best friend gets engaged to the love of your life, you make a spontaneous wish for true love, and then magically wake up as the bride-to-be.” school; plenty of familiar elements there so it’ll be interesting to see if the algorithm was well-tuned.

Math of the elliptic curve: the Cox–Zucker machine can help.

Some algorithms have become famous and others can be said even to have attained a degree of infamy, notably those used by the search engines, social media platforms and such, the Google and TikTok algorithms much debated by those concerned by their consequences.  There is though an algorithm remembered as a footnote in the history of linguistic oddities and that is the Cox–Zucker machine, published in 1979 by Dr David Cox (b 1948) and Dr Steven Zucker (1949–2019).  The Cox–Zucker machine (which may be called the CZM in polite company) is used in arithmetic geometry and provides a solution to one of the many arcane questions which only those in the field understand but the title of the paper in which it first appeared (Intersection numbers of sections of elliptic surfaces) gives something of a hint.  Apparently it wasn’t formerly dubbed the Cox–Zucker machine until 1984 but, impressed by the phonetic possibilities, the pair had been planning joint publication of something as long ago as 1970 and undergraduate humor can’t be blamed because they met as graduate students at Princeton University.  The convention in academic publishing is for authors’ surnames to appear in alphabetical order and the temptation proved irresistible.

Friday, September 10, 2021

Random

Random (pronounced ran-duhm)

(1) Proceeding, made, or occurring without definite aim, reason, or pattern; lacking any definite plan or prearranged order; haphazard.

(2) In statistics, of or characterizing a process of selection in which each item of a set has an equal probability of being chosen (the random sample); having a value which cannot be determined but only described probabilistically.

(3) Of materials used in building and related constructions, lacking uniformity in size or shape.

(4) Of ashlar (stonework), laid without continuous courses and applied without regularity:

(5) In slang (also clipped to “rando” and some on-line sources insist “randy” is also used), something or someone unknown, unidentified, unexpected or out of place; anything odd or unpredictable (not necessarily a pejorative term and used as both noun & adjective).

(6) In slang, someone unimportant; a person of no consequence (always a pejorative).

(7) In printing, the sloping work surface at the top of a compositor's workbench on which type is composed (also called a bank and use now almost exclusive to the UK).

(8) In mining, the direction of a rake-vein.

(9) Speed, full speed; impetuosity, force (obsolete).

(10) In ballistics, the full range of a bullet or other projectile and thus the angle at which a weapon is tilted to gain maximum range (obsolete).

(11) In computing (as pseudorandom), mimicking the result of random selection.

1650s: From the earlier randon, from the Middle English randoun & raundon, from the Old French randon, a derivative of randir (to run; to gallop) of Germanic origin (related to the Old High German rinnan (to run) (from which Modern French gained randonnée (long walk, hike), from either the Frankish rant (a running) & randiju (a run, race) or the Old Norse rend (a run, race), both from the Proto-Germanic randijō, from rinnaną (run), from the primitive Indo-European r̥-nw- (to flow, move, run).  It was cognate with the Middle Low German uprinden (to jump up) and the Danish rende (to run).  The development of the adjective to mean “having no definite aim or purpose, haphazard, not sent in a special direction” evolved in the 1650s from the mid-sixteenth century phrase “at random” (at great speed) which picked up the fourteenth century sense from the Middle English noun randon & randoun (impetuosity; speed).  In English, the meaning closely mirrored that in the Old French randon (rush, disorder, force, impetuosity), gained from Frankish or other Germanic sources.  The spelling shift in Modern English from -n to –m was not unusual (seldom, ransom etc).  Random is a noun & adjective, randomness & randomosity are nouns, randomize is a verb and randomly is an adverb; the noun plural is randoms.

A “random person” is one variously unknown, unidentified, unexpected or out of place.

In general use, the meanings related to speed (full speed; force, trajectory of delivery etc) faded from use between the fourteenth & seventeenth centuries but persisted in the field of ballistics where “random” described the limit of the range of a bullet or other projectile (thus the angle at which a weapon was tilted to gain the maximum range.  Even that was largely obsolete by the early twentieth century but the idea of the angle being “a random” persists still in pockets in the UK to describe a sloping work surface on which printers compose pages (although few now use physical metal type).  The now familiar twenty-first century slang use can be either pejorative (someone unimportant; a person of no consequence) or neutral tending to the amused (something or someone unknown, unidentified, unexpected or out of place; anything odd or unpredictable).  The modern adoption appears to have its origin in 1980s US college student slang when “a person who does not belong on our dormitory floor” was so described; from this the hint of “inferior, undesirable” was perhaps inevitable.  “Rando” seems to be the standard abbreviation but some on-line sources also list “randy” which would seem to risk confusion or worse.

School lunch social engineering: Some sources recommend parents cut their children’s sandwiches in random ways.  The theory is it helps train their minds to accept change and helps them learn to adapt.

In computing, random access memory (RAM) had since the 1980s become familiar as one of a handful of the critical specifications of a computer (CPU, RAM, drive space) and the origin of the terms dates from IBM’s labs in the early 1950s when it was used to describe a new form of memory which could be read non-sequentially.  The modern RAM used by personal computers, servers, smart phones etc is an evolution from the original memory model; in the world of the early mainframes there was simply storage which could fulfil the functions now performed by both RAM and media like hard disks & solid state drives.  RAM is now a well-known commodity but the companion ROM (Read-Only Memory) is understood only by nerds and only an obsessional few of them give it much thought.  RAM volatile in that the contents are inherently temporary lost when the device is powered-down or re-started; it can thus be thought of as using static electricity for data storage.  That characteristic means it’s fast, affording the most rapid access by the CPU (Central Processing Unit) so is used to hold whatever data is at the time most in demand and that can be parts of the operating system, applications or documents.  ROM is non-volatile and whatever is written to ROM remains even if a device is switched-off; it’s thus used for essential, information like firmware and hardware information.

In mathematics and statistics, random does have precise definitions but in general use it’s used also as a vague synonym for “typical or average”.  To a statistician, the word implies “having unpredictable outcomes to the extent all outcomes are equally probable and if any statistical correlation is found to exist it will be wholly coincidental.  Thus, although all dictionaries list the comparative as more random and the superlative as most random, a statistician will insist these are as absurd as “very unique” although even among mathematicians phrases like “increasingly random” or “tending to randomness” are probably not unknown.  For others, the forms are useful and the colloquial use to mean “apropos of nothing; lacking context; unexpected; having apparent lack of plan, cause or reason” is widely applied to events, even those which to a specialist may not be at all random and may even be predictable.  For most of us, any sub-set of numbers which appears to have no pattern will appear random but mathematicians need to be more precise.  In the strict, technical sense, a true random number set exists only when two conditions are satisfied: (1) the values are uniformly distributed over a defined interval or set and (2) it is impossible to predict future values based on past or present ones.  In the pre-computer age, creating random number lists was challenging and subsequent analysis has found some of the sets created by manual or mechanical means were not truly random although those which were sufficiently large probably were functional for the purposes to which they were put.

“Random news” is something strange, unexpected and often amusing.    

Now, random number generators (RNG) are used and they can exist either in hardware or software and there are two types (1) pseudorandom number generators (PRNG) and true random number generators (TRNG).  A software algorithm, a PRNG emulates a TRNG by mimicking the selection of a value to approximate true randomness, the limitation being the algorithm being based on a distribution (the origin of the term pseudorandom) which can only produce something ultimately deterministic and predictable (although to determine the pattern can demand much computational power).  Relying on a seed number, if that can be isolated, other numbers can be predicted although, if the subset is large, for many purposes, what PRNGs generate is functional.  TRNGs don’t use an algorithm (although their processes can be represented by one) but are instead based on an unpredictable physical variable such as radioactive decay of isotopes, airwave static, or the behaviour of subatomic particles, the latter now favoured for their utterly unpredictable movements, now called “pure randomness”.  So random is the behaviour of subatomic particles that their observation appears to be immune to measurement biases which can (at least in theory) afflict other methods.

Random numbers are important in a number of fields including (1) statistical sampling and experimentation where it’s essential to select a random sample to ensure that the results are representative of the entire population, (2) cryptography where random numbers are used to generate the encryption keys which ensure the security of data and communications, (3) simulation and modelling where there’s a need to replicate real-world scenarios, (4) gaming & gambling where the need exists to create unpredictable outcomes and (5) randomized controlled trials (RCT), notably in medical and scientific research where true randomness is needed to assist in the assessment of the effectiveness of treatments, interventions, or policies.

Sunday, April 25, 2021

Ziggurat

Ziggurat (pronounced zik-kur-at, zik-u-rat or zig-oo-rat)

(1) In the architecture of the ancient Babylonians and Assyrians, a temple of Sumerian origin in the form of a pyramidal tower, consisting of a number of stories and having about the outside a broad ascent winding round the structure, presenting the appearance of a series of terraces.

(2) In architecture, any structure similar in appearance.

(3) In statistics and mathematical modeling, as ziggurat algorithm, an algorithm for pseudorandom number sampling, relying on an underlying source of uniformly-distributed random numbers as well as computed tables.

1875–1880: Various cited as from the Akkadian word ziqquratu; from the Assyrian ziqqurati (summit, height) or from an extinct Semitic language, derived from a verb meaning "to build on a flat space." The various spellings were zikkurrat, ziqqurrat, ziqqurat (rare) and ziggurat.  Ziggurat is a noun and zigguratic & zigguratical are adjectives; the noun plural is ziggurate or ziggurats.

The Chogha Zanbil ziggurat was built circa 1250 BC by Untash-Napirisha, King of Elam, probably to honour the Elamite god Inshushinak.  Destroyed in 640 BC by Ashurbanipal, King of Assyria, part of it was excavated between 1951-1961 by Roman Ghirshman (1895-1979), a Ukrainian-born French archeologist who specialized in ancient Persia.  It was the first Iranian site to be added to UNESCO’s World Heritage List.

Ziggurats were massive structures with particular architectural characteristics.  They served as part of a temple complex in the various local religions of Mesopotamia and the flat highlands of what is now western Iran.  Sumer, Babylonia, and Assyria were home to about twenty-five ziggurats.  The shape of a ziggurat makes it clearly identifiable.  It has a platform base which is close to square with sides that recede inward as the structure rises and a flat top presumed to have supported some form of a shrine.  Sun-baked bricks form the core of a ziggurat, with fire-baked bricks used for the outer faces and unlike the Egyptian pyramids, a ziggurat was a solid structure with no internal chambers, an external staircase or spiral ramp provided access to the top platform.  The handful of ziggurats still visible are ruins, but, based on the dimensions of their bases, it’s estimated they may have been as much as 150 feet (46m) high.  It’s possible the terraced sides were planted with shrubs and flowering plants, and some scholars have suggested the legendary Hanging Gardens of Babylon (one of the Seven Wonders of the Ancient World), was a ziggurat.  Ziggurats were some of the oldest structures of ancient religions, the first examples dating from circa 2200 BC and the last circa 500 BC; only a few of the Egyptian pyramids predate the oldest ziggurats.  The Tower of Babel is thought to have been a ziggurat.

Depiction of Lindsay Lohan in ziggurat dress, part of the Autumn-Winter 1994-1995 "Staircase Pleats" collection by Japanese designer Issey Miyake (1938-2022).  Miyake San was noted for his technology-focused clothing designs.

Saturday, November 6, 2021

Granular

Granular (pronounced gran-yuh-ler)

(1) Of the nature of granules; grainy.

(2) Composed of or bearing granules or grains.

(3) Showing a granulated structure.

(4) In computing, an object existing as a singular form at the level of the file system but which exists at the application level in multiple parts.

(5) Relating to or containing particles having a strong affinity for nuclear stains, as in certain bacteria.

1762 (although use not widespread until 1794): From the Late Latin granulum (granule, a little grain), diminutive of the Latin granum (grain, seed) from the primitive Indo-European gre-no- (grain) + -ar (from the From Latin -āris (of, near, pertaining to), the suffix appended to various words, often nouns, to make the adjectival form; added most often, but not exclusively, to words of Latin origin).  The word seems rather suddenly to have replaced the late fourteenth century granulous.  Granular, granularity, granule & granulation are nouns, granulate is a verb & adjective and granulatory is an adjective.

Terminology describing degrees of granularity

As granular has become a more widely used word, fastidious types have noted the increasing frequency of things being described as "more granular" or "less granular" and this elicits disapproval because it’s imprecise.  Something granular is composed of (usually small), discrete entities as opposed to being continuous and that’s a binary distinction, not a matter of degree so it’s inherently unclear if "more granular" and "less granular" indicate finer or coarser granularity.  For clarity, one should speak only of finer or coarser granularity.

Lindsay Lohan represented in granular art, an artificial intelligence (AI) generated artwork created by Wout from AI Fountain as part of the Curated Community Art initiative (CCAI) and finished in Adobe Photoshop.  Each digital artwork created by this algorithm is unique and made from a set of parameters; process and output are thus both inherently granular.

In computing, the concept of granularity exists in many forks and layers.  Users deal frequently with granular data, most typically when handling what appears to exist in many parts but which is, to the system, at least one layer, a single object.  For system administrators, it’s an especially handy attribute when it’s necessary to recover one small piece of data which has been copied or backed-up as something really huge and there are big machine operators which now routinely handle data sets of a size which only a few years ago were unimaginably large.  For them, the ability to look at the whole and be able to extract pieces, drilling down if need be to individual bytes, makes easily possible what would otherwise require much time and hardware; hence the metaphor of granularity, a mechanism to find a particular grain in a silo of many trillions.

That’s useful but really is just brute-force, the massive up-scaling up of something which has existed since the earliest forms of digital storage.  More intriguing is the recent emergence of Granular computing (GrC), a fork in information processing, the focus of which is information granules, entities created from the processes of data abstraction and derivations from data.  The source and structure of this data is not the imperative; what matters are the relationships (of which there may be many) which can, for example, simultaneously be both the extent of difference and a dependence on indistinguishability.  GrC, as it now exists, is more of a conceptual direction than a coherent process or even a theoretical perspective.  Its most promising implication is perhaps the granules which might form as relationships between previously disparate data sets are explored.  This may allow previously unrealized correlates to be identified, perhaps enabling humanity to mine the accumulate data sets for what Donald Rumsfeld (1932–2021: US Secretary of Defense 1975-1977 & 2001-2006) called the unknown knowns.  Rumsfeld may have been evil but his mind could sparkle and many unknown knowns may await.  

Sunday, July 26, 2020

Trope

Trope (pronounced trohp)

(1) In art and literature, any literary or rhetorical device, as metaphor, metonymy, synecdoche, and irony, that consists in the use of words in other than their literal sense and which tends to become a motif.

(2) In rhetoric, a figure of speech in which words or phrases are used with a non-literal or figurative meaning, such as a metaphor.

(3) In geometry, a tangent space meeting a quartic surface in a conic or the reciprocal of a node on a surface (archaic).

(4) In music, a short cadence at the end of the melody in some early music; a pair of complementary hexachords in twelve-tone technique.

(5) In the rituals of Judaism, a chanting (cantillation) pattern, or one of the marks that represents it.

(6) In medieval Christianity (and preserved in the rituals of certain factions in Roman Catholicism), either a phrase or verse added to the Mass when sung by a choir or a phrase, sentence, or verse formerly interpolated in a liturgical text to amplify or embellish.

(7) In Athenian philosophy, any of the ten arguments used in scepticism to refute dogmatism.

(8) In Santayanian philosophy, the principle of organization according to which matter moves to form an object during the various stages of its existence.

(9) In metaphysics, a particular instance of a property, as contrasted with a universal.

1525–1535: From the Latin tropus (a figure of speech (in rhetoric)) from the Ancient Greek τρόπος (trópos) (a turn, direction, course, way; manner, fashion; a mode in music; a mode or mood in logic (in rhetoric, "a turn or figure of speech)) and related to τροπή (trop) (solstice; trope; turn) and τρέπειν (trépein) (to turn).  Root was the primitive Indo-European trep (to turn), related also to the Sanskrit trapate (is ashamed, confused, literally "turns away in shame") which Latin picked up trepit (he turns), the Latin adoption in the figurative.  The meaning is now understood as something more diffuse but technically, in rhetoric, a trope was "a figure of speech in which a word or phrase is used in a sense other than the usual definition".  In English, the word is found often in combined form (such as heliotrope) and occurs also in concrete nouns that correspond to abstract nouns ending in -tropy or -tropism.  Trope is a noun & verb, troper, tropist, tropology & tropism are nouns and tropey is an adjective; the noun plural is tropes.

When younger, Lindsay Lohan's signature trope was playing dual roles (The Parent Trap (1998), Freaky Friday (2003) and I Know Who Killed Me (2007).  During her “troubled starlet” phase, she became emblematic of the “downward spiral” trope.  In 2022, she appeared in Falling for Christmas, Netflix's latest take on the "Christmas movie trope".  Although the scripts for tropes have long followed an algorithm, the studios are said now to be using a predictive form of artificial intelligence (AI) to hone the generation of whatever should have the most audience appeal.  The screen-writers (most of whom drive cars and use other products manufactured using processes in which machines substantially have displaced the human labor content) are are unlikely ultimately to succeed in keeping AI out of their profession and, in the medium term, their future may lie in the creation of the quirky and bizarre but in the economy, that's a niche.  For the formulaic stuff (most commercial cinema), the studios are likely to find the AI path "better, cheaper, faster" and the history of US industrial relations suggests these imperatives will prove irresistible.               

The Stage Five Clinger Trope

Most sources cite origin of the Stage 5 Clinger trope as the movie Wedding Crashers (2005) although there are claims it merely popularized the use; without earlier citations however, the trope’s origin appears to be the movie.  As a technical point, a stage one clinger isn’t initially labelled as such, the term applied retrospectively after syndrome is diagnosed.  If men are smart or lucky, they’ll recognize this by stage two but some men are so stupid they don’t realize until stage four.  While in movie there was no discussion of stages other than “5”, by implication five was most extreme and memes soon fleshed out 1-4:

Stage 1 Clinger: She seems fine

First date goes well, she’s attentive, interested, even gets the drinks sometimes and she makes breakfast.  Afterwards, text messages are fun and flirtatious.

Stage 2 Clinger: Hunter and game

The text messages become frequent, the first hint of the lure / engage / trap strategy of the lone hunter.  SMSs start out OK which lulls you into a false sense of security.  Before long, a few messages have been exchanged, most of which have required you to agree with her about innocuous stuff like the weather or today’s traffic.  Then, she’ll suggest a second date and extract a commitment to a specific time/date/place.  That will be soon.       

Stage 3 Clinger: Manoeuvres

Second date not something you’ll wish to repeat.  Bit creepy, how much she knew about you, clearly adept at mining the web.  To escape, you agree to third date while finding pretext to avoid confirming time.  Within hours, text messages become frequent to the point of nuisance.  Check Facebook and you’ll see she’s friended everyone you know.  Ignore SMS and eventually it goes quiet… for about an hour.  Then she phones.  Third date will not be possible to avoid, the illusion you’ll use it to end things still something you convince yourself to believe.  The S3C stage can frequently be the point of no return.  Acquaint yourself with the tale of Julius Caesar (100-44 BC; Roman general and dictator of Rome 49-44 BC) crossing the Rubicon and ponder.      

Stage 4 Clinger: The circling vulture

By stage four, clinging has slurred effortlessly into stalking and S4C is likely to send your mother flowers on her birthday and attempt at avoidance will prompt texting and calling from other phones.  Those who drive are even more of a threat because, where you go, she can follow so you’ll run into her in the most improbable places, and usually she’ll suggest taking advantage of the coincidence by going to lunch, dinner or whatever else might be close.  No matter how studiously you watch the rear-vision mirror, she’ll hunt you down and find you.

"I've got a stage five clinger", Wedding Crashers (2005).  

Stage 5 Clinger: Thrill of the kill

At this point, her life is scheduled around your own, even to the point where she may now work in the same building, expects to have lunch together every day and a drink after work whenever possible.  When you try to avoid these, emotional meltdowns ensue, the only way to avoid a scene being to agree.  Many of your friends start asking you out as a couple and tell you you’re lucky because she’s wonderful.  She’s been to their dinner parties where she talks about your plans together.  Stage five clinger can also be APC ("actual psycho-chick", the two not synonymous but there’s frequent overlap).  Pursuing another relationship in an attempt to dissuade her brings its own problems, the S5C-APC will spray-paint CHEATER on either their car or yours (in red; unless car is red, then she’ll use black).  At this point, faking your own death begins to look like good tactic.

Cling on and no matter what, never let go: Crooked Hillary (b 1947) and Bill Clinton (b 1946) in the rain at the formal dedication of the William J Clinton Presidential Center, Little Rock, Arkansas, November 2004.  

The significance of dividing the path of the clinger into stages is it’s vital to extricate yourself from their clutches during the earliest stage possible; it needs to be remembered progression can be rapid, some clingers so adept at the art they're able to skip one or even two stages.  The longer delayed the excision, the harder it becomes and if allowed to reach the later stages, you may be stuck with her forever and for that, you can’t blame her: you're trapped and it's all your fault; you have only yourself to blame.

Friday, September 23, 2022

Emoji

Emoji (pronounced ih-moh-jee)

In digital technology, a small digital picture or pictorial symbol that represents a thing, feeling, concept etc, used in text messages and other electronic communications, now usually as part of a standardized set.  Technically an emoji is a digital graphic icon with a unique code point.

1999: From a creation in Japanese translating literally as “pictograph”, the construct being e- (picture, drawing) + moji (written character or letter).  In the original Japanese it’s 絵文字 (えもじ, emoji), the construct being 絵 (え (e, (picture)) + 文字 (もじ (moji) (character).

Proto emojis: Puck Magazine 1881.

Because of a cross-lingual phonetic coincidence, emoji is often thought related to the word emotion, a natural connection because it’s emotions that emojis are now used to convey.  That was the connection with the emoji’s predecessor, the emoticon, the concept of text-based symbols being used to replace certain instances of formal language.  The first codified form of the emoticon set was released in 1982 and used the standard ASCII (American Standard Code for Information Interchange) character set assembled to represent ideas as images ((*_*) being a face, : ( sadness, : (( extreme sadness etc).  The idea wasn’t new, various punctuation marks used for hundreds of years in a similar manner, including in newspapers and books, but there had never been any standardization except that which existed by agreement between regular correspondents although, in 1881, American magazine Puck published four symbols which could be used to convey joy, melancholy, indifference, and astonishment.  Assembled using standard shapes from mechanical type-setting, Puck probably either created or at least legitimized what came to be called typographical art.



The idea of localised conventions would later appeal to a community using a common means of communication with a closed character set: Morse Code operators who devised their own convenient shorthand, a set of numbers transmitted by a short series of dots and dashes, which all understood represented longer strings of text, commonly used messages including:

1- Wait a moment

4- Where shall I go ahead?

6- I am ready

7- Are you ready?

8- Close your key; circuit is busy

12- Do you understand?

13- I understand

24- Repeat this back

27- Priority, very important

29- Private, deliver in sealed envelope.

73- Best regards

88- Love and kisses

92- Deliver promptly

The concept is exactly the same as the part of the algorithm used by data compression programs (ZIP and others) whereby small values are used to represent (and replace) larger ones, hence the ability to compress file-size.  The pragmatic Morse operator's list was mostly business-like, focused on transmitting the most information with the fewest taps but there were a couple more romantic: 73 meant “best regards” and 88 “love and kisses”, both of which would become stalwarts in the world of emoticons and emojis.

Lindsay Lohan Emojis.

The idea of the emoticon, still a disparate thing without standards, began to coalesce in the 1990s, Microsoft bundling the wingdings truetype font with Windows and by the middle of the decade, the first SMS (short message services) products, the protocols for which had evolved as part of the GSM (Global System for Mobile Communications) standards, were released.  Strange as it may sound in an age when SMS messages number annually in the trillions, the take-up rate was initially slow but growth was soon exponential.  Screen-focused, emoticons were always integral to SMS.

Shigetaka Kurita’s 1999 DoCoMo emoji set.

While not the first emoji set, that being a plain black collection included with the Japanese J-Phone in 1997, it’s Shigetaka Kurita’s (b 1972) release in 1999 which is the first notable landmark.  Interestingly, reflecting the intention to make communication more efficient on NTT DoCoMo's business-oriented cellular platform, apart from some hearts (intact and broken), the 176 in the set didn’t include many to convey emotion, although in the abstract, the one representing a beer glass was often used to suggest “I need a drink”.  The beginnings were modest, reflecting both the hardware and the mobile networks of the time; although bright, each was rendered in a single color and the bitmapped shape was blocky but the range and definition constantly improved to the point where, unlike emoticons, emojis really are pictures rather than typographic approximations and this has influenced the use of the word, "emoji" now sometimes applied to just about any small picture in any digital context.

A splash of vomit emojis.

In the English-speaking world, critical mass in terms of adoption was reached in 2012, the year after Apple added an official set to the iOS keyboard, Android following in 2014 when KitKat was released.  Apple had included emojis in the Japanese releases of iOS since 2008 and may have been tempted to extend availability when it became apparent how many hacks existed to gain the feature on devises using other languages.  What made that viable was emoji, in 2010, being standardized by Unicode (the non-profit consortium which maintains text standards on digital devices globally) which meant emojis could be sent and received by any device, regardless of operating system or platform.  By then, the standard set had grown to almost a thousand.  The Unicode Consortium has been busy ever since, creating an emoji subcommittee which has so much business to transact it meets at least weekly and their output has been prodigious: by September 2021, over 3,600 emojis had been approved, 112 in the last release alone.

Crooked Hillary Clinton emoji.

A character set in the thousands and growing has however changed the nature of the emoji as a language supplement, it once being possible to know them all and rely on many others also knowing most.  With so many, it’s become just another language, a system where every user has their own sub-set and analysis of traffic suggests for most this can be just a handful and even among devotees it’s rare for them regularly to display a vocabulary of more than a few dozen.  While, as a medium of meaning, the emoji does depend on an intuitive understanding of appearance, if some are too weird or mysterious, there is Emojipedia, an on-line emoji reference which documents changes and definitions and EmojiTranslate is a website where the translation of text to emoji (and vice versa) is handled.  Even that isn’t enough to satisfy the evidentiary standards of courts in some jurisdictions, accredited translators now sometimes used to translate the meaning of emojis where material using them is tendered in evidence.  Emoji is just another language and something in one cultural context can mean something else in another, the meaning the sender implied perhaps the opposite of what the receiver inferred.  On the basis of established principles such as “reasonable doubt” or “balance of probabilities”, courts must decide.

The New Yorker, 30 March 2015.

Out in the world of the emoji freaks, books have been written using nothing but emojis, a concept not new.  In the 1990s, one pop-music journalist, displeased at the quality of an interview with a singer he was about to publish, rendered the whole thing in the zaph dingbat font (which in professional typesetting had existed since the 1970s), rendering it an illegible cryptogram to all except those who had memorized the mapping of the font.  Such people do exist but they’re rare and it’s not clear if the writer succeeded in his aim to make more interesting a boring interview.  One magazine to find a novel use was the fine New Yorker which, in 2015, ran a cover featuring Crooked Hillary Clinton emojis when discussing the mail server affair, one of the many scandals attached to her although, they unfortunately resisted the temptation to integrate a delete key into one.  Perhaps inspired, in her presidential campaign, crooked Hillary tried to weaponize the emoji in a tweet aimed at a younger demographic but received quite a backlash for doing something so obviously cynical; inauthentic being the modern term.

The work of the consortium has also been cognizant of forces operating more widely.  In 2014, they began to address the lack of racial and gender DEI (diversity, equity and inclusion) in the little images, the population disproportionately male and white, a distortion of reality hardly appropriate in what was to some degree one of the world’s global languages.  In this they were later than some; in 2012, the ever-woke Apple included in iOS 6 several emojis of same-sex couples.  Although all were shown holding hands, they didn’t look any happier than their more traditional predecessors but there are limitations with what can be achieved on such a tiny digital canvas.  In another sign of the times, over the years, guns morphed into less threatening water-pistols.  Perhaps strangely, the pandemic didn’t produce a flood of corona-themed images, Apple’s set still the only of the majors to include something recognizably SARS-Cov-2ish.  Still, there's plenty of time, world emoji day is 17 July and COVID-19, unlike some of us, is expected to be alive and well for many Julys to come.