Saturday, July 1, 2023

Dynamometer

Dynamometer (pronounced dahy-nuh-mom-i-ter)

(1) A device for measuring mechanical force or muscular power (ergometer).

(2) A device for measuring mechanical power, especially one that measures the output or driving torque of a rotating machine.

1800–1810: A compound word, the construct being dynamo + meter.  Dynamo was ultimately from the Ancient Greek δύναμις (dúnamis; dynamis) (power) and meter has always been an expression of measure in some form and in English was borrowed from the French mètre, from the Ancient Greek μέτρον (métron) (measure).  What meter (also metre) originally measured was the structure of poetry (poetic measure) which in the Old English was meter (measure of versification) from the Latin metrum, from the Ancient Greek metron (meter, a verse; that by which anything is measured; measure, length, size, limit, proportion) ultimately from the primitive Indo-European root me- (measure).  Although the evidence is sketchy, it appears to have been re-borrowed in the early fourteenth century (after a three hundred-year lapse in recorded use) from the Old French mètre, with the specific sense of "metrical scheme in verse”, again from the Latin metrum.  Metre (and metre) was later adopted as the baseline unit of the metric system.  Dynamometer is a noun; the noun plural is dynamometers.

The modern meaning of dynamometer (measuring the power of engines) dates from 1882 and is short for dynamo-machine, from the German dynamoelektrischemaschine (dynamo-electric machine), coined in 1867 by its inventor, the German electrical engineer Werner Siemans (1816-1892). Dynamometers, almost universally referred to as dynos, are machines which simultaneously measure the torque and rotational speed (RPM) of an engine or other rotating prime-mover so specific power outs may be calculated.  On modern dynamometers, measures are displayed either as kilowatts (kW) or brake-horsepower (bhp).

Evolution of the Turbo-Panzer

Porsche 917 Flat 12 being run on factory dynamometer, Stuttgart, 1969.

During the last hundred years odd, the rules of motor sport have been written by an alphabet soup of regulatory bodies including the AIACR, the CSI, the FISA and the FIA and these bureaucrats have made many bad decisions, tending often to make things worse but every now and then, as an unintended consequence of their dopiness, something really good emerges.  The large displacement cars of the mid-1960s contested sports car racing in one of the classic eras in motorsport.  Everyone enjoyed the competition except the rule-making body (the CSI, the Commission Sportive Internationale) which, on flimsy pretexts which at the time fooled nobody, changed the rules for the International Championship of Makes for the racing seasons 1968-1971, restricting the production cars (of which 50 identical units had to have been made) to 5.0 litre (305 cubic inch) engines with a 3.0 litre limit (183 cubic inch) for prototypes (which could be one-offs).  Bizarrely, the CSI even claimed this good idea would be attractive for manufacturers already building three litre engine for Formula One because they would be able to sell them (with a few adaptations), for use in endurance racing.  There’s no evidence the CSI ever asked the engine producers whether their highly-strung, bespoke Formula One power-plants, designed for 200 mile sprints, could be modified for endurance racing lasting sometimes 24 hours.  Soon aware there were unlikely to be many entries to support their latest bright idea, the CSI relented somewhat and allowed the participation of 5.0 litre sports cars as long as the homologation threshold of 50 units had been reached.  A production run of 50 made sense in the parallel universe of the CSI but made no economic sense to the manufacturers and, by 1968, entries were sparse and interest waning so the CSI grudgingly again relented, announcing the homologation number for the 5.0 litre cars would be reduced to 25.

The famous photograph of the 25 917s assembled for the CSI’s inspection outside the Porsche factory, Stuttgart, FRG (Federal Republic of Germany, the old West Germany, 20 April, 1969.

This attracted Porsche, a long-time contestant in small-displacement racing which, funded by profits from their increasingly successful road-cars, sought to contest for outright victories in major events rather than just class trophies.  Porsche believed they had the basis for a five litre car in their three litre 908 which, although still in the early stages of development, had shown promise.  In a remarkable ten months, the parts for twenty-five cars were produced, three of which were assembled and presented to the CSI’s homologation inspectors.  Pettifogging though they were, the inspectors had a point when refusing certification, having before been tricked into believing Ferrari’s assurance of intent actually to build cars which never appeared.  They demanded to see twenty-five assembled, functional vehicles and Porsche did exactly that, in April 1969 parking the twenty-five in the factory forecourt, even offering the inspectors the chance to drive however many they wish.  The offer was declined and, honor apparently satisfied on both sides, the CSI granted homologation.  Actually, it was just as well the offer to take the 25 for a run was declined because so hurriedly had many of the 917s been assembled (it was such a rush secretaries, accountants and such were press-ganged to help) that many could only be started, put in first gear and driven a few metres.  Thus, almost accidently, began the career of the Porsche 917, a machine which would come to dominate whatever series it contested and set records which would stand for decades, it’s retirement induced not by un-competitiveness but, predictably, by rule changes which rendered it illegal.  

917LH (Langheck (long tail)), Le Mans, 1969.

The ten month gestation was impressive but there were teething problems.  The fundamentals, the 908-based space-frame and the 4.5 (275 cubic inch) litre air-cooled flat-12 engine, essentially, two of Porsche’s 2.25 (137 cubic inch) litre flat-sixes joined together, were robust and reliable from the start but, the sudden jump in horsepower (HP) meant much higher speeds and it took some time to tame the problems of the car’s behaviour at high-speed.  Aerodynamics was then still an inexact science and the maximum speed the 917 was able to attain on Porsche’s test track was around 180 mph (290 km/h) but when unleashed on the circuits with long straights where over 210 mph (338 km/h) was possible the early cars could be lethally unstable.  The first breakthrough in aerodynamic dynamic was serendipitous.  After one high speed run during which the driver had noted (with alarm) the tendency of the rear end of the car to “wander from side to side”, it was noticed that while the front and central sections of the bodywork were plastered with snarge (squashed insects), the fibreglass of the rear sections was a pristine white, the obvious conclusion drawn that while the airflow was inducing the desired degree of down-force on the front wheels, it was passing over the rear of body, thus the lift which induced the wandering.  Some improvisation with pieces of aluminium and much duct tape to create an ad-hoc, shorter, upswept tail transformed the behaviour and was the basis for what emerged from more extensive wind-tunnel testing by the factory as the 917K for Kurzheck (short-tail).

Porsche 917Ks, the original (rear) and the updated version with twin tail-fins, Le Mans, 1971.

The 917K proved a great success but the work in the wind tunnel continued, in 1971 producing a variant with a less upswept tail and vertical fins which bore some resemblance to those used by General Motors and Chrysler a decade earlier.  Then, the critics had derided the fins as “typical American excess” and “pointlessly decorative” but perhaps Detroit was onto something because Porsche found the 917’s fins optimized things by “cleaning” the air-flow over the tail section, the reduction in “buffeting” meaning the severity of the angles on the deck could be lessened, reducing the drag while maintaining down-force, allowing most of the top-speed earlier sacrificed in the quest for stability to be regained.

The Can-Am: A red Porsche 917/10 ahead of an orange McLaren M8F Chevrolet, Laguna Seca, 17 October 1971.  Two years to the day after this shot was taken, the first oil shock hit, dooming the series.

The engine however had been more-or-less right from day one and enlarged first to 4.9 litres (300 cubic inch) before eventually reaching the 5.0 limit at which point power was rated at 632 HP, a useful increase from the original 520.  Thus configured, the 917 dominated sports car racing until banned by regulators.  However, the factory had an alternative development path to pursue, one mercifully almost untouched by the pettifoggers and that was the Canadian-American Challenge Cup (the Can-Am), run on North American circuits under Group 7 rules for unlimited displacement sports cars.  Actually, Group 7 rules consisted of little more than demanding four wheels, enveloping bodywork and two seats, the last of these rules interpreted liberally.  Not for nothing did the Can-Am come to be known as the “horsepower challenge cup” and had for years been dominated by the McLarens, running big-block Chevrolet V8s of increasing displacement and decreasing mass as aluminium replaced cast iron for the heaviest components.

The abortive Porsche flat-16.

In 1969, the Porsche factory dynamometer could handle an output of around 750 bhp, then thought ample but even 635 bhp wouldn’t be enough to take on the big V8s.  For technical reasons it wasn't feasible further to enlarge the flat-12 so Porsche built a flat-16 which worked well enough to exceed the capacity of the factory's dynamometer beyond its limit; the new engine was allocated a notional rating rated of 750 because that was the point at which the machine's graduations ended.  Such a thing had happened before, resulting in an anomaly which wasn’t for some years explained.  In 1959 Daimler released their outstanding 4.5 litre (278 cubic inch) V8 but their dynamometer was more antiquated still, a pre-war device unable to produce a reading beyond 220 bhp so that was the rating used, causing much surprise to those testing the only production model in which it was installed, the rather dowdy Majestic Major (DQ450 saloon & DR450 limousine, 1959-1968).  In either form the Majestic Major was quite hefty and reckoned to enjoy the aerodynamic properties of a small cottage yet it delivered performance which 220 bhp should not have been able to provide, something confirmed when one was fitted to a Jaguar Mark X (1961-1970 and badged 420G from 1967) for evaluation after Jaguar absorbed Daimler.  The V8 Mark X effortlessly out-performed the six cylinder version (rated at a perhaps optimistic 265 bhp.  Unfortunately, Jaguar choose not to use the Daimler V8 in the Mark X, instead enlarging the XK-six, dooming the car in the US market where a V8 version would likely have proved a great success.

The Can-Am: Porsche 917/10, Riverside, 1972.

Estimates at the time suggested the Porsche flat-16 delivered something like 785 bhp which in the Can-Am would have been competitive but the bulk rendered it unsuitable, the longer wheelbase necessitated for installation in a modified 917 chassis having such an adverse effect on the balance Porsche instead resorted to forced aspiration, the turbocharged 917s becoming known as the turbopanzers.  Porsche bought a new dynamometer which revealed they generated around 1100 bhp in racing trim and 1580 when tuned for a qualifying sprint.  Thus, even when detuned for racing, the Can-Am 917s typically took to the tracks generating about the same HP as the early Spitfires, Hurricanes and Messerschmitt which in 1940 fought the Battle of Britain.  Unsurprisingly, the 917 won the Cam-Am title in 1972 and 1973, the reward for which was the same as that earlier delivered in Europe: a rule change effectively banning the thing.  Still, when interviewed, one Porsche engineer admitted the new dynamometer "cost a boatload of money" but he was reported as seeming "pleased with the purchase." so there was that.

The widow-maker: 1975 Porsche 930 with the surprisingly desirable (for some) “sunroof delete” option.

The experience gained in developing turbocharging was however put to good use, the 911 Turbo (930 the internal designation) introduced in 1975 originally as a homologation exercise (al la the earlier 911 RS Carrera) but so popular did it prove it was added to the list as a regular production model and one has been a permanent part of the catalogue almost continuously since.  The additional power and its sometimes sudden arrival meant the times early versions were famously twitchy at the limit (and such was the power those limits were easily found), gaining the machine the nickname “widow-maker”.  There was plenty of advice available for drivers, the most useful probably the instruction not to use the same technique when cornering as one might in a front-engined car and a caution that even if one had had a Volkswagen Beetle while a student, that experience might not be enough to prepare one for a Porsche Turbo.  When stresses are extreme, the physics mean the location of small amounts of weight become subject to a multiplier-effect and the advice was those wishing to explore a 930's limits of adhesion should get one with the rare “sunroof delete” option, the lack of the additional weight up there slightly lowering the centre of gravity.  However, even that precaution may only have delayed the delaying the inevitable and possibly made the consequences worse, one travelling a little faster before the tail-heavy beast misbehaved.

In what may have been a consequence of the instability induced by a higher centre of gravity, in 2012 Lindsay Lohan crashed a sunroof-equipped Porsche 911 Carrera S on the Pacific Coast Highway in Santa Monica, Los Angeles.

The interaction of the weight of a 911’s roof (and thus the centre of gravity) and the rearward bias of the weight distribution was not a thing of urban myth or computer simulations.  In the February 1972 edition of the US magazine Car and Driver, a comparison test was run of the three flavours of the revised 911 with a 2.3 litre (143 cubic inch) (911T, 911E & 911S) engine and the three were supplied with each of the available bodies: coupé, targa & sunroof coupé, the latter two with addition weight in the roof.  What the testers noted in the targa & sunroof-equipped 911s was a greater tendency to twitchiness in corners, something no doubt exacerbated in the latter because the sliding panel’s electric motor was installed in the engine bay.  Car and Driver’s conclusion was: “If handling is your goal, it's best to stick with the plain coupe.”  She anyway had some bad luck when driving black German cars but clearly Ms Lohan should avoid Porsches with sunroofs.

Friday, June 30, 2023

Antichrist

Antichrist (pronounced an-ti-krahyst)

(1) In Christian theology, a particular personage or power, variously identified or explained, who is conceived of as appearing in the world as the principal antagonist of Christ.

(2) An opponent of Christ; a person or power antagonistic to Christ (sometimes lowercase).

(3) A disbeliever in Christ (often initial lowercase)

(4) A false Christ (often initial lowercase).

1400s: From the Middle English, from the (pre 1150) Late Old English antecrist (an opponent of Christ, an opponent of the Church, especially the last and greatest persecutor of the faith at the end of the world), from the Late Latin Antichrīstus, from the Late Greek ντίχριστος (antíkhristos & antíchrīstos (I John ii.18)), the construct being aντί- (anti-) (against) + khristos (Christ); the Greek Χριστός meaning "anointed one".   This was the earliest appearance of anti- in English and one of the few before circa 1600.  In contemporary English, it’s often (but not always) preceded by the definite article: the Antichrist.  Antichrist is a noun, antichristian is a noun & adjective, antichristianism is a proper noun, antichristianly is an adverb and antichristic is an adjective; the noun plural is antichrists.

The Antichrist and the End of Days

The Antichrist is mentioned in three passages in The New Testament, all in the First and Second Epistles of John (I John 2.18-27, I John 4.1-6, 2 John 7).  Common to all is the theme of Christian eschatology, that the Antichrist is the one prophesied by the Bible who will substitute themselves in Christ's place before the Second Coming.  Biblical scholars note also the term pseudokhristos (false Christ) in the books of Matthew (chapter 24) and Mark (chapter 13), Jesus warning the disciples not to be deceived by false prophets claiming to be Christ and offering "great signs and wonders".  Other imagery which can be associated with an Antichrist is mentioned in the Apostle Paul's Second Epistle to the Thessalonians and, of course, the Beast in the Book of Revelation.  The scriptural language is redolent with drama, the Antichrist spoken of or alluded to as the “abomination of desolation”, the son of perdition, “the man of lawlessness” or “the beast” (from earth or sea).

For most of the Middle Ages, it was the scriptural construct of the Antichrist as an individual which dominated Christian thought; the Antichrist born of Satan but yet an earthly tyrant and trickster, perfectly evil in all he was and did because he was the diametric opposite of Jesus Christ, perfect in his goodness and deeds.  Jesus Christ, the son of God, was born of a virgin into earthly existence and the Antichrist, the son of Satan would be born of the antivirgin, a whore who, like her evil offspring, would claim purity.  More than a fine theological point, it’s also quite deliberately a hurdle for Christ to cross in his Second Coming.  Where Christ was God in the flesh, the Antichrist was Satan in the flesh and point was to beware of imitations.  This was the framework of the medieval narrative, well understood and hardly remarkable but writers fleshed it out to create essentially two threads.  For centuries there was the idea of the single Antichrist who would accrue his disciples, have his followers accept him as the Messiah and put to the sword those who did not.  He would then rule for seven years before until his defeat and destruction by (depending on the author) the archangel Gabriel or Christ the true and his divine armies, all before the resurrection of the dead and the day of Final Judgement.

For two-thousand-odd years, there has been speculation about the identity of the Antichrist. 

By the late Middle Ages, another narrative thread evolved, this one with a modern, structuralist flavor and one more able to be harnessed to a political agenda.  Now the Antichrist was presented not as a force of evil outside the Church but the evil force within, the deceiver perhaps the Pope, the institution of the papacy or the very structure of the Church.  This was a marvellously adaptable theory, well suited to those seeking to attack the institutional church for it rendered the Antichrist as whatever the construct needed to be: the flesh incarnate of a pope, the sins and corruption of a dozen popes and his cardinals or the very wealth and power of the institution, with all that implied for its relationships with the secular world.  That was the position of the more uncompromising of those who fermented the Protestant Reformation of the sixteenth century.  The monk Martin Luther (1483-1546) saw about him venality, depravity and corruption and knew the end of days and the Final Judgement was close, the pope the true “end times Antichrist who has raised himself over and set himself against Christ”.  Unlike the long tradition of antipopes, this was true eschatology in action.  There have been many Antipopes (from the Middle French antipape, from the Medieval Latin antipāpa) although just how many isn't clear and they came and went often as part of the cut and thrust of the Church’s ever-shifting alliances and low skulduggery.  While some of the disputes were over theological or doctrinal differences, sometimes they were about little more than whose turn it was.

The Reverend Dr Ian Paisley, European Parliament, Strasbourg, France, 11 October 1988.

For centuries, Antichrist was a label often used, Nero, Caligula and the prophet Muhammad all victims, sometimes with some frequency and the epithet was often exchanged in the squabbles between Rome and Constantinople.  In the modern, mostly secular West, while the Antichrist has vanished from the consciousness of even most Christians, in the pockets of religiosity which the general godlessness has probably afforced, Antichrists appear to have multiplied.  Like “fascist” in political discourse, “Antichrist” has become a trigger word, a general category where disapprobation is not enough and there’s the need to demonise though even the hunter can be captured by the game.  In October 1988, Pope John Paul II (Karol Wojtyła 1920–2005; pope 1978-2005), who had often warned of the Antichrist waving his antigospel, was interrupted during a speech to the European Parliament by the Reverend Dr Ian Paisley (1926–2014; leader of the Democratic Unionist Party (DUP) 1971-2008 & First Minister of Northern Ireland 2007-2008), who loudly denounced him as ''the Antichrist.''  Standing and holding a large red placard displaying his message, Dr Paisley shouted out ''I renounce you as the Antichrist!''.  He was soon ejected, his holiness seemingly unperturbed.  The late Reverend had a long history of antipathy to popery in general and the “Bachelor bishop of Rome” in particular and, when later interviewed, told the press ''I don't believe he is infallible. He doesn't have the power to turn wine into the blood of Christ.''

Coming usually from the evangelical right, south of the Mason-Dixon Line, it seems to play well and it’s been aimed at the usual suspects including Barack Obama, Osama bin Laden, Saddam Hussein, Adolf Hitler, Joseph Stalin, Bill Gates, George Soros, at least two ayatollahs and, perhaps most plausibly, crooked Hillary Clinton.  Interestingly, although never denying practicing witchcraft or voodoo, crooked Hillary Clinton did feel the need to deny being the Antichrist.  In What Happened (Simon & Schuster, 2017, 512 pp ISBN: 978-1-5011-7556-5), a work of a few dozen pages somehow padded out to over five-hundred using the “how to write an Amazon best-seller” template, a recounting of the denial is there and the exchange does have a rare ring of truth.  It’s a shame that didn’t extend to the rest of the book; claimed to be a review of the 2016 presidential election, it might have been an interesting apologia rather than a two-inch thick wad of blame-shifting.

Never despair.  In the Christian tradition, the Antichrist will finally be defeated by the armies of God under the leadership of Christ with the Kingdom of God on earth or in heaven to follow.  Good finally will prevail over evil.

Thursday, June 29, 2023

Phlebotomy

Phlebotomy (pronounced fluh-bot-uh-mee)

(1) The act or practice of opening a vein for letting or drawing blood as a therapeutic or diagnostic measure; the letting of blood and known in historic medicine as "a bleeding".

(2) Any surgical incision into a vein (also known as venipuncture & (less commonly) venesection).  It shouldn’t be confused with a phlebectomy (the surgical removal of a vein).

1350–1400: From the earlier flebotomye & phlebothomy, from the Middle French flebotomie, from the thirteenth century Old French flebothomie, (phlébotomie the Modern French) from the Late & Medieval Latin phlebotomia, from the Ancient Greek φλεβοτόμος (phlebotómos) (a lancet used to open a vein), the construct being φλέψ (phléps) (genitive phlebos) (vein), of uncertain origin + tomē (a cutting), from the primitive Indo-European root tem- (to cut).  The form replaced the Middle English fleobotomie.  The noun phlebotomist (one who practices phlebotomy, a blood-letter) is documented only as late as the 1650s but may have been in use earlier and operated in conjunction with the verb phlebotomize.  The earlier noun and verb in English (in use by the early fifteenth century) were fleobotomier & fleobotomien.  The Latin noun phlebotomus (genitive phlebotomī) (a lancet or fleam (the instruments used for blood-letting)) was from the Ancient Greek φλεβότομος (phlebótomos) (opening veins), the construct being φλέψ (phléps) (blood vessel) + τέμνω (témnō) (to cut) + -ος (-os) (the adjectival suffix).  The alternative spelling was flebotomusThe noun fleam (sharp instrument for opening veins in bloodletting (and this in the pre-anesthetic age)) was from the late Old English, from Old French flieme (flamme in Modern French), from the Medieval Latin fletoma, from the Late Latin flebotomus, from Greek φλεβοτόμος (phlebotómos) (a lancet used to open a vein).  The doublet was phlebotome and in barracks slang, a fleam was a sword or dagger.  Phlebotomy & Phlebotomist are nouns, phlebotomize is a verb and phlebotomic & phlebotomical are adjectives; the noun plural is phlebotomies.

Phlebotomy describes the process of making a puncture in a vein cannula for the purpose of drawing blood.  In modern medicine the preferred term is venipuncture (used also for therapy) although the title phlebotomist continues to be used for those who specialize in the task.  One of the most frequently performed procedures in clinical practice, it’s commonly undertaken also by doctors, nurses and other medical staff.  Although the origins of phlebotomy lie in the ancient tradition of blood letting, it’s now most associated with (1) the taking of blood samples for testing by pathologists and (2) those carried out as “therapeutic phlebotomies” as part of the treatment regimen for certain disorders of the blood.  The inner elbow is the most often used site but in therapeutic medicine or in cases where the veins in the arms are not suitable, other locations can be used.

Bleeding the foot (circa 1840), oil on canvas following Honoré Daumier (1808-1879).

It’s an urban myth the Hippocratic Oath includes the clause: “First, do no harm” but by any reading that is a theme of the document and while the Greek physician Hippocrates of Kos (circa 460-circa 375 BC) wouldn’t have been the first in his field to regard illness as something to be treated as a natural phenomenon rather than something supernatural, he’s remembered because of his document.  His doctrine was one which took a long time to prevail (indeed there are pockets where still it does not), holding that treatment of ailments needed to be based on science (“evidence-based” the current phrase) rather than devotion or appeals to the gods.  His influence thus endures but one of his most famous theories which persisted for decades resulted in much lost blood for no known benefit and an unknown number of deaths.  Drawing from the notion of earlier philosophers that the basis of the universe was air, earth, water & fire, the theory was that there were four “humors” which had to be maintained in perfect balance to ensure health in body & mind, the four being flegmat (phlegm), sanguin (blood), coleric (yellow bile) & melanc (black bile) which were the source of the four personality types, the phlegmatic, the sanguine, the choleric & the melancholic.  Had Hippocrates and his successors left the humors in the realm of the speculative, it would now be thought some amusing fragment from Antiquity but unfortunately surgical intervention was designed to ensure balance was maintained and the mechanism of choice was bloodletting to “remove excess liquids”.

George Washington in his last illness, attended by Doctors Craik and Brown (circa 1800) engraving by unknown artist, Collection of The New-York Historical Society.

Apparently, bloodletting was practiced by the ancient Egyptians some 3000 years ago and it’s not impossible it was among the medical (or even religious) practices of older cultures and From there it’s known to have spread to the Middle East, Rome, Greece and West & South Asia, physicians and others spilling blood in the quest to heal and the evidence suggests it was advocated for just about any symptom.  The very idea probably sounds medieval but in the West that really was the nature of so much medicine until the nineteenth century and even well into the twentieth, there were still some reasonably orthodox physicians advocating its efficacy.  Still, in fairness to Hippocrates, he was a pioneer in what would now be called “holistic health management” which involved taking exercise, eating a balanced diet and involving the mind in art & literature.  He was an influencer in his time.  All the humors were of course good but only in balance so there could be too much of a good thing.  When there was too much, what was in excess had to go and apart from bloodletting, there was purging, catharsis & diuresis, none of which sound like fun.  Bloodletting however was the one which really caught on and was for centuries a fixture in the surgeon’s bag.

Blood self-letting: Lindsay Lohan as Carrie from the eponymous film, Halloween party, Foxwoods Resort & Casino, Connecticut, October 2013.

Actually, as the profession evolved, the surgeons emerged from the barber shops where they would pull teeth too.  The formal discipline of the physician did evolve but they restricted themselves to providing the diagnosis and writing scripts from which the apothecary would mix his potions and pills, some of which proved more lethal than bloodletting.  The bloodletting technique involved draining blood from a large vein or artery (the most productive soon found to be the median cubital at the elbow) but if a certain part of the body was identified as being out-of-balance, there would be the cut.  The mechanisms to induce blood loss included cupping, leeching & scarification and with the leeches, they were actually onto something, the thirsty creatures still used today in aspects of wound repair and infection control, able often to achieve better results more quickly than any other method.  Leeches have demonstrated extraordinary success in handing the restoration of blood flow after microsurgery and reimplantation and works because the little parasites generate substances like fibrinase, vasodilators, anticoagulants & hyaluronidase, releasing them into the would area where they assist the healing process by providing an unrestricted blood flow.  Of course the leeches don't always effect a cure.  When in 1953 doctors were summoned to examine a barely conscious comrade Stalin (1878-1953; Soviet leader 1924-1953), after their tests they diagnosed a haemorrhagic stroke involving the left middle cerebral artery.  In an attempt to lower his blood pressure, two separate applications of eight leeches each were applied over 48 hours but it was to no avail.  Had he lived he might have had both leeches and physicians shot but all survived to be of further service.

A Surgeon Letting Blood from a Woman's Arm, and a Physician Examining a Urine-flask (in some descriptions named Barber-Surgeon Bleeding a Patient), eighteenth century oil on canvas, attributed to school of Jan Josef Horemans (Flemish; 1682-1752); Previously attributed to Flemish School, artist Richard Brakenburg (Dutch; 1650-1702).

Scarification was a scraping of the skin and if the circumstances demanded more, leeches could be added.  Cupping used dome-shaped cups placed on the skin to create blisters through suction and once in place, suction was achieved through the application of heat.  However it was done it could be a messy, bloody business and in the twelfth century the Church banned the practice, calling it “abhorrent” and that had the effect of depriving priests and monks of a nice, regular source of income which wasn’t popular.  However, especially in remote villages far from the bishop’s gaze, the friars continued to wield their blades and harvest their leeches, the business of bloodletting now underground.  In the big towns and cities though the barbers added bloodletting to their business model and it’s tempting to wonder whether package deals were offered, bundling a blooding with a tooth pulling or a haircut & shave.  From here it was a short step to getting into the amputations, a not uncommon feature of life before there were antibiotics and to advertise their services, the barber-surgeons would hang out white rags smeared in places with blood, the origin of the red and white striped poles some barbers still display.  To this day the distinctions between surgeons and physicians remains and in England the Royal College of Physicians (the RCP, a kind of trade union) was founded by royal charter in 1518.  By the fourteenth century there were already demarcation disputes between the barber surgeons and the increasingly gentrified surgeons and a number of competing guilds and colleges were created, sometimes merging, sometimes breaking into factions until 1800 when the Royal College of Surgeons (RCS) was brought into existence.  It's said there was a time when fellows of the RCP & RCS, when speaking of each-other, would only ever make reference to "the other college", the name of the institution never passing their lips. 

Bloodletting tools: Late eighteenth century brass and iron “5-fingered” fleam.

Unfortunately, while doubtlessly lobbying to ensure the fees of their members remained high, the colleges did little to advance science and the byword among the population remained: “One thing's for sure: if illness didn't kill you, doctors would”.  It was the researchers of the nineteenth century, who first suggested and then proved germ theory, who sounded the death knell for most bloodletting, what was visible through their microscopes rendering the paradigm of the four humors obsolete.  By the twentieth century it was but a superstition.

Wednesday, June 28, 2023

Corrupt

Corrupt (pronounced kuh-ruhpt)

(1) Guilty of dishonest practices, as bribery; lacking integrity; crooked; willing to act dishonestly for personal gain; willing to make or take bribes; morally degenerate.

(2) Debased in character; depraved; perverted; wicked; evil.

(3) Of a text, made inferior by errors or alterations.

(4) Something infected or tainted; decayed; putrid; contaminated.

(5) In digital storage (1) stored data that contains errors related to the format or file integrity; a storage device with such errors.

(6) To destroy the integrity of; cause to be dishonest, disloyal, etc, especially by coercion, bribery or other forms of inducement.

(7) Morally to lower in standard; to debase or pervert.

(8) To alter a language, text, etc for the worse (depending on context either by the tone of the content or to render it non-original); to debase.

To mar or spoil something; to infect, contaminate or taint.

To make putrid or putrescent (technically an archaic use but there’s much overlap of meaning in the way terms are used).

(11) In digital storage, introduce errors in stored data when saving, transmitting, or retrieving (technically possible also in dynamic data such as memory).

(12) In English Law, to subject (an attainted person) to corruption of blood (historic use only).

(13) In law (in some jurisdictions) a finding which courts or tribunals can hand down describing certain conduct.

1300–1350: From the Middle English verb corrupten (debased in character), from the Middle French corrupt, from the Old French corropt (unhealthy, corrupt; uncouth (of language)) from the Latin corruptus (rotten, spoiled, decayed, corrupted (and the past participle of corrumpō & corrumpere (to destroy, ruin, injure, spoil (figuratively “corrupt, seduce, bribe” (and literally “break to pieces”)), the construct being cor- (assimilated here as an intensive prefix) + rup- (a variant stem of rumpere (to break into pieces), from a nasalized form of the primitive Indo-European runp- (to break), source also of the Sanskrit rupya- (to suffer from a stomach-ache) and the Old English reofan (to break, tear)) + -tus (the past participle suffix).  The alternative spellings corrumpt, corrump & corroupt are effectively all extinct although dictionaries sometimes list them variously as obsolete, archaic or rare.  Corrupt and corrupted are verbs & adjectives (both used informally by IT nerds as a noun, sometimes with a choice adjective), corruptedness, corruption, corruptible, corruptness, corrupter & corruptor are nouns, corruptest is a verb & adjective, corruptive is an adjective, corrupting is a verb and corruptedly, corruptively & corruptly are adverbs; the most common noun plural is corruptions.  Forms (hyphenated and not) such as incorruptible, non-corrupt, over-corrupt, non-corrupt, pre-corrupt & un-corrupt etc are created as needed.

The verb corrupt in the mid-fourteenth century existed in the sense of “deprave morally, pervert from good to bad which later in the 1300s extended to “contaminate, impair the purity of; seduce or violate (a woman); debase or render impure (a language) by alterations or innovations; influence by a bribe or other wrong motive", reflecting generally the senses of the Latin corruptus.  The meanings “decomposing, putrid, spoiled”, “changed for the worse, debased by admixture or alteration (of texts, language etc) and “guilty of dishonesty involving bribery" all emerged in the late fourteenth century.  The noun corruption was from the mid-fourteenth century corrupcioun which was used of material things, especially dead bodies (human & animal) to convey “act of becoming putrid, dissolution; decay”.  It was applied also to matter of the soul and morality, it being an era when the Church was much concerned with “spiritual contamination, depravity & wickedness”.  The form was from the Latin corruptionem (nominative corruptio) (a corruption, spoiling, seducing; a corrupt condition), the noun of action from the past-participle stem of corrumpere (to destroy; spoil (and figuratively “corrupt, seduce, bribe”.  The use as a synonym for “putrid matter” dates from the late 1300s while as applied to those holding public office being tainted by “bribery or other depraving influence” it was first noted in the early 1400.  The specific technical definition of “a corrupt form of a word” came into use in the 1690s.  The adjective corruptible (subject to decay or putrefaction, perishable) was from either the Old French corroptible or directly from Late Latin corruptibilis (liable to decay, corruptible), from the past-participle stem of corrumpere (to destroy; spoil (and figuratively “corrupt, seduce, bribe”.  In fourteenth century English, it applied first to objects and by the mid fifteenth to those “susceptible of being changed for the worse, tending to moral corruption.  The more blatant sense of “open to bribery” appears in the 1670s.

Boris Johnson, hair by Ms Kelly Jo Dodge MBE.

Corruption is probably a permanent part of politics although it does ebb and flow and exists in different forms in different places.  In the UK, the honors system with its intricate hierarchy and consequent determination on one’s place in the pecking order on the Order of Precedence has real world consequences such as determining whether one sits at dinners with the eldest son of a duke or finds one’s self relegated to a table with the surviving wife of a deceased baronet.  Under some prime-ministers the system was famously corrupt and while things improved in the nineteenth century, under David Lloyd George (1863–1945; UK prime-minister 1916-1922) honors were effectively for sale in a truly scandalous way.  None of his successors were anywhere near as bad although Harold Wilson’s (1916–1995; UK prime minister 1964-1970 & 1974-1976) resignation honors list attracted much comment and did his reputation no good but in recent years it’s been relatively quiet on the honors front.  That was until the resignation list of Boris Johnson (b 1964; UK prime-minister 2019-2022) was published.  It included some names which were unknown to all but a handful of political insiders and many others which were controversial for their own reasons but at the bottom of the list was one entry which all agreed was well deserved: Ms Kelly Jo Dodge, for 27 years the parliamentary hairdresser, was created a Member of the Most Excellent Order of the British Empire (MBE) for parliamentary service.  Over those decades, she can have faced few challenges more onerous than Mr Johnson’s hair yet never once failed to make it an extraordinary example in the (actually technically difficult) “not one hair in place” style known colloquially in her profession as the JBF.  Few honours have been so well deserved and more illustrious decorations have been pinned on many who have done less for the nation.

In being granted a gong Ms Dodge fared better than another parliamentary hairdresser.  Between 1950-1956, the speaker of the Australian House of Representatives (the lower house) was Archie Cameron (1895–1956) and in some aspects his ways seemed almost un-Australian: he didn’t drink, smoke, swear or gamble.  Not approving of anything to do with the turf, he ordered the removal from the wall of the Parliament House barber’s salon a print of racehorse Phar Lap (1926–1932, the thoroughbred which won the 1930 Melbourne Cup) and later served notice on the barber to quit the building, Cameron suspecting (on hard & fast grounds) he was a SP (starting price) bookie.  Before state-run T.A.B.s (Totalisator Agency Board) were in the 1960s established to regulate such activities, SP bookies were a popular (and convenient) way to undertake off course betting and, like Phar Lap, they were born in New Zealand, the first operating there in 1949.

While in some ways not stereotypically Australian, other parts of his character made Cameron a quintessential of the type.  Once, when displeased by one member’s conduct on the floor of the house, he demanded he bow to the chair and apologize.  Not satisfied with the response, he told the transgressor he needed to bow lower and when asked how low was required, replied: “How low can you go?  As speaker he exercised great power over what went on in the building and insisted on dress standards being maintained although he didn’t adhere to his own rules, on hot days often wandering the corridors in shorts and a singlet; the parliamentary cleaning staff were said to resent the habit, fearing that visitors might mistake him for a cleaner and “damage their prestige”.

Official portrait of Speaker Cameron in the traditional horsehair wig and robes of office.  The wig was the one Dr HV Evatt (1894–1965; leader of opposition 1951-1960) had worn while a judge (1930-1940) of the High Court of Australia (HCA) and Cameron wasn’t best pleased about that but it had been presented to the parliament and no other was available so Cameron “contented himself by reflecting that ‘it was time some straight thinking was done under this wig’.

Upon election in 1949, the prime-minister (Sir Robert Menzies (1894–1978; prime-minister of Australia 1939-1941 & 1949-1966) apparently shuddered at the thought of a “loose cannon” like Cameron in cabinet or on the backbench so appointed him speaker, despite being warned by the respected Frank Clifton Green (1890–1974; clerk of the House of Representatives (Australia) 1937-1955) that Cameron’s habit of being “…so consistently wrong with such complete conviction that he was right” made him “the worst possible choice” for the role.”  On hearing of his nomination, old Ben Chifley (1885–1951; prime minister of Australia 1945-1949) predicted “He’ll either be the best speaker ever or the worst”, concluding a few months later: “I think he’s turned out to be the bloody worst.  Once installed, he made himself a fixture and one not easily dislodged.  Although it was in the Westminster system common for speaker to resign if a house voted a dissent from one of their rulings, Cameron suffered five successful motions of dissent against his rulings, one of them moved by the prime-minister himself.  As one member later recounted: “He just shrugged his shoulders and carried on.  He couldn’t care less whether the house supported him or not.  Archie liked being speaker and intended to keep the job.  Keep it he did, dying in office in 1956.  Green summed him up as “…a queer mixture of generosity, prejudice and irresponsibility” and many noted the parliament became a more placid place after he quit the world.

A corrupted fattie

Corrupt, a drug addict and a failure: The Führer and the Reichsmarschall at Carinhall, next to a stature of a beast of the field.  Hitler once told a visitor; “You should visit Göring at Carinhall, a sight worth seeing.”

Hermann Göring (1893–1946; leading Nazi 1922-1945 and Reichsmarschall 1940-1945) was under few illusions about the sentence he would receive from the International Military Tribunal (IMT) at the first Nuremberg Trial (1945-1946) and resented only the method of execution prescribed was to be "hanged by the neck until dead".  Göring thought that fit only for common criminals and as Germany's highest ranked soldier, he deserved the honor of a firing squad; the death of a gentleman.  In the end, he found his own way to elude the noose but history has anyway judged him harshly as richly deserving the gallows.  He heard many bad things said of him at the trial, most of it true and much of it said by his fellow defendants but the statement which most disappointed him was that Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) had condemned him as “corrupt, a drug addict and a failure”.  Once that was publicized, he knew there would be no romantic legend to grow after his execution and his hope that in fifty years there would be statutes of him all over Germany was futile.  In fairness, even in that he’d been a realist, telling the prison psychologist the statutes might be “…small ones maybe, but one in every home”.  Hitler had of course been right; Göring was corrupt, a drug addict and a failure but that could have been said of many of his paladins and countless others in the lower layers of what was essentially a corrupted, gangster-run state.

Corruption is of course though something bad and corrosive to the state but other people's corruption in other states can be helpful.  In 1940, after the fall of France, the British were genuinely alarmed Spain might enter the war on the side of the Axis, tempted by the return of the Rock of Gibraltar and the acquisition of colonial territory in North Africa.  London was right to be concerned because the loss of Gibraltar would have threatened not only the Royal Navy's ability to operate in the Mediterranean but also the very presence of the British in North African and even the supply of oil from the Middle East, vital to the conduct of the war.  Indeed, the "Mediterranean strategy" was supported strongly by German naval strategists and had it successfully been executed, it would have become much more difficult for the British to continue the war.  Contrary to the assertions of some, Adolf Hitler (1889-1945; Führer (leader) and German head of government 1933-1945 & head of state 1934-1945) did understand the enormous strategic advantage which would be achieved by the taking of Gibraltar which would have been a relatively simple undertaking but to do so was possible only with Spanish cooperation, the Germans lacking the naval forces to effect a seaborne invasion.  Hitler did in 1940 meet with the Spanish leader Generalissimo Francisco Franco (1892-1975; Caudillo of Spain 1939-1975) in an attempt to entice his entry into the conflict and even after the Battle of Britain, Hitler would still have preferred peace with the British rather than their defeat, the ongoing existence of the British Empire better suited to his post-war (ie after victory over the USSR) visions. 

The Führer and the Caudillo at the French railway station in Hendaye, near the Spanish–French border, 23 October 1940.

Franco however was a professional soldier and knew Britain remained an undefeated, dangerous foe and one able to draw on the resources both of her empire and (increasingly) assistance from the US and regarded a victory by the Axis as by no means guaranteed.  Additionally, after a bloody civil war which had waged for four years, the Spanish economy was in no state to wage war and better than most, Franco knew his military was antiquated and unable to sustain operations against a well equipped enemy for even days.  Like many with combat experience, the generalissimo also thought war a ghastly, hateful business best avoided and Hitler left the long meeting after being unable to meet the extraordinary list of conditions demanded to secure Spanish support, declaring he'd "sooner have three teeth pulled than go through that again".  Franco was a practical man who had kept his options open and probably, like the Duce (Benito Mussolini (1883-1945; Duce (leader) & prime-minister of Italy 1922-1943)) would have committed Spain to the cause had a German victory seemed assured.  British spies in Madrid and Lisbon soon understood that and to be sure, the diplomatic arsenal of the UK's ambassador to Madrid, Sir Samuel Hoare (1880-1959), was strengthened with money, the exchequer's investment applied to bribing Spanish generals, admirals and other notables to ensure the forces of peace prevailed.  Surprising neither his friends or enemies, "slippery Sam" proved adept at the dark arts of disinformation, bribery and back-channel deals required to corrupt and although his engaging (if unreliable) memoirs were vague about the details, documents provided by his staff suggest he made payments in the millions at a time a million sterling was a lot of money.  By 1944, the state of the war made it obvious any threat of Spanish belligerency was gone and he returned to London.

The dreaded corrupted FAT

Dating from the mid-1970s, the file allocation table (FAT) is a data structure used by a number of file systems to index and manage the files on storage devices.  First associated with 8 inch (200 mm) floppy diskettes, it became familiar to users when introduced by Microsoft in the early days of PC (personal computer) operating systems (OS) and was used on the precursors to the PC-DOS & MS-DOS OSs which dominated the market during the 1980s.  Over the years there have been a number of implementations, the best known of which are FAT12, FAT16 & FAT32, the evolution essentially to handle the increasing storage capacity of media and the need to interact with enhancements to OSs to accommodate increasing complexities such as longer file names, additional file attributes and special files like sub-directories (now familiar as folders which technically are files which can store other files).

A FAT is almost always stored on the host device itself and is an index in the form of a database which consists of a table with records of information about each file and directory in the file system.  What a FAT does is provide a mapping between the logical file system and the physical location of data on the storage medium so it can be thought of as an address book.  Technically, the FAT keeps track of which clusters (the mechanism by which the data is stored) on the device are linked to each file and directory and this includes unused clusters so a user can determine what free space remains available.  Ultimately, it’s the FAT which maintains a record of the links between the clusters which form a file's data chain and the metadata associated with each file, such as its attributes, creation & modification timestamps, file size etc.  In the same way that when reading a database a user is actually interacting primarily with the index, it’s the FAT which locates the clusters associated with a request to load (or view, delete etc) a file and determine their sequence, enabling efficient read and write operations.  The size, structure and complexity of FATs grew as the capacity of floppy diskettes and then hard disks expanded but the limitations of the approach were well-understood and modern operating systems have increasingly adopted more advanced file systems like HPFS (High Performance File System, developed IBM for OS/2). NTFS (New Technology File System, developed by Microsoft for Windows NT) or exFAT (Extended File Allocation Table, developed by Microsoft as a way of providing simple cross-platform, large capacity storage without the overhead of NTFS) although FAT remains widely used especially on lower capacity and removable devices (USB drives, memory cards etc), the main attraction being the wide cross-platform compatibility.

A corrupted image (JPEG) of Lindsay Lohan.  Files can be corrupted yet appear as correct entries in the FAT and conversely, a corrupted fat will usually contain many uncorrupted files; the files are content and the FAT an index.

The ominous sounding corrupted FAT is a generalized term which references errors in a FAT’s data structure.  There are DBAs (database administrators) who insist all databases are in a constant state of corruption to some degree and when a FAT becomes corrupted, it means that the data has become inconsistent or damaged and this can be induced by system crashes, improper shutdowns, power failures, malware or physical damage to the media.  The consequences can be minor and quickly rectified with no loss of data or varying degrees of the catastrophic (a highly nuanced word among IT nerds) which may result in the loss of one or more files or folders or be indicative of the unrecoverable failure of the storage media.  Modern OSs include tools which can be used to attempt to fix corrupted FATs and when these prove ineffective, there are more intricate third-party products which can operate at a lower level but where the reported corruption is a symptom of hardware failure, such errors often prove terminal, thus the importance of data (and system) backups.

The grey area between corruption and "just politics"

As an adjective, corrupt is used somewhat casually to refer to individuals or institutions thought to have engaged in practices leading to personal gain of some sort (not necessarily financial) which are either morally dubious or actually unlawful and a corrupt politician is the usual example, a corrupted politician presumably one who was once honest but tempted.  The synonyms of corrupt are notoriously difficult to isolate within set parameters, perhaps because politicians have been so involved in framing the definitions in a way which seems rarely to encompass anything they do, however corrupt it may to many appear.  The word dishonest for example obviously includes those who steal stuff but is also used of those who merely lie and there are circumstances in which both might be unlawful but wouldn’t generally to thought corrupt conduct except by the most morally fastidious.  The way politicians have structured the boundaries of acceptable conduct is that it’s possible to be venal in the sense of selling patronage as long as the consideration doesn’t literally end up as the equivalent of cash in the pocket although such benefits can be gained as long as there’s some degree of abstraction between the steps.

Once were happy: Gladys Berejiklian and Daryl Maguire, smiling.

In Australia, news the New South Wales (NSW) Independent Commission against Corruption (ICAC) had handed down a finding that former premier Gladys Berejiklian (b 1970; NSW Premier (Liberal) 2017-2021) had acted corruptly was of course interesting but mystifying to many was that despite that, the commission made no recommendation that criminal charges be considered.  It transpired that was because the evidence Ms Berejiklian was required to provide to the ICAC wouldn’t be admissible in a court because there, the rules of evidence are different and a defendant can’t be compelled to provide an answer which might be self-incriminating.  In other words a politician can be forced to tell the truth when before the ICAC but not before a court when charged.  That’s an aspect of the common law’s adversarial system which has been much criticized but it’s one of the doctrines which underpins Western law where there is a presumption of innocence and the onus of proof of guilt beyond reasonable doubt lies with the prosecution.  Still, what unfolded before the ICAC revealed that Ms Berejiklian seems at the least to have engaged in acts of Billigung (looking the other way to establish a defense of “plausible deniability”).  How corrupt that will be regarded by people will depend on this and that and the reaction of many politicians was to focus on the ICAC’s statement that criminal charges would not be pursed because of a lack of admissible evidence as proof that if there’s no conviction, then there’s no corruption.  Politicians have little interest in the bar being raised.  They were less forgiving of her former boyfriend (with whom she may or not have been in a "relationship" and if one did exist it may or may not have been "serious"), former fellow parliamentarian Daryl Maguire (b 1959, MLA (Liberal) for Wagga Wagga 1999-2018).  Despite legal proceedings against Mr Maguire being afoot, none of his former colleagues seemed reluctant to suggest he was anything but guilty as sin so for those who note such things the comparative is “more corrupt” and the superlative “most corrupt”, both preferable to the clumsy alternatives “corrupter” & “corruptest”.

The release of the ICAC’s findings came a couple of days before the newly created federal equivalent (the National Anti-Corruption Commission (NACC)) commenced operation.  Although the need for such a body had be discussed for decades, it was during the time the government was headed by Scott Morrison (b 1968; Australian prime-minister 2018-2022) that even many doubters were persuaded one would be a good idea.  Mr Morrison’s background was in marketing, three word slogans and other vulgarities so it surprised few a vulgarian government emerged but what was so shocking was that the pork-barreling and partisan allocation of resources became so blatant with only the most perfunctory attempts to hide the trail.  Such conduct was of course not new but it’s doubtful if before it had been attempted at such scale and within Mr Morrison’s world-view the internal logic was perfect.  His intellectual horizons defined by fundamentalist Christianity and mercantilism, his view appeared to be that only those who voted (or might be induced to vote) for the Liberal & National Parties were those who deserved to be part of the customer loyalty scheme that was government spending.  This tied in nicely with the idea those who accept Jesus Christ as the savior getting to go to Heaven, all others condemned to an eternity in Hell.  Not all simplicities are elegant.

As things stand, such an attitude to public finance (ie treating as much spending as possible as party re-election funds) is not unlawful and to most politicians (at least any with some reasonable prospect of sitting on the treasury benches) should not be thought “corrupt”; it’s just “politics” and in NSW, in 1992 it was confirmed that what is “just politics has quite a vista.  Then the ICAC handed down findings against then premier Nick Greiner (b 1947; NSW (Liberal) premier 1988-1992) over the matter of him using the offer of a taxpayer funded position to an independent member of parliament as an inducement to resign, the advantage being the seat might be won by the Liberal party in the consequent by-election.  As the ICAC noted, Mr Greiner had not acted unlawfully nor considered himself to be acting corruptly but that had been the result.  Indeed, none doubted it would never have occurred to Mr Greiner that doing something that was “just politics” and had been thus for centuries could be considered corrupt although remarkably, he did subsequently concede he was “technically corrupt” (not an admission which seems to have appealed to Ms Berejiklian).  The ICAC’s finding against Mr Greiner was subsequently overturned by the NSW Court of Appeal.

So the essence of the problem is just what corruption is.  What the public see as corrupt, politicians regard as “just politics” which, in a practical sense, can be reduced to “what you can get away with” and was rationalized by Ms Berejiklian in an answer to a question by the ICAC about pork-barrelling: "Everybody does it".  Of course that's correct and the differences between politicians are of extent and the ability to conceal but her tu quoque (translated literally as "thou also" and latterly as "you also"; translation in the vernacular is something like "you did it too") defense could be cited by all.  The mechanism of a NACC has potential and already both sides of politics are indicating they intend to use it against their political enemies so it should be amusing for those who enjoy politics as theatre although, unfortunately, the politicians who framed the legislation made sure public hearings would be rare.  One might suspect they want it to be successful but not too successful.  Still, the revelations of the last ten years have provided some scope for the NACC to try to make the accepted understanding of corruption something more aligned with the public’s perception.  Anomalies like a minister’s “partner” being a “partner” for purposes of qualifying for free overseas travel (business class air travel, luxury hotels, lavish dinners etc) yet not be defined a “partner” for purposes of disclosing things which might give rise to a possible conflict of interest for the minister is an example of the sort of thing where standardization might improve confidence.  It probably should be conceded that corruption can’t be codified in the way the speed limits for a nation’s highways can but it’s one of those things that one knows when one sees it and if the NACC can nudge the politicians’ behavior a bit in the direction of public expectation, it’ll be a worthy institution.  On a happier note, Mr Greiner went on to enjoy a lucrative corporate career and Ms Berejiklian (currently with telco Optus) is predicted to follow in his tracks although suggestions posted on social media she'd been offered a partnership at PwC (PricewaterhouseCoopers International Limited) on the basis of her experience making her a "perfect fit for the company" are thought mischievous rather than malicious.