The Tragic Tale of the Typhoid Marys

I’ve noticed with the recent Rona situation, how the broader sense of health and wellbeing (freedom, mental wellbeing, human connections, socialising, enjoying our families, experiencing happiness and hope and purpose) has been cast aside for a very restricted definition of ‘health’ (presence or absence of a certain pathogen, or presence or absence of a certain set of symptoms), and how destructive it really is.

It wasn’t all that long ago, when testing positive for a certain pathogen (whether you displayed symptoms or not) was enough to have your freedoms removed, and spend the rest of your days in misery…all in the name of public health

The most famous, of course, was ‘Typhoid Mary’.

Mary Mallon (1869 – 1938) arrived in America as a penniless 15-year-old Irish immigrant. She loved to cook, and she was good at it, too – that’s how she came to be working for affluent families.

Typhoid typically struck the poor and dirty parts of town, so when it struck the well-to-do family that Mary had been working for, they hired a sleuth to investigate – one George Soper.

When Mr Soper turned up at Mary’s door, demanding she give him samples of her urine and feces, Mary did what any respectable lady would do, under the circumstances – she chased him away with a meat fork! [1].

Soper, however, was undeterred. He went to the city health board, with his suspicions that Mary was an ‘asymptomatic carrier’ of typhoid. When Mr Soper returned with the police, Mary hid for five hours, before they finally discovered her hiding place, and hauled her off to the hospital to be tested. When her test returned positive, she was sent to Riverside Hospital on North Brother Island. She was there for 2 years, before she was allowed back into society, on the promise that she would not work as a cook.

During that time, Mary was forced to provide 163 samples of various bodily substances, in order to be tested. One hundred and twenty of those tested positive. Doctors pressured her to have her gallbladder surgically removed – Mary refused [2].

Mary went to work as a laundry maid. But the pay was poor and she missed cooking…and she didn’t really believe she was carrying diseases, when she seemed perfectly healthy. So she changed her name, and got a job in the kitchen at Sloane Maternity hospital. When a typhoid outbreak occurred there, she was discovered, and sent back to North Brother Island, where she stayed for the next 25 years, until her death in 1938.

Mary became infamous, the butt-end of jokes and cartoons, and an object of fear, in the media. At the time, approximately 1000 people per year in New York were diagnosed with typhoid – but they were mostly poor. Mary’s alleged victims were all rich, and perhaps that, along with the fact that she was an immigrant woman, is why Mary got the treatment she did? [3]

They blamed Mary for the death of three people, and sickness in dozens more, although by the time she died, hundreds of other ‘asymptomatic carriers’ had been discovered in the US, although, none were quarantined.

Numerous US newspapers ran stories in 1954, stating that “known carriers are kept under strict surveillance by the Public Health Officials and are visited at least twice yearly.  

None, under any circumstances, are permitted to work commercially with milk or other foods. Members of the carrier’s household are advised to be vaccinated, and annual booster shots are given (to the carrier) for additional protection.

All known typhoid carriers are listed in the State Registry so that, among other things, occupation and residence can be frequently checked upon by investigators. Owing to the instruction and supervision given, carriers usually prevent no menace to the community or household.

No drug yet found will rid the carrier’s body of the germs. However, since they frequently localize in the gallbladder or kidney, surgical removal of these organs frequently clears up the infection. Where both kidneys are infected, such an operation is, of course, impossible” [4].

Meanwhile, in the UK, it turns out that many ‘Typhoid Mary’s’ had their lives shattered because they tested positive for a particular germ…

IN 2008, BBC News broke the story, that at least 43 women ‘typhoid carriers’ had been locked up in Long Grove Asylum, Epsom, between 1907, and 1992, when it finally closed.

All were from the London area, and none displayed symptoms of typhoid. By all accounts, these women were mentally stable when admitted to the asylum, but years of living in isolation had affected them mentally (hardly surprising), and so their continued confinement was considered justified, even after the advent of antibiotic treatment for typhoid, in the 1950’s.

The Isolation Unit closed in 1972, and all but two of the women were moved into open wards in the asylum. The remaining two women were ‘incurable’ typhoid carriers, and were confined to two separate small rooms, where they lived out their days, with just the daily paper and a small tv as company.

This information came to light only because historians uncovered two volumes of records in the ruins. Most of the records from the asylum were (conveniently) destroyed after it shut down. [5]

Two women were still alive, when the asylum closed in 1992. They were transferred to other institutions. One woman, Rosina Bryans, had spent 60yrs of her life in confinement.

Staff don’t recall any of the women ever having visitors, despite many of them having been married with children, before being admitted [6].

In memory of Mary Allouis, A Brice, Mary Brooks, Rosina Bryans, Johannah Buckland, Lillian Buzzi, Martha Caunt, Lilian Clark, Marguerite Cross, R Cross, Mrs Davies, Elizabeth Driver, Ella Eves, Jane Caroline Finn alias Jackson, Charlotte Forward, Jennie French, Henrietta Victoria, Florence Fortune Greenhalf, Mabel Hardwick, Ellen Jones, Nellie Keylock, Maud Powell, Rebecca Restall, A Redson, Sarah Reynolds, Edith Rhodes, Charlotte Rock, Elsie Stacey, Bridget Tallott, Rose Thacker, Maud Louise Thomas, Ada Elizabeth Thompson, Emily Titcombe, Florence Elizabeth Truman, Margaret Vanderpant, Lily Wade, Margaret Warren, Ada Caroline Wellington, Marie Westlake, Sarah Whall, Ivy Whitmey-Smith, Emma Munnings, Florence Pell [7].

References:

[1] Latson J, Refusing Quarantine: Why Mary Did It, TIME, 11th November, 2014, Quarantine History: Who Was Typhoid Mary and What Happened to Her? | Time

[2] Inglis-Arkell E, What The City of New York Did to Typhoid Mary Was Pretty Horrific, Gizmodo, 25th December 2014, What the City of New York Did to “Typhoid Mary” Was Pretty Horrific (gizmodo.com)

[3] Brockell G. Yes, There Really Was a Typhoid Mary, an Asymptomatic Carrier Who Infected Her Patrons, The Washington Post, 18th March, 2020, ‘Typhoid Mary’: The true story of Irish cook who infected her patrons – The Washington Post

[4] Gilbert R.O, Your Health, South Pasadena Review, 10th August, 1954, page 4.

[5] Tyhoid Women Were Kept in Asylum, BBC News, 28th July, 2008, BBC NEWS | UK | Typhoid women were kept in asylum

[6] Hale B. The British Women Typhoid Carriers Who Were Locked Up For Life in a Mental Asylum, Until the 1990’s, Daily Mail, 29th July, 2008.

[7] Life Sentence, BBC Radio, 28th July, 2008, BBC – Today


Vaccine Hall of Shame: Dr. Saul Krugman

Dr Saul Krugman (1911-1995) was a pediatrician, and medical researcher. In his quest to study hepatitis and develop a vaccine, he experimented on mentally-disabled children at Willowbrook State School, on Staten Island.

His method of study was to collect feces from sick children, add it to chocolate milkshakes, and feed it to other disabled children, in order to study how and when they got sick.

Yes, chocolate poo milkshakes.

Krugman reasoned that nearly every child would get hepatitis, anyway (the reason for this will become abundantly clear as you keep reading).

His work was funded by the US military, who were apparently keen to find a cure for hepatitis, after it had wreaked havoc on American troops, during World War II.

Willowbrook State School was an institution for severely mentally disabled children and adults. There were very few options for those severely disabled children, at the time, and desperate parents signed consent forms for their children to take part in ‘vaccine research’ in return for a place at the school. One mother remembers asking why the experiments could not be conducted on animals, instead, and was told that would be ‘too expensive’ [1].

The school was opened in 1947, on Staten Island, and designed to hold 4000 residents, but for many years, 6000 residents were crammed into close quarters.

In 1965, Robert F Kennedy, then a New York Senator, paid an unannounced visit to the school, and was appalled by what he found – squalor, neglect and stench. He later testified before Congress, that the place was a ‘snake pit’ [2].

Seven years later, a local television news reporter, Geraldo Rivera, snuck into the grounds and filmed the appalling conditions that residents lived in. Some were naked, some were smeared in their own excrement, or banging their head against a wall. Staffing levels were woefully inadequate to provide proper care. The footage can still be viewed online (it is very disturbing) [3]. He says “It’s almost 50 years and speaking about it still makes me cry”.

Rather than using his influence to clean up the filthy conditions that fostered ill health, Krugman chose to deliberately sicken children, so as to study the progression of disease.

Not only was Krugman lauded for his work, receiving the Robert Koch medal, and other prestigious awards, he later became President of the American Paediatric Society.

Krugman is credited as having paved the way for the rubella, hepatitis and measles vaccines that are currently on the childhood schedule.

Strangely enough, his son, Richard Krugman, would later become head of the US Advisory Board on Child Abuse and Neglect, and still publicly defends his father’s work…

[1] Rosenbaum L. The Hideous Truths of Testing Vaccines on Humans, Forbes Magazine, 12th June, 2020. Available at: https://www.forbes.com/sites/leahrosenbaum/2020/06/12/willowbrook-scandal-hepatitis-experiments-hideous-truths-of-testing-vaccines-on-humans/

[2] Excerpts From Statement by Kennedy, The New York Times, 10th September, 1965.

[3] Geraldo Rivera: Willowbrook

Polio & The Poisoning of America

Before DDT was introduced in the 1940s, lead arsenate was widely used as a pesticide. It was originally introduced in the United States in 1892, to control gypsy moth.

Massachusetts was the first state to manufacture and begin spraying lead arsenate.

The first small outbreak of polio (26 cases) occurred the following year, in 1893 [1].

Also in Massachusetts.

Boston, Massachusetts actually, which happened to be downstream of where the lead arsenate was being manufactured by Merrimack Chemical Corporation (later bought by Monsanto) in Woburn.

(By the end of the century, Merrimack Chemical Corporation was the biggest manufacturer of lead arsenate in the US. It is estimated that 13 tonnes of arsenic made it’s way into the Aberjona River and Mystic Lakes, which were then public water supplies, during the decades that lead arsenate was being manufactured in Woburn) [2].

Arsenic still contaminates the watershed area downstream of Woburn today, and even decades after production had ceased, rates of childhood leukemia in Woburn were 4-fold higher than the national average [3].

The following year, 1894, an even bigger outbreak occurred just over the state border in Rutland County, Vermont – now officially recognised as America’s first polio epidemic.

Nearly all cases occurred in the Otter Creek Valley, a fertile valley nestled between two mountain ranges, and surrounded by agricultural industries. It was noted that numerous horses, dogs and fowls had also died with symptoms of paralysis, (yet polio is said to only affect humans), and some victims died suffering strange rashes and convulsions (symptoms of poisoning) [4].

Strangely, convulsions were often reported in relation to poliomyelitis, with many doctors confirming that the disease did not seem to be contagious, often seeming to affect only one child in the household, and put it down to “abnormal atmospheric conditions” [5].

There had been numerous isolated cases of ‘infantile paralysis’ for decades, and was originally blamed on teething, because of its propensity to strike suddenly when a child was teething. It may seem odd to us now, to blame teething for paralysis and convulsions, but teething used to be a dangerous business! For example, in 18th century France, one half of all infant deaths were attributed to teething. In 19th century England, 12% of all deaths under 4yrs of age were recorded as teething [6].

What made teething dangerous? The teething powders used to soothe fussing and fractious infants contained mercury [7]. Unfortunately, the connection between mercury and ‘teething deaths’ wasn’t made until the 1940’s, and only after some observant doctors noted the similarities between arsenic and mercury poisoning [8].

It wasn’t just mercurial teething powders that made teething fraught with danger. Opiates, such as laudanum were commonly used to sedate irritable babies and dull the pain of teething, and many cases of lethal intoxication occurred [9].

With the advent of arsenical pesticides, however, clusters of young paralysis victims became more and more common, especially during the summer and autumn months (when the fruits sprayed with arsenic were being eaten, and children were swimming in creeks and waterholes contaminated by pesticide run-off).

As the codling and gypsy moths developed resistance, heavier applications of lead arsenate were required – up to 5 or 6 applications per season. It was not until the 1920’s that researchers realized pesticide residues were not removed by washing or rubbing – about two-thirds of the residue remained on the fruit [10].

By 1929, almost 30 million pounds of calcium or lead arsenate were being sprayed every year, onto the fields and orchards of America [11].

Arsenic was also being widely used in medicine, during the early 20th century, especially as a treatment for syphilis – some patients were given more than 100 injections of arsenic-containing Tryparsamide (which was developed by the Rockefeller Foundation, and manufactured by Merck) to ‘treat’ advanced syphilis [12].

There were several reports of polio following arsenical injections – one of those occurred in a children’s home in Germany in 1913, where 5 children were diagnosed with polio.  All were being treated for syphilis at the time, via arsenical injections [13].

Arsenical injections were also employed to treat yaws – a tropical skin disease. In 1936, a campaign to eradicate yaws in Western Samoa preceded a large polio epidemic. Thirty-six thousand locals were given two or more injections in the buttock. A week after the second injection, the first cases of paralysis appeared – all were in the lower limbs, and all had received arsenical injections. In total, 138 locals suffered paralysis after receiving injections. [14].

When widespread vaccination campaigns began, for diptheria and pertussis, paralysis cases, diagnosed as ‘polio’, also followed [15]. At the time, those vaccines contained mercury as a preservative.

It was not just arsenic or mercury that caused polio symptoms – phosphorus, lead, carbon monoxide and cyanide poisoning were also reported in cases of poliomyelitis.

Lead arsenate finally fell out of favour in the 1940’s and was largely replaced by another poison – DDT. Arsenic-based pesticides weren’t banned until the 1980’s, however, and some ‘modified arsenates’ are still in use today, on cotton crops. It is thought that ground contamination is the cause of elevated levels of arsenic in rice today.

The use of arsenical pesticides in China continued beyond the year 2000, and it is suspected they still continue to use them illegally, hence the presence of arsenic in apple juice imported from China [16]

[1] Putnam JJ, Taylor EW. Is Acute Poliomyelitis Unusually Prevalent This Season? Boston Med Surg J, 1893, 129:509-510.

[2] Aurilio, A.C., Durant, J.L., Hemond, H.F. et al. Sources and distribution of arsenic in the aberjona watershed, eastern Massachusetts. Water Air Soil Pollut, 1995, 81, 265–282.

[3] Durant, John L., et al. Elevated Incidence of Childhood Leukemia in Woburn, Massachusetts: NIEHS Superfund Basic Research Program Searches for Causes. Environmental Health Perspectives, 1995, 103:93–98.

[4] Caverley CS. Infantile Paralysis in Vermont, 1894 – 1922, Published 1924. Available at: https://trove.nla.gov.au/work/15562978/version/209712793.

[5] Queer Epidemic Prevailing Among New York Children, The Meriden Weekly Republican, 17th August 1899.

[6] Malkiel S, Eisenstadt M, Pollak U. Say a Prayer for the Safe Cutting of a Child’s Teeth: The Folklore of Teething, J Paediatrics & Child Health, 2017, 53(12): 1145-1148.

[7] Dally A, The Rise and Fall of Pink Disease, Social History of Medicine, 1997, 10(2):291–304.

[8] Shandley K, Austin DW. Ancestry of pink disease (infantile acrodynia) identified as a risk factor for autism spectrum disorders. J Toxicol Environ Health A. 2011;74(18):1185-1194.

[9] Obladen M. Lethal Lullabies: A History of Opium Use in Infants. Journal of Human Lactation. 2016, 32(1):75-85.

[10] Schooley T, Weaver MJ, Mullins D, et al. The History of Lead Arsenate Use in Apple Production: Comparison of its Impact in Virginia with other States, J Pest Safety Ed, 2008, 10:21-53.

[11] Whorton J. Before Silent Spring: Pesticides and public health in pre-DDT America, Princeton University Press, 1974.

[12] Cormia FE. Tryparsamide in the treatment of syphilis of the central nervous system, Brit J Ven Dis, 1934, 10:99-116.

[13] Wyatt HV. Provocation Poliomyelitis: Neglected Clinical Observations from 1914 to 1950, Bull Hist Med, 1981, 55(4):543-557.

[14] Ibid.

[15] Martin K. Local paralysis in children after injections, Arch Dis Child, 1950, 25:1-14.

[16] Blum D. A is for Arsenic (Pesticides if you please), WIRED Magazine, 19th June, 2012.

Compulsory Vaccination: History Repeating? (Part 1: England)

Regardless of your own personal beliefs regarding vaccination, the idea that a government can mandate a medical procedure without your consent, should be cause for concern to everyone (in addition, it contravenes basic human rights principles, in regards to informed consent, which must be freely given, “without coercion, undue influence or misrepresentation”) [1].

As we see governments around the world moving ever closer to forced vaccination, it behoves us to take a leaf from history, and remember what happens when the State assumes ownership of a person’s physical body.

The truth is that compulsory vaccination is not a new concept. It’s been tried before! In Part 1, we will take a closer look at how it worked out for England, with compulsory smallpox vaccination.

It began innocently enough, with the British Vaccination Act (1840).

Under this law, free vaccination was provided to the poor, to be administered by the Poor Law Guardians (while the original practice of ‘inoculation’ was outlawed). Many ‘poor and uneducated’, though, shunned the offer of free vaccination [2].

Thirteen years later, compulsory vaccination was introduced – despite evidence that smallpox mortality had been declining for many decades [3].

Compulsory Vaccination Act (1853)

This law required all babies up to the 3mths old (or, 4mths in the case of orphans) to be vaccinated. Parents who refused to comply faced fines of £1 (the equivalent of approximately one week’s wages for a skilled tradesman, and todays equivalent of approximately £80), or imprisonment.

Vaccination during those years was not the procedure that we know today. It was painful and inconvenient – for both parents and children, alike. The vaccinator used a sharp surgical knife (known as a lancet), to make incisions into the flesh, in a scored pattern. This was usually done in several different places on the arm. Vaccine lymph was then smeared into the cuts. Infants were to be brought back to vaccination stations, eight days later, in order to have the pus harvested from their blisters, to be used on other waiting infants [2].

In an era where doctors were incensed at the idea that postnatal infections were caused by their failure to wash their hands after handling dead bodies, and drinking and bathing water was often contaminated with raw sewage, it is hardly surprising that deaths caused by infections of the skin, such as erysipelas, increased as vaccinations were increasingly enforced [4].

The routine treatment of smallpox involved mercury or phenol (otherwise known as carbolic acid, which is highly corrosive, and causes blistering of the skin, on its own) applied topically to sores. Mercury gargles in the throat were also employed. If the patient became delirious (which would hardly be surprising, given the frequent use of mercury), they were given morphine or bromides – which also causes pustular eruptions of the skin [5].

Vaccination Act amendment (1867)

The law was extended to include all children up to 14yrs of age (in order to capture all the children who had ‘snuck through the cracks’, during the previous 14 years of compulsory vaccination). This law introduced continuous fines and cumulative penalties.

In other words, parents could be fined continuously with increasing prison sentences for non-compliance. The UK Court Hansard notes the case of a Mr. Pearce of Andover who, up until 1877, had been convicted some 40 times [6].

Also noted, was the case of Mr. Joseph Abel, who was convicted 11 times over a 14mth period, for refusing to have his child vaccinated [7].

Further amendment (1871)

Ironically, the law was further tightened in 1871, the same year a deadly smallpox epidemic raged through Europe and Britain – regarded by many as the most destructive epidemic during that entire century [8]. The UK suffered approximately 42,000 deaths, over the course of two years.

The new law made it compulsory for all local authorities to hire Vaccination Officers, and introduced fines of 20 shillings (the equivalent of 4 days wages for a skilled tradesman) for parents who refused to allow pus to be collected from their children’s blisters, for public vaccination.

The Leicester Mercury reported the case on a Mr. George Banford, who “had a child born in 1868. It was vaccinated, and after the operation the child was covered with sores, and It was some considerable time before it was able to leave the house. Again Mr Banford complied the the law in 1870. This child was vaccinated by Dr. Sloane in the belief that by going to him they would get pure matter. IN that case erysipelas set in, and the child was on a bed of sickness for some times. IN the third case the child was born in 1872, and soon after vaccination, erysipelas set in, and it took such a bad course that the expiration of 14 days the child died.”

It will come as no surprise, that Mr. Banford refused to have his next child vaccinated…and was fined 10 shillings, with the option of seven days imprisonment [9].

Meanwhile, resistance raged on, especially in the town of Leicester, where rallies attracted crowds up to 100,000 [10]. The resistance was such, that some local magistrates and politicians declared their support for a parent’s right to choose, and a Parliamentary Inquiry was eventually held, which sat for 7 years, and finally agreed to amend the laws.

It should be noted here that compulsory vaccination proved to be the ‘thin edge of the wedge’ for governmental incursion of bodily autonomy and personal liberty.

The Contagious Diseases Acts of 1864, 1866, and 1869, were passed very quietly and suddenly, with little fanfare (it was considered unseemly to discuss such matters). The laws were aimed at preventing sexually-transmitted diseases in the Armed Forces where 1 in 3 sick cases were caused by venereal diseases. Instead of targeting members of the Armed Forces, though, the law targeted women who were suspected of prostitution [11].

These women were apprehended by police, and forced to have their genitals inspected by a doctor (no doubt, male), and if found to be infected, confined in a lock hospital for treatment, for up to 3 months. Refusal to co-operate resulted in imprisonment, with possibility of hard labour [12].

Once registered under the Act, she was expected to show up at a designated inspection station, to be inspected, every two weeks [13].

During the 1860’s, there were approximately 26,000 prostitutes known to police, while other estimates say there may have been up to 368,000 prostitutes. The vast majority of these women were poor and uneducated, and resorted to prostitution to survive [13].

After the 1866 amendment, she could be confined to hospital for treatment, for up to 12 months.

The typical treatment for syphilis during that era would most likely have been mercury rubs. Later, the severe side effects of mercury became too obvious to ignore, and it was replaced by injections of arsenic.

Ironically, there were numerous instances reported, whereby syphilis was transmitted via smallpox vaccination [14-15].

The burgeoning feminist movement fiercely opposed the Contagious Diseases Acts, on the basis that they unfairly discriminated against women, and were undertaken in a most humiliating fashion. There was a lot of common ground between the early feminist movement fighting against the Contagious Diseases Acts, and the anti-vaccinationists. Indeed, feminist leader, Josephine Butler, who spearheaded the campaign to repeal the Contagious Diseases Acts, also served in the Mother’s Anti-Compulsory Vaccination League [16].

In addition to the Contagious Diseases Act, the Notification of Infectious Diseases Acts in 1889, and 1899 required that all contagious diseases, except tuberculosis (which is curious, since it was a major killer at the time) be reported to the local medical officer, who could then forcibly remove the patient to hospital, whether they consented or not. Household contacts and doctors who failed to notify the local medical officer were liable for fines of up to 40 shillings [17].

Again, the accepted medical treatment of the time most likely involved mercury or arsenic.

Finally, after forty-five years of protests, fines and imprisonments, the Vaccination Act (1898) promised some respite to parents – it removed cumulative penalties, and allowed for a conscientious clause to be added. This Act introduced the concept of ‘Conscientious Objection’ into English law. However, parents were still required to satisfy, not one, but two magistrates of their legitimate concerns and objections, in order to gain an exemption. For a number of years (until further amendments were made in 1908), many magistrates simply refused to issue the exemption to parents, resulting in continuing fines.

The UK Court Hansard reveals the case of one applicant, who was told by his local magistrate that “such people as the applicant ought to be set on an island by themselves and die of smallpox” [18].

The 1898 law had also outlawed arm-to-arm vaccination, which was replaced by vaccination of calf lymph, which was deemed to be safer. With little government oversight, however, many entrepreneurial types saw it as a way to make easy money, supplying cheap vaccines which, occasionally included dust, hair, and even animal dung [19]. Cases of tetanus, and other infections following vaccination, continued to be reported.

In 1908, when government realized that magistrates were failing to carry out the 1898 law, it was amended further, to allow parents to make a statutory declaration of their objections to vaccination, within four months of birth.

By 1921, only 40% of English infants were being vaccinated [19].

[1] United Nations General Assembly, 64th Session, 10th August, 2009. Available at: https://www.refworld.org/pdfid/4aa762e30.pdf. Accessed September, 2019.

[2] Durbach N. They Might As Well Brand Us: Working Class Resistance to Compulsory Vaccination in Victorian England, Soc Social Hist Med, 2000, 13:45-62.

[3] McCulloch JR. A Descriptive and Statistical Account of the British Empire, Longman, Brown, Green and Longmans, London, 1854. Available online at: https://archive.org/details/adescriptiveand00mccugoog/page/n654. Accessed September, 2019.

[4] Deaths from Erysipelas After Vaccination, 1859-1880, Vaccination Inquirer, Vol 5, p.84.

[5] Blumgarten AS. A Textbook of Medicine – For Students in Schools of Nursing, Macmillan, 1937.

[6] Hansard, Deb 17 April 1877 vol 233 cc1267-8, Available at: https://api.parliament.uk/historic-hansard/commons/1877/apr/17/vaccination-acts-prosecutions-case-of-mr#S3V0233P0_18770417_HOC_12. Accessed September, 2019.

[7] Hansard, Deb 11 June 1877 vol 234 cc1569-71, Available at: https://api.parliament.uk/historic-hansard/commons/1877/jun/11/vaccination-act-prosecutions-case-of. Accessed September, 2019.

[8] Lankester E. The Smallpox Epidemic, Nature, 1871, 3:341-342.

[9] Leicester Mercury, 10th March, 1884.

[10] Porter D, Porter R. The politics of prevention: anti-vaccinationism and public health in nineteenth-century England. Med Hist. 1988;32(3):231–252.

[11] Walkowitz JR. Prostitution and Victorian Society: Women, Class and the State, Cambridge University Press, 1982.

[12] Hamilton M. Opposition to the Contagious Disease Acts, 1964 – 1886, Albion: A Quarterly Journal Concerned With British Studies, 1978, 10(1):14-27.

[13] Ibid. See #11.

[14] Syphilis conveyed by the vaccine lymph to 46 children, The Lancet, Nov 16. 1861.

[15] Lee H. Lectures on syphilitic inoculation in 1865,1866, The Lancet, 87(2224):391-394.

[16] Johnston RD. The Radical Middle Class: Populist Democracy And The Question of Capitalism, Princeton University Press, 2013, p185.

[17] Mooney G. Public Health versus Private Practice: The Contested Development of Compulsory Infectious Disease Notification in Late-Nineteenth Century Britain, Bulletin of the History of Medicine, 1999, 73(2):238-267.

[18] Hansard, HC Deb 06 March 1902 vol 104 c588 https://api.parliament.uk/historic-hansard/commons/1902/mar/06/bakewell-anti-vaccinationists#S4V0104P0_19020306_HOC_119. Accessed September, 2019.

[19] Ibid. See #16.

A Brief History of the ‘Antivax’ Movement

It is often assumed that the ‘anti-vax’ movement began with Andrew Wakefield, and ‘that autism study’, or former Playboy model Jenny McCarthy’s claims that her son’s autism was caused by vaccination.

But did these two events really cause tens of thousands of parents to begin questioning vaccines and getting embroiled in bitter skirmishes on social media? Personally, I had never heard of Andrew Wakefield, or Jenny McCarthy, when I first began to delve into the vaccine subject, in early 2010.

Opposition to vaccination is not a new phenomenon – for as long as there have been vaccines, there has been fierce opposition. Originally focused in England, that opposition really gained momentum when the Compulsory Vaccination Act was passed in Victorian England, in 1853.

The main pockets of opposition to compulsory vaccination were among the working class, and the clergy, who believed it was ‘un-Christian’ to inject people with animal products [1].

The original Vaccination Act in 1840 had provided free vaccination for the poor, to be administered by the Poor Law guardians. This law, however, was a failure, as the “lower and uneducated classes” did not take up the offer of free vaccination [1].

The Compulsory Vaccination Act of 1853 went a lot further – it ordered all babies up to 3 months old be vaccinated ( to be administered by Poor Law Guardians), and in 1867, this was extended up to 14 years of age, and penalties for non-compliance were introduced.

Doctors were encouraged to report non-vaccinators to the authorities, by “financial inducements for compliance and penalties for failure”. While the 1853 Act had introduced one-off fines or imprisonment, the 1867 Act increased this, to continuous and cumulative penalties, so that parent’s found guilty of default could be fined continuously, with increasing prison sentences, until their child reached 14 years of age [2].

(As an interesting side-note here, the vaccination laws were not the only incursions of the state during this time, at the expense of personal liberty, and private bodily autonomy. The Contagious Diseases Acts of 1864, 1866, and 1869, required that any woman suspected of prostitution was to be medically inspected for venereal disease. If deemed to be infectious, she was confined in hospital for treatment, with or without her consent. The Notification of Infectious Diseases Acts in 1889 and 1899 required that all contagious diseases – except tuberculosis, which is rather odd, since it was a major killer at the time – be reported to the local medical officer, who could then forcibly remove the patient to hospital, whether they consented or not [1].

Meanwhile, the vaccination laws were tightened yet again in 1871 (ironically, the same year that a large smallpox epidemic raged across Europe and England – a testament to how ‘effective’ the compulsory laws had been?), making it compulsory for all local authorities to hire Vaccination Officers [2].

In response to these increasingly draconian measures, the Anti-Vaccination League was formed in England, and a number of anti-vaccine journals sprang up, which “included the Anti-Vaccinator (founded 1869), the National Anti-Compulsory Vaccination Reporter (1874), and the Vaccination Inquirer (1879)”.

A number of other writings and pamphlets were distributed widely – for example, 200,000 copies of an open letter titled ‘Current Fallacies About Vaccination’, written by Leicester Member of Parliament, P Taylor, were distributed in 1883 [2].

The vaccination process was painful and inconvenient, for both parents and children alike. The vaccinator used a lancet (a surgical knife with sharp, double-edged blade) to cut lines into the flesh in a scored pattern. This was usually done in several different places on the arm. Vaccine lymph was then smeared into the cuts. Infants then had to be brought back eight days later, to have the lymph (pus!) harvested from their blisters, which was then used on waiting infants [1].

Following the strict 1871 amendments to the law, parents could even be fined 20 shillings for refusing to allow the pus to be collected from their children’s blisters, to be used for public vaccination [1].

By this point, severe and sometimes fatal reactions to the vaccine were being reported, and doubts began to grow about how effective the vaccine really was [3].

The town of Leicester was a particular hot-bed of anti-vaccine activity, with many marches and rallies, demanding repeal of the law, and advocating other measures of containment, such as isolation of the infected. Up to 100,000 people attended these rallies [4].

The unrest and opposition continued for two decades, and an estimated 6000 prosecutions were carried out, in the town of Leicester alone [3].

The following excerpts from the Leicester Mercury bears witness to the deep convictions held by those who refused to submit to the mandatory measures:

‘George Banford had a child born in 1868. It was vaccinated and after the operation the child was covered with sores, and it was some considerable time before it was able to leave the house. Again Mr. Banford complied with the law in 1870. This child was vaccinated by Dr. Sloane in the belief that by going to him they would get pure matter. In that case erysipelas set in, and the child was on a bed of sickness for some time. In the third case the child was born in 1872, and soon after vaccination erysipelas set in and it took such a bad course that at the expiration of 14 days the child died“.

Mr Banford was fined 10 shillings, with the option of seven days imprisonment, for refusing to subject his fourth child to the vaccine [5].

And again…‘By about 7.30 a goodly number of anti-vaccinators were present, and an escort was formed, preceded by a banner, to accompany a young mother and two men, all of whom had resolved to give themselves up to the police and undergo imprisonment in preference to having their children vaccinated. The utmost sympathy was expressed for the poor woman, who bore up bravely, and although seeming to feel her position expressed her determination to go to prison again and again rather than give her child over to the “tender mercies” of a public vaccinator. The three were attended by a numerous crowd and in Gallowtreegate three hearty cheers were given for them, which were renewed with increased vigour as they entered the doors of the police cells [6]”.

Eventually, there were so many refusers in the town of Leicester, that some local magistrates and politicians declared their support for parental rights, and encouraged their peers to do the same [3].

The law was finally relaxed in 1898. New laws were passed, allowing for conscientious objection of vaccination [7]. By the end of that same year, more than 200,000 certificates of conscientious objection had been issued, most among the working class, and many were women. [1]

Meanwhile in the United States, smallpox outbreaks in the late 1800’s led to vaccine campaigns, and subsequent opposition in the formation of The Anti-Vaccination Society of America in 1879, followed by the New England Anti Compulsory Vaccination League in 1882, and the Anti Vaccination League of New York City in 1885 [4].

The homeless and the itinerate were blamed for spreading smallpox, and in 1901, the Boston Board of Health ordered ‘virus squads’ to force-vaccinate men staying in cheap boarding rooms [8].

Following a smallpox outbreak in 1902, the Cambridge Board of Health in Massachusetts mandated vaccination for all city residents. This led to possibly the most important, and controversial, judicial decision regarding public health.

One man, Henning Jacobson refused to comply with the mandate, on the grounds that it violated his right to care for his own body as he saw fit. The city filed criminal charges against him, which he fought, and lost, in court. He appealed to the US Supreme Court, who ruled in the State’s favour in 1905, prioritising public health over individual liberty [9].

The ‘anti-vaxxers’ have never gone away in the intervening years, though sometimes they have been more vocal than others, such as in the 1970’s, when there was controversy throughout Europe, North America and Britain, about the safety and potential side effects of the diptheria-tetanus-pertussis vaccine [10].

In 1998, the vaccination argument came to the public attention again, with Andrew Wakefield’s case series published in the Lancet. Although the report was looking at a link between autistic disorders and bowel dysfunction, it mentioned in its conclusion that a number of parents believed their child’s symptoms began after MMR vaccination [11]. The authors felt this potential link deserved more investigation…

The furore and the fall-out are still ongoing. Wakefield was found guilty of failing to get proper ethics approval for the study, and he and a fellow investigator were subsequently ‘struck off’. Wakefield’s fellow investigator later challenged the decision, and won [12]. And while a number of researchers later confirmed the original findings, of bowel dysfunction in autistic children [13-16], Wakefield’s reputation and career have been left in tatters – the subject of mockery and derision.

Anybody who confesses to have doubts about the safety of efficacy of vaccines, as a general rule, get a taste of the same scorn and derision that Andrew Wakefield has received.

Even in the era of smallpox vaccination, the media tended to portray anti-vaxxers in a less-than-flattering light. At that time, the media referred to the debate as a “conflict between intelligence and ignorance, civilization and barbarism [9].

So, are anti-vaxxers really anti-science?

Not according to science.

In 2007, Kim et al analysed vaccination records of 11,680 children from 19 to 35 months of age, to evaluate maternal characteristics that might influence whether the child was fully vaccinated, or not.

They discovered that mothers with tertiary degrees and high incomes were the least likely to fully vaccinate their children, while mothers in poor minority families without high school diplomas were the most likely to fully vaccinate their children [17].

Similarly, a study in 2008 that investigated the attitudes and beliefs of parents who decided to opt out of childhood vaccine mandates, found that they valued scientific knowledge, were adept at collecting and processing information on vaccines…and had little trust in the medical community [18].

In 2017, the Australian Institute of Health and Welfare released their latest figures on vaccination rates. The national average was 93% of children fully vaccinated, yet in Sydney’s upmarket (ie. Highly educated, high income-earning professionals) inner suburbs and northern beaches, as few as 70% of children under 5 were fully vaccinated [19].

The same story was repeated in Melbourne, with the wealthiest – and by association, better educated – suburbs having the lowest vaccination rates. There was an ironic, and rather telling, opening paragraph in The Age, when reporting these figures: “Four of the wealthiest, healthiest suburbs of Melbourne have the worst child vaccination rates in the state [20]

Statistics gathered from Canada tell a similar story – a higher percentage of anti-vaxxers hold university degrees, compared to the national average [21].

It appears that doctors and paediatric specialists are not always in agreement with current vaccine practice either – at least, not when it comes to their own children:  “Ten percent of paediatricians and 21% of paediatric specialists claim they would not follow [CDC] recommendations for future progeny. Despite their education, physicians in this study expressed concern over the safety of vaccines [22]”.

With the vaccine schedule becoming increasingly crowded, and governments moving towards compulsory vaccination, the anti-vaccination movement is again gathering momentum. Increasing numbers of parents are delaying, declining, or opting for alternative vaccine schedules [23-24].

Around the world, as vaccine scepticism is on the rise, history looks set to repeat, as governments are becoming increasingly more forceful in trying to curb the sentiment. Time will tell how this round will play out…

References:

[1] Durbach, N. They might as well brand us: Working class resistance to compulsory vaccination in Victorian England. The Society for the Social History of Medicine, 2000, 13:45-62.

[2] Porter D, Porter R. The politics of prevention: anti-vaccinationism and public health in nineteenth-century England. Med Hist. 1988;32(3):231-52.

[3] Williamson S. Anti-vaccination leagues: One hundred years ago, Arch Dis Child, 1984, 59: 1195-1196.

[4] Wolfe, R.M., Sharpe, L.K. Anti-vaccinationists past and present. BMJ. 2002d;325:430-432.

[5] Leicester Mercury, 10th March, 1884.

[6] Leicester Mercury, 10th June, 1884.

[7] Wohl A. Endangered Lives: Public Health in Victorian Britain, 1984, Methuen, London, pp. 134-135.

[8] ] Albert, M., Ostheimer, K.G., Breman, J.G. The last smallpox epidemic in Boston and the vaccination controversy. N Engl J Med. 2001;344: 375-379.

[9] Gostin, L. Jacobson vs. Massachusetts at 100 years: Police powers and civil liberties in tensionAJPH. 2005;95:576-581.

[10] Baker, J. The pertussis vaccine controversy in Great Britain, 1974-1986. Vaccine. 2003;21:4003-4011.

[11] Wakefield AJ, Murch SH, Anthony A, et al. Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive development disorder in children, Lancet, 1998, 3519103): 637-641.

[12] Professor John Walker Smith vs General Medical Council [2012] EWHC 503, http://www.eastwoodslaw.co.uk/wp-content/uploads/2013/03/Walker-Smith.pdf. Accessed September, 2017.

[13] Horvath K, Medeiros L, Rabszlyn A, et al. High prevalence of gastrointestinal symptoms in children with autistic spectrum disorder (ASD). J Pediatr Gastroenterol Nutr 2000, 31:S174.

[14] Horvath K and Perman JA. Autistic disorder and gastrointestinal disease, Current Opinion in Pediatrics 2002, 14:583-587

[15] Ashwood P, Anthony A, Torrente F, Wakefield AJ. Spontaneous mucosal lymphocyte cytokine profiles in children with regressive autism and gastrointestinal symptoms: Mucosal immune activation and reduced counter regulatory interleukin-10. Journal of Clinical Immunology. 2004:24:664-673.

[16] Torrente F, Anthony A, Heuschkel RB, et al. Focal-enhanced gastritis in regressive autism with features distinct from Crohn’s and helicobacter pylori gastritis. Am. J Gastroenterol. 2004;4:598-605.

[17] Kim SS, Frimpong JA, et al. Effects of maternal and provider characteristics on up-to-date immunization status of children aged 19-35 months. Am J Public Health, 2007, 97(2): 259-266.

[18] Gullion JS, Henry L, Gullion G. Deciding to opt out of childhood vaccination mandates. Public Health Nurs, 2008, 25(5): 401-408.

[19] Aubusson K, Butt C, Sydney postcode has Australia’s worst vaccination rate for five year old children, Sydney Morning Herald, 8th June, 2017.

[20] Butt C, Spooner R, Melbourne vaccination data: immunisation rates not improving in wealthy inner-city suburbs, The Age, 7th June, 2017.

[21] Chai C, Who are the anti-vaxxers in Canada? New poll profiles resistant group, Global News, 9th March, 2015.

[22] Martin, M. and Badalyan, V, Vaccination practices among physicians and their children. Open Journal of Pediatrics, 2012, 2:228-235.

[23] McCauley MM, Kennedy A, Basket M, Sheedy K. Exploring the choice to refuse or delay vaccines: a national survey of parents of 6- through 23-month olds, Acad Pediatr, 2012, 125): 375-383.

[24] Robison SG, Groom H, Young C. Frequency of alternative immunization schedule use in a metropolitan area, Pediatrics, 2012, 1301): 31-38.

Stranger Than Fiction: Polio ‘Treatments’ in the 1900’s

There’s no doubt whatsoever that the polio epidemics of the early 20th century left a traumatic and lasting impression on the American psyche (and perhaps to a lesser extent, the Western psyche). Everybody seems to know somebody who was ‘crippled by polio’. The fear and devastation were very real indeed.

Others have written excellent, in-depth analyses on what caused sporadic cases to become widespread and disabling epidemics, but few have delved into the reality of medical care exacerbating the severity of poliomyelitis.

Below are some of the treatments you could expect, if stricken by paralysis in the early 1900’s:

  • Intramuscular injections of strychnine (which can cause paralysis and nerve damage – if it doesn’t kill you outright) [1].
  • Lumbar punctures, which can cause or exacerbate paralysis, and may also precede respiratory problems (which would have been blamed on ‘bulbar’ polio at the time) [1].
  • Intraspinal injections of adrenaline (almost half of the recipients died), human serum, or quinine and urea hydrochloride (3 of 6 children given this mixture orally and intramuscularly died). Even intraspinal injections of horse serum were tried [1].
  • Injections of tetanus antitoxin – the rationale being that “tetanus, rabies and poliomyelitis all attacked nerve cells, so perhaps giving the antitoxin would block access to absorption sites on the cells”. Even injections of diptheria antitoxin were tried, with 3 out of 5 patients dying [1].
  • Tendon cutting and transplantation [2].
  • Painful electrical treatments [2].
  • Radium water (After radium was discovered in 1898, it quickly gained popularity, proclaimed as a ‘cure-all’ elixir that could make one young again, and cure all kinds of ills and ails) [3].
  • Surgical Straightening: Dr. John Pohl, in an interview circa 1940, said “We’d take the children to the operating room in those days, straighten them out under anaesthetic, and put them in plaster casts. When they woke up, they screamed. The next day they still cried from the pain. That was the accepted and universal treatment virtually all over the world. I saw it in Boston and New York City and London” [4].
  • Even laypeople had their ‘cures’ and remedies, and some couldn’t resist the opportunity to ‘make a quick buck’. During the deadly 1916 epidemic, the New York Times reported that one Joseph Frooks had been charged with selling ‘Infantile Disease Protector’, which, upon investigation, was found to contain “a mixture of wood shavings” that were saturated in a mixture smelling remarkably like naphthalene [5].

It behoves us to ask…how many people were disabled or killed by polio – and how many by the so-called ‘treatments’ for polio?

References:

[1] Wyatt HV, Before the Vaccines: Medical Treatments of Acute Paralysis in the 1916 New York Epidemic of Poliomyelitis, The Open Microbiology Journal. 2014, 8:144-147.

[2] Paul JR. A History of poliomyelitis, Yale University Press, New Haven, Connecticut, 1971.

[3] Gould T, A Summer Plague: Polio and its Survivors, Yale University Press, 1997.

[4] Cohn V. Sister Kenny: The Woman Who Challenged the Doctors, University of Minnesota Press, 1975.

[5] Ibid See reference 3.

15 Reasons Why Millions of People Once Died From ‘Infectious’ Disease

1. OVERCROWDING

During the 19th century, the population of London swelled by more than six-fold, from 1 million to more than 6 million inhabitants, to become the largest city in the world [1].

All across the western world, as the Industrial Revolution took hold, vast numbers of rural folk moved into towns and cities. For example, in 1750, only 15% of the population lived in towns, but by 1880, a massive 80% of the population were urban dwellers [2]. The Industrial revolution, and city living, promised a better life but, for many, it became an unimaginable nightmare.

With housing in short supply, unscrupulous landlords turned buildings into tenements, and leased every spare inch to desperate families – dingy damp cellars, fire-trap attics and under-stair storage rooms, many without any ventilation or light. Just imagine the damp, mouldy air that these people were constantly breathing – it’s hardly a wonder that tuberculosis and pneumonia were the biggest killers, accounting for one-fifth of all deaths [3].

Disease and death were distressingly close in these crowded quarters: “…the report of a health officer for Darlington in the 1850’s found six children, aged between 2 and 17, suffering from smallpox in a one-roomed dwelling shared with their parents, and elder brother and an uncle. They all slept together on rags on the floor, with no bed. Millions of similar cases could be cited, with conditions getting even worse as disease victims died and their corpses remained rotting among families in single-roomed accommodations for days, as the family scraped together pennies to bury them” [5].

2. LACK OF PLUMBING

Entire streets had to share one outdoor toilet, which was usually in foul condition – cleaning supplies were expensive, and flies hung around in droves (and then made their way through open windows to nearby kitchens etc), and of course, diarrhoea was ever-present!

Sewerage drained into waterways via open channels in the streets and lanes, or simply lay stagnant in stinking cesspools of filth.

Henry Mayhew was an investigative journalist who, in 1849, described a London street with a ditch running down it, that contained the only drinking water available to residents. He said it was ‘the colour of strong green tea’, and ‘more like watery mud than muddy water’.

‘As we gazed in horror at it, we saw drains and sewers emptying their filthy contents into it; we saw a whole tier of doorless privies (toilets) in the open road, common to men and women built over it; we heard bucket after bucket of filth splash into it’ [6].

3. CONTAMINATED DRINKING WATER

With no environmental laws in place, raw sewage poured into drinking water supplies, as did run-off and toxic waste from factories and animal slaughterhouses.

 “The spill-off from the slaughter-houses and the glue factories, the chemicals of the commercial manufacturers, and all of Chicago’s raw sewage had begun to contaminate the drinking water” [7].

In London, the River Thames, which was the source of drinking water for many Londoners, became a stinking flow of excrement and filth, as human, animal and industrial waste was dumped into it. “In the heatwave of 1858, the stagnating open sewer outside Westminster’s windows fermented and boiled under the scorching sun” [8].

During a cholera epidemic in London, in 1854, Dr John Snow realized that the only people who seemed to be completely unaffected were the workers at a local brewery – they were drinking beer instead of water [9]! The discovery that disease could be spread via water was revolutionary, and paved the way for massive sanitary reforms

4. CONTAMINATED FOOD SUPPLY

With slow, unreliable transport, and no refrigeration, food was often past its use-by date. Diseased and rotting meat was made into sausages and ham. ‘Pigs are largely fed upon diseased meat which is too far gone, even for the sausage maker, and this is saying a great deal; and as a universal rule, diseased pigs are pickled and cured for ham, bacon etc’ [10].

Milking cows were often fed on ‘whisky slops’ and other rotting, cheap food, and therefore became diseased. ‘New York’s milk supply was also largely a by-product of the local distilleries, and the milk dealers were charged with the serious offense of murdering annually eight thousand children’ [11].

Before pasteurization, milk was treated with formaldehyde to prevent souring [12].

‘Fresh’ produce, when it was available, was not so fresh after all – often slimy, putrid and unfit for human consumption [13].

5. ABSENT MOTHERS

During the 19th century, countless mothers died during, or soon after, childbirth.

There were a number of reasons for this:

a) Rickets, and malnutrition in general, was rife,

b) Doctors, who had impinged into the female-only world of childbirth, took offense at the idea they had dirty hands, and refused to wash them [14],

c) chloroform and forceps were used unnecessarily, even in uncomplicated labours [15]

If the baby survived past infancy, they could generally look forward to a life of malnutrition, hard labor and improper care, often performed by older siblings.

During the Industrial Revolution, many mothers worked long hours in factories, leaving their young children in the care of hired ‘nurse-girls’, who were little more than children themselves, between 8-12yrs of age [16].

Many children ended up living on the streets, driven to stealing and pilfering in order to survive. ‘In 1848 Lord Ashley referred to more than thirty thousand ‘naked, filthy, roaming lawless and deserted children, in and around the metropolis‘ [17].

6. CHILD LABOUR & HARD LABOUR

With the Industrial Revolution in full swing, and labour in short supply, children as young as three and four years old were put to work in sweatshops and factories. Many of the jobs involved long hours, working in dangerous conditions, such as around heavy machinery or working near furnaces [18].

Children were forced to do back-breaking work in the most appalling conditions: ‘Children began their life in the coal-mines at five, six or seven years of age. Girls and women worked like boys; they were less than half-clothed, and worked alongside men who were stark naked. There were from twelve to fourteen working hours in the twenty-four, and these were often at night…A common form of labour consisted of drawing on hands and knees over the inequalities of a passageway not more than two feet, or twenty-eight inches high a car or tub filled with three or four hundred weight of coal, attached by a chain, and hooked to a leather band around the waist’ [19].

Children were sometimes crushed or ground to death, or had limbs severed, in some of the more dangerous industries, such as underground mining [20]

Basically, millions of children had no childhood, but a monotonous, depressing existence.

‘Children had not a moment free, save to snatch a hasty meal, or sleep as best they could. From earliest youth they worked to a point of extreme exhaustion, without open air exercise, or any enjoyment whatever, but grew up, if they survived at all, weak, bloodless, miserable, and in many cases deformed cripples, and victims of almost every disease’ [18].

And to make matters worse, many children were constantly exposed to poisons, such as arsenic, lead and mercury, which were being widely used in industries, such as silk and cotton spinning [21].

Adulthood didn’t bring much change – hard labour, often for 12-16 hours per day. The terrible conditions and over-work, along with poor diet, aged people quickly: “…from the 1830’s photographs show working people looking old by their thirties and forties, as poor nutrition, illness, bad living conditions and gross overwork took their toll’ [22].

7. POLLUTED AIR

Factories spewed soot and waste into the air, unchecked and unregulated. Cities were covered in a layer of grease and grime [23].

It’s no surprise that lung and chest complaints were rife. And then there was the ever-present stench of open sewage, rubbish, animal dung etc.

Refuse, including the rotting corpses of dogs and horses, littered city streets. In 1858, the stench from sewage and other rot was so putrid that the British House of Commons was forced to suspend its sessions’ [23].

That episode became known as ‘The Great Stink’, and in 1952, atmospheric conditions coupled with coal-fire burning, led to the event now known as ‘The Great Smog” – which killed thousands within the space of weeks [24].

Even today, an estimated 9000 people die prematurely each year in London alone, due to air pollution [25]. Yet the levels of pollution in Victorian times were up to 50x worse than they are today [26] – how many lives must have been cut short because of the foul air poisoning their lungs?

8. LACK OF BREASTFEEDING

Infant formula was first patented and marketed in 1865, consisting of cow’s milk, wheat and malt flour, and potassium bicarbonate – and regarded as ‘perfect infant food’ [27].

Over the next 100 years, breastfeeding rates dropped to just 25% [28], as social attitudes disdained the practice as being only for the uneducated, and those who could not afford infant formula [29].

Not only did millions of babies miss out on the nurturing of their mother’s breast, but their formula was poor quality, and often made with contaminated water in unsterile bottles, and milk quickly spoiled during warm weather without refrigeration.

It’s hardly a wonder that so many babies succumbed to diarrheal infections, such as typhoid fever.

9. IMPROPER GARBAGE DISPOSAL

Without a proper disposal system in place, alleys, courtyards, and streets became littered with rubbish and waste – sometimes knee-high, which was not only offensive-smelling, but a great attraction for all kinds of scavengers – rats, pigs, dogs, cockroaches and swarms of flies [30].

10. ANIMALS

Because horses and donkeys were used to transport goods, they also had to be housed in overcrowded cities, often in close quarters to humans, since space was at a premium. Rotting carcases were left to decompose where they lay.

By late 19th century, 300,000 horses were being used in London, creating 1000 tonnes of dung per day [31].

Pigs roamed freely in the streets, ferreting amongst the rubbish – some towns recorded more resident pigs than people.

Animal slaughterhouses were located amongst high-density tenement housing – animals were constantly slaughtered in full view of the surrounding residents, and the sounds and smell of death were constantly in the air [32].

11. LACK OF SUNLIGHT

Due to the burning of coal, and wood fires, cities were blanketed in a thick, black smog that covered everything in grime.

The murk was so dense that countless accidents occurred, including horses and carts running into shop-fronts, or over pedestrians, or into each other [33].

Vitamin D deficiency was widespread, and in the late 1800’s, studies concluded that up to 90% of children were suffering from rickets [34]. In young girls, this often led to deformed hips, and later on, problems in childbirth.

12. MALNUTRITION

Millions of families subsisted on the cheapest food possible, and many lived on the brink of starvation. Malnutrition was rife, with so little fresh fruits and vegetables in the diet.

Scurvy (Vitamin C deficiency) claimed an estimated 10,000 men during the California Gold Rush in the mid-1800’s [35]. Even in those who did not have overt signs of scurvy, a state of mild deficiency must have been prevalent, leading to weakened immunity to disease and infection.

13. BAD MEDICINE

If you thought blood-letting and leeches were bad, how about an injection of arsenic – proudly brought to you by Merck and Co [36]? Or a gargle with mercury – where’s the harm [37]?

And if you have smallpox, we’ll dab your sores with corrosives [38].

Treatment for syphilis included mercury rubs, bismuth injections, and arsenic injections – some patients endured more than 100 such injections [36].

It’s highly possible that the medical ‘treatments’ killed more people than the diseases they were intended to treat.

Hospitals were known to be breeding-grounds of disease, and over-run by rats, that were so numerous and hungry, they ate patients [39].

14. LACK OF BASIC CLEANLINESS

With less than 2% of the urban population with running water to their homes [40], and soap/detergents viewed as luxuries, washing of hands, clothes, plates and utensils had to be done with dirty, contaminated water – or not at all.

Note that items such as nappies and sanitary ‘rags’ also had to be washed – no ‘disposables’ in those days!

15. MENTAL AND EMOTIONAL STRESS

We now know that stress and fear take a huge toll on the body, resulting in immune system malfunction [41]. Can you imagine the mental anguish of being surrounded by abject poverty, and seeing no way of escape for yourself or your children? Or the panic of watching everybody you love succumb to a dreaded disease, and not having the knowledge or means to protect yourself?

Fear and hysteria ran high during disease outbreaks – during a cholera epidemic in the US in 1849 “thousands fled panic-stricken before the scourge…The streets were empty, except for the doctors rushing from victim to victim, and the coffin makers and undertakers following closely on their heels” [42].

Not to mention the stress of toiling for long hours in monotonous or dangerous work, with hardly a piece of dry bread to fill your hungry stomach?

Given the poor living conditions that millions suffered, it was hardly a wonder that average life expectancy was, tragically, just 15 or 16 years among the working class [43].

References:

[1] GB Historical GIS / University of Portsmouth, London GovOf through time | Population Statistics | Total Population, A Vision of Britain through Time.

[2] [Porter R. The Greatest Benefit to Mankind, Harper Collins, New York, 1997]

[3] Publications of the American Statistical Association, Volume 9, Nos 65-72, 1904-1905, pp 260-261.

[4] Chesney K. The Victorian Underworld, Penguin Books, 1972.

[5] Porter D, Health, Civilization and the State – A History of Public Health From Ancient to Modern Times, Routledge, Oxfordshire, England, 1999.

[6] Mayhew H. A Visit To The Cholera Districts of Bermondsey, The Morning Chronicle, 24th September, 1849.]

[7] Byrne J, My Chicago, Northwestern University Press, Evanston, Illinois, 1992.

[8] Mann E, Story of Cities #14: London’s Great Stink heralds a wonder of the modern world, The Guardian, 4th April, 2016, https://www.theguardian.com/cities/2016/apr/04/story-cities-14-london-great-stink-river-thames-joseph-bazalgette-sewage-system. Accessed January, 2019.

[9] Radeska T, The 1854 cholera outbreak of Broad Street, Everyone got sick except those who drank beer instead of water, Vintage News, 26th September, 2016, https://www.thevintagenews.com/2016/09/26/1854-cholera-outbreak-broad-street-everyone-got-sick-except-drank-beer-instead-water, Accessed January, 2019.

[10] The British and Foreign Medico-Chirurgical Review, Quarterly Journal of Practical Medicine and Surgery, Volume XXXV, John Churchill and Sons, London, Jan-Apr 1865, pp 32-33.

[11] Cole AC, The Irrepressible Conflict 1850-1865: A History of American Life, Volume VII, Macmillan, New York, 1934, p 81.

[12] Formaldehyde and Milk, JAMA. 1900; XXXIV(23):1496.

[13] Report of the Council of Hygiene and Public Health of the Citizen’s Association of New York, 1865, p 59.

[14] Wertz RW, Wertz DC, Lying In: A History of Childbirth in America, Yale University Press, 1989, p 122.

[15] Loudon I, Maternal Mortality in the Past and its Relevance to Developing Countries Today, American Journal of Clinical Nutrition, 2000, 72:241S-246S.

[16] Newman G, Infant Mortality: A Continuing Social Problem, Methueun and Co, London, 1906, p 95.

[17] Horn P. The Victorian Town Child, New York University Press, 1997.

[18] Willoughby WF, de Graffenried C, Child Labor, American Economic Association, Guggenheimer, Weil and Co, Baltimore, 1890, p 16.

[19] Cheyney EP. An Introduction to the Industrial and Social History of England, Macmillan, New York, 1920, pp 243-244.

[20] Lovejoy OR, Child Labor in the Coal Mines, Child Labor – A Menace to Industry, Education and Good Citizenship, Academy of Political and Social Science, 1906, p 38.

[21] The American Journal of Nursing, 1903, 3(8):664.

[22] Mearns A, Preston WC. The Bitter Cry of Outcast London: An Inquiry Into the Condition of the Abject Poor, James Clarke and Co, London, 1883.

[23] Noble TFX, Straus B, Osheim DJ, Neuschel KB, Accampo AE, Roberts DD, Choen WB. Western Civikization: Beyond Boundaries, Volume II, 6th Edition, Wadsworth, Boston, Massachesetts.

[24] Carrington D, The truth about London’s air pollution, The Guardian, 5th February, 2016, https://www.theguardian.com/environment/2016/feb/05/the-truth-about-londons-air-pollution. Accessed January, 2019.

[25] Vaughan A, Nearly 9500 die every year in London because of air pollution, The Guardian, 15th July, 2015, https://www.theguardian.com/environment/2015/jul/15/nearly-9500-people-die-each-year-in-london-because-of-air-pollution-study. Accessed January, 2019.

[26] UK Air, What are the main trends in particulate matter in the UK? Chapter 7, https://www.google.com.au/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=2ahUKEwiJo6G8r9ffAhXJP3AKHf4dCesQFjAAegQICxAC&url=https%3A%2F%2Fuk-air.defra.gov.uk%2Fassets%2Fdocuments%2Freports%2Faqeg%2Fch7.pdf&usg=AOvVaw3Qd9bOWTstyEmCOpcmeLA6, Accessed January, 2019.

[27] Stevens EE, Patrick TE, Pickler R. A history of infant feeding. J Perinat Educ. 2009;18(2):32-9.

[28] Hirschman C, Butler M. Trends and differentials in breast feeding: an update, Demography, 1981, 18:39-54.

[29] Riordan J; Countryman BA. “Basics of breastfeeding. Part I: Infant feeding patterns past and present.”, JOGN Nurs 1980., 9 (4): 207–210.

[30] Oatman-Stanford H, A Filthy History: When New-Yorkers Lived Knee Deep in Trash, Collector’s Weekly, https://www.collectorsweekly.com/articles/when-new-yorkers-lived-knee-deep-in-trash/. Accessed Januray, 2019.

[31] Jackson L. Dirty Old London: The Victorian Fight Against Filth, Yale University Press, 2014.

[32] Annual Report of the Metropolitan Board of Health, 1866, Westcott and Co’s Printing House, New York, 1987.

[33] Heggie V, Over 200yrs of deadly London air: smogs, fogs and pea soupers, The Guardian, 9th December, 2016, https://www.theguardian.com/science/the-h-word/2016/dec/09/pollution-air-london-smogs-fogs-pea-soupers. Accessed January, 2019.

[34] Holick MF. Resurrection of vitamin D deficiency and rickets. J Clin Invest. 2006;116(8):2062-72.

[35] Lorenz AJ, Scurvy in the Gold Rush.” Journal of the History of Medicine and Allied Sciences, 1957, 12(4):473–510.

[36] Cormia FE, Tryparsamide in the treatment of Syphilis of the central nervous system, British Journal of Venereal Diseases, 1934, 10:99-116.

[37] Swediaur F, Practical observations on the more obstinate and inveterate venereal complaints, J Johnson and C Elliott, London, 1784.

[38] Blumgarten AS. A Text Book of Medicine – For Students in Schools of Nursing, 1937.

[39] Vincent’s Semi-Annual United States Register, 1860, p346.

[40] Greene VW, Personal Hygiene and Life Expectancy Improvements Since 1850: Historic and Epidemiologic Associations, American Journal of Infection Control, August 2001, p 205.

[41] Rosen J, The Effects of Chronic Fear on a Person’s Health, Neuroscience Education Institute (NEI), 2017 Conference, https://www.ajmc.com/conferences/nei-2017/the-effects-of-chronic-fear-on-a-persons-health, Accessed January, 2019.

[42] Cole AC, The Irrepressible Conflict 1850-1865: A History of American Life, Volume VII, Macmillan, New York, 1934, p 81.

[43] Greene VW, Personal Hygiene and Life Expectancy Improvements Since 1850: Historic and Epidemiologic Associations, American Journal of Infection Control, August 2001, p 205.