A Brief History of the ‘Antivax’ Movement

It is often assumed that the ‘anti-vax’ movement began with Andrew Wakefield, and ‘that autism study’, or former Playboy model Jenny McCarthy’s claims that her son’s autism was caused by vaccination.

But did these two events really cause tens of thousands of parents to begin questioning vaccines and getting embroiled in bitter skirmishes on social media? Personally, I had never heard of Andrew Wakefield, or Jenny McCarthy, when I first began to delve into the vaccine subject, in early 2010.

Opposition to vaccination is not a new phenomenon – for as long as there have been vaccines, there has been fierce opposition. Originally focused in England, that opposition really gained momentum when the Compulsory Vaccination Act was passed in Victorian England, in 1853.

The main pockets of opposition to compulsory vaccination were among the working class, and the clergy, who believed it was ‘un-Christian’ to inject people with animal products [1].

The original Vaccination Act in 1840 had provided free vaccination for the poor, to be administered by the Poor Law guardians. This law, however, was a failure, as the “lower and uneducated classes” did not take up the offer of free vaccination [1].

The Compulsory Vaccination Act of 1853 went much further – it ordered all babies up to 3 months old be vaccinated (administered by Poor Law Guardians), and in 1867, this was extended up to 14 years of age, and penalties for non-compliance were introduced.

Doctors were encouraged to report non-vaccinators to the authorities, by “financial inducements for compliance and penalties for failure”. While the 1853 Act had introduced one-off fines or imprisonment, the 1867 Act strengthened this to continuous and cumulative penalties, so that parent’s found guilty of default could be fined continuously, with increasing prison sentences, until their child reached 14 years of age [2].

(As an interesting side-note here, the vaccination laws were not the only incursions of the state during this time, at the expense of personal liberty, and private bodily autonomy. The Contagious Diseases Acts of 1864, 1866, and 1869, stated that any woman suspected of prostitution was required to be medically inspected for venereal disease. If found infected, she was to be confined in hospital for treatment, with or without her consent. The Notification of Infectious Diseases Acts in 1889 and 1899 required that all contagious diseases – except tuberculosis, which is curious, since it was a major killer at the time – be reported to the local medical officer, who could then forcibly remove the patient to hospital, whether they consented or not [1].

Meanwhile, the vaccination laws were tightened yet further in 1871 (ironically, the same year that a large smallpox epidemic raged across Europe and England – no doubt testament to how ‘effective’ the compulsory laws had been!), making it compulsory for all local authorities to hire Vaccination Officers [2].

In response to these draconian measures, the Anti-Vaccination League was formed in England, and a number of anti-vaccine journals were started, which “included the Anti-Vaccinator (founded 1869), the National Anti-Compulsory Vaccination Reporter (1874), and the Vaccination Inquirer (1879)”.

Numerous other writings and pamphlets were distributed widely – for example, 200,000 copies of an open letter titled ‘Current Fallacies About Vaccination’, written by Leicester Member of Parliament, P Taylor, were distributed in 1883 [2].

The vaccination process itself was both painful and inconvenient, for parents and children alike. The vaccinator used a lancet (a surgical knife with sharp, double-edged blade) to cut lines into the flesh in a scored pattern. This was usually done in several different places on the arm. Vaccine lymph was then smeared into the cuts. Infants then had to be brought back eight days later, to have the lymph (pus!) harvested from their blisters, which was then used on waiting infants [1].

After the stricter 1871 amendments to the law, parents could also be fined 20 shillings for refusing to allow the pus to be collected from their children’s blisters, to be used for public vaccination [1].

By this time, severe and sometimes fatal reactions to the vaccine were being reported, and doubts began to grow about how effective the vaccine really was [3].

The town of Leicester was a particular hot-bed of anti-vaccine activity, with numerous marches and rallies, demanding for repeal of the law, and advocating other measures of containment, such as isolation of the infected. These rallies attracted up to 100,000 people [4].

The unrest and opposition continued for two decades, and an estimated 6000 prosecutions were carried out, in the town of Leicester alone [3].

The following excerpts from the Leicester Mercury bears witness to the deep convictions held by those who refused to submit to the mandatory measures:

‘George Banford had a child born in 1868. It was vaccinated and after the operation the child was covered with sores, and it was some considerable time before it was able to leave the house. Again Mr. Banford complied with the law in 1870. This child was vaccinated by Dr. Sloane in the belief that by going to him they would get pure matter. In that case erysipelas set in, and the child was on a bed of sickness for some time. In the third case the child was born in 1872, and soon after vaccination erysipelas set in and it took such a bad course that at the expiration of 14 days the child died“.

Mr Banford was fined 10 shillings, with the option of seven days imprisonment, for refusing to subject a fourth child to the vaccine [5].

And again…‘By about 7.30 a goodly number of anti-vaccinators were present, and an escort was formed, preceded by a banner, to accompany a young mother and two men, all of whom had resolved to give themselves up to the police and undergo imprisonment in preference to having their children vaccinated. The utmost sympathy was expressed for the poor woman, who bore up bravely, and although seeming to feel her position expressed her determination to go to prison again and again rather than give her child over to the “tender mercies” of a public vaccinator. The three were attended by a numerous crowd and in Gallowtreegate three hearty cheers were given for them, which were renewed with increased vigour as they entered the doors of the police cells [6]”.

Eventually, there were so many vaccine refusers in the town of Leicester, that some local magistrates and politicians declared their support for parental rights, and encouraged their peers to do the same [3].

The law was finally relaxed in 1898, when new laws were passed, that allowed for conscientious objection to vaccination [7]. By the end of that same year, more than 200,000 certificates of conscientious objection had been issued, most among the working class, and many were women. [1]

Meanwhile in the United States, smallpox outbreaks in the late 1800’s led to vaccine campaigns, and subsequent opposition in the formation of The Anti-Vaccination Society of America in 1879, followed by the New England Anti Compulsory Vaccination League in 1882, and the Anti Vaccination League of New York City in 1885 [4].

The homeless and the itinerate were blamed for spreading smallpox, and in 1901, the Boston Board of Health ordered ‘virus squads’ to vaccinate men staying in cheap boarding rooms – by force [8].

Following a smallpox outbreak in 1902, the Cambridge Board of Health in Massachusetts mandated vaccination for all city residents. This led to possibly the most important, and controversial, judicial decision regarding public health.

One man, Henning Jacobson refused to comply with the mandate, on the grounds that it violated his right to care for his own body as he saw fit. The city filed criminal charges against him, which he fought, and lost, in court. He then appealed to the US Supreme Court, which ruled in the State’s favour in 1905, giving priority to public health over individual liberty [9].

The ‘anti-vaxxers’ have never really gone away in the intervening years, although sometimes they have been more vocal than others, such as in the 1970’s, when there was controversy throughout Europe, North America and Britain, about the safety and possible side effects of the diptheria-tetanus-pertussis vaccine [10].

In 1998, the vaccination argument again came to the forefront, with Andrew Wakefield’s case series published in the Lancet. Although the report was looking at a link between autistic disorders and bowel dysfunction, it mentioned in its conclusion that a number of parents believed their child’s symptoms began after MMR vaccination [11]. The authors felt this potential link deserved more investigation…

The furore and the fall-out are still ongoing. Wakefield was found guilty of failing to get proper ethics approval for the study, and he and a fellow investigator were subsequently ‘struck off’. Wakefield’s fellow investigator later challenged the decision, and won [12]. And while a number of researchers later confirmed the original findings, of bowel dysfunction in autistic children [13-16], Wakefield’s reputation and career have been left in tatters – the subject of mockery and derision.

Anybody who confesses to have doubts about the safety of efficacy of vaccines, as a general rule, get a taste of the same scorn and derision that Andrew Wakefield has received.

Even in the era of smallpox vaccination, the media tended to portray anti-vaxxers in a less-than-flattering light. At the time, the media referred to the debate as a “conflict between intelligence and ignorance, civilization and barbarism [9].

So, are anti-vaxxers really anti-science?

Not so, says…science.

In 2007, Kim et al analysed vaccination records of 11,680 children from 19 – 35 months of age, to evaluate maternal characteristics that might influence whether the child was fully vaccinated, or not.

They found that mothers with tertiary degrees and high incomes were the least likely to fully vaccinate their children, while mothers in poor minority families without high school diplomas were the most likely to fully vaccinate their children [17].

Similarly, a study in 2008 that investigated the attitudes and beliefs of parents who decided to opt out of childhood vaccine mandates, found that they valued scientific knowledge, were adept at collecting and processing information on vaccines, and had little trust in the medical community [18].

In 2017, the Australian Institute of Health and Welfare released their latest figures on vaccination rates. The national average was 93% of children fully vaccinated, yet in Sydney’s upmarket (ie. Highly educated, high income-earning professionals) inner suburbs and northern beaches, as few as 70% of children under 5 were fully vaccinated [19].

The same story was repeated in Melbourne, with the wealthiest – and by association, better educated – suburbs having the lowest vaccination rates. There was an ironic, and rather telling, opening paragraph in The Age, when reporting these figures: “Four of the wealthiest, healthiest suburbs of Melbourne have the worst child vaccination rates in the state [20]

(It does beg the question: Were they not vaccinated because they were healthy…or were they healthy because they were not vaccinated?)

Statistics gathered from Canada tell a similar story – a higher percentage of anti-vaxxers held a university degree, compared to the national average [21].

It seems that doctors and paediatric specialists are not always in agreement with current vaccine practice either – at least, not when it comes to their own children:  “Ten percent of paediatricians and 21% of paediatric specialists claim they would not follow [CDC] recommendations for future progeny. Despite their education, physicians in this study expressed concern over the safety of vaccines [22]”.

With the vaccine schedule becoming increasingly crowded, and governments moving ever towards compulsory vaccination, the anti-vaccination movement is again gathering momentum. Increasing numbers of parents are delaying, declining, or opting for alternative vaccine schedules [23-24].

Around the world, as vaccine scepticism is on the rise, history looks set to repeat, as governments are becoming increasingly more forceful in trying to curb the sentiment. Only time will tell how this round will play out…


[1] Durbach, N. They might as well brand us: Working class resistance to compulsory vaccination in Victorian England. The Society for the Social History of Medicine, 2000, 13:45-62.

[2] Porter D, Porter R. The politics of prevention: anti-vaccinationism and public health in nineteenth-century England. Med Hist. 1988;32(3):231-52.

[3] Williamson S. Anti-vaccination leagues: One hundred years ago, Arch Dis Child, 1984, 59: 1195-1196.

[4] Wolfe, R.M., Sharpe, L.K. Anti-vaccinationists past and present. BMJ. 2002d;325:430-432.

[5] Leicester Mercury, 10th March, 1884.

[6] Leicester Mercury, 10th June, 1884.

[7] Wohl A. Endangered Lives: Public Health in Victorian Britain, 1984, Methuen, London, pp. 134-135.

[8] ] Albert, M., Ostheimer, K.G., Breman, J.G. The last smallpox epidemic in Boston and the vaccination controversy. N Engl J Med. 2001;344: 375-379.

[9] Gostin, L. Jacobson vs. Massachusetts at 100 years: Police powers and civil liberties in tensionAJPH. 2005;95:576-581.

[10] Baker, J. The pertussis vaccine controversy in Great Britain, 1974-1986. Vaccine. 2003;21:4003-4011.

[11] Wakefield AJ, Murch SH, Anthony A, et al. Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive development disorder in children, Lancet, 1998, 3519103): 637-641.

[12] Professor John Walker Smith vs General Medical Council [2012] EWHC 503, http://www.eastwoodslaw.co.uk/wp-content/uploads/2013/03/Walker-Smith.pdf. Accessed September, 2017.

[13] Horvath K, Medeiros L, Rabszlyn A, et al. High prevalence of gastrointestinal symptoms in children with autistic spectrum disorder (ASD). J Pediatr Gastroenterol Nutr 2000, 31:S174.

[14] Horvath K and Perman JA. Autistic disorder and gastrointestinal disease, Current Opinion in Pediatrics 2002, 14:583-587

[15] Ashwood P, Anthony A, Torrente F, Wakefield AJ. Spontaneous mucosal lymphocyte cytokine profiles in children with regressive autism and gastrointestinal symptoms: Mucosal immune activation and reduced counter regulatory interleukin-10. Journal of Clinical Immunology. 2004:24:664-673.

[16] Torrente F, Anthony A, Heuschkel RB, et al. Focal-enhanced gastritis in regressive autism with features distinct from Crohn’s and helicobacter pylori gastritis. Am. J Gastroenterol. 2004;4:598-605.

[17] Kim SS, Frimpong JA, et al. Effects of maternal and provider characteristics on up-to-date immunization status of children aged 19-35 months. Am J Public Health, 2007, 97(2): 259-266.

[18] Gullion JS, Henry L, Gullion G. Deciding to opt out of childhood vaccination mandates. Public Health Nurs, 2008, 25(5): 401-408.

[19] Aubusson K, Butt C, Sydney postcode has Australia’s worst vaccination rate for five year old children, Sydney Morning Herald, 8th June, 2017.

[20] Butt C, Spooner R, Melbourne vaccination data: immunisation rates not improving in wealthy inner-city suburbs, The Age, 7th June, 2017.

[21] Chai C, Who are the anti-vaxxers in Canada? New poll profiles resistant group, Global News, 9th March, 2015.

[22] Martin, M. and Badalyan, V, Vaccination practices among physicians and their children. Open Journal of Pediatrics, 2012, 2:228-235.

[23] McCauley MM, Kennedy A, Basket M, Sheedy K. Exploring the choice to refuse or delay vaccines: a national survey of parents of 6- through 23-month olds, Acad Pediatr, 2012, 125): 375-383.

[24] Robison SG, Groom H, Young C. Frequency of alternative immunization schedule use in a metropolitan area, Pediatrics, 2012, 1301): 31-38.

Stranger Than Fiction: Polio ‘Treatments’ in the 1900’s

There’s no doubt whatsoever that the polio epidemics of the early 20th century left a traumatic and lasting impression on the American psyche (and perhaps to a lesser extent, the Western psyche). Everybody seems to know somebody who was ‘crippled by polio’. The fear and devastation were very real indeed.

Others have written excellent, in-depth analyses on what caused sporadic cases to become widespread and disabling epidemics, but few have delved into the reality of medical care exacerbating the severity of poliomyelitis.

Below are some of the treatments you could expect, if stricken by paralysis in the early 1900’s:

  • Intramuscular injections of strychnine (which can cause paralysis and nerve damage – if it doesn’t kill you outright) [1].
  • Lumbar punctures, which can cause or exacerbate paralysis, and also precede respiratory problems (which would have been blamed on ‘bulbar’ polio) [1].
  • Intraspinal injections of adrenaline (almost half of the recipients died), human serum, or quinine and urea hydrochloride (3 of 6 children given this mixture orally and intramuscularly died). Even intraspinal injections of horse serum were tried [1].
  • Injections of tetanus antitoxin – the rationale being that “tetanus, rabies and poliomyelitis all attacked nerve cells, so perhaps giving the antitoxin would block access to absorption sites on the cells”. Even injections of diptheria antitoxin were tried, with 3 out of 5 patients dying [1].
  • Tendon cutting and transplantation [2].
  • Painful electrical treatments [2].
  • Radium water (After radium was discovered in 1898, it quickly gained popularity, proclaimed as a ‘cure-all’ elixir that could make one young again, and cure all kinds of ills and ails) [3].
  • Surgical Straightening: Dr. John Pohl, in an interview circa 1940, said “We’d take the children to the operating room in those days, straighten them out under anaesthetic, and put them in plaster casts. When they woke up, they screamed. The next day they still cried from the pain. That was the accepted and universal treatment virtually all over the world. I saw it in Boston and New York City and London” [4].
  • Even laypeople had their ‘cures’ and remedies, and some saw it as too good an opportunity to pass up. During the deadly 1916 epidemic, the New York Times reported that one Joseph Frooks had been charged with selling ‘Infantile Disease Protector’, which, upon investigation, was found to contain “a mixture of wood shavings” that were saturated in a mixture smelling remarkably like naphthalene [5].

It behoves us to ask…how many people were disabled or killed by polio – and how many by the so-called ‘treatments’ for polio?


[1] Wyatt HV, Before the Vaccines: Medical Treatments of Acute Paralysis in the 1916 New York Epidemic of Poliomyelitis, The Open Microbiology Journal. 2014, 8:144-147.

[2] Paul JR. A History of poliomyelitis, Yale University Press, New Haven, Connecticut, 1971.

[3] Gould T, A Summer Plague: Polio and its Survivors, Yale University Press, 1997.

[4] Cohn V. Sister Kenny: The Woman Who Challenged the Doctors, University of Minnesota Press, 1975.

[5] Ibid See reference 3.

15 Reasons Why Millions of People Once Died From ‘Infectious’ Disease


During the 19th century, the population of London swelled by more than six-fold, from 1 million to more than 6 million inhabitants, to become the largest city in the world [1].

All across the western world, as the Industrial Revolution took hold, vast numbers of rural folk moved into towns and cities. For example, in 1750, only 15% of the population lived in towns, but by 1880, a massive 80% of the population were urban dwellers [2]. The Industrial revolution, and city living, promised a better life but, for many, it became an unimaginable nightmare.

With housing in short supply, unscrupulous landlords turned buildings into tenements, and leased every spare inch to desperate families – dingy damp cellars, fire-trap attics and under-stair storage rooms, many without any ventilation or light. Just imagine the damp, mouldy air that these people were constantly breathing – it’s hardly a wonder that tuberculosis and pneumonia were the biggest killers, accounting for one-fifth of all deaths [3].

Disease and death were distressingly close in these crowded quarters: “…the report of a health officer for Darlington in the 1850’s found six children, aged between 2 and 17, suffering from smallpox in a one-roomed dwelling shared with their parents, and elder brother and an uncle. They all slept together on rags on the floor, with no bed. Millions of similar cases could be cited, with conditions getting even worse as disease victims died and their corpses remained rotting among families in single-roomed accommodations for days, as the family scraped together pennies to bury them” [5].


Entire streets had to share one outdoor toilet, which was usually in foul condition – cleaning supplies were expensive, and flies hung around in droves (and then made their way through open windows to nearby kitchens etc), and of course, diarrhoea was ever-present!

Sewerage drained into waterways via open channels in the streets and lanes, or simply lay stagnant in stinking cesspools of filth.

Henry Mayhew was an investigative journalist who, in 1849, described a London street with a ditch running down it, that contained the only drinking water available to residents. He said it was ‘the colour of strong green tea’, and ‘more like watery mud than muddy water’.

‘As we gazed in horror at it, we saw drains and sewers emptying their filthy contents into it; we saw a whole tier of doorless privies (toilets) in the open road, common to men and women built over it; we heard bucket after bucket of filth splash into it’ [6].


With no environmental laws in place, raw sewage poured into drinking water supplies, as did run-off and toxic waste from factories and animal slaughterhouses.

 “The spill-off from the slaughter-houses and the glue factories, the chemicals of the commercial manufacturers, and all of Chicago’s raw sewage had begun to contaminate the drinking water” [7].

In London, the River Thames, which was the source of drinking water for many Londoners, became a stinking flow of excrement and filth, as human, animal and industrial waste was dumped into it. “In the heatwave of 1858, the stagnating open sewer outside Westminster’s windows fermented and boiled under the scorching sun” [8].

During a cholera epidemic in London, in 1854, Dr John Snow realized that the only people who seemed to be completely unaffected were the workers at a local brewery – they were drinking beer instead of water [9]! The discovery that disease could be spread via water was revolutionary, and paved the way for massive sanitary reforms


With slow, unreliable transport, and no refrigeration, food was often past its use-by date. Diseased and rotting meat was made into sausages and ham. ‘Pigs are largely fed upon diseased meat which is too far gone, even for the sausage maker, and this is saying a great deal; and as a universal rule, diseased pigs are pickled and cured for ham, bacon etc’ [10].

Milking cows were often fed on ‘whisky slops’ and other rotting, cheap food, and therefore became diseased. ‘New York’s milk supply was also largely a by-product of the local distilleries, and the milk dealers were charged with the serious offense of murdering annually eight thousand children’ [11].

Before pasteurization, milk was treated with formaldehyde to prevent souring [12].

‘Fresh’ produce, when it was available, was not so fresh after all – often slimy, putrid and unfit for human consumption [13].


During the 19th century, countless mothers died during, or soon after, childbirth.

There were a number of reasons for this:

a) Rickets, and malnutrition in general, was rife,

b) Doctors, who had impinged into the female-only world of childbirth, took offense at the idea they had dirty hands, and refused to wash them [14],

c) chloroform and forceps were used unnecessarily, even in uncomplicated labours [15]

If the baby survived past infancy, they could generally look forward to a life of malnutrition, hard labor and improper care, often performed by older siblings.

During the Industrial Revolution, many mothers worked long hours in factories, leaving their young children in the care of hired ‘nurse-girls’, who were little more than children themselves, between 8-12yrs of age [16].

Many children ended up living on the streets, driven to stealing and pilfering in order to survive. ‘In 1848 Lord Ashley referred to more than thirty thousand ‘naked, filthy, roaming lawless and deserted children, in and around the metropolis‘ [17].


With the Industrial Revolution in full swing, and labour in short supply, children as young as three and four years old were put to work in sweatshops and factories. Many of the jobs involved long hours, working in dangerous conditions, such as around heavy machinery or working near furnaces [18].

Children were forced to do back-breaking work in the most appalling conditions: ‘Children began their life in the coal-mines at five, six or seven years of age. Girls and women worked like boys; they were less than half-clothed, and worked alongside men who were stark naked. There were from twelve to fourteen working hours in the twenty-four, and these were often at night…A common form of labour consisted of drawing on hands and knees over the inequalities of a passageway not more than two feet, or twenty-eight inches high a car or tub filled with three or four hundred weight of coal, attached by a chain, and hooked to a leather band around the waist’ [19].

Children were sometimes crushed or ground to death, or had limbs severed, in some of the more dangerous industries, such as underground mining [20]

Basically, millions of children had no childhood, but a monotonous, depressing existence.

‘Children had not a moment free, save to snatch a hasty meal, or sleep as best they could. From earliest youth they worked to a point of extreme exhaustion, without open air exercise, or any enjoyment whatever, but grew up, if they survived at all, weak, bloodless, miserable, and in many cases deformed cripples, and victims of almost every disease’ [18].

And to make matters worse, many children were constantly exposed to poisons, such as arsenic, lead and mercury, which were being widely used in industries, such as silk and cotton spinning [21].

Adulthood didn’t bring much change – hard labour, often for 12-16 hours per day. The terrible conditions and over-work, along with poor diet, aged people quickly: “…from the 1830’s photographs show working people looking old by their thirties and forties, as poor nutrition, illness, bad living conditions and gross overwork took their toll’ [22].


Factories spewed soot and waste into the air, unchecked and unregulated. Cities were covered in a layer of grease and grime [23].

It’s no surprise that lung and chest complaints were rife. And then there was the ever-present stench of open sewage, rubbish, animal dung etc.

Refuse, including the rotting corpses of dogs and horses, littered city streets. In 1858, the stench from sewage and other rot was so putrid that the British House of Commons was forced to suspend its sessions’ [23].

That episode became known as ‘The Great Stink’, and in 1952, atmospheric conditions coupled with coal-fire burning, led to the event now known as ‘The Great Smog” – which killed thousands within the space of weeks [24].

Even today, an estimated 9000 people die prematurely each year in London alone, due to air pollution [25]. Yet the levels of pollution in Victorian times were up to 50x worse than they are today [26] – how many lives must have been cut short because of the foul air poisoning their lungs?


Infant formula was first patented and marketed in 1865, consisting of cow’s milk, wheat and malt flour, and potassium bicarbonate – and regarded as ‘perfect infant food’ [27].

Over the next 100 years, breastfeeding rates dropped to just 25% [28], as social attitudes disdained the practice as being only for the uneducated, and those who could not afford infant formula [29].

Not only did millions of babies miss out on the nurturing of their mother’s breast, but their formula was poor quality, and often made with contaminated water in unsterile bottles, and milk quickly spoiled during warm weather without refrigeration.

It’s hardly a wonder that so many babies succumbed to diarrheal infections, such as typhoid fever.


Without a proper disposal system in place, alleys, courtyards, and streets became littered with rubbish and waste – sometimes knee-high, which was not only offensive-smelling, but a great attraction for all kinds of scavengers – rats, pigs, dogs, cockroaches and swarms of flies [30].


Because horses and donkeys were used to transport goods, they also had to be housed in overcrowded cities, often in close quarters to humans, since space was at a premium. Rotting carcases were left to decompose where they lay.

By late 19th century, 300,000 horses were being used in London, creating 1000 tonnes of dung per day [31].

Pigs roamed freely in the streets, ferreting amongst the rubbish – some towns recorded more resident pigs than people.

Animal slaughterhouses were located amongst high-density tenement housing – animals were constantly slaughtered in full view of the surrounding residents, and the sounds and smell of death were constantly in the air [32].


Due to the burning of coal, and wood fires, cities were blanketed in a thick, black smog that covered everything in grime.

The murk was so dense that countless accidents occurred, including horses and carts running into shop-fronts, or over pedestrians, or into each other [33].

Vitamin D deficiency was widespread, and in the late 1800’s, studies concluded that up to 90% of children were suffering from rickets [34]. In young girls, this often led to deformed hips, and later on, problems in childbirth.


Millions of families subsisted on the cheapest food possible, and many lived on the brink of starvation. Malnutrition was rife, with so little fresh fruits and vegetables in the diet.

Scurvy (Vitamin C deficiency) claimed an estimated 10,000 men during the California Gold Rush in the mid-1800’s [35]. Even in those who did not have overt signs of scurvy, a state of mild deficiency must have been prevalent, leading to weakened immunity to disease and infection.


If you thought blood-letting and leeches were bad, how about an injection of arsenic – proudly brought to you by Merck and Co [36]? Or a gargle with mercury – where’s the harm [37]?

And if you have smallpox, we’ll dab your sores with corrosives [38].

Treatment for syphilis included mercury rubs, bismuth injections, and arsenic injections – some patients endured more than 100 such injections [36].

It’s highly possible that the medical ‘treatments’ killed more people than the diseases they were intended to treat.

Hospitals were known to be breeding-grounds of disease, and over-run by rats, that were so numerous and hungry, they ate patients [39].


With less than 2% of the urban population with running water to their homes [40], and soap/detergents viewed as luxuries, washing of hands, clothes, plates and utensils had to be done with dirty, contaminated water – or not at all.

Note that items such as nappies and sanitary ‘rags’ also had to be washed – no ‘disposables’ in those days!


We now know that stress and fear take a huge toll on the body, resulting in immune system malfunction [41]. Can you imagine the mental anguish of being surrounded by abject poverty, and seeing no way of escape for yourself or your children? Or the panic of watching everybody you love succumb to a dreaded disease, and not having the knowledge or means to protect yourself?

Fear and hysteria ran high during disease outbreaks – during a cholera epidemic in the US in 1849 “thousands fled panic-stricken before the scourge…The streets were empty, except for the doctors rushing from victim to victim, and the coffin makers and undertakers following closely on their heels” [42].

Not to mention the stress of toiling for long hours in monotonous or dangerous work, with hardly a piece of dry bread to fill your hungry stomach?

Given the poor living conditions that millions suffered, it was hardly a wonder that average life expectancy was, tragically, just 15 or 16 years among the working class [43].


[1] GB Historical GIS / University of Portsmouth, London GovOf through time | Population Statistics | Total Population, A Vision of Britain through Time.

[2] [Porter R. The Greatest Benefit to Mankind, Harper Collins, New York, 1997]

[3] Publications of the American Statistical Association, Volume 9, Nos 65-72, 1904-1905, pp 260-261.

[4] Chesney K. The Victorian Underworld, Penguin Books, 1972.

[5] Porter D, Health, Civilization and the State – A History of Public Health From Ancient to Modern Times, Routledge, Oxfordshire, England, 1999.

[6] Mayhew H. A Visit To The Cholera Districts of Bermondsey, The Morning Chronicle, 24th September, 1849.]

[7] Byrne J, My Chicago, Northwestern University Press, Evanston, Illinois, 1992.

[8] Mann E, Story of Cities #14: London’s Great Stink heralds a wonder of the modern world, The Guardian, 4th April, 2016, https://www.theguardian.com/cities/2016/apr/04/story-cities-14-london-great-stink-river-thames-joseph-bazalgette-sewage-system. Accessed January, 2019.

[9] Radeska T, The 1854 cholera outbreak of Broad Street, Everyone got sick except those who drank beer instead of water, Vintage News, 26th September, 2016, https://www.thevintagenews.com/2016/09/26/1854-cholera-outbreak-broad-street-everyone-got-sick-except-drank-beer-instead-water, Accessed January, 2019.

[10] The British and Foreign Medico-Chirurgical Review, Quarterly Journal of Practical Medicine and Surgery, Volume XXXV, John Churchill and Sons, London, Jan-Apr 1865, pp 32-33.

[11] Cole AC, The Irrepressible Conflict 1850-1865: A History of American Life, Volume VII, Macmillan, New York, 1934, p 81.

[12] Formaldehyde and Milk, JAMA. 1900; XXXIV(23):1496.

[13] Report of the Council of Hygiene and Public Health of the Citizen’s Association of New York, 1865, p 59.

[14] Wertz RW, Wertz DC, Lying In: A History of Childbirth in America, Yale University Press, 1989, p 122.

[15] Loudon I, Maternal Mortality in the Past and its Relevance to Developing Countries Today, American Journal of Clinical Nutrition, 2000, 72:241S-246S.

[16] Newman G, Infant Mortality: A Continuing Social Problem, Methueun and Co, London, 1906, p 95.

[17] Horn P. The Victorian Town Child, New York University Press, 1997.

[18] Willoughby WF, de Graffenried C, Child Labor, American Economic Association, Guggenheimer, Weil and Co, Baltimore, 1890, p 16.

[19] Cheyney EP. An Introduction to the Industrial and Social History of England, Macmillan, New York, 1920, pp 243-244.

[20] Lovejoy OR, Child Labor in the Coal Mines, Child Labor – A Menace to Industry, Education and Good Citizenship, Academy of Political and Social Science, 1906, p 38.

[21] The American Journal of Nursing, 1903, 3(8):664.

[22] Mearns A, Preston WC. The Bitter Cry of Outcast London: An Inquiry Into the Condition of the Abject Poor, James Clarke and Co, London, 1883.

[23] Noble TFX, Straus B, Osheim DJ, Neuschel KB, Accampo AE, Roberts DD, Choen WB. Western Civikization: Beyond Boundaries, Volume II, 6th Edition, Wadsworth, Boston, Massachesetts.

[24] Carrington D, The truth about London’s air pollution, The Guardian, 5th February, 2016, https://www.theguardian.com/environment/2016/feb/05/the-truth-about-londons-air-pollution. Accessed January, 2019.

[25] Vaughan A, Nearly 9500 die every year in London because of air pollution, The Guardian, 15th July, 2015, https://www.theguardian.com/environment/2015/jul/15/nearly-9500-people-die-each-year-in-london-because-of-air-pollution-study. Accessed January, 2019.

[26] UK Air, What are the main trends in particulate matter in the UK? Chapter 7, https://www.google.com.au/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=2ahUKEwiJo6G8r9ffAhXJP3AKHf4dCesQFjAAegQICxAC&url=https%3A%2F%2Fuk-air.defra.gov.uk%2Fassets%2Fdocuments%2Freports%2Faqeg%2Fch7.pdf&usg=AOvVaw3Qd9bOWTstyEmCOpcmeLA6, Accessed January, 2019.

[27] Stevens EE, Patrick TE, Pickler R. A history of infant feeding. J Perinat Educ. 2009;18(2):32-9.

[28] Hirschman C, Butler M. Trends and differentials in breast feeding: an update, Demography, 1981, 18:39-54.

[29] Riordan J; Countryman BA. “Basics of breastfeeding. Part I: Infant feeding patterns past and present.”, JOGN Nurs 1980., 9 (4): 207–210.

[30] Oatman-Stanford H, A Filthy History: When New-Yorkers Lived Knee Deep in Trash, Collector’s Weekly, https://www.collectorsweekly.com/articles/when-new-yorkers-lived-knee-deep-in-trash/. Accessed Januray, 2019.

[31] Jackson L. Dirty Old London: The Victorian Fight Against Filth, Yale University Press, 2014.

[32] Annual Report of the Metropolitan Board of Health, 1866, Westcott and Co’s Printing House, New York, 1987.

[33] Heggie V, Over 200yrs of deadly London air: smogs, fogs and pea soupers, The Guardian, 9th December, 2016, https://www.theguardian.com/science/the-h-word/2016/dec/09/pollution-air-london-smogs-fogs-pea-soupers. Accessed January, 2019.

[34] Holick MF. Resurrection of vitamin D deficiency and rickets. J Clin Invest. 2006;116(8):2062-72.

[35] Lorenz AJ, Scurvy in the Gold Rush.” Journal of the History of Medicine and Allied Sciences, 1957, 12(4):473–510.

[36] Cormia FE, Tryparsamide in the treatment of Syphilis of the central nervous system, British Journal of Venereal Diseases, 1934, 10:99-116.

[37] Swediaur F, Practical observations on the more obstinate and inveterate venereal complaints, J Johnson and C Elliott, London, 1784.

[38] Blumgarten AS. A Text Book of Medicine – For Students in Schools of Nursing, 1937.

[39] Vincent’s Semi-Annual United States Register, 1860, p346.

[40] Greene VW, Personal Hygiene and Life Expectancy Improvements Since 1850: Historic and Epidemiologic Associations, American Journal of Infection Control, August 2001, p 205.

[41] Rosen J, The Effects of Chronic Fear on a Person’s Health, Neuroscience Education Institute (NEI), 2017 Conference, https://www.ajmc.com/conferences/nei-2017/the-effects-of-chronic-fear-on-a-persons-health, Accessed January, 2019.

[42] Cole AC, The Irrepressible Conflict 1850-1865: A History of American Life, Volume VII, Macmillan, New York, 1934, p 81.

[43] Greene VW, Personal Hygiene and Life Expectancy Improvements Since 1850: Historic and Epidemiologic Associations, American Journal of Infection Control, August 2001, p 205.