Big Geoff Norman

I have had a long and fruitful relationship with Advances in Health Sciences Education: Theory and Practice, and must express my deep admiration for, and gratitude to, Geoff Norman for his groundbreaking work in medical education, and for his dedication to nurturing new talent in the field. He is a formidable giant in both bodily presence and educational acumen.

In this article, I engage the reader with a slice of North American medical and cultural history to better understand the present. In particular, I ask why conservative heroic individualism in medicine still lingers in a global, connected age in which medical education’s main political role is to democratize medicine. It is only through flattening hierarchies a crude metaphor, but the point is made—that we can authentically realise the twin promises of patient-centredness and inter-professionalism (Bleakley 2014). These socially-responsible practices, seeded some decades ago, are often now termed “coordinated care” (Langberg, Dyhr and Davidsen 2019). I prefer the term ‘collaborative care’, progressing surface co-ordination of tasks to deeper emotional and values commitment to collaboration as democratic “habits of the heart” (Bellah et al. 2008).

The Goldfarb and colleagues’ syndrome

At the point of acceleration of the novel coronavirus pandemic in America, with the death toll nearing 50,000 and rising (at the time of writing it is over 200,000), Dr Anthony Fauci, a distinguished public health expert and the primary medical and scientific advisor in the White House, said of his work: “You stay completely apolitical and non-ideological, and you stick to what you do I’m a scientist and I’m a physician. And that’s it” McCarthy, 2020). In contrast in the UK, also at the height of the coronavirus crisis, Richard Horton also a doctor, and longstanding editor of the prestigious medical journal The Lancet—vociferously challenged the UK Government’s response to the pandemic as too little too late. In an interview with the Financial Times Horton (2020) said, contra Fauci: “The idea you can strip out politics from medicine or health is historically ignorant. The medical establishment should be much more politicised, not less, in attacking issues like health inequalities and poor access to care.”

In an opinion piece in the Wall Street Journal—Take Two Aspirin and Call Me by My Pronouns—Stanley Goldfarb (2019), an experienced physician and retired associate dean of curriculum at the University of Pennsylvania’s Perelman School of Medicine, complains about ‘woke’ medical schools where “curricula are increasingly focused on social justice rather than treating illness”. ‘Woke’ is invoked here in a disparaging way. It is an African-American term referring to vigilance to acts of discrimination (staying awake, or ‘woke’) Goldfarb’s nostalgia for a golden age of medical education supposedly lost to ‘woke’ interests is badly misguided, as the state of American medicine shows. There is a serious fault-line running through American medicine that medical education fails to address. This is the systematic refusal to address upstream community health issues, particularly structural health disparities (inequities and inequalities), focusing instead upon downstream hospitalism unhooked from political concerns. A glaring upstream political issue is that of health care insurance, another is structural racism.

Stanley Goldfarb is not a lone wolf – he represents a reactionary movement in Health Professions Education (HPE) that hankers after a supposed golden age. It is difficult to gauge just how pervasive this movement is, and it may be a generational effect, but my point here is that there is a wider phenomenon at work. Prejudice against medical education’s potential interest in social justice issues is grounded in a long-standing historical divide between health in the community setting (focused on prevention of illness) and hospital-based medicine (focused on treatment). The latter particularly privileges the doctor as authority figure.

William Hsiao (2020, pp. 96–97), a Harvard economist, notes: “There are many statistics that illustrate the flaws of the U.S. health-care system”, where “The United States is the only advanced economy that does not offer universal health care coverage”. Where Americans spend nearly twice as much per capita on health care than other advanced post-industrial economies, in comparison with European, Japanese, Australian and Canadian societies, Americans have “lower life expectancy, higher infant mortality rates, and a higher prevalence of heart disease, lung disease, and sexually transmitted diseases”, where income inequality feeds health disparities grounded in poverty and structural racism.

This is fed by a dominant values system that resists conservation and celebrates consumption. Hsiao notes that around 30% of the capital spent on health care per annum in America is wasted just as vast amounts of food and energy are wasted. This amounts to $1 trillion a year bleeding out from the health care budget. Fraud and abuse in insurance claims, high administrative expenses, and duplication of services all add to this waste, yet, again, many Americans still do not have any or adequate health insurance. Hsiao (ibid, p. 99) notes: “health care still remains primarily a private-sector activity driven by the profit motive”. Here, I am reminded of an acid remark of the celebrated socialist economist John Maynard Keynes (1930): “The love of money as a possession – as distinguished from the love of money as means to the enjoyment and realities of life will be recognised for what it is, a somewhat disgusting morbidity”.

Daniel E. Dawes (2020) meticulously tracks the development of President Barack Obama’s “comprehensive health care reform” initiated in March 2009, as a model for “the political determinants of health”. ‘Obamacare’ was an attempt to address the dilemma of health insurance for all. By July 2009, under pressure from opponents of reform, the President referred to “health care reform” dropping the descriptor “comprehensive”, and by August 2009, this had mutated to “health insurance reform”. Dawes shows how political machinations practically wrecked the plan for health care for all, and how, after Donald J. Trump’s election as President in 2016, efforts have been made to sabotage and dismantle the proposal.

But Dawes does not get at the root reason why universal health care is so disliked in America, where it is perceived as Big Government meddling in an individual’s business. Many individuals saw compulsory ObamaCare as a ‘Soviet-like’ intrusion on choice. As we shall see, this rabid individualism—a key identity position in particular for wider American culture (Bellah 2008)—also permeates American medical education. In the late 1940′s, President Harry Truman tried to introduce a universal health care insurance, but this was vigorously blocked by the American Medical Association (AMA), which stirred up a bitter campaign to disparage Truman’s plan, labelling it “socialized medicine” and even calling it “un-American” and supporting “the Moscow party line” (Hsiao 2020). During the same postwar period, UK doctors responded in the same reactionary way to initial proposals for a National Health Service, fearing they would lose private work.

This stubbornness against government intrusion is part of a frankly hysterical element in the American psyche that so treasures individualism that it favours citizens carrying guns, despite the consequences for high levels of gun crimes. True to type, Stanley Goldfarb complains about the American College of Physicians making statements supporting gun control, suggesting that here medicine has “stepped out of its lane”, or is meddling in politics. There are 40,000 deaths in the USA annually from gun use, and treating injuries from firearms costs $230 billion annually. Guns are the leading cause of death among children and adolescents and African American youth, and are implicated in 70% of over 8000 suicides per annum.

It was the powerful National Rifle Association (NRA) and the pro-gun lobby who recently advised doctors to “stay in their lane” when issuing statements on gun control and safety. In the USA, while Physician Political Action Committees (PACs) are common, doctors seem reluctant to upset the NRA and wider gun lobby. This echoes an earlier era where doctors were not outspoken enough about tobacco use as they remained in some cases financially supported by the tobacco industry. But change may be afoot. Rebecca Cunningham and colleagues (Cunningham et al. 2019) report the “beginning of the end of the medical community’s silence on the issue of firearm research and safety”.

Goldfarb also notes disapprovingly: “During my term as associate dean of curriculum at the University of Pennsylvania’s medical school, I was chastised by a faculty member for not including a program on climate change in the course of study”, where “such programs are spreading across medical schools nationwide” as if this were an unwelcome infection. Indeed, the AMA now welcomes such programmes. Out-of-touch sceptics such as Goldfarb, meanwhile, might take heed of analyses such as that of Alice Hill and Leonardo Martinez-Diaz (2020, p. 107) who suggest that “lost productivity due to climate-related illness are projected to consume an estimated $500 billion per year by the time a child born today has settled into retirement”.

Goldfarb chastises medical schools for “inculcating social policy” when they should be teaching students to “cure patients”, and blames “A new wave of educational specialists” who are “increasingly influencing medical education”, to emphasise “social justice” topics “at the expense of rigorous training in medical science”. This in turn, claims Goldfarb (and here is the rub), comes of a mindset that “abhors hierarchy of any kind and the social elitism associated with the medical profession in particular”. Further, “The prospect of this ‘new,’ politicised medical education should worry all Americans”. Then Goldfarb lays the blame at the doorstep of socio-cultural learning theorists supposedly infecting medical education: “Theories of learning with virtually no experimental basis for their impact on society and professions now prevail. Students are taught in the tradition of educational theorist Étienne Wenger, who emphasized ‘communal learning’ rather than individual mastery of crucial information”.

What is wrong with “communal learning”? Surely this is in any case unavoidable, where no person is an island. Indeed, contemporary psychology points to ‘self’ as a delusion and a philosophical category error, in a networked world in which both skin-to-skin social connections and technology enhanced virtual connections are the norm (Oliver 2020). Well before our virtual age, I know of no better description of collaboration than the celebrated passage In Herman Melville’s Moby Dick, where Ishmael and Queequeg are roped together for efficiency as they attend to a whale’s carcass, dangling over the edge of the ship. If one goes, then the other goes too. Melville describes this “as the precise situation of every mortal that breathes”.

Goldfarb and others’ collective prejudice for individualism in any case runs counter to the evidence base for the effectiveness of collectivism advertised by inter-professional clinical teams. For example, changes in operating theatre practices that embrace the mandatory global Surgical Safety Checklist must include collaborative briefing and debriefing of lists, found to improve surgical outcomes (Gawande 2009; Haynes et al 2009). Further, more democratic team structures in healthcare lead to better patient outcomes, also improving patient safety and worker satisfaction (Borrill & West 2002).

The source of prejudice against collectivism may rest with a values system grounded in an unexamined emphasis upon the moral status of the heroic individual, the heart of Protestant-Capitalist ideology (as argued by Max Weber (2002) in The Protestant Ethic and the Spirit of Capitalism, first published in German in 1905 and not translated into English until 1930). Subscription to such a values perspective leads to a prejudice against nationalised and centralised healthcare, and collaborative rather than individualistic practices of learning. Democracy is translated back to the freedom of the individual (such as the right to bear arms) rather than progressed forward to collective responsibility, however tough to achieve as compromise is always built in (participatory and democratic clinical team process).

Goldfarb’s doomsday polemic properly attracted a largely hostile online response. One hundred and fifty plus alumni of the Pennsylvania’s School of Medicine bothered to strongly refute his claims. On Medscape (Zheng 2019), Penn Med graduate doctors said: “we are compassionate, socially responsible, and grounded in the deep-rooted belief that doctors are vehicles for social justice. We believe that social justice should not only have a place, but a central place, in the medical school curriculum”. What then are the historical conditions of possibility for the emergence of such rabid individualism (expressed as ‘self-help’) and demonizing of collectivity?

Self-help

Traditions of self-sufficiency permeate mainstream medical education. Self-help’s centre of gravity for over a century has been North America, but its birthplace is Scotland. The Scottish doctor, author and reformer Samuel Smiles (1812–1904), who trained in medicine at Edinburgh, published Self-Help in 1859. Smiles was critical of excessive wealth and materialism, but he also thought that poverty was a product of irresponsibility, and avoidable. Ironically Smiles, one of eleven children, was supported through medical school by finances provided by his mother, after his father had died from cholera. In the 1840′s, Smiles engaged deeply with political reform, arguing for democratic principles including the rights of women. But by the 1850′s, he had stopped campaigning for general political reform as he vigorously promoted the idea of self-sufficiency. Self-Help sold 20,000 copies in the first year of publication, and by the time of Smiles’ death in 1904 the book had sold over 250,000 copies.

Orison Swett Marden (1848–1924), an American physician and polymath, had degrees in law, science, and arts as well as medicine. Orphaned at the age of seven, as a teenager Marden fortuitously came across a copy of Self-Help and was smitten with Smiles’ ideas. Marden wrote his own self-help book Pushing to the Front. Published in 1894, it was the first and most influential popular psychology book in America. By 1925 it had run to 250 editions and became a global bestseller. Spurred on by the initial success of his book, in 1897 Marden founded a magazine called ‘Success’, with a circulation of 500,000. It was indeed a runaway success as the first motivational self-help journal.

By then, Marden had left medicine, turning his back on hospitals to enter the hospitality industry. He ran several hotels and a holiday resort, and eventually employed over 200 people to produce and deliver his periodical. In 1916, he became the first President of ‘The League for the Larger Life’ in New York, an organisation whose mission statement was "to spread a knowledge of the fundamental principles that underlie healthy and harmonious living" and "to assist the individual in the solution of personal problems". The ‘pop’ psychology, personal development culture was established, grounded in the wider values of the ‘lone frontiersman’ mentality of heroic-individualism, self-sufficiency, strong work ethic (Protestantism’s main secular value) and opportunistic capitalism. Here, private feelings and emotional life become commodities for capitalist enterprises of self-help an therapy, as Arlie Russell Hochschild (1983) in particular details in The managed heart: commercialization of human feeling.

The UK’s Observer newspaper recently reported a boom in sales of self-help books, particularly pertaining to mindfulness, with sales of 3 million in 2019 (Walker 2019). In France, nearly 15 million self-help or health and wellbeing improvement books were sold in 2018, compared with 10 million cookery books and 3 million books on gardening, animals and nature (Staista 2019). The self-help or personal growth market in the USA is now turning to ‘life coaching’ and is worth $10 billion, predicted to rise to $12 billion by 2022 (LaRosa 2018). While more women than men read self-help books, more men than women write them (Zhou 2017).

Stressing the ‘frontiersman’ virtues of resilience and persistence (core to self sufficiency) and the Protestant work ethic, Orison Marden recounted how his first manuscript copy of Pushing to the Front had been destroyed in a fire when one of his hotels burned down. He immediately wrote three new versions and sent them simultaneously to three different publishers each wanted to publish the book. Again, Marden, inspired by Samuel Smiles, had created the self-realisation movement that today we know by descriptors such as ‘personal growth’ and ‘humanistic psychology’, and that has exploded through You Tube and social media, affording a significant psychological public health intervention. Marden himself was almost certainly influenced by a 19th century American movement called ‘New Thought’ formed initially by the ideas of Phineas Quimby. Drawing on ideas from a number of religious denominations, the New Thought movement’s doctrine was that health is a product of ‘right thinking’, and conversely, sickness a product of ‘wrong thinking’. In short, the individual is in charge of his or her fate, and responsible for his or her actions, where psyche precedes and shapes soma.

The Scottish-American industrialist and philanthropist Andrew Carnegie (1835–1919) made a fortune from producing steel. He was a staunch believer in independence, the Protestant work ethic and self-help, and admired Marden’s work in particular. Carnegie set up a charitable Foundation to re-distribute around 90% of his considerable fortune (around $65–70 billion in today’s money). There was, however, a dark side to Carnegie’s beliefs that also characterised Samuel Smiles’ philosophy: those who could not help themselves were seen as either weak or lazy and should be allowed to perish. This twisted reading of Darwinian ‘survival of the fittest’ offered a cruel injunction to the physically or mentally challenged, or to those stuck in a poverty trap.

Marden’s and Carnegie’s shared value system would come to describe a cultural style and trait among North Americans that would shape educational systems and pedagogical practices, focused on self ‘improvement’. Its main proponent would be John Dewey, born in the year that Samuel Smiles’ Self-Help was first published (1859) and a contemporary of both Marden and Carnegie. Dewey was a firm believer in democracy, but more, in autonomy. Democracy advertised individual freedom of expression rather than compromise for the common good. This has now become the credo for neoliberal (free-market and competitive) capitalism, and has proved to be illusory. ‘self help’ readily becomes ‘every man for himself’. Should Ishmael or Queequeg encounter trouble, one would cut the tie and clamber to safety, leaving the other to plunge onto the carcass of the whale into a mess of trouble.

The darker competitive side of Samuel Smiles’ legacy has dominated North American pedagogy and this has bled into medical education, including Western European versions. The individual, and the cult of individualism expressed competitively (prizes! awards! leadership! mastery!), has been a primary driver for medical pedagogies. (This is not to ignore the major influence of liberatory and explicitly politicised and activist educationalists such as Paulo Freire). Where success stems from self-reliance, so failure is described as reliance on others. Recall Stanley Goldfarb’s invective recounted earlier, where in medical education: “Theories of learning with virtually no experimental basis for their impact on society and professions now prevail. Students are taught ‘communal learning’ rather than individual mastery of crucial information”. Such ‘mastery’ carries the sinister overtones of a Master/Slave relationship refusing reciprocity and productive social bonds.

The self-help philosophy of Marden, the pedagogy of Dewey, and the philanthropy of Carnegie converge in the work of Abraham Flexner, an ambitious educationalist who ran his own innovative school. For the origins of modern medical pedagogy, we must go back over a century to the politics of Abraham Flexner and his hugely influential reports on medical education in North America and Canada (Flexner 1910); and, two years later, in Europe. These have been picked over many times by contemporary medical educators. Importantly, Brett Schrewe (2013) argues that Flexner has now become mythologised through a “metanarrative” that surely clouds rather than illuminates our understanding of his contribution to HPE.

Flexner’s political worldview

The Flexerian myth suggests that the shape of modern medical education should be ascribed to the work of one man again, an heroic endeavour. A Classics graduate turned educationalist, Abraham Flexner was a contemporary of Orison Marden, his educational inspiration was John Dewey, and the funding that allowed his vision of medical education to be realised was gained originally from the legacy of Andrew Carnegie in the form of the Carnegie Foundation for the Advancement of Teaching. This was not explicitly a cabal, but the interconnections between the members of this male group are surely as interesting as the individual figures.

Flexner’s (1910) Medical Education in the United States and Canada: A Report to the Carnegie Foundation for the Advancement of Teaching is in fact a text of immense political and ethical interest. The Carnegie Foundation did not want a doctor to carry out the on-the-ground research and subsequent writing of the report, but somebody who would be dispassionate about medical education. Many considered Flexner (1866–1959) to be the most important educationalist of his era, even more so than his contemporary John Dewey (1859–1952) whose educational methods Flexner greatly admired. On his death, the New York Times front page obituary said of Flexner: “No other American of his time contributed more to the welfare of his country and of humanity in general”. Not everybody agreed. An American doctor, Lester King (1984) called Flexner’s report “probably the most grossly overrated document in American medical history”, pointing out that recent medical historical scholarship has placed Flexner among a network of many equally important figures and factors influencing medical education of his time. The celebrated medical historian Kenneth Ludmerer (1999) agrees, as we shall see.

Flexner was the sixth of nine children born to German Jewish immigrant parents in Louisville, Kentucky. His father Moritz was a hat seller and his mother Esther a seamstress. His eldest brother Jacob supported Flexner through his first degree at Johns Hopkins University in Baltimore. Flexner completed his Bachelor degree in 2 years. Jacob, a pharmacist, later trained as a doctor, while Flexner’s older brother Simon became a renowned pathologist and bacteriologist. Flexner taught Greek and Latin at High School, and on the back of private education tuition from wealthy donors set up his own experimental school in which lessons were not compulsory and students did not enter for exams. Yet many went on to attend prestigious Universities and Colleges, and the school gained a reputation for educational innovation. Flexner married one of his former students, Anne Crawford, who had become a teacher and playwright. She had a Broadway success and the profits financed Flexner to close his school and study full time for a Master’s degree in Psychology from Harvard, and then to spend a yearlong sabbatical in Germany at Berlin and Heidelberg Universities, something of a homecoming, where he decided that the German educational system was the finest in the world.

The wealthy Carnegie Foundation asked Flexner in 1908 to conduct the planned survey of the quality of medical education in North America and Canada. The president, Henry S. Pritchett, had read Flexner’s recently published (1908) critique of higher education, The American College: A Criticism, in which Flexner attacked in particular large lecture teaching as a ‘cheap’ and ‘wholesale’ way of turning education into cruder management of learning. Flexner was taken by surprise at the invitation as he knew nothing about medical education and had never set foot inside a medical school. When he did, what he saw shocked him. From January 1909 to April 1910, Flexner visited all 155 medical schools in North America and Canada, some only briefly but some of them twice, clocking up 175 visits. Most visits revealed available medical education to be largely a totally unacceptable way to prepare doctors for practice.

Within the dominant Flexner narrative, in the majority of schools, admissions policies were poor, haphazard or non-existent. Curricula had no formal shape. Pedagogies went unexamined. Resources were lacking 140 of the 155 schools had no library; schools were poorly equipped and had no link with nearby hospitals. Yet the certificates received upon completion of studies licensed graduates to practice medicine. Most importantly, there was an over-production of doctors (Flexner claimed four to five times as many doctors were being trained in North America as in Germany per head of population). Only one school—Johns Hopkins in Baltimore, Flexner's alma mater—required entrants to possess a prior degree. Flexner took this institution as his standard for future development of medical schools. He did not think along lines of equity and equality, such as how such poorly performing schools could be better resourced or helped a ‘seeding’ approach. Rather, he set a standard and then employed the scythe, and brutally.

As fee-paying private institutions, medical schools were more interested in profits than standards, often recruiting their students from industrial occupations. Flexner had apparently uncovered a scandal. Fifteen thousand copies of his report were printed and distributed free of charge causing widespread alarm. By 1922, of the 155 schools only 81 remained. Flexner himself had called for a maximum of 31. He initiated four major changes: medical schools would be University based; faculty would be involved in research, both scientific and educational; students would be recruited only after obtaining an undergraduate degree in sciences; and would learn through a standardised curriculum of two years of anatomy (through cadaver dissection) and bench sciences, followed by two years of clinical study through the University’s attachment to hospital settings.

In a comprehensive history of North American medical education Kenneth Ludmerer (1999) points out that Flexner’s extreme foregrounding from a background of multiple medical educational activities provides a distorted picture. Flexner did not suddenly initiate modern medical education singlehandedly, while his observations of medical schools often suffered from a lack of appreciation of how far many schools had come since inauguration especially those that catered for women and minority groups. Appalled by the laxity of home-based medical education, some more thoughtful, inquisitive and morally sensitive – American doctors had gained experience in Germany where medical students underwent a rigorous education, first socialised as anatomists (through cadaver dissection) and then bench scientists before engaging in clinical medicine.

Ludmerer (ibid, p. xxii) shows that the Flexnerian revolution had antecedents since the mid-nineteenth century “when a revolution occurred regarding how medicine should be taught”. The revolution was not confined to medicine, but was one of ideas and adventures in pedagogy based on a social contract. Capitalist society, driven by entrepreneurs, would provide the climate for the generation of schools and universities, including medical schools. In turn, doctors educated in these schools would engage in a social contract in which they committed not only to serving communities, but also to developing the highest possible standards of research and professionalism within their field. As Ludmerer (ibid.) says, this was a “financial, political, and moral” exchange. Financial capital flowed in to medical school development not in dribs and drabs, but in huge quantities, reflecting Flexner’s new influential role at the Carnegie Foundation and then the Rockefeller Foundation. Ludmerer (ibid, p. xxiii) notes that in 1910 a leading medical school may have a budget of around $100,000, by 1940 that would have swelled tenfold, to $1 million.

More sinister is whether or not Flexner had consciously decided to come down heavily on those schools that catered for women and minorities (Hodges 2005). They were the most vulnerable, short of funding and then equipment and expertise. Nowadays, as noted above, we would see this as good reason to support them and invest in them to counter both inequity and inequality. It did not seem to matter to Flexner that women would be dispossessed of the opportunity to study medicine – a condition that persisted until relatively recently. However, there were certainly open motives for discouraging people of colour to study medicine. Flexner suggested that black doctors should only work with black patients using the spurious argument that such doctors might infect white patients with illnesses carried only by people of colour (Nevins 2010).

Flexner’s view about race issues was complex. In a letter from 1930 concerning recruitment of staff at Princeton University, Flexner’s belief in offering opportunities to all individuals is clear: “It is fundamental in our purpose, and our express desire, that in the appointments to the staff and faculty as well as in the admission of workers and students, no account shall be taken, directly or indirectly, of race, religion, or sex”. Further, while Flexner was Jewish, he never openly spoke out against anti-Semitism and was strangely quiet when Hitler came to power. In the 1930s, when high-profile Jewish intellectuals and scientists emigrated to America, Flexner was often involved in employing them through his role as founding director of the Institute for Advanced Study at Princeton, where his biggest ‘catch’ was Albert Einstein.

As Michael Nevins (ibid.) argues, Flexner’s achievements in pedagogy and medical education have been lauded, while his infamous character flaws of irascibility and narcissism have been noted, yet his values and beliefs, despite his 1940 autobiography and an update in 1960, remain opaque or cloudy, complex, contradictory and difficult to decipher. He must have been conflicted over his love for the German educational system, his Jewish ancestry, and what he was seeing in Germany as Hitler came to power, but he never made this plain. Nevins asks why Flexner did not openly come out against institutionalized anti-Semitism that was reflected in the popularity of the eugenics movement in America during the first half of the 20th century.

Medical schools as businesses

The German sociologist Max Weber (1864–1920) was a contemporary of Marden, Carnegie, Dewey and Flexner. First published in 1905, but not translated into English until 1930, and then probably unknown to the spearhead figures in American self-help and self-sufficiency thinking, Weber put forward a radical idea in The Protestant Ethic and the Spirit of Capitalism. Goethe’s 1809 novel Elective Affinities took the idea—prevalent at the time—that certain chemicals were attracted to other chemicals and would react with these and not with others. Goethe took this as a metaphor for human passions and relationships. We still use the term ‘chemistry’ to describe such affinities. Weber poached Goethe’s idea to explore social and intellectual bonds. He was puzzled as to why market-driven capitalism was so successful in the Western world, and suggested that this can be explained by capitalism’s elective affinity with the central ethic of Protestant belief: ‘getting ahead’ through self-help and independence, or what we now call a ‘work ethic’.

Protestant Calvinism in particular, popular in Scotland, England, Germany and the Netherlands, mapped on to the rapid development of capitalist economies in these European countries when compared to Catholic-dominated countries such as Spain, France and Italy. Calvinism encouraged hard work in this life with reinvestment of profits (rather than what was seen as frivolous spending) to set up salvation in an afterlife. More, buying in to this predestination eased any conscience about social and economic inequality in this life – that could be put down to others’ laziness or indulgence.

We have seen that Flexner’s fieldwork inquiry and subsequent report uncovered a common model amongst North American and Canadian medical schools. Whatever their quality and standing, they were all profit-seeking private institutions or businesses. Paradoxically, drawing more on John Dewey’s idea of independent learning than on democratic engagement as the primary pedagogy for medical education, Flexner’s purging and reconceptualising of medical education drove curricula deeper into capitalist ideology, where knowledge and skills were obtained through individual effort and retained as personal capital.

With Flexner’s initiative, the political body of North American (and then Western European) medical education is laid bare. In short, the dominant model of learning remained individualistic and not social, as free-market neoliberal capitalistic and competitive, certainly up to the dawn of the twenty-first century. Collaborative or socio-cultural learning theories were still on the horizon. Marxist dialectical materialism, where physical objects and artefacts (including languages and symbol systems) played an equal role in learning along with the humans who drew on them, was until recently for most medical educators a foreign language. Medical educators whose pedagogies celebrated individual achievement ignored the work of American scholars who had spent time in Russia studying collectivist and dialectical-materialist learning theory, such as the psychologist Michael Cole (Cole et al. 1997). So, for example, peer assessment methods were never properly explored, while the collectivity possible in problem-based learning was consistently displaced by focus on individuals’ efforts and how these could be isolated and assessed.

Soviet learning theory

Stanley Goldfarb, above, speaking for an entrenched conservative view, complained of both political activism and pedagogical collectivism tainting contemporary medical education, where “The prospect of this ‘new,’ politicized medical education should worry all Americans”. As noted earlier, he lays the blame at the feet of social (and socialist) learning theorists: “Students are taught in the tradition of educational theorist Étienne Wenger, who emphasized ‘communal learning’ rather than individual mastery of crucial information”. Poor Étienne Wenger, who came from a medical family but decided to study psychology, and never wrote a thing about medical education until late in his career. Wenger’s work, with his colleague the anthropologist Jean Lave, was focused on craft apprenticeships in areas such as the work of butchers.

Better that he target the psychologist Michael Cole, mentioned above, who on the advice of Jerome Bruner went to study learning theory in Moscow in 1962 for a year, under the direction of Alexandr Luria. Or Yrjö Engeström, the leading Finnish educator, who has done more than anybody globally to promote and develop Cultural-Historical Activity Theory (CHAT) (a descriptor coined by Cole) first developed by the Russian psychologist Lev Vygotsky after the 1917 revolution. Engeström too has worked in America on and off since the 1980′s, at the University of California, San Diego in a department founded by Cole, where Engeström (1999) says: “I had to learn about multiculturalism and to appreciate ethnic, religious, and other differences between people”. Here then is a glaring paradox – Engeström, seeped in collectivist tradition in Helsinki, learns about multiculturalism in San Diego, but North American HPE fails largely to learn from collectivist CHAT. Social learning theories did not register in the medical education literature until the early 2000s (Bleakley 2006).

Michael Cole later became a translator and editor of Luria’s writings and, for over three decades, was editor of the journal Soviet Psychology. Cole’s political interests came partly from the influence of his father, Lester Cole, a Hollywood film screenwriter who held left-wing views and was under surveillance during the height of the McCarthy era. He refused to answer questions when interviewed about his possible Communist affiliations, along with a group of Hollywood directors and writers who became known as the ‘Hollywood Ten’.

In conclusion, Stanley Goldfarb and others should know better – all medical students and junior doctors learn in social or communal settings such as inter-professional teams on ward rounds, crash teams in resuscitation scenarios, or surgical teams in the operating theatre, and the evidence base shows – as noted earlier—that the more collaborative the team, the better the outcomes not only for patients’ health and safety, but also for team members’ work satisfaction. Goldfarb and others, I guess, would not welcome medical students and junior doctors challenging the system to afford innovation, but would wish neat and quiet absorption into traditions of individualism without a hint of revolution. You would think that current generations of medical students and junior doctors would surely change that, after #MeToo and Black Lives Matter, but lingering traditions of heroic individualism are so entrenched that they will surely persist (Sabin 2012).

Such entrenchment would mirror what Yrjö Engeström (2018) calls a “will-to-stability”, as opposed to the horizon of “possibility knowledge” achieved through questioning unproductive historical habits. Social learning theories follow a model of dialectical learning, in which challenge from below is invited and contradiction is the engine of change. Medicine’s major contradiction is that, within multi-professional healthcare provision, it remains a stubbornly hierarchical culture while advertising patient-centredness and inter-professionalism, or “co-ordinated care”. Medicine must democratise, and quickly.

John Dewey’s dilemma was how to fuse rabid autonomy and unbridled capitalism (the American way) with democracy, to give equal weight to ‘mind’ and ‘culture’. How should the individual mind and “habits of the heart” adapt in commitment to collaborative democratic progress? Dewey’s answer was to integrate culture into individual mind, as a commitment to collective endeavour without losing individual rights and freedoms. This is the American Way (Sabin 2012). But this path morphed into what Kenneth Galbraith (1992) derides as a “culture of contentment”, one of smug self-satisfaction in the face of growing structural inequities and inequalities. Rather, as Michael Cole insisted, we should integrate mind into culture, as extended cognition and affect.

In this ‘ecological perception’ view, the ‘mind’, or thinking and feeling, is not in the skull, but is afforded by interactions between environmental context and human intentions. Context includes artefacts such as technologies, languages and semiotics (signs and symbols). Human intentions include ‘predictive processing’—the ability to link improvised activity with changing environmental cues (Clark 2015). If we take this model of cognition seriously, then ‘individualism’ is already a mis-description, as we are all embedded (and then embodied) in a wider ecological context of shared technologies and languages.

In the neoliberal capitalist model, as individuals gain more money, power and prestige, they typically abandon collectivism and drift away from the common good. The less successful are cast adrift, because now we know that as the economic tide rises, it does not bring all boats with it (Piketty 2014). The socialist way in contrast is to sacrifice individualism for what the collective can afford the individual. This honours principles of equity and equality. While equality provides the same measures for all, equity provides differing measures depending on needs. Following Alexandr Luria’s maxim that “the determining factor in the psychological development of the child is the social development of the child”, if Goldfarb had, for example, been born in Cuba, or even a Scandinavian country where individualism is subsumed in the collective ideal, he probably would have followed a quite different set of values as a doctor and medical educator.

I have – often polemically – argued for the value of a collectivist HPE over an individualistic model and have provided a rationale for this. AHSE is a journal that has been at the cutting edge of thinking and practice in the arena of HPE, recognising that major advances in healthcare have been achieved through genuine patient-centred and inter-professional collective care team practices. Again, I applaud Geoff Norman’s pivotal role in advancing work in the field.