Sketches From a Sociologist’s Career: 3 – Becoming a Medical Sociologist

By | March 27, 2024

Accepting the post at St Barts established and confirmed my work and my social ‘status’ as a medical sociologist. Within the academic community I had just tentatively set foot in this rapidly became a ‘master status’: in other words, any other components of what American sociologist Robert Merton in his Social Structure and Science termed my ‘status set’ – partner, father and so on – counted for relatively little. And every status is accompanied by a ‘role’, that is, a clutch of actual as well as normative or moral expectations about how anyone occupying a given status should behave. My employer, Anthony Hopkins, was a maverick neurologist, as appointing a sociologist to conduct a study on a topic like social stigma eloquently testified. Like most ‘medics’ he was hierarchically-minded and we were destined to have a few run-ins; but the three years of our collaboration were, I think, fruitful and rewarding for us both. Unusually, Anthony – never Tony – proved willing to personally interview all the adults with epilepsy in our sample, either in his clinic at St Barts or in their homes. This meant that we were to have comprehensive clinical as well as social data on nearly one hundred people. It is a tragedy that Anthony was later to die unexpectedly and prematurely at the age of 59.

If Anthony was my day-to-day supervisor, George Brown was my academic supervisor. George’s reputation rested on his painstaking research on aspects of life events and mental illness, most notably schizophrenia and depression. His Social Origins of Depression, co-authored with Tirril Harris in 1978, was rightly to become a classic. I found him helpful and supportive, not least with the occasional hiccup in relations with Anthony. What I also discovered quite soon was that my study did not command a great deal of his attention. This became clear when he asked me one week why we weren’t going to include a control group in the study, and then, when I drafted a paper to incorporate one, he asked me why we needed a control group. Lesson learned: I didn’t mind if his focus was elsewhere, but I did decide at that point that I was going to make my own decisions.

Before I turn to the substance of this study, which was to prove significant for my subsequent career in sociology, it is worth remarking further on the nature of Ph.D supervision in the 1970s. Both George Brown and I were able to be relaxed about this process, institutionally as well as personally. Not only was he content to be largely hands-off at my instigation, but neither he nor Bedford College exerted any real pressure on completion dates. Thus it was that a Ph.D commenced in 1972 was only submitted and approved – with an optional request to correct 13 typos – in 1983. Given that I had typed my thesis on an old portable typewriter in my kitchen I certainly wasn’t going to bother about a handful of typos! But I’m jumping the chronological gun.

When I began to think about issues of stigma, or shame, facing adults with epilepsy living in the community I was armed with sociological theories acquired during my undergraduate studies. Principal among these were two forms of interactionism, ‘labelling theory’ and ‘dramaturgy’. The former originated with and was popularised by American sociologists like Lemert and Becker, in part reacting to Parsons’ system-oriented structural-functionalism, which was seen, a little unfairly, as altogether excluding the theorising of social change and the exercise of agency. Within the field of medical sociology, Elliot Freidson in his classic Profession of Medicinepublished in 1970 comprehensively addressed the issue of doctors as state-sanctioned professional experts who possessed the ‘cultural authority’ to label patients via the making and communication of diagnoses, in the process allocating them new identities – in Merton’s terms, another status and associated role – and the political, social and psychological ramifications of this whole process. It takes power, in this case in the guise of licensed rational authority, to label someone effectively, and it might be said to constrain those who are labelled.

Amongst the offsprings of the pioneers of labelling theory were the British ‘new deviance’ theorists, but it was to Freidson that I instinctively turned. The result was an embryonic framework tucked into my back pocket that accompanied me to my studies. It was probable, I reasoned, that people with epilepsy were as vulnerable to the application of the label ‘epileptic’ and its inevitable psychosocial sequelae as they were to the prospect or fact of recurring seizures. Diagnostic labelling applied professionally would surely lead to stigmatisation by others, and thenceforth to ‘secondary deviance’, that is, to those labelled in this way internalising others’ negative views of them and altering their self-perceptions and behaviour towards conformance with these negative views. It turned out I was wide of the mark. But I should say a word more about the study before outlining what I came to call the ‘hidden distress model of epilepsy’.

The community sample comprised 94 adults with active epilepsy accessed through five general practices in and around London. Each was interviewed both by me and by Anthony Hopkins, affording us an excellent data set. My own interviews all took place in people’s homes at their convenience, and I was to experience some interesting moments and challenging setbacks. Out at Thamesmead in East London, for example, I interviewed a former associate of the Kray twins whose seizures started when he was shot in the head by a member of a rival gang. His ‘canaries’ (seizures) had terminated his job as a driver on armed robberies. During the whole of the interview he was doing one-arm press-ups on the floor beside me. Several years later I was to read about his arrest in the Evening Standard. He had held up a bank with a shotgun, backed out of the door with his loot, stepped off the curb and been run over. He was quoted as saying ‘It’s a mug’s game’. The principal methodological challenge was the failure of my tape recorder during part or all of 13 of the first 20 interviews, meaning I had to return to complete the conversation; and there are methodological problems about returning (especially if people have reflected on what they said previously and regretted it). It was my failure to arrange and conduct two-to-three interviews a day that led to one dispute with Anthony, more used to dealing with subservient juniors; but George Brown provided welcome support.

One other methodological, or more accurately data-processing, issue is worth commenting on. I have never been convinced by the multiple software innovations designed to help analyse qualitative data, and part of the reason for this is the faith I developed during the epilepsy study in compiling what I called ‘topic cards’. An initial set of topic cards was constructed, corresponding approximately to the series of topics or ‘classes of information’ explored during the interviews. A set was then produced for each interviewee.  It was a device I was to write up much later in a paper published in Social Science and Medicine in 1990. I recorded all remarks on a given topic, often scattered throughout two-hour interviews, including direct quotations; precisely where these could be found on the cassette recordings so I could revisit them if necessary; and, where relevant, cross-references to other topic cards. Ok, it took forever, the first interview absorbing 13 hours to transfer onto topics cards, but I gradually grew more efficient, averaging 8-10 hours. I ended up with nearly 5,000 topic cards. I felt then, and feel now, that no better way of familiarising oneself with a data set from in-depth semi-structured interviews commends itself.

Returning to the hidden distress model of epilepsy, I eventually concluded that epilepsy remains a stigmatising condition, but that most people with epilepsy are, to use Goffman’s terms, ‘discreditable’ rather than ‘discredited’; in other words, their stigma is inconspicuous between observed seizures, meaning that the predominant issue they face is ‘information management’ – when to disclose their epilepsy, to whom, and whether to it disclose at all -rather than ‘impression management’. The result was my most cited paper (1,000 + citations at the time of writing) in Sociology of Health and Illness, published in 1986.

The model itself can be summarised in three propositions. First, on communication of the diagnosis people quickly come to perceive their new status or identity as ‘epileptics’ as socially undesirable. In general terms, this is because they define epilepsy as a stigma; more specifically, it derives from a ‘special view of the world’ in which a fear of meeting with active stigmatisation, what I called enacted stigma, predominates. Second, this special view of the world, the salience of which at any given time is contingent upon situational stimuli, is predispositional. It predisposes people, first and foremost, to conceal their condition and its medical label from others, to try, to deploy Goffman’s terminology once more, to ‘pass as normal’. The fear of enacted stigma, in other words, leads to a policy of non-disclosure, a policy which remains feasible for as long as they are discreditable and not discredited. Third, the policy of non-disclosure reduces the opportunities for, and hence rate of, enacted stigma, most notably in the context of personal relationships and the employment market. One general but crucial consequence of this is that felt stigma, denoting a fear of enacted stigma and a sense of shame applied to self, is more disruptive of the lives of adults with epilepsy living in the community than enacted stigma. This model gave a novel twist both to orthodox labelling theory in medical sociology and to clinicians’ thinking in general practice and neurology. The study was ultimately written up as a book in 1989, entitled, appropriately enough, Epilepsy.

Reflecting now on doing a Ph.D in the 1970s, albeit extending into the 1980s, there are several points to be made over and beyond the ‘relaxed’ academic environment and supervisory practices already alluded to. From the vantage point of nearly half a century on, and having myself examined between 40 and 50 Ph.Ds in the UK and elsewhere, it is difficult not to conclude that today’s theses are typically slighter. It is not unusual, for example, for them to rest on analyses of a dozen interviews, or to be touch more generous, case studies, or to deal exclusively in secondary data sets. There is nothing intrinsically wrong with either of these contemporary options and methodologies, it’s just that I have found I was used to, expected and wanted more. A related point is that doing a Ph.D is a far less relaxed process. As will become clear in later sketches, the institutional and supervisory agendas are now tied to log-books, box-ticking and a competitive ethos that pressurises students not only to stick to rigid timetables while thinking about jobs and their futures but to publish in high-impact journals prior to writing up their theses.

As far as my own Ph.D is concerned, I recall George Brown’s sound advice not to start off by clarifying and sorting basic philosophical and theoretical premises, which was my inclination, but to just get on with doing the research. What he never knew was that I had made another foolish decision at the outset, namely, to read nothing but thesis-related material for an unspecified but prolonged period, a tactic I obstinately stuck to for a whole year.  On another tack, while I’m sure it never occurred to George to co-author material from my Ph.D, I had to negotiate a compromise with Anthony Hopkins, who was after all the initiator and fully engaged collaborator on the study. It was to be a compromise that saw a mix of single and joint authored articles and other publications. Eventually publishing the results in dribs and drabs established my presence within medical sociology as ‘the stigma man’, a tag that has never left me. I had, and indeed have, no objections to this and have subsequently built on my early endeavours by re-contexualising and elaborating on the hidden distress model, of which more in later sketches. Critical to this reputational boost was the distinction between enacted and felt stigma, which has featured regularly in textbooks ever since. Looking back, I think I was half aware of the possible, though certainly not probable, significance of what I was doing when I coined the terms. Maybe there’s a moral there.

A final point before moving on is that the easy-going and extended trajectory from Ph.D registration (1972) to submission (1983) enabled me to embark on other academic projects. Under the watchful eye of Margot Jefferys, for example, Donald Patrick, then at St Thomas’ Hospital medical School, and I co-edited one of the first textbooks in medical sociology aimed directly at the teaching of medical students: the first edition of Sociology as Applied to Medicine was published in 1982 (and it has gone on to achieve a seventh edition in 2018). But I also changed jobs and became a teacher. When the three-years of funding for the epilepsy study came to an end I was fortunate to be appointed as half-time lecturer in medical sociology at Charing Cross Hospital Medical School. How this came about now seems quite extraordinary. I have previously described Margot Jefferys and George Brown as London’s mafiosi in the world of medical sociology, and so it was to prove. The Dean of Charing Cross approached George to see if he knew of anyone who might join David Blane, who wished to remain half-time, as a second half-time lecturer with joint responsibility to teach a course in medical sociology to the medical students. George approached me, persuaded another Ph.D student who expressed interest not to apply, and I found myself seated in front of the Dean anticipating an exacting inquisition. That’s not how it turned out. He was clearly going through the motions and mostly devoted his questioning to the number of foxes to be found in Epsom, where I then lived. I left mildly confused but bordering on ecstatic. Ok, it was a part-time job, but I was now a university lecturer!

If my appointment at St Bart’s marked me as a medical sociologist, that at Charing Cross affirmed me as a lecturer-cum-teacher. I gave an early ‘experimental’ lecture to a group of medical students while David Blane looked on. Drawing in too much detail on my (ongoing) thesis, I was initially discouraged by the students’ blank looks; but David helped build my confidence, and we went on to combine well to offer students what I still think a good and challenging course. Some background is important here. The Report of the Royal Commission on Medical Education (the Todd Report) had been published in 1968 and reflected a significant input from Margot Jefferys. It argued for the inclusion not only of sociology but of psychology and statistics in the medical student curriculum, advice afterwards accepted by the General Medical Council (GMC). Hence David’s and, later, my presence at Charing Cross. Gradually, fitfully, and occasionally irritably, we sociologists began to occupy positions in medical schools in London and elsewhere. We were face with a plethora of challenges. David and I were initially given office space in labs and the like: I recall us later being housed in the one room of a block in Fulham Palace Road, the only room that was left un-refurbished and undecorated. Nor were the students predisposed to take sociology seriously. Even worse were so-called colleagues in the life sciences. An ancient Professor of Physiology with whom we were sharing a lift suddenly came out with: ‘I can never see you two without thinking of anarchy and bombs’. And this was a man who blew on the lift button to summon it out of a concern for hygiene. But we persisted and developed some ground-breaking student projects. Not that these were always trouble-free. One project involved students interviewing people in the local community about their experiences of primary care services. We were summoned to see the Dean, who had subsequently received complaints from one or more GPs that our students were interviewing their patients without their permission. We protested in vain that we didn’t need their ‘permission’ to organise interviews with people, not GPs’ patients. In what we thought was a diplomatic and expedient but indecent capitulation, the Dean supported the GPs, so that exercise was discontinued.  After I left Charing Cross David initiated another impressive student project, in which his students drew up personal family trees to facilitate discussion of the social determinants of social position and its health sequelae. The teaching at this time was reasonably well funded, allowing David and I and our colleagues in London’s other medical schools to employ tutors to teach small groups by topic.  The array of talent available was impressive and at different times the line-ups included Mel Bartley, Annette Scambler, Ann Bowling, Ruth Pinder, James Nazroo, Clive Seale, Richard Compton, Mary Boulton, Jocelyn Cornwall, Judy Green, Sarah Nettleton and many another contemporary and future luminary.

But half a post brought half an income and I had begun to search for another part-time post. I was already teaching a basic course and an advanced option in philosophy of education on the Postgraduate Certificate in Education at South Bank Polytechnic, courtesy of Joe McCarney, a world-weary Marxist who went on to write a book on ideology. This took up four evenings a week, but the financial return was meagre. I applied for a post as a venereal disease tracer at University College Hospital Medical School, but I was turned down because I was overly qualified and unlikely to stay in post for long. Eventually I landed a half-time post in the Department of General Practice at Guy’s Hospital Medical School to conduct a study of help-seeking behaviour. I’m not sure it was made entirely clear at the outset, but the focus was to be on women and menstruation. Overseen by Peter Higgins, the day-to-day running of the study was delegated to GP Donald Craig, working out of his practice at Thamesmead. I remained the sole researcher for around 18 months, devising six-week health diaries for the sample of women and collecting, processing and ruminating on pilot data. I left in 1978 when I was accepted for a full-time teaching post at the Middlesex Hospital Medical School, and when I did so I handed over the main study to Annette. The handover and its aftermath were informative and illuminating. I thought I had done well in persuading the women I interviewed to talk openly and freely, but – of course, as I was compelled to admit in retrospect – they had shown a natural gender-based reticence. What did a man in his mid-20s know or understand about what it is to have periods? Annette’s interviews in the main study highlighted the true significance of this. I wonder now whether a man would ever have been appointed to such a post as mine if the study had been initiated by female GPs. But thanks to Annette some interesting findings and publications emerged. Principal among the results was the gendered medicalisation of menstruation and the diversity of women’s approaches and judgements about their periods. Based on the health dairies, 35% of the women neither associated their periods with illness nor experienced any symptom episodes; 33% both made the link with illness and experienced a high level of symptom distress; 17% made the association with illness but apparently had no symptom distress; and 15% did not think of menstruation as illness-related but had a high level of symptom distress. We explored these results in several publications and later, in 1993, in another book, Menstrual Disorders, which was later unaccountably translated into Chinese.

My appointment to a full-time lectureship at the Middlesex Hospital Medical School in 1978 eased our family’s financial situation. It also introduced me to another sociologist, Ray Fitzpatrick, as well as psychologists James Thompson and David Mulhall, the latter being replaced by Stan Newman. Sociology at the Middlesex was to be taught alongside psychology under the rubric of ‘Behavioural Science’, and we were to have what would now be called as a surfeit of curriculum time at our disposal: in the event, we reigned ourselves in and planned for and utilised 60 hours. That we had such time to play with was largely down to the amiable and accommodating John Hinton, Head of the Department of Psychiatry in which we were housed. Ray and I gave a series of lectures organised in conjunction with or alongside the psychologists, and we employed tutors to help us run two sets of seminars, one of which allowed students to choose specific topics of interest to them. Crucially in relation to institutional politics, students had to sit and pass an exam in behavioural science; and if they failed either its sociological or its psychological components, they were heading for an autumn resit. Ray was an excellent colleague and sociologist, and when he left in 1986 to triumph in what had been an extremely competitive field for a lectureship at Nuffield College, Oxford University, it was a real loss. In fact, his focus at Oxford was not to be on medical sociology, at which I thought he excelled, but on health services research. I thought this sensible career decision a loss for medical sociology and niggled away at him for a bit!

In an invited chapter for a book edited by Caragh Brosnan and Bryan Turner and entitled Handbook of the Sociology of Education, I reflected on the changing circumstances of the teaching of sociology to medical students. I identified four phases, the first of which I called the innovative phase, dating from 1969 to 1983. This was the take-off period and covered my time at Charing Cross and my early years at the Middlesex. It was dominated by lecturers trained by Margot Jefferys and George Brown at Bedford College and was characterised by neophyte experimental courses conducted and executed ’against the odds’. It also featured the formation of a Special Advisory Committee in Sociology Applied to Medicine (SACSAM) of the University of London, then still a city-wide federal institution. Chaired initially by Margot, this body provided us with a rationale and excuse to meet regularly in Senate House and in the process created a sense of togetherness, collegiality and solidarity. It was an official body that also conferred on us a degree of clout, as we were to discover.

The second phase was the consolidation phase, dating from 1983 to 1995. The selection of 1983 here marks the decision by Margaret Thatcher’s education minister, Keith Joseph, to rebrand the Social Science Research Council as the Economic and Social Research Council. It was a move that symbolised governmental hostility to disciplines like sociology. In the medical schools, innovation and staffing stalled in sociology (as indeed elsewhere). Posts were typically frozen if a lecturer moved on. Sociology remained, however, a necessary component of the education of doctors as defined by the GMS post-Todd. Intriguingly, it was this GMC ‘requirement’ that underpinned SACSAM’s major skirmish in the late 1980s and early 1990s. It was a skirmish that escalated to vice-chancellor level before fizzling out in obduracy and indecision.  By the time I became chair of SACSAM in 1989 my predecessor Sheila Hillier had prepared the ground. Using our representation on the committee overseeing medical education within the University of London, we had won support for exerting pressure on the medical school at Cambridge University to incorporate a medical sociology course taught and examined by sociologists into its undergraduate curriculum, a step it had been reluctant to take. In the absence of a positive response, medical students from Cambridge were stopped from transferring to any of London’s medical schools to do their clinical training, at that time a popular move for students. Cambridge would not budge and in the end it was decided that this stalemate, disrupting the training of growing numbers of Cambridge students, could not continue. We on SACSAM were reportedly dismissed as ‘a bunch of Ayatollahs’ after the new ruler of Iran. From anarchists to theocrats! Although technically seen-off, we took much encouragement from the support of our medical allies in London.

I called the third and fourth phases the rationalisation phase (1995-2006) and the corporate phase (2006-) respectively; but the exploration of these will feature later as aspects of the comprehensive ‘neoliberalisation’ of our universities as financialised or rentier capitalism gathered pace. (In the event, in a talk in Paris in 2017, I was to re-set the corporate phase as lasting from 2006 to 2010, and I added a fifth or neoliberal phase from 2010 to the present.)  It would be remiss to sign off this sketch without mentioning two teaching initiatives that in their different ways were to provide me with some of the most rewarding and enjoyable teaching experiences of my career. Both began in the mid-1970s. One was medical school-based and emerged out the synergy created by the innovative phase outlined above. David Armstrong at Guy’ Hospital Medical School was a driving force. Medical students could then opt to take a year out between their pre-clinical and clinical studies to undertake a year of specialisation in a relevant area or discipline; their reward would be am intercalated B.Sc to add to their eventual MB,BS. David and allies won approval for an intercalated B.Sc in Sociology as Applied to Medicine. I was persuaded early on to teach Max Weber on a theory unit (and I was able to recruit David Blane to teach Marx).  Over time I extended my teaching, establishing my own unit on ‘Conceptual Foundations of Modern Sociological Thought’. I had initially called it ‘Philosophy of Science and Social Science’, but David Wiggins, representing London University’s philosophers, objected to my planned syllabus, which included the likes of Wittgenstein whom he thought way beyond the intellectual reach of medical undergraduates. How wrong he was, and how appalled at his judgement Wittgenstein would surely have been. In the event I kept my syllabus, simply renamed my unit and all was well.

The medical students we recruited were exceptional. Many had to jump through hoops, most often vociferous opposition from parents who were themselves practicing doctors to sociology and all it was thought to be and to involve.  Some recruits, I always thought sensibly, had decided to take a year out from their medical training because they had future career doubts, and why not enjoy this year of rethinking? In the event, almost all those of this persuasion were ready for clinical studies after this year away. It is worth dwelling awhile on what teaching these students was like, not least because it now seems like, and in many respects is, a bygone era. Usually totalling a dozen or so, the students were bright and highly motivated. Apart from the deviant who missed numerous seminars because he was busking along the south coast, attendance was excellent. The readings I personally allocated, usually by lending out my own volumes, were challenging but invariably completed with comprehensive noting. I remember one student reporting back on the whole of Bhaskar’s Realist Theory of Science, a dense and difficult text which he’d digested whole no ill-effects. Seminars were enjoyable, one memorable one lasting six hours, the first three in the seminar room in the Middlesex’s Department of Psychiatry in the Wolfson Building in Riding House Street and the second three in a local pub. This was teaching at its most pleasurable and I’m still in touch – usually via social media – with some who took our B.Sc, many of whom now hold senior positions in medicine or are approaching retirement.

The second teaching commitment involved visiting students from Emory University in Atlanta, Georgia. Margot Jefferys was approached – she was then and remained the main link between medical sociologists in the UK and those in the USA – by Dick Levinson from Emory with a view to hosting a six-week summer programme in London on comparative health care. Aware that I have a young family, Margot recruited me in the mid-1970s for a fee that came in very useful, in fact funding our holidays for many years. The programme itself comprised lectures from local experts on the NHS, visits to healthcare facilities and 12-day placements in settings married to individual students’ interests (in as far as that was possible). Many of the students were ‘pre-meds’: that is, they were undertaking a first degree with a view to applying to medical school in the US on its completion, and for them placements in clinical environments were a useful addition to their CVs. The programme was to evolve quite rapidly. Not only was I able to recruit well-known academics, including experts like Brian Abel-Smith, Michael Marmot and Ann Cartwright, but the placements afforded the students remarkable opportunities to ‘get close to clinical action’, for example, by attending ward rounds and even surgery. I was to remain coordinator of the programme for 35 years, in the process making good and lasting friendships, most notably with Dick Levinson, Mike McQuaide, Terry Boswell, Karen Hegtvedt and their families. This led to several visits to Atlanta on the part of the Scamblers, including visiting professorships for Annette and I for a semester in 1998, courtesy of then chair of the sociology department, Terry Boswell. I had the dubious pleasure of teaching classes normally taught by two very good and popular teachers, Dick at Emory and Mike at the Emory campus at Oxford College. It was an experience Annette and I appreciated and very much enjoyed and I shall return to it as a learning experience later.  Tragically, Terry Boswell, an established scholar, Marxist and world system theorist, was later to suffer from motor neurone disease and to die prematurely aged only 40.

It is difficult to see how such teaching is possible now. In fact, the intercalated B.Sc was to die a natural death after 20 years or so; and the Emory summer programme, which still runs, was to transmute into something significantly more diluted during the rationalisation and corporate phases of sociology teaching in medical schools alluded to above. In sociological terms this process was part and parcel of a much broader tranche of social changes culminating in the displacement of welfare state capitalism by rentier capitalism. It is important for much of what follows in this account of the unfolding of my personal career in sociology to understand what this displacement consisted of. As a babyboomer born in 1948 I was a beneficiary of a British society re-shaped by the trauma of war and loss. The Beveridge Report that in 1942 established the principles and laid the foundations for a comprehensive welfare regime, and Attlee’s subsequent election victory in 1945, announced the birth of a welfare state, allowing for the introduction of the National Health Service in the year of my birth. These initiatives, triggered by a mix of collective suffering, post-war ‘restlessness’ and political commitment, marked the beginning of an historically unusual phase of capitalism. As Adam Smith wrote, capitalism has a natural tendency to make the rich richer and the poor poorer, unless steps are taken to mitigate this tendency. In the years immediately following the second world war to, say, the mid-1970s, such steps were falteringly taken by means of state interventionism. The result was a significant slowing of capitalism’s tendency to growing wealth and income inequality. A degree of consensus held between Conservative (in office 1951-64, 1970-74), Labour (in office 1964-70, 1974-79) and Liberal Parties that adequate welfare and health systems and decent housing were prerequisites for a civilised society. But by the 1960s the political scene was growing more unsettled, and it is reasonable to take the oil crisis of the mid-1970s as marking the halting beginnings of a transition to rentier capitalism. Margaret Thatcher was elected to office in 1979 and lent her considerable political weight to this shift.

As Brett Christophers has shown in detail in his excellent Rentier Capitalism: Who Owns the Economy and Who Pays For It? published in 2020, the British economy has had a strong rentier element historically, though it is only since Thatcher’s time that this element has come to the fore; and it has done so more rapidly and decisively in Britain than in any other country. Rentier capitalism denotes a system oriented to and privileging ‘unearned income’ deriving purely from the ownership of assets, often in monopoly markets. What rentier capitalism has done, in effect, is release rentiers’ ability to make excess profits. In a classic tome which hit the headlines in 2014, entitled Capital in the Twentieth-First Century, the French economist Thomas Piketty sets the scene for us. He estimated that the return on assets (r) globally before tax has always been greater than the rate of economic growth (g), and for most of the history of capitalism, r after tax has also been greater than g, leading Piketty to claim that, ceteris paribus, wealth inequality increases under capitalism. Unusually during my early babyboomer years in welfare state capitalism after the second world war g exceeded net (post-rax) r, in the process curbing inequality (largely through a combination of exceptional growth and progressive taxation policies). Both Christophers and Piketty insist that right-of-centre governments in the Thatcher mould have cosseted rentiers, even encouraging people to become rentiers by means of tax subsidies. The two of them have suggestions for remedying this, but that can wait for later sketches. For now it will suffice to lay the groundwork for grasping the welfare state/rentier switch, and this requires more to be said about related causes and effects of wealth and income inequality in rentier capitalism.

In a useful table from their Paying for the Welfare State in the 21st Century, which saw the light of day in 2017, Dave Byrne and Ruane set out the differing employment and taxation systems characterising welfare state (or industrial) capitalism and rentier (or post-industrial) capitalism. What the items in the right-hand column unambiguously reveal is a cross-the-board deterioration in the circumstances of those most poorly positioned either to cope or to contest policy changes that further disadvantaged them. I am reproducing this table here in recognition of its articulacy.

 

The Principal Differences in Employment Relations and Taxation Systems in Industrial and Post-industrial Capitalism

 

Industrial employment relations and tax system Post-industrial employment relations and taxation system
Keynesian/Beveridge mode of regulation Post-industrial/consolidation state mode of regulation
Industrial workforce approaching half of total workforce Primarily service sector workforce; industrial workforce less than 15% of total
Full employment with frictional unemployment Disguised unemployment (eg extension of higher education; early retirement); underemployment
Job security and substantial worker rights Flexible labour – spread of precarious employment, limited worker rights
Employer-borne risk and responsibilities to workforce Transfer of risk to workers – use of zero hours contracts and forms of self-employment
Large public sector and devalorised labour Declining public sector as proportion of all employment and recommodification of labour
Relatively high trade union membership Low trade union membership
Relatively high wages Lagging wages and spread of low wages; heavy reliance on wage subsidy
Strong protections for workers in public sector Workers in public sector exposed to market competition
Status and protection for professionals Extension of Fordism into professional work and proletarianisation
High top rates of income tax Relatively low top rates of income tax
Relatively strong link between national insurance contributions and benefits received Weak link between national insurance contributions and benefits received
Higher corporation tax rates Lower corporation tax rates
Avoidance and evasion practices which do not catastrophically compromise the tax system Avoidance and evasion practices which catastrophically compromise the tax system
Strong and independent tax collection authorities Weakened tax collection authorities strongly influenced by corporate lobbying

 

The rationale for addressing the transition from welfare state to rentier capitalism at this juncture is that I not only have lived through it, experiencing first-hand the admittedly gradual stop-start moves from innovation to consolidation to rationalisation to corporate to neoliberal phases of the teaching of sociology to medical students, but because it was in Thatcher’s 1980s that I began to turn more attention to social and sociological theory. As we shall see this led eventually to a retheorising of the welfare state-to-rentier ‘trauma’; to the emergence of what I have come to call the ’fractured society’; and to an analysis of the roots of this traumatic cluster of events in causal mechanisms residing in the social structural, cultural and agential strata. But I must not jump the gun.

 

 

Leave a Reply