Sartre & Co.

Richard Ashcroft is a philosopher and ethicist. He is Professor of Bioethics in the School of Law at Queen Mary University of London. Here he reviews Sarah Bakewell’s book  At the Existentialist Café: Freedom, Being, and Apricot Cocktails (Chatto & Windus, 2016) for the Cultural History of Philosophy Blog.

As a teenager I began to take an interest in philosophy for some of the usual reasons: uncertainty about the existence of God, doubt about the sort of person I was or wanted to be, puzzlement about my studies, utter confusion about sexuality. I took myself fairly regularly to the public library in search of enlightenment.

emmet-coverAfter getting bored by E. R. Emmet’s Pelican paperback, Learning to Philosophise – I really wasn’t that bothered by the existence of tables, but I was bothered that people might bother about that – I had fun with A. J. Ayer’s punk rock classic, Language, Truth and Logic, and then fell off the deep end into Nietzsche’s abyss through R. J. Hollingdale’s biography. Yet there was something a bit too challenging about Nietzsche. He was too elusive. Even the greatest hits (“God is dead…”) slipped through my fingers when I tried to pick them up and examine them. What Nietzsche did give you was a sort of borrowed dangerousness. Sticking a copy of Thus Spake Zarathustra in your pocket gave you instantly the air of an Intellectual, even if you didn’t know what it was on about, in part because no one else did either, but it gave everyone something to react to. Nietzsche would have something to say about this, no doubt.

Yet it was only when I encountered the Existentialists that I began to get a sense of what philosophy might really be and how one might practically do – indeed, live – it. My route into Existentialism was through Beckett, but I quickly moved into the main writings of Camus, Sartre, de Beauvoir and eventually Heidegger as I passed through my late teens and into my twenties. By the time I began to study philosophy (as part of History and Philosophy of Science) I had become aware that the Existentialists were rather out of fashion. Derrida and Foucault were now the names to drop, though of course they had their own debts to Existentialism. The professional philosophers I was taught by largely, though not exclusively, scorned this stuff (analytic rigour or Fenland parochialism? You decide). But in the wider circle of people who were interested in philosophy, who stuck paperbacks in their pockets and got into passionate and futile arguments in pubs and parties and over endless chocolate biscuits, the Existentialists were still current.

My reason for this excursion through memoir is to underline a thing which Sarah Bakewell’s study of the lives of the Existentialists highlights: the cultural importance of Sartre and company, and the autobiographical importance of these thinkers in the lives of many readers who grew up in the post-war period. Philosophy was current. People talked about it. People had fannish relations with philosophers; if you liked Sartre, you weren’t supposed to like Camus. For men particularly Simone de Beauvoir was the Yoko Ono of thought. The pop analogy is deliberate; for much of this coincides with the rise of pop and rock culture, and the emergence of the Teenager.



Much of the language of teenage self-fashioning and evaluation of pop trends is drawn directly from Existentialism – either directly or through writers such as Colin Wilson. Consider the lyrics of The Who, for instance, which are deeply engaged with questions of authenticity, honesty and truth in a very Sartrean vein; the same can be said of the Sex Pistols in a more refracted and distilled form.

Existentialism mediates between the ordinary developmental crisis of trying to become an adult person in one’s own right and the more specific crisis of doing so in a consumer society which both prioritises and pathologises individualism. Another feature of the postwar teenage experience is that “the kids” know something the adults don’t, that real invention and innovation come from youth and inexperience, and that the world as we find it is corrupt and needs to be overcome through youthful energy. Again, these are very Existentialist notions about finding one’s authentic project in world into which we are thrown, but which we can remake on our own terms, not accepting the rules as given as being binding upon us morally, but only as constraints to be overcome.

at-the-existentialist-cafe-uk-coverBakewell’s book is terrific – beautifully written, and elegant in its precise and concise portraits of the leading figures in the European Existentialist movement, their engagements – with thought and with each other – and the historical circumstances through which they moved. She is very fair to her cast, but does admit to her preferences. She is acute and tough-minded when it comes to her appraisals of their various political engagements (she’s especially good on Heidegger, but the arguments between Camus, Merleau-Ponty and Sartre are also well treated). One thing which appeals to Bakewell (and to me) is the relative prominence of women as Existentialist philosophers, and arguably the most abiding influence of Existentialist philosophy as such is in feminism, and I must admit that the only work of the Existentialists I would now want to go back to re-read is The Second Sex.

Writing and publishing a popular book about philosophers who were (are?) popular calls for comment in its own right. Bakewell’s book sits alongside Andy Martin’s The Boxer and the Goalkeeper: Sartre vs Camus as a popular exposition of the ideas and lives of the Existentialists. It also sits alongside Bakewell’s study of Montaigne, How To Live: A Life of Montaigne in One Question and Twenty Attempts at an Answer. Stuart Jeffries has just published a group biography of German humourists The Frankfurt School (The Grand Hotel Abyss). And of course any visit to a bookshop will find a section on philosophy, much of which is devoted to a few Penguin classics, some popularisations of particular philosophers’ works, and books from the ever expanding “School of Life” books and the omnipresent Alain de Botton.

The standard philosopher’s view of all this might be that most of these are not “real” philosophy, being neither rigorous academic texts nor much connected to current research in the field. The standard non-philosopher’s view of that would be that that’s so much the worse for academic philosophy. I think reading Bakewell allows a more nuanced view to emerge. It shows that the “academic” and the “popular”, and indeed the “text” and the “life” can come together, but that it takes a rather specific historical conjuncture to occur for this to happen. Crudely put: while popularisation succeeds because there is a felt want for some kinds of “teaching” about life and its meanings and purposes which is never wholly out of fashion, it takes a lot more for work in professional philosophy (inside the academy or elsewhere) to become popular in its own right.


In the case of Sartre and company, they achieved a certain level of “cool” at a time when “being cool” was coming into focus; they danced, they drank, they published, they fought, and, in due course, they became so current and recognisable that Tony Hancock could make a film satirising and admiring them and their fans and Monty Python could do a skit in which two of their redoubtable Pepperpot ladies called on Sartre in Paris to ask him about a point of metaphysics, and millions of viewers would be in on the joke.

Some of this is probably mere historical happenstance. But the hook which enables this popularity is the engagement of this philosophy with the things of life itself – you don’t just philosophise and then go dancing. You philosophise about the dancing. And you dance philosophically.

Follow RIchard Ashcroft on Twitter: @qmulbioethics

Read more about Existentialism on the Cultural History of Philosophy Blog


Edward Caddy took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post he writes about ‘existentialism’ – one of the most widely used of all philosophical keywords.

Benedict Cumberbatch performs in Director Lyndsey Turner’s production of Hamlet at the Barbican, in London. Johan Persson / Reuters

“To be, or not to be: that is the question…” As a youth, sitting in class on a hazy summer afternoon, I didn’t wholly understand the question. In fact, I very much doubt anyone in that class did, no offence meant, Mr Williams. Shakespeare’s famous soliloquy must surely be the Western world’s most well known expression of an individual in existential crisis. Martin Heidegger’s Being and Time, published in 1927 was the first philosophical work to point to the fact that ‘the essence of what it means to be human’. For Heidegger, as for Prince Hamlet, the only question a human must answer, is whether they choose to a live an authentic life, or not – what it means to be human is evidenced by our every day existence. Fortunately for most of us, Hamlet’s reservation remains one to be tackled in fiction and fantasy. Existentialist philosophers however, share a common concern with Hamlet, asking questions such as; Why am I here? What does it mean to be human? How should I best live my life?

‘Existential’ is recorded as having first entered the English language in 1656. The term was drawn from 4th century post-classical Latin – existentialis – meaning, coming into being. Up until the mid-20th century, ‘existential’ was used primarily to refer to existence; it held no philosophical definition as such. Unsurprisingly, the term found its roots outside the relatively modern field of existential philosophy. For instance, in the field of psychology, ‘existentialist’ found use as early as 1929 to describe an advocate of an approach to the study of consciousness based on the introspective analysis of experience into its elements. It was not until the mid-1940s that French Catholic philosopher Gabriel Marcel coined the term ‘existentialism’ to refer to the emerging philosophical approach.[1]

Clearly defining existentialism has proven difficult, not least because of profound doctrinal differences between existentialism’s principle philosophers. The general consensus has been to consider existentialism not as a philosophical system or rigid set of doctrines, but rather as a philosophical movement, which focuses primarily on the analysis of human existence. The primary interest of those in the movement is the problem of human existence.

Despite rising to prominence in the mid-20th century, it was two thinkers from the 19th century that are considered the founding fathers of the movement. Søren Kierkegaard and Friedrich Nietzsche, though they never used the term ‘existentialist’, were the first to focus on the subjective human experience and the apparent meaningless of life. It was in the context of the emerging sense that science, which had made the world a mechanism governed by irrefutable, natural laws, and had drained any external or transcendent source of meaning, purpose or value from the universe, that Kierkegaard and Nietzsche began to look again at what it meant to be human, and how being human can be something valuable and useful. To overcome this dilemma, Kierkegaard’s knight of faith and Nietzsche’s Übermensch defined the nature of their own existence. In this way, Kierkegaard and Nietzsche can be considered the precursors to the philosophies that would define the movement.

“Shadow Play” 1961

It was in the years following the Second World War that existentialism became a significant philosophical and cultural movement, with influential existentialists such as Albert Camus, Simone de BeauvoirMartin HeideggerMaurice Merleau-Ponty and perhaps, most notably, Jean-Paul Sartre. The Second World War focused attention on the kind of moral dilemma that you’re faced with if you don’t have any absolutes to rest on – in a sense, the war presented individuals with a sharper form of perennial dilemmas. This period was pivotal for the transmission of existential philosophy to the public sphere; Beauvoir wrote that “not a week passed without the newspapers discussing us”;[2] existentialism became “the first media craze of the postwar era.”[3]

“[Existentialism] is an attitude that recognises the unresolvable confusion of the human world, yet resists the all-too-human temptation to resolve the confusion by grasping toward whatever appears or can be made to appear firm or familiar… The existential attitude begins with a disoriented individual facing a confused world that he cannot accept.” (From Hegel to Existentialism, Robert Solomon)

Solomon’s explanation goes some way to expressing the existentialist’s concern with the ‘human condition’ and the questions they sought to answer. Existential thinkers have differed widely on their evaluation of the ‘human condition’ which only adds to the difficulty in defining the keyword, let alone the movement.

Although existentialist thinking became more apparent, it managed to retain, or indeed entrench further, its fluidity as a term. As Solomon noted, the individuals starting point begins with “the existential attitude”, that is, disorientation and confusion in the face of an apparently meaningless world; for many, this only served to reinforce pre-existing, all-encompassing systems or theories that purported to provide answers to the meaning and purpose of human life. That is, religious and philosophical systems that remove the massive burden you are faced with if you try to create meaning and purpose for yourself in a unique and personal manner. Existential thinking is not fixed thinking.

The lack of clearly defined parameters has served to increase the
accessibility, and appeal of existentialism – in the wider realm of literature, theatre and film and television, ideals can be picked and chosen at will. Fyodor Dostoyevsky was the first to use literature to describe the existential condition in his Notes from Underground. Dostoyevsky describes the underground man, confronting an awful solitude and the lack of meaning and value in the universe. The protagonist, confronted with a confused world, has to look to himself to find meaning.

Dostoyevsky’s novels spawned influence in the world of film and television, art, literature and theatre. Jean Genet’s 1950 fantasy erotic Un chant d’amour attempts to convey the bleakness of human existence in a godless universe. More recently, The Matrix, Monty Python’s The Meaning of Life and even Toy Story, although not explicitly existentialist, tackle ideas of self, identity, freedom and authenticity. Whilst this creates difficulty in assessing the history of existentialism as a keyword, it goes some way to note the importance of existentialism in areas that it is not traditionally acknowledged. Existential philosophy will effect almost everybody at some point in their lives, whether they are conscious of it or not.


Much like myself growing up in the 21st century, young people right across the 1950s, ’60s and ’70s were influenced greatly by existentialism, not immediately by the novels and writings of Sartre or Camus, but by a series of popularisations that were extraordinarily well read – books by John Macquarrie, Lynne Olson, and Walter Kauffman. The Spectator Archive reflects the increase of existentialism in the public sphere from the 1960s onwards. The works of the aforementioned prolific authors served to distill and impress upon these adolescents, the essence of the existential outlook. Sarah Bakewell’s recent article, highlights how existential thinking is still relevant to us today, going so far as to say that it offers a number of unique benefits in the realms of personal life, relationships and productivity. In an ever increasingly definition-rejecting age that seeks to define itself upon ones self rather than external measures or values, the ideals of existentialism that are distilled through theatre, film and literature are more important than ever.

Despite the apparent morbidity of existentialism, there is an optimism. The first response to questioning the purpose or value of a life that holds no external absolutes, is of despair. However, when the existential thinker is confronted with an absurd or confused world, and asks ‘Why am I here?’, the answer is one of positivity and, almost, enlightenment. The essence of existentialism is that human beings are able to transcend the futility of life through honesty and bravery in the face of an absurd world, advocating freedom, individuality and responsibility. To be ones self, is to be authentic, is to be an authentic human being.

“Rashomon” 1950

[1] Thomas R. Flynn, Existentialism: A Very Short Introduction (Oxford University Press, 2006), p. 89.

[2] Simone de Beauvoir, Force of Circumstance, quoted in Ronald Aronson, Camus and Sartre (University of Chicago Press, 2004), p. 48.

[3] Ronald Aronson, Camus and Sartre (University of Chicago Press, 2004), p. 48.


Connie Thomas took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘veganism’ as a keyword in philosophy, ethics, and celebrity culture.

It was the day after the 2013 Super Bowl when my 26-year-old steak-loving, Mulberry-wearing sister declared she was embracing Veganism at the family dinner table. To say we were all shocked would certainly be an understatement.  While she has never shown any particular dislike of animals, it seemed unlikely to me that she’d ever be willing to give up her leather handbag collection in their cause. Unsurprisingly, she wasn’t. Instead, when I asked her what her motivations were for undertaking such a massive change, expecting a barrage of moral and ethical arguments in return, her response was simply, “because, Beyoncé!”. Indeed, it was not an adoration of animals, but a 15-minute Super Bowl half-time performance by her favourite  vegan celebrity which drew my sister to Veganism. Though this example of Veganism seems somewhat unconventional, it is certainly not exceptional in modern society. Indeed, despite most -isms existing primarily within the realms of philosophy and ethics, Veganism has made an interesting transition through to popular culture and lifestyle, greatly influencing its perception as both a word and a concept.

Two examples of self confessed "Beygans": Twitter users who converted to Veganism to imitate Beyonce.

Two examples of self confessed “Beygans”: Twitter users who converted to Veganism solely to imitate Beyonce.

Originally, the term Veganism was first created in 1944 by Donald Watson, an English animal rights activist, to make the distinction between normal vegetarians and non-dairy vegetarians. Following a lengthy discussion of other possible words to describe this, Watson settled on the word Veganism as a contraction of the term Vegetarianism. Using solely the first and last letters of Vegetarianism, according to Watson, the syntax of Veganism matched its definition, “the beginning and end of vegetarianism”. [1] The first written use of the word was in Watson’s newsletter, ‘The Vegan News’, also published in 1944, in conjunction with the creation of the Vegan Society in England. However, the initial definition of the word, simply a concise form of ‘non-dairy vegetarian’, was revised in 1949 by the Vegan society into a more explicit form. It rejected the definition of Veganism as just an abstinence of animal produce within food, but instead, “to seek an end to the use of animals by man for food, commodities, work, hunting, vivisection, and by all other uses involving exploitation of animal life by man”. [2] This distinction is very significant as it sees the initial use of Veganism to define a philosophical principle. From this point, Veganism became more than just avoiding types of food and instead about animal conservation and protection as a concept.

Whilst this definition of Veganism provided by the Vegan Society in 1949 remains the sole ‘official’ one, according to the Oxford English Dictionary, the term has since been interpreted in several different ways, both in theory and in practice. One of the most notable philosophers to wade in on the debate is Peter Singer, an Australian moral philosopher. Though Singer specializes in applied ethics across a broad spectrum of social issues, his written work on animal rights in the 1970’s is off particular note, and has been widely regarded as the central ideology behind Veganism. In his renowned book, Animal Liberation

Animal Liberation (1975)

Animal Liberation (1975)

(1975), Singer bases his philosophy of Veganism upon the concept of ‘speciesism’; human prejudice and discrimination of an animal due to it’s species. [3] As a species who often fall foul of this, humans frequently criticize and treat other animals based solely on intelligence, according to Singer, which acts as an arbitrary scale and initiates inequality. Thus, in this case, Veganism denotes more than just avoidance of animal produce, considering the entire ethos surrounding human interaction with other animals.

Whilst these uses of the term Veganism seem relatively moderate, not all interpretations of the word are quite so peaceful. Radical animal rights activist Gary Yourofsky has more recently acted as a proponent of Veganism, working frequently with PETA to give lectures promoting the cause, such as his viral “Best Speech You Will Ever Hear”. [4] Although Yourofsky shares similar sentiments of Veganism with Singer, particularly that of speciesism, he places far more emphasis of Veganism on active protest, controversially linking the term to ‘eco-terrorism’. [5] Despite the vast majority of philosophers and activists undertake Veganism as a means of passively protesting animal rights, violent action is deemed a necessary connotation of the term according to Yourofsky. In his mission to promote Veganism, he has famously expressed ‘unequivocal support’ for the deaths of medical researchers via arson attacks, and once declared that, “Every woman ensconced in fur deserves a rape so vicious that it scars her forever”.[6] Such radical opinions, whilst only existing in a tiny minority of Vegan activists, have generated a significant discussion surrounding Veganism, attaching several negative connotations to it.

Since the turn of the 21st Century, the term Veganism has exploded in the English vocabulary, increasingly rapidly in use despite the relative decline of the term vegetarianism. Significantly, the term has expanded not only in frequency of use, but also in context and content. Although philosophical links of animal rights remain attached to Veganism, the word is now associated with a broader cultural movement and lifestyle. The adoption of Veganism by famous figures, possibly in conjunction with the rise of modern celebrity culture, has generated mass discussion of the term and seen the number of people ascribing to it grow. In an article of reasons to convert to Veganism, PETA includes at number 8, “All the cool kids are doing it!”, before listing vegan celebrities to whom the public should aspire. [7] Whilst this probably tells us more about celebrity culture than  it does Veganism, the effect of popular culture on the ‘fashionable’ use of the term is undeniable, even when dissociated with its original philosophy. For example, Ariana Grande and Miley Cyrus, both incredibly influential celebrity figures and self proclaimed followers of Veganism, controversially released lipstick collections with the make-up company M.A.C. despite its renowned reputation for abusing animal rights. This exemplifies the place of Veganism in modern culture as a tool to gain popularity and seem ‘fashionable’, as opposed to an intellectual movement.

While Veganism as a term has existed philosophically for decades, its evolution into both a tool for violent activism and celebrity culture has dominated its use in modern society. Indeed, as my sister has so aptly illustrated, association with Veganism doesn’t necessarily equate to an association with animal rights or even food avoidance, but rather a particular celebrity or fan base. Although we can safely assume the majority of people link Veganism to its philosophical roots in theory, it would be misguided to assume those undertaking it in practice do the same. In the 2010’s, it now appears the term Veganism says more about one’s popularity and culture than it does their morality and ethics.

Continue reading

Have the British finally learnt how to express their emotions?

Jenny Chowdhury took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘Emotion’ as a philosophical keyword, especially in the context of British culture and history.

Can we confidently admit to knowing what ‘emotion’ means? If someone asked you about its definition, what would you say? You could give examples of it such as happiness, sadness and anger. But what does it mean and how do we understand the term today?

Our use of the term ‘emotion’ today is connected to the use of emoticons on social media, not only do we use it on a day-to-day basis, but I want to argue that the younger generation, especially, are overusing it to the extent that we no longer genuinely feel the emotions that we portray on social media. Last year, Oxford Dictionaries Word of the Year was . The use of emoticons/ emojis is clearly taking over our use of language! This is an interesting concept linking to the study on the philosophy of emotion which covers a number of different fields, making the definition even harder to grasp.

To give you a quick history of emoticons and where it came from I came across an interesting article by Tim Slavin. Slavin states that ‘the modern history of emoticons grew out of an interesting side effect of technology: typed messages on a computer screen appear neutral and can be difficult to translate emotionally.’[i] Amy-Mae Elliott expresses how a satirical magazine called Puck used emoticons a hundred years ago when writing about passion and emotions. Some argue that Scott Fahlman, a computer scientist was the first to use symbols to convey emotions through text. Our increasing need for using emoticons when communicating with our friends indicates a new level of expressing ourselves. The term ‘emotion’ appears in a number of discourses such as science, philosophy, popular culture and psychology.

Whatsapp Emoticons

Whatsapp Emoticons

The key thinkers behind the philosophy of ‘emotion’ in Britain are Thomas Brown and Charles Bell during the 1800s. Recently, Thomas Dixon and Fay Alberti have researched the history of emotion. Dixon concentrates on how the term has been in crisis since its connection with psychology in the nineteenth century.

The Greeks used pathos as an alternative to emotion; ‘that which happens to a person or thing.’[ii]  ‘Emotion’ derived from the French word ‘émotion’ which meant physical disturbance and bodily movement in the seventeenth century. The term was first used by John Florio who was a translator of Michel de Montaigne’s essays. During the eighteenth century the definition of ‘emotion’ moved from bodily movements to mental states and instinctive feelings such as pleasure and grief. Close attention was given to the term in the eighteenth century where it was connected to mental experiences. Its theoretical use was influenced by Thomas Brown, Charles Bell and William James in the nineteenth century. Bell’s Anatomy of Expression focused on the artistic portrayal of emotion.

Charles Bell, Essays on the anatomy of Credit: Wellcome Library, London. Wellcome Images BELL, Sir Charles {1774-1842} Charles Bell, Essays on the anatomy of expression in painting, London: Longman, 1806. Page 142 - Wonder / Fear / Astonishment. Published: - Copyrighted work available under Creative Commons Attribution only licence CC BY 4.0

Charles Bell, Essays on the anatomy of
Credit: Wellcome Library, London. Wellcome Images
BELL, Sir Charles {1774-1842}
Charles Bell, Essays on the anatomy of expression
in painting, London: Longman, 1806.
Page 142 – Wonder / Fear / Astonishment.

In 1836, William Whewell stated how his notion of desires of human nature being linked to emotions was not accepted. The history of ‘emotion’ dates back to the time of the Stoics who believed that emotions were diseases of the soul which could only be cured through reason. However, for Thomas Aquinas, passions and affections were different parts of the soul where one is the sense appetite and the other is intellectual appetite.[iii]

During the eighteenth century, there was a move towards sensibility alongside passions. Human feelings were categorised into either a violent type of ‘passion’ or a milder ‘moral sentiment’.[iv] A new category for ‘emotion’ in the nineteenth century was invented by Thomas Brown. Brown used the previous connotations such as passions and affections to put them all under one category of ‘emotions.’ This new category dwells on the science of the mind and how ‘emotion’ was used to understand feelings, pleasures, affections, etc.[v] Brown’s definition for ‘emotion’ was that ‘they may be defined to be vivid feelings, arising immediately from the consideration of objects, perceived, or remembered, or imagined, or from other prior emotions.’[vi]

Charles Bell is an important figure as he linked ‘emotion’ to a movement of the mind where affections of the mind were made visible through signs on the face or body. Bell and Brown differed in their opinions towards what constituted an emotion, whether it was primarily mental or bodily. Discussions are still taking place on whether the heart of emotions lie in the heart or the brain.[vii]

Charles Darwin was interested in Charles Bell’s illustrations in his Anatomy and Philosophy of Expression’. However, he noticed that Bell did not explain why different muscles are used for different emotions, for example the arching of eyebrows and mouth expressions. ‘The movements of expression give vividness and energy to our spoken words. They reveal the thoughts and intentions of others more truly than do words, which may be falsified.’[viii]

William James wrote an article called ‘What Is an Emotion?’ in 1884 which opened it up to the public and still to this day there is no definitive meaning behind ‘emotion’. He concluded that emotions were mental feelings brought about by the perception of an object in the world. However, this was not agreed to universally. Dixon’s article, ‘“Emotion”: The History of a Keyword in Crisis’, puts forward an interesting case for the importance of keywords which pose as mirrors and motors for showing social and intellectual change. Concepts are created through giving new meanings to words that are already in our vocabulary.

A century ago, Britain was known for its ‘stiff upper lip’ from the death of Charles Dickens in 1870 until the death of Winston Churchill in 1965. People of this period did not show their emotions. Dixon covers the gendered nature of tears in Weeping Britannia: Portrait of a Nation in Tears. ‘The idea that there was something feminine about tears was never entirely erased.[ix]  Ute Frevert argues that gender ‘“naturalised” emotions while at the same time connecting them to distinct social practices and performances.’[x] Both genders had emotions but they differ in intensity. She argues that ‘emotions, above all social or “relational” emotions, are deeply cultural.’[xi] ‘When humans label their own feelings, those labels begin to give their feelings shape and direction. This is what culture and language do for and to us.’[xii]

In particular regards to the period after the Second World War, a youth group of angry young men emerged in popular culture. British novelists and playwrights expressed their dissatisfaction with the government and class system. John Osborne’s play, Look Back in Anger, represented this new movement. The focus was on the lives of the working class and their typical daily tasks. It shocked the audience as kitchen sink realism was a developing cultural movement in the late 1950s.

A scene from John Osborne’s Look Back in Anger (1959). The constant props of tea-sets on set and the look of anger on the main character, Jimmy, embodies the angry young man movement.

Kleenex ® commissioned SIRC to find out how British people felt about displaying [emotions] [] publicly. This emphasises the change in attitudes from a previous ‘stiff upper lip’ stance where it was shameful to show any type of emotions, especially for men. The questions that arises are; is it acceptable to cry and is there a gendered aspect to it? Should emotions be kept private? The report discusses reality shows such as X Factor and Big Brother which publicise a range of different emotions. How do the audience respond to the extent of their emotional journey? There have been studies which conclude that letting out pain and relief through tears works towards improving your well-being, so why is there such a taboo over being too emotional? Should we break out of the conventional view and be able to fully express how we feel in the world that we live in today? The report states that ‘71% of women have ‘let it out’ in the past 6 months by crying compared with 28% of men.’[xiii]

‘We are better at talking about our emotions than in previous generations.’[xiv]

Do you agree with this statement, and if you do, why? Do you think Britain has finally given in to emotions?

I typed in ’emotion’ on YouTube and a number of songs were listed. I have chosen a recent song for this post for you to ponder about. The chorus focuses on emotion and the intensity of feelings can be experienced through her voice. Have a listen and let your emotions run wild.

Word cloud image of this blog-post. A visual image to accompany this post to see what really encapsulates the term 'emotion'.

Word cloud image of this blog-post. A visual image to accompany this post to see what really encapsulates the term ’emotion’.


[i] Tom Slavin, ‘History of Emoticons’, Off Beat (May 2014) <> [accessed February 2016].

[ii] A. W. Price, ‘Chapter 5: Emotions in Plato and Aristotle’, The Oxford Handbook of Philosophy of Emotion ed. Peter Goldie (Oxford: Oxford University Press, 2010), p. 121.

[iii] Thomas Dixon, ‘“Emotion”: The History of a Keyword in Crisis’, Emotion Review, Vol. 4, No. 4 (October 2012), p. 399.

[iv] Ibid., p. 399.

[v] Ibid., p. 340.

[vi] Thomas Brown, Thomas Brown: Selected philosophical writings ed. T. Dixon (Exeter: Imprint Academic, 1820/2010), pp. 145-6

[vii] Fay Bound Alberti , Matters of the Heart: History, medicine, emotion (Oxford: Oxford University Press, 2010).

[viii] Charles Darwin, The expression of the emotions in man and animals (London: John Murray, 1872) p. 366.

[ix] Thomas Dixon, Weeping Britannia: Portrait of a Nation in Tears (Oxford, Oxford University Press, 2015), p. 98.

[x] Ute Frevert, Emotions in History – Lost and Found (Budapest: Central European University Press, 2011), p. 11.

[xi] Ibid., p. 211.

[xii] James M. Jasper, ‘Emotions and Social Movements: Twenty Years of Theory and Research’, Annual Review of Sociology, Vol. 37 (April, 2011) p. 298.

[xiii] ‘Britain: A nation of emotion?’, Social Issues Research Centre (January 2007).

[xiv] Ibid.

Further Reading

‘Angry Young Men’, Encyclopaedia Britannica <>.

Fay Alberti Bound, Matters of the Heart: History, medicine, emotion (Oxford: Oxford University Press, 2010).

‘Britain: A nation of emotion?’, Social Issues Research Centre (January 2007).

Thomas Brown, Thomas Brown: Selected philosophical writings ed. T. Dixon (Exeter: Imprint Academic, 1820/2010).

Thomas Dixon, ‘“Emotion”: The History of a Keyword in Crisis’, Emotion Review, Vol. 4, No. 4 (October 2012), pp. 338 – 344.

Thomas Dixon, Weeping Britannia: Portrait of a Nation in Tears (Oxford, Oxford University Press, 2015).

Amy-Mae Elliott, ‘A Brief History of the Emoticon’, Mashable (September 2011).

Emotion Review <>.

Ute Frevert, Emotions in History – Lost and Found (Budapest: Central European University Press, 2011).

S. Jacyna, ‘Bell, Sir Charles(1774–1842)’,Oxford Dictionary of National Biography, Oxford University Press, 2004; online edn, Jan 2008.

James M. Jasper, ‘Emotions and Social Movements: Twenty Years of Theory and Research’, Annual Review of Sociology, Vol. 37 (April, 2011), pp. 285-303.

‘emotion, n.’, OED Online (Oxford: Oxford University Press, February 2016).

W. Price, ‘Chapter 5: Emotions in Plato and Aristotle’, The Oxford Handbook of Philosophy of Emotion ed. Peter Goldie (Oxford: Oxford University Press, 2010).

Tim Slavin, ‘The History of Emoticons’, Off Beat (May 2014).




Zena Gainsbury took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘utilitarian’ as a keyword, from John Stuart Mill to modern fashion magazines….

(Courtesy: ImaxTree) 2014 Paris Fashion Week: Utilitarian style by (Left-Right) H&M, Isabel Marant, Balenciaga, Balmain, and Lanvin.

When flicking through Grazia magazine on a Tuesday evening (I receive the joy of this subscription in my postbox weekly) nothing puzzles me more than the use of ‘utilitarian’ against a backdrop of khaki, pockets, and wrap belts. Between reading about Taylor Swift’s squad and the trivialities of Harry Styles love life I am more than surprised to see a nod towards the utilitarians. I swiftly imagine J.S. Mill, Grazia in hand, fighting for the ‘Mind the Pay Gap Campaign’ outside the Houses of Parliament alongside the likes of Gemma Arterton. Perhaps, the use of ‘utilitarian’ amongst celebrity gossip is a product of the evolution of women’s magazines like Grazia, who now, rather than print stories focused merely on the latest popstar include informed cultural pieces. But as I scan my Grazia for a ‘normal’ use of the word ‘utilitarian’ amongst these more serious pieces on women’s rights and the plight of Syrian refugees the only use of ‘utilitarian’ is to connote a fashion trend. Such a mainstream use of ‘utilitarian’ in British popular culture seems ultimately surprising – severing the white middle class figures of Jeremy Bentham and J.S. Mill from use of the word Utilitarian seems somewhat impossible. How then, has the fashion industry appropriated its use to connote something which appears to diverge completely from the utilitarian’s intentions?


(Courtesy of Wikipedia) The utilitarian and not so fashion conscious J.S. Mill.

Although the image of J.S. Mill as utilitarian philosopher and fashion darling is an amusing and somehow pleasing image this isn’t the usual picture that the word Utilitarian (dating back to 1781 to describe Jeremy Bentham himself) evokes in philosophical uses as either adjective of noun.  As a noun Utilitarian refers to ‘one who holds, advocates, or supports the doctrine of utilitarianism; one who considers utility the standard of whatever is good for man; also, a person devoted to mere utility or material interests.’[1] And, again, when used as an adjective the earliest usage, is again, attributed to Jeremy Bentham; defined as ‘Consisting in or based upon utility; that regards the greatest good or happiness of the greatest number as the chief consideration or rule of morality.’ From these definitions, however, we do not gather a complete sense of what it is, or was, to be a ‘utilitarian’.

To really grasp what it is to be, or to be described as a utilitarian in pre-20th Century Britain it is best to consult the literature of the utilitarians. Bentham formulated this ethical system in a number of his life’s works which approves, or disapproves any action according to whether it augments of diminishes ‘the two sovereign masters’ pain and pleasure. [2] ‘Utility’ itself, for Bentham, refers to the ‘property’ in any object which produces any of the following: advantage, benefit, pleasure, good, happiness or the prevention of pain in regards to the interests of the individual, or the interests of the whole community. The interest of the community is calculated by taking the ‘sum’ interest of its members. Bentham argues the case for his formulation of utilitarianism by asking the question: is there any other motive a man can have which is distinguished from the motive of utility? Perhaps there isn’t, but defining ‘utility’ becomes problematic, and many condemn Bentham’s approach as merely exemplifying hedonistic principles.


‘The Auto-Icon’ courtesy of: Existential Comics – the Utilitarian Bentham and the ‘faults’ of Utilitarianism.

The hedonism in Bentham’s approach was critiqued and redeveloped by his utilitarian successor J.S. Mill. Mill, who grew up under the influence of Bentham, criticised him for ‘neglected character in his ethics’ and a focus on ‘self-interest’ and a formidable lack of ‘inspiration’ in which Mill attempted to remedy in his formulation of the Utilitarian theory. Namely, Mill’s utilitarianism differs from Bentham’s in its emphasis on quality over quantity in regards to pleasure:

‘It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied.’[3]

Bentham’s utilitarianism was criticised as debasing the object of human life as nothing but pleasure: ‘a doctrine worthy of swine.’ Mill tackles this criticism of Bentham by arguing that a beast’s pleasures do not satisfy a human being’s conception of happiness; and goes so far as to argue that ‘few human creatures would consent to be changed into any of the lower animals, for a promise of the fullest allowance of a beast’s pleasures’ and thus making a distinction between human and animal pleasures. Mill justifies this by suggesting that the possession of higher faculties in humans entails a greater measure of happiness for fulfilment, and as such, the fulfilment of a beast’s pleasures for a human provides no fulfilment, or pleasure, at all.

Courtesy: Grazia 29th Feb 2016 Issue Fashion Utilitarianism.

(Courtesy: Grazia 29th Feb 2016) ‘Fashion Utilitarianism’ in magazines included in a report on London Fashion Week.

In 20th and 21st Century philosophy this is problematic – Mill’s reference to ‘lower pleasures’ ridicules those with lower mental capacities who Peter Singer infers as ‘a person with an intellectual disability.’[4] It is interesting here to go back to what I will call ‘Fashion Utilitarianism’ – what end of the scale would Mill propose that this was situated? Looking perhaps at the catwalk of Paris, Madrid or London, maybe these would be considered as ‘higher pleasures’ alongside other privileged activities including opera and theatre. What about high-street ‘Fashion Utilitarianism’? I imagine that Primark sits at the bottom of Mill’s scale. Or is the consideration and pleasure taken in choosing agreeable clothing whether it is off an Essex high street or hot off the Versace catwalk a higher pleasure in itself? The problematic subjectivity inherent in both Mill and Bentham’s utilitarianism is clear here.

Peter Singer’s approach to utilitarianism is inspired by his vegetarianism and animal rights activism which makes his ‘preference’ utilitarianism more inclusive. A case that demonstrates this inclusiveness in Singer’s ethics is included in his article Famine, Affluence and Morality (1972).[5] Singer, writing after the cyclone in Bangladesh in 1971, argued that physical proximity should not be a factor when establishing one’s moral obligations to others. He argues that it makes no difference whether he helps his neighbour or someone whose name he will never know and in essence compromising the relationship between duty and charity. Hence, Singer’s utilitarian approach proposes that any act becomes duty if it will either prevent more pain that it causes or cause more happiness than it prevents. But how can this apply to fashion? It seems as if ‘Fashion Utilitarianism’ has modelled its utilitarian approach even further from Mill, Bentham and Singer’s definitions of a utilitarian ethics.

Courtesy of ASOS - Fashion Utilitarianism and Utility

(Courtesy of ASOS) – Fashion Utilitarianism and Utility as shown on one of the leading fashion retailer’s site.

The only thing that I can see relating to these khaki adorned, boiler suit and pocket embellished page spreads is that word ‘utility’ of which utilitarianism and utilitarian derive. The other common factor that arises when looking through these glossies is the intertwining and juxtaposed use of the words ‘utilitarian’ and ‘military’ to describe a fashion trend which seems bounds away from the ethical theory proposed by Jeremy Bentham, and later J.S Mill. Magazine writers flit between using ‘utilitarian’ to describe a look to ‘military’ two words that, when thinking about the principle of utility as the minimisation of pain and maximisation of pleasure seem exceptionally polarised terms. Maybe if we go back to the definition of the word ‘utility’ this use in the fashion industry may become clearer:

‘The fact, character, or quality of being useful or serviceable; fitness for some desirable purpose or valuable end; usefulness, serviceableness.’[6]

Here I can find links between the use ‘utility’ and ‘military’ which relate to the fashion world and illuminate the meaning of ‘Fashion Utilitarianism’. The words ‘serviceable’ and ‘usefulness’ stand out in the definition of ‘utility.’ Serviceableness as the ‘utility’ of a garment links to the use of ‘military’ interchangeably with ‘utility’ or ‘utilitarian’ – what we’re really looking at is a fashion which is easy to maintain, whether this be the look, or the garment itself (which I presume is how ‘military’ comes in). ‘Usefulness’ has a much broader spectrum of connotations than utilitarianism and hence seems to resonate much more with the fashion and designers described as ‘utilitarian’. It infers a practical look; and finally those oversized pockets seem to make sense. ‘Fashion Utilitarianism’ seems to have adopted the language of the utilitarians but arguably not the semantics. However, the evolution of the word ‘utility’ resonates with words like ‘wearability’ and ‘practicality’ which relate more obviously to fashion and hence underline how the word ‘utilitarian’ has evolved in modern use. And as for the pockets, their existence on the garment is beneficial to the wearer and serves a purpose – how very utilitarian.


[1] “utilitarian, n. and adj.” OED Online. Oxford University Press, December 2015. Web. 28 February 2016.

[2] Jeremy Bentham, ‘An Introduction to the Principles of Morals and Legislation’ in Utilitarianism and Other Essays, ed. Alan Ryan (London: Penguin Classics, 1987) p. 65.

[3] J.S. Mill, ‘Utilitarianism’ in Utilitarianism and Other Essays, ed. Alan Ryan (London: Penguin Classics, 1987) p. 281.

[4] Peter Singer, Practical Ethics (Cambridge: Cambridge University Press, 1993) p. 108.

[5] Peter Singer, ‘Famine, Affluence, and Morality’ in Philosophy and Public Affairs, vol. 1, no. 1 (Spring 1972), pp. 229-243 [revised edition] <—-.htm> [accessed: 26/02/2016]

[6] “usefulness, n.” OED Online. Oxford University Press, December 2015. Web. 29 February 2016.

Further Reading:

Existential Comics:

In Our Time ‘Utilitarianism Episode:

Jeremy Bentham, ‘An Introduction to the Principles of Morals and Legislation’ in Utilitarianism and Other Essays, ed. Alan Ryan (London: Penguin Classics, 1987)

Katie Davidson, ‘Army Green Marches Down the Runway at Paris Fashion Week’ – /jqtgypb2__3/Army+Green+Marches+Down+Runway+Paris+Fashion

J.S. Mill, ‘On Liberty’ in On Liberty and other writings (Cambridge: Cambridge University Press, 1989)

J.S. Mill, ‘Utilitarianism’ in Utilitarianism and Other Essays, ed. Alan Ryan (London: Penguin Classics, 1987)

Peter Singer, Practical Ethics (Cambridge: Cambridge University Press, 1993)

Peter Singer, Rethinking Life and Death: The Collapse of our Traditional Ethics (Oxford: Oxford University Press, 1995)


Poppy Waring took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘egoism’ as a philosophical keyword.

Egoism is hardly a word that you’re likely to come across in your day to day life. In fact, I’d happily put bets on some high percentage of people having no definition to hand. Despite this, it’s also not exactly hard to grasp roughly what it discusses. ‘Ego’ is the self. As egoism stems off from this word, it encapsulates all philosophies linked to self-interest. These days it has also become linked to hedonism (as the press release for the egoistmobile ‘The Egoist’ tells us), probably due the worlds continued confusion between words that only have a few letters difference between them.

Something which became clear when I started my research for this blog was that there is a LOT of confusion regarding the differences between egoism and egotism. So, to clarify, here’s the differences:

  • The definition, according to the Oxford English Dictionary of egotist is ‘The obtrusive or too frequent use of the pronoun of the first person singular: hence the practice of talking about oneself or one’s doings.’
  • So egoism is more abstract and a course of actions, where as egotism require an personal involvement to be discussed.
  • Egoism is self-interest, where as egotism is self-obsession.

In the present day, we are probably far more familiar with the term egotism since it so often seems to come up when talking about celebrity culture. So in present day popular culture, Kanye West, is the prime example of egotism – his self professed love of his self, and placing of himself as the most important person alive right now could be the distilled essence of egotism. Yet I suppose if he’s acting like this purely to get exposure, there is an argument that his actions are egoistic… Probably best to avoid dwelling on that for now though…

One of the earliest examples of egoism in writing can be found in the works of Hobbes. In Leviathan, he states that ‘No man giveth but with intention of good to himself; because gift is voluntary; and of all voluntary acts the object to every man is his own pleasure’. It is this that would shape psychological egoism, countering the common hailing of altruism by religion. Skipping ahead, Max Stirner would advance the discussion of egoism away from purely psychological egoism and ethical egoism, going on to become one of the most influential writers on the subject. His exploration of the concepts found in ‘The Ego and His Own’, would bring egoism into then modern concepts, such as anarchism, libertarianism and individualism.

If you allow me to make a rather sweeping tour of the philosophical writings on egoism, one also can’t ignore the impact of Nietzches writing. It would shape the writing of most after him on the subject, whilst due to the notoriety of his linking with the ideologies of Nazism. It also would be where Marx and Engel would find a source of critism from which to build their own theories of Marxism/communism. Nietzche, argues that at our hearts we all follow egoism, but does not believe this is a bad thing – in true Neitzsche style he remains pretty silent on the moral implications on this, purely attempting to understand the way the world works. 

Egoism started to appear in popular culture. Some of the key British thinkers, writers and artist of the period, the Bloomsbury group, which would carry through in to the Edwardian era. Writers such as Erza Ound, Richard Adlington and T.S. Eliot would champion the self. Many of these ideas would be published, either in their own books or in Dora Marsdens influencial publication ‘The Egoist’. Whilst ‘The Egoist’ was an intellectual magazine, written for the educated middle classes, it also was published as all modern magazines would be. Unlike journals, it also contained advertising, and readers could write in to express their thoughts on articles from previous issues. 

The objective of the Women Movement being the development of the individual Ego… it appeals to the spirit of woman – Dora Marsden

Dora Marsden’s publications would go on to shape the progression of the feminist movement. Marsden played a large role in progressing feminism on from suffragettes, who were relatively altruistic in their actions and arguments. From around 1912 onwards, as the Freewoman became the New Freewoman, and the New Freedwoman became the Egoist, Dora Marsden moved away from the suffragette moment, towards a feminism which has far more similarities with todays feminism than the so called first wave. Other writers from this era, such as Virginia Wolfe would also find a sense of their self in the principles of egoism and feminism. In the footsteps of Marsden, as an egoist philosophy. The writer Rose Young would later describe feminism as concerning individualistic self development – crucial meaning that egoism in the realm of empowerment is not inherently selfish. Women for centuries have been expected to be altruistic, caring for their families at the expense of being able to explore their own identities and capabilities. Whilst men have been taught to always be egoistic firstly, and to strive towards altruism, women have always been expected to already be altruistic, and so egoism is a revelation to women allowing for self discovery and empowerment.

Whilst egoism fell out of popular favour throughout the war period, there were still a handful of writers who would carry to baton on from earlier writers such as Nietzsche and Stirner, developing their theories on egoism. Perhaps the most influential of these was Ayn Rand. In her work, The Virtue of Selfishness she makes the case for selfishness, reminding us that we inherently fear the criticism as we are told it is a negative trait. Whilst in popular culture, the rise of socialism, free love and hippie culture in the 60s would suggest the continuing trend away from egoism. Yet if we actually take a close examination of the protests we find that actually pretty much all of them actually have self interest at their hearts. The most obvious example of this would be the movement of sexual liberation, and as previously stated the feminist movement as was concerned with allowing women to be as egoist as men had been allowed to be for centuries. Even outside of this though, the anti-war protests, were often about preventing the self being sent to war as their more altruistic parents and grandparents had been. There is probably an argument for many social protests since the 60s being more egoistic than we would like to imagine – from the miners strikes, to the marchs of students for free education.

Rands work, outside of her own fiction writing such as The Fountainhead and Atlas Shrugged, would go on to influence those in many other unlikely fields, which would lead to her understanding of egoism permeating the public psyche even to the present day. For instance it is said that the comic book writers Steve Dikto and Frank Miller of Spiderman and Batman fame accordingly found some inspiration for the nature of their heroes in Rands writing. And with the continued popularity of comic book adaptions and the ‘antihero’, I think it would be fair to say that the idea of the self-serving anti-hero is pretty big in the present. I’ve been told that Deadpool is a pretty Randian (anti)hero…

And so we return to the present day. Whilst you might not see ‘EGOISM’ in flashing lights anywhere, it remains a constant in oour understanding of actions. In psychology, we continue to analyse whether we do anything for truly altruistic purposes. Just as the self remains a continuing obsession, we continue to live in an egoistic society. With the continued championing of ‘follow our dreams’, it seems pretty logical that This asides, what is clear, is that in todays society, where capitalism reigns supreme unchallenged, egoism is just another fact of life, and therefore has fallen out of common discourse.



Lucy Delap, The Feminist Avant-Garde: Transatlantic Encounters of the Early Twentieth Century

Mark s. Morrisson, The Public Face Of Modernism: Little Magazones, Audiences, and Reception

Eleanor M. Sickels, The Gloomy Egoist: Moods and Themese of Melancholy from Gray to Keats, (New York, Octagon Books, 1969)

(ed) Peter Singer, Ethics, (Oxford: Oxford University Press, 1994)


David Ashford, ‘A New Concept of Egoism:The Late Modernism of Ayn Rand‘, Modernism/modernity, Volume 21, Number 4, November 2014,

Michael Slote, ‘Egoism and Emotion‘, Philosophia, June 2013, Volume 41, Issue 2, pp 313-335


Oxford English Dictionary

Internet Encyclopaedia of Philosophy


Common Sense

Isabel Overton took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘common sense’ as a philosophical keyword.

As a result of studying in one of the most expensive cities in the world, upon moving to London in September 2013 I found myself working part time in the supermarket chain Waitrose. Located in a particularly affluent area of London it boasted a continuous stream of business professionals and upper middle class families. Working in such an environment it is no surprise that I have been witness and recipient of many ‘unique’ queries. Whilst some have provided vast entertainment during an otherwise tedious shift (I once had a man ask me where we kept the ‘premium houmous’) others have been baffling in their lack of what we would call ‘common sense’. It would seem that many people, so drained from their fast paced city lives, seem to have no energy left to try and manoeuvre the minor speed bumps they may face when doing a food shop. I’m sure i’m not alone in being familiar with the phrase ‘use your common sense’, but how productive are we actually being when urging someone to tap into their inner instinct. Furthermore, where does the term ‘common sense’ stem from, and how common, or sensible, is it?

Greek philosopher Aristotle coined the term ‘common sense’ to describe a sense that unified all five human senses, such as sight and smell, allowing humans and animals to distinguish multiple senses within the same object. This came in response to a theory put forward by Plato that all senses worked individually from one another but were then integrated within the soul where an active thinking process took place, making the senses instruments of the thinking part of man. In Plato’s view, the senses were not integrated at the level of perception, but at the level of thought and thus the unifying sense was not actually a kind of ‘sense’ at all. Aristotle on the other hand, attributed the common sense to animals and humans alike, but believing animals could not think rationally, moved the act of perception out of Plato’s rational thinking soul and into the sensus communis, which was something like thinking and something like a sense, but not rational. He also located it within the heart, as he saw this as the master organ.[1]

“Good sense is, of all things among men, the most equally distributed; for every one thinks himself so abundantly provided with it”[2]

– Rene Descartes, Discourse on Method

French Philosopher Rene Descartes

French Philosopher Rene Descartes

Contemporary definition can be summed up as the basic ability to judge and perceive things shared by the majority of the population. Father of modern western philosophy, French philosopher Rene Descartes, in Discourse on Method established this when he stated that everyone had a similar and sufficient amount of common sense. He went on to claim however, that it was rarely used well and he called for a skeptical logical method to follow, warning against over reliance upon common sense.

During the Enlightenment common sense came to be seen more favourably as a result of works by philosophers such as religiously trained Scottish philosopher Thomas Reid. Often described as a ‘common sense philosopher’ Reid developed a philosophical perspective that provided common sense as the source of justification for certain philosophical knowledge. Reid wrote in response to the increase in scepticism put forward by philosophers such as David Hume, who held an empiricist view towards the theory of ideas, arguing that reason and knowledge were only developed through experience or through active research.[3] Reid, alongside other like-minded philosophers including Dugald Stewart and James Beattie, formed the Scottish School of Common Sense, arguing that common sense beliefs automatically governed human lives and thought. This theory became popular in England, France and America during the early nineteenth century, before losing popularity in the latter part of the period.

Political philosopher Thomas Paine

Political philosopher Thomas Paine

In terms of American influence Reid’s philosophy was pervasive during the American Revolution. England born political philosopher and writer Thomas Paine was a key advocate of common sense realism, and used it to advocate American independence. Published in 1776, his highly influential pamphlet Common Sense conveyed the message that to understand the nature of politics, all it took was common sense. Selling around 150,000 copies in 1776 during a time when the rate of illiteracy was high amongst the American population, Common Sense was both a tribute to the persuasiveness of Paine’s argument and his ability to simplify complex rhetoric.[4]

In the early twentieth century British philosopher G. E. Moore developed a treatise to defend Thomas Reid’s argument titled A Defense of Common Sense, that had a profound effect on the methodology of twentieth century Anglo-American philosophy.[5] This essay listed several seemingly obvious truths including “There exists at this time a living human body which is my body” arguing against philosophers who held the idea that even ‘true’ propositions could be partially false.[6] He greatly influenced Austrian philosopher Ludwig Wittgenstein who, having previously known Moore during time spent in England, was reintroduced to his work when a former student was wrestling with his response to external world skepticism. Inspired, Wittgenstein began to work on a series of remarks, eventually published posthumously as On Certainty, that proposed there must be some things exempt from doubt in order for human practices to be possible.

The philosophical debate surrounding common sense shows itself as complex and immersed with disunity, so it is important to view how modern society has taken the idea of common sense and what platforms of discussion it features in. Whilst many people take for granted what they assume to be an innate natural instinct, it seems that over the past two decades there has been an increasing documented lament over the decrease in common sense within society as a whole. In 1998 author Lori Borgman released an essay entitled The Death of Common Sense in which she remarked:

“A most reliable sage, he was credited with cultivating the ability to know when to come in out of the rain, the discovery that the early bird gets the worm and how to take the bitter with the sweet.”[7]

She then noted the decline of ‘Common Sense’s’ health in the late 1960’s when “he became infected with the If-It-Feels-Good, Do-It virus”[8]. It is unsurprising that countless idle news stories reporting the ‘misgivings’ of people who appear too incompetent to make the most basic decisions has led to such a cynical outlook on how sound societies judgment is. An example that springs to mind is the plethora of multi million pound lottery winners who spend all their money within a year only to find themselves in debt wondering where it all went wrong. This is something reiterated within psychologist and professor Jim Taylor’s article Common Sense is Neither Common nor Senseand instead he argues for a focus on ‘reasoned sense’ as opposed to ‘common sense’.

Naturally, this perceived lack of common sense has led many to question how common it really is. According to Voltaire common sense was not so common[9] but the more accurate assumption might be to suggest it was not so commonly used in particular situations; on the down side this is not nearly as memorable. Nevertheless, education teaches us to make decisions based on careful, calculated thought, something that people lacking education fail to do. Whilst almost all people have common sense, it is usually those who are more intellectually capable that ‘fail’ to use it, at least initially, when in the process of basic problem solving. This is through over-calculating however, not under-calculating, and ultimately not an accurate measure of capability.

Some theorists believe that educational intellect can clash with basic common sense

Educational intellect does not impact the amount of common sense people hold

Many lighthearted studies have been done that test this theory, one of the most popular being to have a child and adult solve the same simple brainteaser. Studies seem to suggest that a child is more likely to solve a simple problem faster than an adult, as their lack of knowledge disallows them to overcomplicate the problem put in front of them. An example of this is found in the short clip featured below:

This use of common sense is of course unique to brain teasers and jovial mind games, and through the course of day to day life, the majority of the population are unwittingly using their common sense concisely, whether it be to look both ways before they cross the street, or to let their coffee cool down before taking a sip.

It has been argued however, that group judgment can impair individual common sense, thus providing debate towards how significant it is when used as a means of judgment. Social theorist Stuart Chase is believed to have remarked that common sense was that which told us the world was flat, putting focus on the semantic ‘common’ in its raw sense to mean widespread, that is to say, consensus of opinion.[10]

Group judgment can impair individual thinking

Group judgment can impair individual thinking

In 1841 Scottish journalist Charles Mackay in Extraordinary Popular Delusions and the Madness of Crowds observed: “Men, it has been well said, think in herds; it will be seen that they go mad in herds, while they only recover their senses slowly, and one by one.”[11] It is natural for something to gain credibility the more universally it is agreed upon or known, but this does not necessarily make it right. For example, studies have shown that the commonly held belief of never waking a sleepwalker as it is dangerous is actually a myth. Whilst they may find themselves disorientated upon being woken, the act itself is not harming.[12]

Through all this we have seen the reshaping, acceptance, memorial, criticism and dissection of the term, and use of, common sense, and it is ultimately personal opinion that can decipher how prevalent an individual will view it. Some pride themselves on their developed intelligence through precision schooling, whereas others prefer to boast of their natural intelligence. All this aside, I am sure I stand in the majority when I admit that this natural guide has helped me maneuver my way out of a tricky situation on more than one occasion, and so for that I am happy to look upon it favourably as a silent guardian, justified or not.

[1]Gregoric, Pavel, Aristotle on the common sense, Word Press,

[2] Descartes, Renes, Discourse on Method 1635,, [accessed 19th February, 2016]

[3] David Hume, Stanford Encyclopaedia of Philosophy,, [accessed 15th February, 2016]

[4] Thomas Paine, History,, [accessed 19th February, 2016]

[5] Philosophy of Common Sense, New World Encyclopedia,, [accessed 15th February 2016]

[6] Mattey, G. J., Common sense epistemology, UC Davis Philosophy 02,

[7] Borgman, Lori, The Death of Common Sense,, [accessed 15th February 2016]

[8] Ibid.

[9] Voltaire, A Pocket Philosophical Dictionary, Oxford World’s Classics, 2011

[10] Hayakawa, S. I. and Hayakawa, A. R., Language in Thought and Action, Harvest Original; 5th ed. 1991, p.18

[11] Mackay, Charles, Extraordinary Popular Delusions and the Madness of Crowds, Wordsworth Reference, 1995, p.xv

[12] Future, BBC,, [accessed 19th February, 2016]


Further Reading

Lori Borgman, The Death of Common Sense

Ian Glynn, An Anatomy of Thought: The Origin of Machinery of the Mind, (Oxford University Press, 2013)

Pavel Gregoric, Aristotle on common sense, (Oxford University Press, 2007)

S. I. Hayakawa and Alan R. Hayakawa, Language in Thought and Action, (Harvest Original; 5th ed. 1991)

Philosophy of Common Sense, New World Encyclopedia,


Sophia Patel took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘materialist’ as a philosophical keyword.

What do Kanye West, Karl Marx and Lucretius have in common?

While this may sound like the set up for a terribly unfunny joke, rather than the introduction to the history of a philosophical word, it is not. What these three all share is that they could all be identified by the word ‘materialist.’

Although I cannot comment on whether or not Kanye believes that ‘physical matter is the…fundamental reality and that all being…can be explained as manifestations… of matter,’ in the philosophical sense of the word, the popular economically focused definition, of ‘a preoccupation with material…things’ seems to fit him to a tee.[1]

The fact that materialism as a word, contains three very different concepts (philosophical, historical and economic) makes its history fascinating to track and raises questions as to links between the concepts: If someone holds the belief that all there is is matter, is that individual more likely to value material things? It seems this is not the case.

Philosophical materialists, in the simplest sense, argue that all of reality can be reduced down to the organisation of matter in a certain way. While idealists such as Plato argued that mind and matter were separate and that mind ruled over matter, materialists have argued that there is no such thing as a separate ‘mind.’ Mechanical materialism as a theory states that the world is made up of imperceptibly small objects that interact with each other. This theory denies the existence of immaterial things such as the mind. However, since more is now understood about the difference between matter and energy, the usage of the term materialism has evolved to include those who  base their philosophy on physics to explain existence. This Physics based materialism has been termed ‘physicalistic materialism.’[2]


Democritus, one of the fathers of materialism. Image Credit: Internet Encyclopaedia of Philosophy, 2016

The materialist tradition entered both Eastern and Western philosophy in approximately the fifth century BCE, with the Carvaka school of philosophy in India and the ancient Greek philosopher Democritus in the West. Democritus argued that the world and everything in it, including the soul, was made up of nothing but atoms and empty space. Epicurus would go on to develop Democritus’ views, adapting and expanding his idea of atoms and how their collisions resulted in free will. These ideas continued to be built upon in the first century BCE in Rome, where the Epicurean, Lucretius penned The Nature of Things, outlining his materialist views.

Materialism enjoyed little popularity through the medieval period, possibly because, as a theory, it is naturally secular, denying the existence of anything but matter. In Britain, it was not until the early seventeenth century that Thomas Hobbes, remembered often for his political philosophy, asserted his materialist ideas. He believed that humans and their minds were entirely material, later even extending this to the belief that God too was a material being. However, this was not a popular or mainstream strand of philosophical thinking in Britain at this time. The tradition gained more popularity in France through the eighteenth century, though this could largely be ascribed to the negativity facing orthodox Christianity at this time.

Materialist thought was strengthened in Britain by Charles Darwin’s theory of evolution, which helped to prove the similarities that existed between animals and humans, as well as provide an alternative explanation to the design argument.


U.T. Place and his brain. Is his consciousness stored in there? Image Credit: University of Adelaide

The mid twentieth-century, with its increasing knowledge of physics and biochemistry, meant a resurgence in the popularity of philosophical materialism. British philosopher, U.T. Place penned his ‘Is Consciousness a Brain Process?’ in 1956 while teaching in Australia. This paper, with its thesis regarding mental states’ relationship with neural states, helped to cement Place and his theory in the centre of modern materialist philosophy. However, while earlier scientific advancement frequently served philosophical materialism’s interests, advancing knowledge of physics towards the end of the twentieth century has meant that materialism has increasingly come under fire. This criticism has frequently originated in Britain, with the English philosopher Mary Midgley criticising materialism for failing to define what matter actually consists of. [3] Similarly, the English physicist Paul Davies argued that materialism has been disproven by scientific findings. This was demonstrated in  his 1991 book The Matter Myth with John Gribbin in which they argue ‘Quantum physics undermines materialism because it reveals that matter has far less ‘substance’ than we might believe. But another development goes even further… this development is the theory of chaos.[4]

While the philosophical concept of materialism may not seem like a topic that would easily translate into popular culture, some materialist ideas do occasionally disseminate into popular discourse, particularly in America. The American Atheists founder Madalyn O’Hair based her form of atheism on the materialist idea that nothing exists but natural matter. Materialism is also the subject of American punk rock band, Bad Religion’s, 2002 song ‘Materialist.’ The song, while obviously not focussing on the minute details of philosophical materialism does raise several of the concepts, stating ‘mind over matter, it really don’t matter.’ The fact that this was one of the few mentions philosophical materialism received in popular culture suggests that it a more learned theory that has not spread very far into wider society.

Materialist in the economic or ethical sense is used to identify someone interested in the enjoyment and comforts that material possessions bring. Economic materialism is closely linked to social issues such as class, in which social positioning and success is often determined by one’s material possessions.

As early as the fifth century BCE Socrates spoke out against economic materialism asking What is the point of walls and warships and glittering statues if the men who build them are not happy? However, now more than ever we exist in a society where materialism and the constant drive to possess more ‘things’ is always there; from television, to popular music, to targeted advertising on social media. Material culture is so pervasive that an American survey found that one in every fourteen people would murder someone in exchange for three million dollars.[5]

A key area for the dissemination of materialistic ideologies is through rap and hip hop culture. British rapper Tinie Tempah’s 2010 hit Miami 2 Ibiza reads more like a list of brands and labels than a song at times, as he sings ‘I got a black BM, she got a white TT, She wanna see what’s hiding in my CK briefs.’ While this may be a slightly more subdued list when compared to the yachts and bentleys listed by American rappers, the fact remains that it links the importance of material possessions to success and status in society.


Macklemore and Ryan Lewis encouraging us to be thrifty. Image Credit: Macklemore, 2012

In response to materialistic culture and the recent financial difficulties that have plagued America and Britain, artists such as Macklemore in his 2012 hit ‘Thrift Shop’ have attempted to distance themselves from such themes. In the song he praises bargain hunting while simultaneously mocking materialist culture stating ‘They be like, “Oh, that Gucci – that’s hella tight. I’m like, Yo – that’s fifty dollars for a T-shirt… I call that getting tricked by a business’ Alongside the song, Macklemore instructed fans to look for his merchandise at thrift shops to avoid the increased prices at venues. Indeed, while popular culture is often responsible for driving economic materialism in culture, it is also a conduit for spreading ideas against materialism. Chuck Palahniuk lamented in his 1996 novel Fight Club, as did the 1999 film of the same name, “Advertising has us chasing cars and clothes, working jobs we hate so we can buy shit we don’t need.”

While in philosophy, ‘Materialist’ has usually been used in the philosophical sense, it does not mean that philosophers have ignored economic materialism as a concept. Bertrand Russell, condemned economic materialism, stating that: ‘It is a preoccupation with possessions, more than anything else that prevents us from living freely and nobly.’

Dialectical or historical materialism has also been a more common usage of the word. Materialism in this sense was made popular by the fact that it was the common philosophy of the communist countries. Dialectical materialism is a theory of how changes occur throughout the history of humanity. Karl Marx was the first to identify as a historical materialist in the nineteenth century. This theory suggested that the material conditions of society’s mode of production were what ultimately defined its ability to develop and organise itself. He stated that people “make their own history, but they do not make it just as they please; they do not make it under circumstances chosen by themselves.” However, since Marx created this concept the theory has been modified and adopted by Marxists and non-Marxists alike. [6]

One word with three very different meanings and even more different histories. Though there is apparently no real link between economic and philosophical materialism, their evolution as words in relation to one another is fascinating. Philosophical materialism is constantly advancing to make use of new discoveries in physics and the sciences, but still remains rooted in the same ideas expressed by Democritus. Economic materialism, though popular culture occasionally attempts to fight against it, has however, become a almost inescapable part of modern life.


[1] Merriam Webster Dictionary, ‘Materialism’

[2] John Jamieson Carswell Smart, ‘Encyclopaedia Britannica,’ Materialism

[3] Mary Midgley, The Myths We Live By (London: Routledge, 2003)

[4] Paul Davies and John Gribbin, The Matter Myth: Dramatic Discoveries that Challenge Our Understanding of Physical Reality (London: Simon & Schuster, 1991)

[5] Bernice Kanner, Are You Normal about Money? (Princeton: Bloomberg Press, 2001)

[6] Paul D’Amato, ‘Why was Marx a materialist?’ (2011),

Further Reading

Charles Darwin, On the Origin of Species (London: John Murray, 1859)

Paul Davies and John Gribbin, The Matter Myth: Dramatic Discoveries that Challenge Our Understanding of Physical Reality (London: Simon & Schuster, 1991)

Thomas Hobbes, Leviathan, (1651)

Bettany Hughes, ‘Socrates – a Man For Our Times,’ The Guardian, (2010),

Lucretius, On the Nature of Things, (50 BCE)

Mary Midgley, The Myths We Live By (London: Routledge, 2003)

Madalyn O’Hair, Why I Am an Atheist: Including a History of Materialism (New Jersey: American Atheists Press, 1991)

U.T. Place, ‘Is Consciousness a Brain Process?,’ British Journal of Psychology, 47:1 (1956)


Holly Meldrum took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘atheism’ as a philosophical keyword in recent British cultural history.

‘Atheism’ – the disbelief or lack of belief in the existence of God or god.

This clip of Christopher Hitchens, is taken from a debate with his brother Peter Hitchens, on the 3rd April 2008. The late Christopher Hitchens (1949-2011) was one of the most notorious and outspoken proponents of the ‘New Atheism’ movement that developed in the 2000s. In the clip, Hitchens and his unique style that is both unapologetic and contemptuous, not only rejects the existence of God but condemns the idea of God as ‘totalitarian’ and likens this dictatorship as being akin to a ‘celestial North Korea’. For Hitchens, in one of his bestsellers, God is Not Great: How Religion Poisons Everything, explains how belief in the supernatural and religion are detrimental to progression of humanity, because they destroy individual freedom, free expression and scientific discovery[1].

Hitchens, has been grouped with three other current intellectuals; Sam Harris, Daniel Dennett and Richard Dawkins, who are collectively known as ‘The Four Horsemen of the Non-Apocalypse’. The infamous ‘Four Horsemen’ have brought their ideas on atheism and religion into mainstream media; through their television appearances, bestselling books and debates that are watched worldwide. Central to their thinking, is that there is no god, supernatural or divine being. They also contend that the world of religion and religious belief, is often morally corrupt and needs to be open to criticism. Current world affairs concerning religious fundamentalism, have propelled these intellectuals into the spotlight as authorities on secular opinion. Their prominence has given the ‘New Atheism’ movement substantial rise and has given atheist ideas recognition in today’s world.

Dawkins, Hitchens, Harris and Dennett

‘Atheism’ or the word ‘atheist’, was first used in the English language in the 16th Century, it derived from the French ‘athéisme’, which came from the Greek, ‘a’ meaning without and ‘theos’ meaning ‘God’. However in the 16th Century, the word ‘atheist’ was used as an insult to call someone, nobody would have used it as a description of their beliefs. Atheism can be traced back to the ancient Greeks, particularly to Diagoras of Melos, a 5th Century B.C. poet and sophist. Although Diagoras of Melos never wrote about atheism, it has been widely recognised that he was publically very candid of his views and was well-known to demystify the Eleusinian secret religion, in order to provoke his contemporaries into thought[2].

Despite being able to trace atheism as far back as the ancient Greeks, the key period for atheist thinking came during the Age of Enlightenment. The nature of the Enlightenment period, made it possible for intellectuals and philosophers to be forthright with their arguments, with the growing examination of religious orthodoxy. Key to the atheism movement during the Enlightenment was philosopher Baron d’Holbach (1723-1789), whose salon was renowned for being the popular haunt of the philosophers of the age, with revolutionary ideas on the existence of God. One of d’Holbach’s best known works, ‘Systéme de la Nature’ published in 1770, has been described by some as the “bible of atheism”. Systéme de la Nature, openly denies the existence of God and attributes such beliefs as the product of fear and a lack of understanding, ground-breaking for its time.  D’Holbach argues,

“What, indeed, is an atheist? He is one who destroys delusions which are harmful to humanity in order to lead men back to nature, to reality, to reason. He is a thinker who, having reflected on the nature of matter, its energy, properties and ways of acting, has no need of idealized powers or imaginary intelligences to explain the phenomena of the universe and the operations of nature.”[3]

Later in Britain, during the Victorian period, John Stewart Mill (1806-1873), a prominent English philosopher, though writing little on religion during his time, for fear of alienating his audience, posthumously published ‘Three Essays on Religion’ (1874). Mill’s decision to publish his essays after he died, shows us that society was not always accommodating to critical views on religion. For Mill, in his essay, ‘The Utility of Religion’, argues that religion’s utility in society derives from the moral code it imposes, not from its dogma or theology. Mill contends that the belief in the supernatural, is not needed anymore, though in the past it was useful in upholding the moral code, it now hinders the progress of humanity by preventing social change. Herbert Spencer (1820-1903), another English philosopher during the Victorian era and a contemporary of Mill, advocated the pre-eminence of science over religion. For Spencer, who was often a controversial figure, argued that knowledge of a phenomena required empirical demonstration, we cannot know anything non-empirical, therefore we cannot know whether there is a God or what his character may be[4]. We can see from these Victorian thinkers, who both were extremely prominent in their time for their views, both don’t expressly deny the existence of a God, rather highlight problems with the absolute faith that there is a supernatural God. In an era of avid church going in Britain, it is safe to assume that neither wanted to risk their reputation, by expressing too much doubt in God’s existence.

One of the most distinguished British philosophers of the 21st Century, Bertrand Russell (1872-1970) and also Godson to J.S. Mill, marked a move within the discussion of atheism and its difference to agnosticism. Russell very publically discussed why he turned his back on Christian faith and how he felt that there wasn’t adequate logical evidence for the existence of God or an afterlife[5]. For Russell, when trying to determine his position, by distinguishing between atheist and agnostic, proved to be a difficult question to answer. While in prison during the First World War, Russell recalls a warden asking his religion, to which he replied “agnostic”, having to spell it out for the warden, the warden remarked, “religions are many, but I suppose they all believe in the same God”[6]. For Russell, in his 1953 essay called, ‘What is an Agnostic’ he argues that the atheist claims that we can know that there is no God, while agnostics recognise that there is no conclusive argument for God’s non-existence. Yet in many respects, Russell struggled with the difference between atheist and agnostic, while describing himself as an agnostic, in the practical sense he could be described as an atheist as he see no logical explanation for believing in God in the Judeo-Christian sense. This change in philosophical thinking expressed by Russell, clearly defined atheism as the complete disbelief in a supernatural God, whereas agnostics recognise that it cannot be argued absolutely that there is no God.

Bertrand Russell

Today, a clear line has been drawn, with regard to what it means to be an atheist. The founding premise of atheism, which has been set out by modern thinkers, is that atheists do not believe in the existence of God or gods. In popular culture today, we know that when a ‘celebrity’ describes themselves as an atheist, we know exactly what they are referring to, which has enabled many to further their arguments for atheism beyond the existence of God foundation. This has been able to remove to some extent, the social stigma that once surrounded the likes of Mill. An excellent example of this in Britain today, is how mainstream British comedians have used the topic of atheism in their comedy. The likes of John Cleese, Ricky Gervais, Jimmy Carr, Stephen Fry and Eddy Izzard, have successfully promoted atheism through ridiculing the ideas of religion and God, as a thought provoking tool. Below, in this clip from the Irish television, Jimmy Carr discusses being an atheist on the Late Late Show, 18th September 2009. Carr with his use of comedy manages to mock the audience when they showed distaste for his jokes on Jesus.

To finish, with the rise of the ‘New Atheism’ movement and the widespread ridicule of religion by comedians, have led many to argue that atheists today have become ‘militant’ or ‘fundamental’, for their forthright views on God and religion. This I would argue, shows there is still room in society for progression and acceptance for ideas that disagree with traditional religious dogma. The stigma surrounding atheism, has condemned those who are opinionated on the subject, as extremist. Likewise, the world of atheism as it stands today, could take a few lessons from the religious world. As Swiss-British philosopher, Alain De Botton, has argued, that the atheist and religious sectors, need not be enemies of each other, and has promoted a much gentler philosophy of atheism. One which moves away from the traditional debates over God and morality of organised religion, and looks to the lessons that can be drawn from religion. He concludes his Ted Talk, appropriately titled ‘Atheism 2.0’, by saying, “You may not agree with religion, but at the end of the day, religions are so subtle, so intelligent in many ways, that they are not fit to be abandoned for the religious alone, they are for all of us”[7].

[1] Christopher Hitchens, God is not Great: How Religion Poisons Everything (London: Atlantic Books, 2007)

[2] Jennifer Michael Hecht, “Whatever Happened to Zeus and Hera? 600 BCE – 1CE” in Doubt, A History (New York: HarperOne, 2003)

[3] Paul Henri Thiry (Baron d’Holbach), Systéme de la Nature, 1770.

[4] Internet Encyclopaedia of Philosophy, []

[5] Bertrand Russel on Religion, [] 1959

[6] Bertrand Russell Society [] 2012

[7] Alain De Botton, ‘Atheism 2.0’, Ted Talks [] July 2011

Continue reading


In this post, Alfie Turner, who took the ‘Philosophical Britain‘ module at Queen Mary in 2015, writes about ‘Orwellian’ as a philosophical keyword.

Aged 17, I got my first paid job washing-up in my local village pub. Describing working conditions on the phone to my aunt – the anti-Semitic chef and the cohort of Slovenian dogsbodies, jumping to every foul-mouthed order, – my aunt remarked, “it sounds Orwellian!”  Later that year she gave me ‘Down and Out in Paris and London’. Being, I suspect, the only student in England to have escaped school without having ‘Animal Farm’ or ‘1984’ surreptitiously shoved down my throat, my engagement with the phrase “Orwellian” was both literary and practical. Feeling as I did, I equated it with Humanism, the human spirit of the Parisian plongeur, the Wigan miners, and the Communist and Fascist soldiers shouting insults at each other about buttered toast over the frontline trenches in the Spanish Civil War.

The shouting of propaganda tGeorge Orwell cover undermine the enemy morale had been developed into a regular technique…

Generally they shouted a set-piece, full of revolutionary sentiments which explained to the Fascist soldiers that they were merely the hirelings of international capitalism…

Sometimes, instead of shouting revolutionary slogans he simply told the Fascists how much better we were fed than they were…’Buttered toast!’–you could hear his voice echoing across the lonely valley–‘We’re just sitting down to buttered toast over here! Lovely slices of buttered toast!…It even made my mouth water, though I knew he was lying.[1]

Orwell cuts through the mirage of claim and counter-claim, using humour in the midst of war, men of one side indistinguishable from the other, finding dignity in man’s common humanity. It brought home what Orwell once said about his art –‘good prose is like a windowpane’[2]. This humanist interpretation of Orwell is in alignment with his prior appreciation of the late-Victorian humanism of James Joyce in ‘Inside the Whale’ and the ever-present influence of Aldous Huxley.

Alas, “Orwellian”, the adjective he has left us, is more complex in its usage and meanings. According to the New York Times, “Orwellian” is ‘more common than ”Kafkaesque,” ”Hemingwayesque” and ”Dickensian” put together. It even noses out the rival political reproach ”Machiavellian”, which had a 500-year head start’[3]. “Orwellian” is used because of a hegemonic Anglo-American adoration of Orwell endorsed in every school this side of Siberia. Orwell’s prophetic verdicts on Imperialism, Totalitarianism and Communism render him a genius, “Orwellian” is used less to evoke what Orwell said than for the recognisable man who said it. His readers are often labelled “Orwellians”; if we are to believe YouGov’s profile, a cohort of young middle class, right of centre males, working in the law and media, most likely to own a cat.[4]

Orwell profile

Putting this aside, the way the word has been used is somewhat more complex. To be an “Orwellian” writer is certainly a good thing, the Orwell Prize a coveted award for writers. As Polish intellectual Czeslaw Milosz said of Orwell, whose books he illegally smuggled into Poland, ‘Even those who know Orwell only by hearsay are amazed that a writer who never lived in Russia should have so keen a perception into its life’.[5] This ability to cut through the fog, to take on totalitarianism from outside a communist state, is at the heart of the term’s positive usage. Polemicist Christopher Hitchens wrote that ‘we commonly use the term ‘Orwellian’ in one of two ways. To describe a state of affairs as ‘Orwellian’ implies crushing tyranny, fear and conformism. To describe a piece of writing as ‘Orwellian’ is to recognize that human resistance to these terrors is unquenchable’[6]. To many “Orwellian” is about precise and simple language, derived from his novels but also some of his most famous essays. It was Orwell who made this readable style of language so easily applicable to nearly everything in the modern world, not least because he used it to describe the everyday, even precise instructions for brewing tea.

George Orwell at the BBC in 1941

George Orwell at the BBC in 1941

“Orwellian” principle usage is now as a by-word for evil. It is summoned to attack people, organisations and regimes that bear no relation to one another, other than that they are held in contempt by somebody. The Republican Party have brandished the NHS “Orwellian”, Samsung TVs are “Orwellian” and recently Bill Maher said of Isis that “this idea that we cannot even call it Islamic terrorism seems Orwellian to me”. Even Peppa Pig has been called Orwellian, some animals consulting Doctor Brown Bear, whilst others Dr Hamster the Vet, leading many to ask the appropriate Orwellian question; “are some animals more equal than others”?

At a 1995 Tory conference John Major attacked Blair, suggesting ‘Labour has been reading 1984…the brainchild of another public-school-educated Socialist’.[7] In the light of the Iraq war, it is interesting to reflect on Major’s prophetic words. Orwell’s ‘1984’ mantra that “War is Peace, Freedom is Slavery, Ignorance is Power” has echoes in George Bush’s claim that “I just want you to know that, when we talk about war, we’re really talking about peace.”[8] The “Orwellian” nightmare cast by the shadows of ‘Animal Farm’ and ‘1984’ has a clear political saliency that persists today; recent evocations of Orwell regarding WikiLeaks and the NSA revelations are a case in point.

“Orwellian”, born out of Orwell’s thinking, is rooted in 20th century philosophy, but there is a dualism in Orwell’s theoretical ideas and views as a political activist. For example, he had a life-long contempt for intellectuals, calling Sartre ‘a bag of wind and I am going to give him a good boot’[9]. However, he saw politics as ubiquitous, infecting all other aspects of life. Writing to Orwell in 1949, Huxley observed that ‘1984’ ‘hints of a philosophy of the ultimate revolution — the revolution which lies beyond politics and economics, and which aims at total subversion of the individual’s psychology and physiology’[10]. Orwell believed that literature and philosophy were inseparable; ‘no one, now, could devote himself to literature as single-mindedly as Joyce and Henry James’[11]. But it is his activism that marks him apart, unable to watch the 20th century unfold as a passive spectator. Having read Nietzsche’s “master slave relationship”, he reappraised his own position as a colonial officer in Burma, and, as related in ‘Burmese Days’, duly quit to become a full-time writer.

“Orwellian” also carries with it a cynicism and hatred about a stultifying Edwardian class structure into which he was born. At Eton he ‘had to supress his distrust and dislike of the poor, his revulsion from the ‘coloured’ masses who teemed throughout the empire, his suspicion of Jews, his awkwardness with women and his anti-intellectualism’[12]. But he could not abide the establishment that underpinned such attitudes, rebelling and then abandoning his class. Orwellian” as the ultimate “outsider” was a precursor to the “Angry Young Men” of the 1950s and 60s, and possibly to the punks of the 1980s.

“Orwellian” is not static, its meaning and inferences changing, and no more so than since the collapse of Communism. Where once it conjured up a futuristic “Orwellian” nightmare, now it is used to describe dystopic times gone by.  The term’s first use, by Mary McCarthy’s 1950 ‘On the Contrary’, was in the context of ‘the Orwellian future’[13]. After 1992, it has been used, much like “Nazism” or “Stalinism”, to recall foregone perils. Ironically, Orwell himself, through the “Orwellian” prism, has become synonymous with the totalitarian regimes he once satirised.

Given its indefinable meaning, “Orwellian” has curiously always said more about its user than about itself. For example, Hitchens used it to justify his support for the Iraq war, ironically evoking Orwell who was famous for exposing the horrors of war. Another irony was the evocation of Orwell in the final edition of the News of the World, brought to a demise by the “Big Brother is listening” phone-hacking scandal. It quotes Orwell on the final page: ‘The wife is already asleep in the armchair and the children have been sent out for a nice long walk. You put your feet up on the sofa, settle your spectacles on your nose and open the News of the World’[14]. Here, as the Telegraph points out, the editor cut the end of the quotation, which read ‘In these blissful circumstances, what is it that you want to read about? Naturally, about a murder’. [15]

We have hijacked Orwell’s name to lend legitimacy to our own, often controversial, views. But I don’t think Orwell would have been much surprised. For him, language is merely ‘an instrument, which we shape for our own purposes’[16] and the abuse of Orwell, for financial profit, political point scoring, or waging war, is the greatest ‘Orwellian” crime of all. I rest my case here:

Suggested Further Reading:

George Orwell: A Life in Pictures Full Documentary

Orwell Essays

Why Orwell Matters: Lecture

The Road to Nineteen Eighty-Four

George Orwell, Down and Out in Paris and London (London, Penguin, 2001)


[1] George Orwell, Homage to Catalonia (London, Penguin, 1962), p. 42-43.

[2] George Orwell, Why I Write, (London, Penguin, 2004) p. 10.

[3] Geoffrey Nunberg, Simpler Terms; If It’s ‘Orwellian It’s Probably Not, Accessed at:, (22, 6, 2003).

[4] YouGov, Readers of George Orwell, Accessed at:

[5]  Czeslaw Milosz, Quotes about 1984, Accessed at:

[6]  Christopher Hitchens, Why Orwell Matters, (USA, Basic Books, 2002), p. 5.

[7]  Christopher Hitchens, Why Orwell Matters, (USA, Basic Books, 2002), p. 117.

[8] George w. Bush, Quotes, Accesed at:

[9] George Orwell, Letter, Accessed at:

[10] Aldous Huxley, Letter, Accessed at:

[11] George Orwell, Essays, (Penguin, London, 2000), p. 454.

[12] Christopher Hitchens, Why Orwell Matters, (USA, Basic Books, 2002), p. 9.

[13] Mary McCarthy, On the Contrary, (New York, Farrar Straus and Cudahy, 1961), p. 187.

[14] Michael Deacon, George Orwell would be proud of how his words were twisted, Accessed at:, (11, 8, 2011).

[15] Ibid.

[16] George Orwell, Essays, (London, Penguin, 2000), p. 348.