In this post, Alfie Turner, who took the ‘Philosophical Britain‘ module at Queen Mary in 2015, writes about ‘Orwellian’ as a philosophical keyword.

Aged 17, I got my first paid job washing-up in my local village pub. Describing working conditions on the phone to my aunt – the anti-Semitic chef and the cohort of Slovenian dogsbodies, jumping to every foul-mouthed order, – my aunt remarked, “it sounds Orwellian!”  Later that year she gave me ‘Down and Out in Paris and London’. Being, I suspect, the only student in England to have escaped school without having ‘Animal Farm’ or ‘1984’ surreptitiously shoved down my throat, my engagement with the phrase “Orwellian” was both literary and practical. Feeling as I did, I equated it with Humanism, the human spirit of the Parisian plongeur, the Wigan miners, and the Communist and Fascist soldiers shouting insults at each other about buttered toast over the frontline trenches in the Spanish Civil War.

The shouting of propaganda tGeorge Orwell cover undermine the enemy morale had been developed into a regular technique…

Generally they shouted a set-piece, full of revolutionary sentiments which explained to the Fascist soldiers that they were merely the hirelings of international capitalism…

Sometimes, instead of shouting revolutionary slogans he simply told the Fascists how much better we were fed than they were…’Buttered toast!’–you could hear his voice echoing across the lonely valley–‘We’re just sitting down to buttered toast over here! Lovely slices of buttered toast!…It even made my mouth water, though I knew he was lying.[1]

Orwell cuts through the mirage of claim and counter-claim, using humour in the midst of war, men of one side indistinguishable from the other, finding dignity in man’s common humanity. It brought home what Orwell once said about his art –‘good prose is like a windowpane’[2]. This humanist interpretation of Orwell is in alignment with his prior appreciation of the late-Victorian humanism of James Joyce in ‘Inside the Whale’ and the ever-present influence of Aldous Huxley.

Alas, “Orwellian”, the adjective he has left us, is more complex in its usage and meanings. According to the New York Times, “Orwellian” is ‘more common than ”Kafkaesque,” ”Hemingwayesque” and ”Dickensian” put together. It even noses out the rival political reproach ”Machiavellian”, which had a 500-year head start’[3]. “Orwellian” is used because of a hegemonic Anglo-American adoration of Orwell endorsed in every school this side of Siberia. Orwell’s prophetic verdicts on Imperialism, Totalitarianism and Communism render him a genius, “Orwellian” is used less to evoke what Orwell said than for the recognisable man who said it. His readers are often labelled “Orwellians”; if we are to believe YouGov’s profile, a cohort of young middle class, right of centre males, working in the law and media, most likely to own a cat.[4]

Orwell profile

Putting this aside, the way the word has been used is somewhat more complex. To be an “Orwellian” writer is certainly a good thing, the Orwell Prize a coveted award for writers. As Polish intellectual Czeslaw Milosz said of Orwell, whose books he illegally smuggled into Poland, ‘Even those who know Orwell only by hearsay are amazed that a writer who never lived in Russia should have so keen a perception into its life’.[5] This ability to cut through the fog, to take on totalitarianism from outside a communist state, is at the heart of the term’s positive usage. Polemicist Christopher Hitchens wrote that ‘we commonly use the term ‘Orwellian’ in one of two ways. To describe a state of affairs as ‘Orwellian’ implies crushing tyranny, fear and conformism. To describe a piece of writing as ‘Orwellian’ is to recognize that human resistance to these terrors is unquenchable’[6]. To many “Orwellian” is about precise and simple language, derived from his novels but also some of his most famous essays. It was Orwell who made this readable style of language so easily applicable to nearly everything in the modern world, not least because he used it to describe the everyday, even precise instructions for brewing tea.

George Orwell at the BBC in 1941

George Orwell at the BBC in 1941

“Orwellian” principle usage is now as a by-word for evil. It is summoned to attack people, organisations and regimes that bear no relation to one another, other than that they are held in contempt by somebody. The Republican Party have brandished the NHS “Orwellian”, Samsung TVs are “Orwellian” and recently Bill Maher said of Isis that “this idea that we cannot even call it Islamic terrorism seems Orwellian to me”. Even Peppa Pig has been called Orwellian, some animals consulting Doctor Brown Bear, whilst others Dr Hamster the Vet, leading many to ask the appropriate Orwellian question; “are some animals more equal than others”?

At a 1995 Tory conference John Major attacked Blair, suggesting ‘Labour has been reading 1984…the brainchild of another public-school-educated Socialist’.[7] In the light of the Iraq war, it is interesting to reflect on Major’s prophetic words. Orwell’s ‘1984’ mantra that “War is Peace, Freedom is Slavery, Ignorance is Power” has echoes in George Bush’s claim that “I just want you to know that, when we talk about war, we’re really talking about peace.”[8] The “Orwellian” nightmare cast by the shadows of ‘Animal Farm’ and ‘1984’ has a clear political saliency that persists today; recent evocations of Orwell regarding WikiLeaks and the NSA revelations are a case in point.

“Orwellian”, born out of Orwell’s thinking, is rooted in 20th century philosophy, but there is a dualism in Orwell’s theoretical ideas and views as a political activist. For example, he had a life-long contempt for intellectuals, calling Sartre ‘a bag of wind and I am going to give him a good boot’[9]. However, he saw politics as ubiquitous, infecting all other aspects of life. Writing to Orwell in 1949, Huxley observed that ‘1984’ ‘hints of a philosophy of the ultimate revolution — the revolution which lies beyond politics and economics, and which aims at total subversion of the individual’s psychology and physiology’[10]. Orwell believed that literature and philosophy were inseparable; ‘no one, now, could devote himself to literature as single-mindedly as Joyce and Henry James’[11]. But it is his activism that marks him apart, unable to watch the 20th century unfold as a passive spectator. Having read Nietzsche’s “master slave relationship”, he reappraised his own position as a colonial officer in Burma, and, as related in ‘Burmese Days’, duly quit to become a full-time writer.

“Orwellian” also carries with it a cynicism and hatred about a stultifying Edwardian class structure into which he was born. At Eton he ‘had to supress his distrust and dislike of the poor, his revulsion from the ‘coloured’ masses who teemed throughout the empire, his suspicion of Jews, his awkwardness with women and his anti-intellectualism’[12]. But he could not abide the establishment that underpinned such attitudes, rebelling and then abandoning his class. Orwellian” as the ultimate “outsider” was a precursor to the “Angry Young Men” of the 1950s and 60s, and possibly to the punks of the 1980s.

“Orwellian” is not static, its meaning and inferences changing, and no more so than since the collapse of Communism. Where once it conjured up a futuristic “Orwellian” nightmare, now it is used to describe dystopic times gone by.  The term’s first use, by Mary McCarthy’s 1950 ‘On the Contrary’, was in the context of ‘the Orwellian future’[13]. After 1992, it has been used, much like “Nazism” or “Stalinism”, to recall foregone perils. Ironically, Orwell himself, through the “Orwellian” prism, has become synonymous with the totalitarian regimes he once satirised.

Given its indefinable meaning, “Orwellian” has curiously always said more about its user than about itself. For example, Hitchens used it to justify his support for the Iraq war, ironically evoking Orwell who was famous for exposing the horrors of war. Another irony was the evocation of Orwell in the final edition of the News of the World, brought to a demise by the “Big Brother is listening” phone-hacking scandal. It quotes Orwell on the final page: ‘The wife is already asleep in the armchair and the children have been sent out for a nice long walk. You put your feet up on the sofa, settle your spectacles on your nose and open the News of the World’[14]. Here, as the Telegraph points out, the editor cut the end of the quotation, which read ‘In these blissful circumstances, what is it that you want to read about? Naturally, about a murder’. [15]

We have hijacked Orwell’s name to lend legitimacy to our own, often controversial, views. But I don’t think Orwell would have been much surprised. For him, language is merely ‘an instrument, which we shape for our own purposes’[16] and the abuse of Orwell, for financial profit, political point scoring, or waging war, is the greatest ‘Orwellian” crime of all. I rest my case here:

Suggested Further Reading:

George Orwell: A Life in Pictures Full Documentary

Orwell Essays

Why Orwell Matters: Lecture

The Road to Nineteen Eighty-Four

George Orwell, Down and Out in Paris and London (London, Penguin, 2001)


[1] George Orwell, Homage to Catalonia (London, Penguin, 1962), p. 42-43.

[2] George Orwell, Why I Write, (London, Penguin, 2004) p. 10.

[3] Geoffrey Nunberg, Simpler Terms; If It’s ‘Orwellian It’s Probably Not, Accessed at: http://www.nytimes.com/2003/06/22/weekinreview/simpler-terms-if-it-s-orwellian-it-s-probably-not.html, (22, 6, 2003).

[4] YouGov, Readers of George Orwell, Accessed at: https://yougov.co.uk/profiler#/George_Orwell/demographics.

[5]  Czeslaw Milosz, Quotes about 1984, Accessed at: http://www.goodreads.com/quotes/tag/nineteen-eighty-four.

[6]  Christopher Hitchens, Why Orwell Matters, (USA, Basic Books, 2002), p. 5.

[7]  Christopher Hitchens, Why Orwell Matters, (USA, Basic Books, 2002), p. 117.

[8] George w. Bush, Quotes, Accesed at: http://www.goodreads.com/quotes/50967-i-just-want-you-to-know-that-when-we-talk.

[9] George Orwell, Letter, Accessed at: http://www.lettersofnote.com/2011/12/bag-of-wind.html.

[10] Aldous Huxley, Letter, Accessed at: http://www.dailymail.co.uk/news/article-2111440/Aldous-Huxley-letter-George-Orwell-1984-sheds-light-different-ideas.html.

[11] George Orwell, Essays, (Penguin, London, 2000), p. 454.

[12] Christopher Hitchens, Why Orwell Matters, (USA, Basic Books, 2002), p. 9.

[13] Mary McCarthy, On the Contrary, (New York, Farrar Straus and Cudahy, 1961), p. 187.

[14] Michael Deacon, George Orwell would be proud of how his words were twisted, Accessed at: http://www.telegraph.co.uk/news/uknews/phone-hacking/8630799/George-Orwell-would-be-proud-of-how-his-words-were-twisted.html, (11, 8, 2011).

[15] Ibid.

[16] George Orwell, Essays, (London, Penguin, 2000), p. 348.


Charlie Roden took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘happiness’ as a philosophical keyword, with the help of Charlie Brown.

Extract from the comic-strip ‘Peanuts’. Image from http://www.philipchircop.com/post/15448312238/incidentally-what-is-happiness-do-whatever

According to the Oxford English Dictionary, ‘happiness’ is defined as ‘the state of being happy’, that is, ‘feeling or showing pleasure or contentment.’[1]  Happiness is a universal concept which, I believe, most people aspire to achieve. However, since happiness is so subjective, everyone interprets it in different ways.

Many people believe that they attain happiness when they eat their favourite food, buy new clothes or earn a lot of money. Although these are all experiences that can be enjoyed, they don’t actually cause happiness- they only bring us pleasure.  Of course, the official definition of ‘happiness’ does include pleasure, however I agree with Happiness International who suggest that pleasure is only short-lived and externally motivated. If happiness relied on pleasures such as the ones just mentioned, does this suggest that without a lot of money or materialistic items people are unhappy?

I don’t believe that anyone can truly define ‘happiness’, and by looking at the history of this word we can see how its cultural and philosophical meanings have changed over time, demonstrating that happiness cannot simply be understood as a single concept.

‘Happiness’ stems from the late fourteenth-century word ‘hap’ meaning ‘good luck’ or ‘chance’. [2] This suggests then that in the Middle Ages, a person was believed to be happy if they had good fortune.  Already, we can see how a modern perspective of ‘happiness’ is different to this idea, as although being lucky can promote happiness, we can often feel happy without being fortunate.

The sole predecessor to the idea of ‘happiness’ was proposed by Aristotle (384-322 BC). In his Nicomachean Ethics, Aristotle emphasised that the ultimate aim in life is ‘Eudaimonia’, an Ancient Greek term usually translated as ‘happiness’ or ‘human growth.’ [3]

Unlike an emotional state, such as pleasure, Aristotle asserted that Eudaimonia is about reaching your full potential and flourishing as a person. In order to do this, you need to live a life that is wholesome and virtuous to attain the best version of yourself. [4] Virtue can be achieved through balance and moderation, as this way of life leads to ‘the greatest long-term value’ rather than just pleasure that is short-lived.  [5] In a modern-day perspective, this would be the difference between earning vast sums of money but spending it all at once, as opposed to spending money wisely, ensuring it will last and provide you with a good life. [6]

In the early modern era, the importance of happiness began to emerge in the political sphere. [7] In 1726, the Scottish philosopher Francis Hutcheson (1694-1746) wrote that

‘that Action is best, which procures the greatest Happiness for the greatest Numbers; and that, worst, which, in like manner, occasions misery.’ [8]

This utilitarian principle, which aims at the greatest happiness of the greatest number, essentially asserts than an action is right if it produces happiness and wrong if it produces the reverse of happiness. [9]

Jeremy Bentham 1748 – 1832

Jeremy Bentham, image from http://sueyounghistories.com/archives/2010/06/13/jeremy-bentham-1748-%E2%80%93-1832/

The most famous advocate of utilitarianism was English philosopher and jurist Jeremy Bentham (1748-1832). Bentham proposed many social and legal reforms, such as complete equality for both sexes, and put forward the idea that legislation should be based on morality. [10] Identifying the good with pleasure, in his 1781 book An Introduction to the Principles of Morals and Legislation, Bentham wrote:

Nature has placed mankind under the governance of two sovereign masters, pain and pleasure. It is for them alone to point out what we ought to do, as well as to determine what we shall do.’  [11]

By stating that happiness can be understood in terms of the balance of pleasure over pain, Bentham shares an ethical Hedonistic claim; the notion that only pleasure is valuable, and displeasure or pain is valueless. [12]

In 1861, English philosopher and economist John Stuart Mill (1806-1873) published one of his most famous essays, Utilitarianism, which was written to support the value of Bentham’s moral theories. The general argument of Mill’s work proposed that morality brings about the best state of a situation, and that the best state of affairs is the one with the largest amount of happiness for the majority of people. Mill also defined happiness as the supremacy of pleasure over pain; however, unlike Bentham, Mill recognised that pleasure can vary in quality. Whereas Bentham saw simple-minded and sensual pleasures, such as drinking alcohol or eating luxurious foods as just as good as complex and sophisticated pleasures, such as listening to classical music or reading a piece of literature,  [13] Mill argued that:

‘the pleasures that are rooted in one’s higher faculties should be weighted more heavily than baser pleasures.’

[14]  Mill’s version of pleasure also links back to the tradition  of Aristotle’s virtue ethics, as he stated that leading a virtuous life should be counted as part of a person’s happiness. [15]

Ultimately, ‘happiness’, at least from a political viewpoint, took its deepest roots in the New World. Thomas Jefferson (1743-1826) asserted that:

‘The care of human life and happiness and not their destruction is the first and only legitimate object of good government.’ [16] 

He believed that a good government was one that promoted its people’s happiness by securing their rights.

First Printed Version of the Declaration of Independence

First Printed Version of the Declaration of Independence, 1776, image from http://www.loc.gov/exhibits/creating-the-united-states/interactives/declaration-of-independence/pursuit/enlarge5.html

‘Life, Liberty and the Pursuit of Happiness’, the three ‘unalienable rights’ is the phrase  most often quoted from  the 1776 American Declaration of Independence. Today, Americans translate ‘the pursuit of happiness’ as a right to follow ones dreams and chase after whatever makes you subjectively happy. [17]   However, Professor James R. Rogers from Texas A&M University argues that happiness in the public discourse of the late eighteenth-century did not simply refer to an emotional state. Instead, it meant a person’s wealth or well-being. [18] It included the right to meet ‘physical needs’, but it also encompassed an important religious and moral aspect. The Massachusetts Constitution of 1780 confirmed that:

‘the happiness of a people and the good order and preservation of civil government essentially depended upon piety, religion and morality, and… these cannot be generally diffused through a community but by the institution of the public worship of God and of public instructions in piety, religion and morality.’ [19]

Statements like these can be found in many documents of the time. Essentially, ‘happiness’ in the Declaration should be understood as a virtuous happiness, again similar to Aristotle’s ‘Eudaimonia’. Although the ‘pursuit of happiness’ includes a right to material things, it goes beyond that to include a person’s moral condition. [20]

After searching for the philosophy of happiness in twentieth-century Britain, I came across Bertrand Russell’s (1872-1970) The Conquest of Happiness, published in 1930. To my surprise, I found his beliefs on happiness rather modern, and similar to the sort of ideas about happiness you can read about in self-help books today. Nevertheless, I found his work inspiring. Russell wrote this book to ‘suggest a cure’ for the day-to-day unhappiness that most people suffer from in civilised countries.  [21]

The key concept of happiness that I took away from Russell’s book was to stop worrying:

‘When you have looked for some time steadily at the worst possibility and have said to yourself with real conviction, “Well, after all, that would not matter so very much,’ you will find that your worry diminishes to a quite extraordinary extent.’ [22]

This also means to stop worrying about what other people think of you, since most people will not think about you anywhere near as much as you think [23], essentially suggesting that people overestimate other negative people’s feelings about them.

With around two thousand self-help books being published every year, it can be argued that happiness is more central to modern-day society than any other time in history. [24]

However, as well as aiming to achieve happiness, there is now a huge emphasis on how to reduce symptoms which prevent happiness, such as anxiety and depression.  According to the Huffington Post, around 350,000,000 people around the world are affected by some form of depression. These extortionate statistics has led to the creation of organisations such as Action For Happiness, whose aim is to reduce misery in people’s lives, and encourage people to create more happiness and less unhappiness in the world.

Russian writer Leo Tolstoy (1838-1910) once said:

“If you want to be happy, be.” [25]

The idea that we can simply choose to be happy, regardless of certain aspects of our life that we want to change, is also a prevalent idea today. The best-selling song of 2014, Pharrell Williams’ Happy promotes this idea:

‘Because I’m happy, clap along if you feel like a room without a roof.’ [26]

When asked what these lyrics meant, Williams stated that happiness has no limits and can be achieved by everyone.


Pharrell Williams’ reply. Image from https://twitter.com/Pharrell/status/431011318737698816?ref_src=twsrc%5Etfw

Finally, the idea that everyone can achieve happiness has been a topic talked about by Sam Berns. Berns suffered from Progeria and helped raise awareness of this disease. He died one year after appearing in a TEDx Talks video called ‘My philosophy for a happy life’ at the age of seventeen in 2014. In this inspiring video, Berns shares his four key concepts that help him lead a happy life.

1) Overcome obstacles that prevent happiness.

2) Instead of focussing on what you can’t do, focus on what you can do.

3) Surround yourself with people who bring positive energy into your life.

4) Don’t waste energy on feeling bad for yourself.

Overall, it appear that there is no such thing as one concept of ‘happiness.’ From classical antiquity all the way through to present day, the idea of what happiness means culturally and philosophically has developed, and will most likely continue to change in the future.

Continue reading


Muhammad Domun took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post he writes about ‘radical’ – one of the most politically charged of philosophical keywords.

“I was determined that my Radicalism should not be called in question.”

-Charles Dickens ( The letters of Charles Dickens)

From the new elected FIFA president to make radical realignment to football to David Cameron’s nonsensical arguments on the “traditional submissiveness of Muslim women”  being key to the radicalisation of Muslim men. The words radical, radicalism and radicalisation remains some of the most overused, and in all likelihood misused, words in politics and media today. Stemming from the late Latin word Radix, the word radical originated in the 14th century, meaning “of or having roots. However, the general use of the word where it usually means going to the origin of a concept is from 1650s. The political sense of the word which refers to changing from the roots was
Bring_radicals_cartoonfirst recorded in 1802, whilst radical reform was a current phrase since 1786. It was not until 1921 that the word radical gained the connotations of unconventional. Nowadays radical generally means relating to or affecting the fundamental nature of something; far-reaching or thorough. Radicalism stems from 1817 coined by no other than Bentham himself in his “Plan of parliamentary reform, in the form of a catechism”. Bentham and his legacy which led to this concept of philosophical radicalism is something we will discuss in greater details later on. Radicalisation has its political origins stemming in 1820. It also has a much damning interpretation with the Home Office defining radicalisation as “The process by which people come to support terrorism and violent extremism and, in some cases, then join terrorist groups.” In essence, the word radical itself intimates change, the concepts of radicals, radicalism and radicalisation remain politically indeterminate,since it is defined solely by the fundamental character of the desire for change, rather than by any particular political principles, or description of the desired end-state of the reform. It challenges the status quo and asks for more.

Philosophical radicalism.

The term philosophical radicalism is in reference to a group of utilitarians, who had a very idealistic social formula: reduce pain and increase pleasure. Amidst the accusation of hedonism, the Philosophical radicalism maintained that all laws and institutions should exist based on their usefulness or utility to the general happiness of the mass. This felicific calculus,  stemmed from the cornerstone of Jeremy Bentham’s manifesto of utilitarianism, the Introduction to the Principles of Morals and Legislation Bentham famously remarked that:

“Nature has placed mankind under the governance of two sovereign masters, pain and pleasure, it is to them alone to point out what we ought to do, as well as determine what we shall do.”

Whilst this may seem generally in order of things and certainly not radical the second premise of Bentham and the Philosophical radicals may have been slightly troublesome. To their logic as all the contemporary existing systems in the words were falling or had failed to accommodate the greatest happiness for the greatest number, the Philosophical radicals concluded that all existing systems should be abolished and substituted for systems more beneficial to the perpetual happiness of the people. Yet,
ollowing true to the steps of Bentham they sustained that this could only be done through gradual reforms and not violent revolution. The Philosophical Radicals did not only source from Bentham, they also took from Malthus, Ricardo, and Hartley. Among their midst are famous names such as John and James stuart Mill, George Grote and John Roebuck. They spread their ideas through a series of publications; namely Morning Chronicle, Westminster Review, and London Review. Their main focus was electoral reforms. Whilst their effort to establish a radical party in the parliament did not succeed, their utilitarian ideas did permeate politics and had leading roles between 1820-1850; some even arguing that it was an age of reform.

Philosophical anarchist William Godwin, yes the same person who wrote Jack and the Beanstalk, was similarly attracted to the radical concepts of Bentham. Whilst he conformed to the notion of the world being composed of rational men who seek pleasure and avoid pain, Godwin concluded that the greatest happiness can only be a reality if there were no law, no state, no marriage, no state-sponsored morality and no church. In essence Godwin was against anything that held man from being truly human. In response to conflict Godwin argued that all could be solved through “disinterested benevolence”. This disinterested benevolence was, of course, intellectual discourses, honesty and sincerity, which could all be achieved had man discuss things rationally. Alas, according to Godwin, under the modern system men take what they do not have, kill what they do not like and exploit others for their own materialistic gains.

Godwin remarked:

With what delight must every individual friend of mankind look forward to the auspicious period, the dissolution of political government, of that brute engine which has been the perennial cause of the vices or mankind, and which . . . has mischiefs of various sorts incorporated with its substance, and not otherwise removable than by its utter annihilation.

The question remains to what was individuals like Bentha
keep_your_coins__i_want_change_by_angelaravesm, Gowin and those in the philosophical radicalism movement calling for, this remains a question to be answered the short and perhaps hugely sweeping answer would be that it was a group of men trying to solve mankind’s qualms through rationality, but embedded in the longer and perhaps more accurate answer was individuals trying to reform and change society.

Radicalism in Politics.

Whilst Radicalism may have had its origins in politics, its connotations has morphed into something quite unrecognisable since the 1980s. Whilst prior to the 80s, in Britain at least, radicalism was mostly associated to the left, the radical Right briefly forgotten in Britain, reared it all too familiar head again in 1980s. Such has been the legacy of this radicalism of the right in Britain, in fact, that it was thought somewhat convenient and even correct to invent the oxymoronic idea of a radical centre in order to combat it, appropriating the rhetoric of radicalism for its opposite: a white-collar government of gradual sectorial changes aimed at maintaining the status quo – in essence, conservatism, in the most Burkean sense possible. This anti-thesis of radicalism still holds strong as so many liberal parties in Europe still name themselves “radical” but are very much centrist.

In Britain, contrary to popular beliefs, its not so much of a bad thing to be regarded as a radical albeit which side of the fence you stand on does matter. For one, the late Geoffrey Howe was celebrated as a quietly spoken radical by George Osborne, whilst David Cameron was urged to be “fearlessly radical” when it comes to a few of his reforms. Yet amongst Labour is very scared of portraying themselves as such, why? Maybe because the Tories have four more years in power whilst Labour are still reeling from the lost of supporters from the 2015 general elections.


Infographic published by French government in the wake of the Charlie Hebdo attacks warning people of the signs of Radicalisation

For some reason calling Cameron or Osborne radical resonate strong leadership, whilst calling Corbyn radical insinuate crazy leftie. Similarly, on the very wrong side of radicalism you have the likes of ISIS and Iran, who last time I checked do not really like each other. Yet they are still portrayed as radical as it seems a perfect umbrella to place their vicious authoritarianism approach. Indeed, the decision to name process of a British Muslim becoming sympathetic to ISIS as radicalisation seems horribly problematic as you are figuratively endorsing ISIS’s dubious assertion that they represent Islam. However, it seems that this may just have slipped the mind of the government’s think tanks as the Government strategy paper “Prevent” mentions radicalisation no less than one hundred and eighty times. Yet, it completely fails to mention that ISIS do not represent the true roots of Islam and the piousness of ISIS fighters itself is very dubious. To the government the route that Muslims should take is the one of becoming a “moderate Muslim”. Well, “moderate”, as Raymon Williams argued in his brilliant Keywords, “is often a euphemistic term for everyone, however insistent and committed, who is not a radical.” Thus all Muslims should hold their beliefs half-heartedly and less than half committed or risk of being accused of being radicalised. Yet it seems that its for the good of all of us if the Conservative party revels in radical extremism

Hence, in conclusion the word radical and its compartments all call for change. Whether its change towards the status quo or against it. Its no surprise that a word of such volatile nature is misused and misunderstood. Nonetheless, it is still a crying shame to see how politicians are using such ambiguous words in such a circular and highly racialised manner.


Maya Bhogal took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘mystic’ as a philosophical keyword.

Today, when someone says the word mystic, the most common connotation might be the figure of Mystic Meg, astrologist and psychic for The Sun. As in the picture below, Meg is often depicted holding a crystal ball in an ethereal setting. She is the archetype of a popular understanding of the word mystic. However, the original ‘mystics’ would be somewhat puzzled over this definition.

Mystic Meg with her crystal ball.

Mystic Meg with her crystal ball.

The online Oxford English Dictionary defines a mystic as a person who can grasp spiritual truths; i.e. something beyond the realm of knowledge. This definition perhaps might explain why Meg is adorned with the title of ‘Mystic’ beyond its mellifluent alliteration obviously. Through her (alleged) ability as a psychic to tell the future, she is able to transcend the boundaries of her own world in order to gain knowledge or insight into another realm.

However, earlier definitions of the word mystic, with roots in fifteenth century ecclesiastical history, define a mystic as an exponent of mystical theology. So traditionally, mystics were a phenomenon related to religion and God. The most prominent expounder of the mystic as a religious ‘experience’ was William James, in his influential work The Varieties of Religious Experience (1902).[i] He writes that it is an experience which has united many world religions, from Hinduism, Buddhism, Judaism, Islam, and most prominently in Britain, Christianity. Yet somewhat paradoxically, he also highlights the view that these experiences are so profound that they cannot be communicated through ordinary language, making them very isolated and individualistic experiences.

The ensuing definition by the OED, says that in its latter usage, a mystic was a person who seeks union with, or absorption into God. We can see this understanding of the word emerge around 1856 in relation to the Christian Mystics. When this is woven with James’ idea of the mystical ‘experience’, we can understand the nineteenth century philosophical understanding of mysticism as the ineffable experience of becoming one with God. Talking about mysticism coincided with the rise in spiritualism in Britain.

In 1848, Kate and Margaret Fox in New York used a primitive noggin to attempt to establish a dialogue between the spirit of Charles B. Rosma, who they believed had been haunting their home and after excavating their cellar, they found what they believed to be his skeleton. This episode, for many scholars represented the turning point where spiritualism began to gain a real following. By 1853 there were over 100,000 converts to spiritualism in the United States, and by 1900 the trend had caught on across the Atlantic with another 100,000 followers.

Looking at a Google Ngram viewer, we can see that the use of the word ‘mystic’ in books began to greatly increase from this moment onwards. A facet of spiritualism is the belief that a medium has the ability to channel a spirit which is separate from themselves, often allowing them to be overcome by that person. This draws distinct parallels with the mystic experience which is defined by the Stanford Encyclopaedia of Philosophy as:

“A (purportedly) super sense-perceptual or sub sense-perceptual unitive experience granting acquaintance of realities or states of affairs that are of a kind not accessible by way of sense-perception, somatosensory modalities, or standard introspection.”[ii]

Google Ngram depicting the use of the word 'Mystic' from 1800 - 2000

Google Ngram depicting the use of the word ‘Mystic’ from 1800 – 2000

As the Google Ngram viewer shows, writings about ‘mysticism’ reached their peak at the beginning of the twentieth century. Evelyn Underhill was an influential figure during this period with her 1911 book, Mysticism: A Study of the Nature and Development of Man’s Spiritual Consciousness, amongst other works. Mysticism in particular became a staple introductory text to the topic for the following fifty years.[iii] In it, she argues that mysticism is “the expression of the innate tendency of the human spirit towards complete harmony with the transcendental order.” Following the tendency to regard mysticism as an ineffable experience, Underhill expounds the means to achieve this experience, rather than the nature of the experience itself. Underhill’s extensive writings about mysticism were influential to the aforementioned William James amongst others as she defines it as, explicitly a transcendental experience.

 A rather saintly portrayal of Evelyn Underhill by Ron Dart, no doubt to emphasise her transcendental connection

A rather saintly portrayal of Evelyn Underhill by Ron Dart, no doubt to emphasise her transcendental connection

However, Bernard McGinn argues that this nineteenth and early twentieth century definition of mysticism is not accurate. McGinn, and later Ralph Norman, contend that during the Romantic era, mysticism was an academic endeavour, predominantly pursued by German Romantics. They interpreted mystical texts in the context of romanticism, thereby imbuing it with their own notions of creating unity and harmony between the self and nature to overcome the evils of modernity. “It became a form of religious romanticism concentrated on the themes of individual subjectivity and heightened emotional or psychological states.” Norman instead argues that the only, and correct context for interpreting these texts is within the Christian doctrine, because Christian mysticism is filled with expressions of theological reflection.[iv]

The definition of the ‘mystic’ continued to be misconstrued throughout the majority of the twentieth century. This is evident in Steven Katz’s collection of essays Mysticism and Philosophical Analysis (1978). The authors of these essays had all relegated the importance of the primary texts of mysticism in their analysis. These sources normally range from the Hellenistic to the medieval period. Denys Turner argues that the modern interpretations of mysticism have essentially invented the word and that “we persist on reading back the terms of that conception upon a stock of mediaeval authorities who knew of no such thing.”[v] On such a reading, established mystics from the modern period, including Meister EckhartJohn of the CrossJulian of Norwich and Dionysius the Areopagite are not in fact mystics at all!

After the realisation of this misunderstanding, attention for philosophers of mysticism shifted back to the primary texts, revealing what traditional believers in mystic theology actually ascribed to. The word mystikos is derived from the Greek myo meaning “to close” especially ones eyes. When we apply this understanding to the Hellenistic mystical text, Mystical Theology written by Dionysius the Areopagite, it becomes apparent that the pursuit of the mystic was understanding the philosophy behind the idea of an ineffable God, who is hidden from us and therefore beyond our understanding

Dionysius the Areopagite

Dionysius the Areopagite

This definition of mysticism then, would arguably make Thomas Aquinas one of the most important Christian mystics as he advocated the theory of negative theology. His theory argues that Christians should not understand God by what he is, but rather what he is not. Aquinas’ teachings are fundamental in Christian theology, which views the essential nature of God as being beyond the capacity of the human mind.

Modern notions of mysticism tend to be based on the idea of the parting of ways between the theologian and the mystic, but the traditional view does not separate the two. Russian philosopher, Vladimir Solovyo argued that mysticism was the remaining aspect of knowledge which reconciled the speculative gaps between rationality and empiricism. In his work Philosophical Principles of Integral Knowledge, he rejects rationality and empiricism as the only foundations of knowledge, arguing for mysticism to provide a more transcendental understanding of philosophy. Interestingly, the author of this article, writing within the framework of the modern conception of mysticism, argues that it is important to distinguish this from the term mysticism as it denotes to a reflection on that connection with the transcendental. By pursuing such a line of thought, he seems

This is somewhat in line with the direction of early theological and philosophical discussions of mysticism, which, according to Jerome Gellman, do not obey traditional laws of logic and reason.

In modern times, mysticism, with its association as an experience that connects one with the divine or supernatural knowledge, has been considered an irrational and superstitious pursuit. Many have dismissed it as a “queer” phenomenon of sociological or psychological behaviour as it does not operate within modern conceptions of epistemology.[vi] Interestingly, Grace Jantzen has argued that a stress on the concept of mystic experiences as ineffable have largely been an attempt to consign it as an “irrational” belief and remove it from rational discourses into the realm of emotions.

As such, discussion regarding mysticism has dwindled back to the levels of the eighteenth century. The common perception of the mystic as an eccentric fortune-telling figure, but ultimately not someone (one should hope!) that people take too seriously, appears to be a huge anti-climax in the history of mysticism, after having been a philosophy that essentially championed a huge section of Christian theology.


Further Reading

Stanford Encyclopedia of Philosophy – Mysticism

Stanford Encyclopedia of Philosophy – Dionysius the Areopagite

Oxford Dictionary of National Biography – Evelyn Underhill

William James, The Varieties of Religious Experience, (1902).

Ralph Norman, “Rediscovery of Mysticism” in, ed. Gareth Jones, The Blackwell Companion to Modern Theology, (Blackwell Publishing Ltd: Oxford, 2004).

Evelyn Underhill, Mysticism: A Study of the Nature and Development of Man’s Spiritual Consciousness, (1911)


[i] William James, The Varieties of Religious Experience, (1902).
[ii] http://plato.stanford.edu/entries/mysticism/
[iii] Evelyn Underhill, Mysticism: A Study of the Nature and Development of Man’s Spiritual Consciousness, (1911)
[iv] Ralph Norman, “Rediscovery of Mysticism”
[v] Denys Turner, The Darkness of God: Negativity in Christian Mysticism, (Cambridge: Cambridge University Press, 1995),
[vi] Constantinos Athanasopoulos, “The Validity and Distinctness of the Orthodox Mystical Approach in Philosophy and Theology and Its Opposition to ‘Esse ipsum subsistens’”, Revista Portuguesa de Filosfia, (68:4, 2012).


Emmeline Wilcox took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘moral’ as a philosophical keyword.

Growing up as a Harry Potter fanatic, I lapped up all the moral messages that littered the books. The importance of empathy, friendship, loyalty, love, and acceptance were set out as qualities that should be fostered. Finding moral lessons in stories is not unfamiliar ground when you’re a child. In fact, as children we are bombarded with moral messages. A children’s TV show, for example, will most likely take you on a journey of moral discovery. Characters make mistakes, and lessons are learned. In this way, the child can reap the moral benefits without the anguish of making mistakes. Teaching moral behaviour in children is essentially teaching a child how to live in our society, and what is expected of them. To my delight, a whole book has been dedicated to child moral development using Harry Potter as a basis. However, this use of the word ‘moral’ in terms of a lesson is just one of the ways we use the word. Exploring other definitions and usages of the word takes us deep into our history.

‘The Potter books in general are a prolonged argument for tolerance, a prolonged plea for an end to bigotry. And I think it’s one of the reasons that some people don’t like the books, but I think that’s it’s a very healthy message to pass on to younger people that you should question authority and you should not assume that the establishment or the press tells you all of the truth.’ – J.K. Rowling.

The word ‘moral’ first appeared in the English language around the mid to late 14th century. It is thought that the word was first used by John Gower to translate the Latin title of St. Gregory the Great’s moral exposition on the Book of Job (Moralia, sive Expositio in Job) for use in his poem, The Lovers’ Confession (circa. 1393). The Latin word moralis (the singular of moralia) literally translated from Latin to ‘pertaining to manners’ – with mos meaning manner, custom, or law. The word ‘moral’ was subsequently used in translating many other classical works. The use of moral as an adjective can similarly be traced back to the late 14th century – first appearing in Chaucer’s The Canterbury Tales with the term ‘moral virtue’ (circa. 1387-95). Crucially, the use of the word as both an adjective and a noun emerged around the same time: Gower used the word as a noun, and Chaucer used the word as an adjective.

St. Gregory the Great

“Forthi Gregoire in his Moral / Seith that a man in special” – Gower.


Moral: Human Behaviour

The earliest use of the word as an adjective, merely indicated human behaviour and was unchanged from its Latin meaning. It is this definition that the term ‘moral philosophy’ has sprung from. Moral philosophy and ‘ethics’ are often seen as one in the same and are frequently used interchangeably. Surprisingly, the two words actually share a common evolutionary ancestor: the Latin word moralis was coined by Cicero when translating the ancient Greek word ethikos. Moral philosophy, being concerned with the big question of how we should live our lives, is a topic that has been taken up by many philosophers over the course of history. From Aristotle’s Nicomachean Ethics; Immanuel Kant’s The Groundwork of the Metaphysics of Morals; and G.E. Moore’s Principia Ethica are just a few examples of philosophers who have explored moral philosophy and ethics. Moral philosophers have traditionally been divided into two categories: moral universalists, and moral relativists. Moral universalists believe that there is a moral code which all humans share, and it can be arrived at through rational thought. Moral relativists (and later moral sceptics) denied that these universal moral laws exist, citing that many different cultures had very different moral codes.

In the mid 20th century, the meaning of ‘moral’ which referred to behaviour, was taken up by scientists and transformed into something that could now be tested and measured. With the advent of behavioural psychology in the 1920s, psychologists began to test morals and explore moral development. Psychologists such as Jean Piaget and Lawrence Kohlberg came up with theories of moral development in children. In adults, the famous 1963 Milgram experiment tested the strength of someone’s ‘moral imperative against hurting others’. Milgram’s experiment is certainly a mirror of the time – the study was set up as an attempt to explain how the holocaust could ever have occurred.

“Stark authority was pitted against the subjects' [participants'] strongest moral imperatives against hurting others, and, with the subjects' [participants'] ears ringing with the screams of the victims, authority won more often than not.”

“Stark authority was pitted against the subjects’ [participants’] strongest moral imperatives against hurting others, and, with the subjects’ [participants’] ears ringing with the screams of the victims, authority won more often than not.”

Moral: good behaviour or having good principles 

In the mid 15th century, the word took on a positive definition – meaning virtuous, or good behaviour and beliefs. We can infer a great deal about a society or person when the term moral is used as a justification for views or behaviour. Most starkly, we can begin to deduce someone’s general political views when he or she states that something is ‘moral’. For example, in 2004(?), Tony Benn stated that ‘there is no moral difference between a stealth bomber and a suicide bomber. Both kill innocent people for political reasons.’ Conversely, and more recently, the exact word, was used by David Cameron to justify an opposing stance. When attempting to justify his position on airstrikes in Syria, Cameron claimed that there was a ‘moral case. Even Katie Hopkins, famed for causing outrage by expressing views which oppose generally accepted morals, still likes to think she has a ‘moral compass’. Hopkins claims that her moral compass is only different as it is ‘not to do with religion […], it’s something to do with Britishness.‘.  When people talk about what they believe is moral, we can begin to make a good stab at where they are on the political spectrum. When Katie Hopkins associates her morals with nationalism, we can infer that she is a nationalist and is therefore right wing. When Tony Benn associates morality with an anti-war agenda, we can reasonably guess that he is sympathetic to the left. George Lakoff wrote the book Moral Politics: How Liberals and Conservatives Think – ultimately illustrating how the difference in morals is the fundamental factor when it comes to differing political views.

What is perhaps most illuminating, is that no one wants to come across as someone who is not in possession of morals. If you say ‘you have no morals’ it is an insult – an insult which is often aimed at the youth – apparently all those moral lessons in Harry Potter wear of off by adolescence. A world full of these immoral youths is a very frightening place – so much so, that this fear of immorality has been given its own term: moral panic. This term first appeared in Galaxy Magazine in 1877, when it was stated that the financial crisis of 1873 ‘has been followed by a moral panic.’ However, it wasn’t until the mid 1970s onwards that, in the western world, we really have been in a moral panic with the term dramatically increasing during this time. The increase could possibly be attributed to growing crime rates and illegal drug use – with drug use reaching its peak in 1995 when it is was reported that almost half of young people in Britain had tried an illegal drug. This is compared to 1950 when only 5% of young people were reported to have tried illegal drugs. Of course the HIV/AIDS epidemic of the 1980s, nicknamed the ‘gay plague’ created an enormous amount of moral panic around sexuality and promiscuity.

Moral Panic

The moral panic of the HIV/AIDS epidemic feeds into the definition that surprised me the most during my research on the word ‘moral’. One of the definitions given to me by dictionary.com, was that ‘moral’ meant ‘Virtuous in sexual matters; chaste’. Out of the many definitions, this was the only time that a dictionary had given me a precise example of moral behaviour as its definition. What seems very illuminating was how it was in sexual matters that the dictionary was telling us how to behave. Why is being chaste seen as the pinnacle of being moral? Why had none of the behaviours that J.K. Rowling had taught me not made it to the definition? Is this definition just a hangover from a pre-1960s age? Luckily a quick search for chaste in the Oxford English Dictionary confirmed for me that a definition for chaste was ‘Morally pure, free from guilt, innocent. Obs.’ – with ‘obs.’ actually meaning obsolete (and not ‘obviously’ as I began to believe, thinking that the OED was giving me attitude through text language). The strong association of morality and sex in western culture can be traced back to the beginnings of Christianity. Even before this, chastity was encouraged as a way of increasing paternal certainty and encourage ‘male paternal investment’. Thus, we can infer that the use of the word ‘moral’ to mean ‘chastity’ was almost inevitably going to happen – dating back thousands of years.

“In Western culture, at least since its Christian formation, there has been a perduring tendency to give too much importance to the morality of sex… ‘morality’ is almost reduced to ‘sexual morality’” – Margaret Farley

‘The Moral of the Story…’

The word moral is delightfully versatile. It is used to justify, to insult, and to scaremonger. The way people use the word can reveal a person’s views about politics, or even their views about sex. It is used in Parliament, in scientific reports, and in philosophical treaties. Yet it is simultaneously used in tabloids – with even the likes of Katie Hopkins appearing to give some thought to its meaning.


Aristotle, Nicomachean Ethics, (London: Penguin Classics, 350 BC/2004)

Benedict, Leo, ‘How the British Fell Out of Love With Drugs’, The Guardian, 24/02/2011, <http://www.theguardian.com/society/2011/feb/24/british-drug-use-falling> [Accessed: 26/02/2016]

Burkart, Gina, A Parents Guide to Harry Potter, (Illinois: InterVarsity Press, 2009)

Cameron, David, quoted in, Andrew Sparrow, ‘Cameron sets out ‘moral case’ for airstrikes against Isis in Syria – Politics live’, The Guardian, 26/11/2015, <http://www.theguardian.com/politics/blog/live/2015/nov/26/cameron-statement-syria-isis-air-strikes-not-a-sign-of-weakness-politics-live> [Accessed: 26/02/2016]

Chaucer, Geoffrey, The Canterbury Tales, (London: Penguin Classics, 1475/2003)

Gower, John, Confessio Amantis, (Charlston: Nabu Press, c. 1399/2012) 

Hopkins, Katie, quoted in, Hesham Mashhours, ‘‘Do You Have a Moral Compass?’ – BlewsWire Interviews Katie Hopkins’, BlewsWire,  <http://blewswire.com/do-you-have-a-moral-compass-blewswire-interviews-katie-hopkins/> [Accessed: 26/02/2016]

Kant, Immanuel, The Groundwork of the Metaphysics of Morals, 2nd edn, (Cambridge: Cambridge University Press, 1785/2012)

Lakoff, George, Moral Politics: How Liberal and Conservatives Think, (Chicago: University of Chicago Press, 2002)

Moore, G.E., Principia Ethica, (Kent: Dover Publications, 1903/2004)

Price, Michael E., ‘FromDarwin to Eternity: Why Does Morality Focus So Much On Sex?’, Psychology Today, (2013), <https://www.psychologytoday.com/blog/darwin-eternity/201307/why-does-morality-focus-so-much-sex> [Accessed: 26/02/2016]

Thatcher, Margaret, ‘The Moral Foundations of Society’, Speech delivered at Hillsdale College, 11/1994.

Thompson, Gavin, Hawkins, Oliver, Dar, Aliyah, Taylor, Mark, Olympic Britain: Social and Economic Change Since the 1908 and 1948 London Games, (London: House of Commons Publications, 2012)

Westacott, Emrys, ‘Moral Relativism’, Internet Encyclopedia of Philosophy, <http://www.iep.utm.edu/moral-re/> [Accessed: 27/02/2016]


Chloe Pritchard took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘Aesthetics’ as a philosophical keyword.

Aesthetics has been defined by the Oxford English dictionary as ‘a set of principles concerned with the nature and appreciation of beauty, or the branch of philosophy which deals with questions of beauty and artistic taste’.

Appreciation of pleasing aesthetics, or ‘beauty’, has been perceived by some academics as a means of communicating that begins as early as childhood. As Barbara Herberholz hypothesises, aesthetic judgement involves an awareness of our environment, and that early childhood is the time for aesthetic beginnings and foundations. Thus, appreciation for aesthetics resonates so deeply within our society and culture that many believe it begins as early as childhood.

To understand how the appreciation of pleasing aesthetics has become so permeated into our society, we must study its philosophical and historical origins. The appreciation of beauty, or of pleasant aesthetics, arose primarily from prominent ancient Greek philosophers such as Plato, Pythagoras and Aristotle. These philosophers explored the subject of aesthetics from a combination of empirical and moral methods, which included pleasing aesthetics as one of the ‘highest forms’ of goodness on Earth, attempting to quantify mathematical properties of ‘beauty’, and perfect aesthetics in a human as a paragon for virtue.

Sandro Botticelli, The Birth of Venus (c. 1486). Tempera on canvas, Uffizi, Florence. In Roman mythology, Venus was the Goddess of beauty, love, sex, fertility, prosperity and desire. The Goddess Venus is often seen as the classical personification of beauty.

Sandro Botticelli, The Birth of Venus (c. 1486). Tempera on canvas, Uffizi, Florence. In Roman mythology, Venus was the Goddess of beauty, love, sex, fertility, prosperity and desire. The Goddess Venus is often seen as the classical personification of beauty.

From the 1700s onwards, the study of aesthetics became a topic of philosophical interest. Prominent philosophers such as Hume and Kant both addressed the philosophical appreciation of beauty: they argued that aesthetic pleasure was something universally shared, and that pleasure is derived from a universal agreement of good taste. This universality of aesthetics has been questioned by some modern feminists, who argue that philosophers in the 18th century who could afford (and who had the leisure time) to produce supposed universal truths were in a position of privilege; the elitist philosophers ‘universal truths’ and policies of absolutism were therefore subjugated to their own class and gendered perceptions, their ‘universal truths’ often unknowingly excluded opinions from the alternative classes and the opposing gender.

During the 1800s, ‘aestheticism’, the movement around the appreciation of aesthetics, promoted the ideal of beauty over social-political themes as the inspiration for literature, fine art, music, and other arts. The discussions around aestheticism became more varied and complex during this era: For instance, music, previously deemed beautiful but trivial in aesthetics terms by Kant in his work Critique of Judgement, became divided into two contrasting philosophies in the 19th century: formalism and anti-formalism.

Formalists believed that music could only be appreciated as intelligent and beautiful manipulations of sound, essentially as a pleasant sound aesthetic. Although formalists appreciated music, they believed that music held no deep philosophical meaning. Those philosophically opposed to this concept of music, or the aesthetics of sound, as nothing more then an arrangement of pleasant noises were referred to as anti-formalists. Famous anti-formalists included prominent composers and philosophers such as Richard Wagner and Nietzsche. They believed that the aesthetics of sound, i.e music, could be utilised as a form of expressing alternative artistic meanings and emotions, however they believed music could only be used to enhance other artistic mediums, like plays or works of art, and not as its own form of art. The clip below is an extract from Wagner’s famous opera ‘the ring cycle’, and shows how music was successfully used to enhance the theatricality of the story.

In the 19th century literary movement, writers of the ‘aesthetic style’ adopted the bohemian principle that art should exist for art’s sake (from the French expression l’art pour l’art) and its only requirement was to be beautiful. As demonstrated in an extract from Edgar Allan Poe’s ‘the poetic principal’:

“We have taken it into our heads that to write a poem simply for the poem’s sake […] and to acknowledge such to have been our design, would be to confess ourselves radically wanting in the true poetic dignity and force: – but the simple fact is that would we but permit ourselves to look into our own souls we should immediately there discover that under the sun there neither exists nor can exist any work more thoroughly dignified, more supremely noble, than this very poem, this poem per se, this poem which is a poem and nothing more, this poem written solely for the poem’s sake”

This expression ‘art for art’s sake’ made an artistic comeback in 1975, as it became the title of a single released by 10cc which reached 5th on the UK singles chart.

By the onset of the 20th century, artists, musicians and philosophers were challenging the assumptions that the purest form ‘art’ could be in was aesthetically pleasing. The term aesthetics became much broader during the 20th century as new artistic concepts, such as modernism and postmodernism were introduced into society. The term aesthetics also became increasingly popular during the 20th century and began to be integrated into discussions on the arts.

My exploration of the term aesthetics in the 20th century will narrow and primarily focus upon art, music and literature. If your personal interest in 20th century aestheticism lies in alternative subjects, I would recommend books such as Judith Wechsler (ed) Aesthetics in Science (1988) and Eckart Voland, Karl Grammer (Eds.) Evolutionary Aesthetics (2013), which explore scientific and evolutionary aesthetics in the 20th century. (Previews for the books can be found by following the links below):

Aesthetics in science

Aesthetics of evolution

Musicians, philosophers, critics and poets of the 20th century debated questions which arose from the 19th century discussion on the aesthetics of music. Prominent composers like Stravinsky and poets such as Ezra Pound warned that listeners should not look for meaning in music, as it distracts listeners from truly experiencing the music. Many current philosophers however disagree with this approach to the aesthetics of music. For instance, philosophers Stephen Davies and Peter Kivy both believe that music can be an expression of the artist’s emotions and can offer a reflection into the internal state of the performer or composer. Discussions in the late 20th century & 21st century on the aesthetics of music focus(ed) upon a much larger variety of issues. Current music aestheticism touches upon topics such as the capacity for music to express emotion, the ‘decline’ of musical complexity in popular music, the concept of ‘bad music’, and whether musical aestheticism can actually be found in current music and the music industry.

Pablo Picasso, Three Musicians (1921), Museum of Modern Art. An example of synthetic-cubism, it is generally believed to represent Picasso, Guillaume Apollinaire, and Max Jacob (Jacob and Apollinaire were both poets).

Artists in the early 20th century attempted to use new and alternative forms of expression which rejected the traditional concept of pleasing and universal aesthetics, a far cry from the Kantian and Hume theories on the universality of aesthetics. Traditional notions of aesthetics were challenged and re-sculpted in early 20th century artistic movements such as cubism and surrealism. Aesthetics within modernist art in the 20th century became a matter of individual taste, as the principles of modernism stressed freedom of expression, experimentation, and radicalism. The term aestheticism is therefore entirely different and has changed significantly when compared to its usage in the 18th and 19th centuries, particularly as it has lost its philosophical universality.

Literature of the 20th century also attempted to break away from pre-existing notions of beauty and aestheticism. For instance, famous author Virginia Woolf advocates in her essay on modern fiction the importance of freedom of expression and originality. She also endorsed a move away from society constrictions on the aesthetics of literature. The aesthetics of literature in the 20th century was intermingled with other cultural influences like art and music. Movements like cubism, originally only considered an artistic movement, became so culturally significant its influence became broader and effected alternative cultural fields, like literature. For instance, William Faulkner’s novel As I lay dying and Gertrude Stein’s The Making of Americans contains cubist elements, as they both employ repetition and block passages (which were often rearranged) to construct an overall piece of literature.

This move away from traditional aestheticism within artistic movements can be seen as having potentially lead to the increasing usage of the term aestheticism throughout the 20th century, as its place within the arts was discussed, debated and often discredited.

Although this ‘break away’ attitude towards aestheticism was more prevalent towards the beginning of the 20th century, debating and questioning aestheticism’s place within artwork and the traditional role of beauty within society has maintained to this day. The term aesthetics seems to have changed entirely since its origins: as the importance of traditional beauty in artwork fluctuates, so too does the philosophy and cultural applications of the term aesthetics.


B.Herberholz (2nd ed.) Early Childhood Art (Dubuque, I.A, 1979).

C.Dahlhaus, The Idea of Absolute Music (Chicago, 1991).

Edgar Allan Poe, The Poetic Principle (1850).

Feminist views on aesthetics

Google Ngram for the word aesthetics

Immanuel Kant, The Critique of Judgement, trans. J.H.Bernard (1892).

Internet Encyclopaedia of philosophy on aesthetics

Oxford dictionary definition of aesthetics

Simon Frith, Taking Popular Music Seriously: Selected Essays (Surrey,2007).

S.Olsen, ‘Literary Aesthetics and Literary Practice’, Mind, Vol.90 (1981).











Lauren Purchase took the ‘Philosophical Britain‘ module at Queen Mary in 2016. In this post she writes about ‘Vegetarian’ as a philosophical keyword.

Undoubtedly, I was just one of the 1.2 million individuals who the Vegetarian society recognises to abstain from both meat and fish within the United Kingdom who were recently targeted by the January 2016 advertising campaign for the restaurant chain Gourmet Burger Kitchen. The marketing campaign perpetuated the dominant societal attitude that eating meat is both the tastiest and most natural food option through multiple images, including the one below, being present on public transport.

This campaign inevitably caused backlash from vegetarians, resulting in public media attention from significant papers such as the Independent and a subsequent withdrawal by GBK of the most offensive images. However, the damage had already been fully effected to the image of the contemporary vegetarian. That the undermining of the vegetarian extended far beyond GBK’s public images and their connotations is concluded in the statement released by Gourmet Burger itself on their Facebook page. Although GBK attempted to apologise, the company evidently did not understand the hostile reaction by the population’s vegetarians, claiming the campaign was ‘light-hearted’. The reception of the vegetarians to the campaign was thereby presented as melodramatic and representative of their humourless nature which has continuously been portrayed in modern British culture.

Vegetarians themselves undeniably recognised the existence of this stereotypical image in the 1960’s with the 1961 formation of the meatless restaurant Cranks. Crank was evidently a derivative of the word cranky which therefore deemed all vegetarians simultaneously as ill-tempered, odd and sickly. The term crank was thereby adopted ironically by vegetarians in the 1960’s in order to gain a form of control over their own image and humorously combat negative associations.

Despite the best intentions in the 60’s, however, to literally bend or ‘crank’ societal attitudes towards vegetarians, it is evident that derision has still continually prevailed. Vegetarianism has long been mocked through public culture through many different mediums, obviously including advertising images, such as those demonstrated by GBK, but also through theatre, television and popular public personalities. The ability of all of the three latter forms to ascribe a negative image to vegetarians is demonstrated in the later televised 1999 stand-up comedy performance by comedian Jack Dee at London’s Gielgud Theatre in which he claimed that the only point vegetarians were able to ‘muster enough energy to smile for the day’ was upon the discovery of mad cow disease which was able to reinforce their case for vegetarianism. This sketch evidently again highlights the vegetarian as being miserable and sickly from a poorer diet.

Nevertheless, the negative portrayals of vegetarians within the media could also be argued to have the ability to raise awareness of vegetarianism and hint at the potentially persuasive philosophical reasoning behind its adoption. This is true even in the case of the GBK marketing campaign, especially with regard to the image caption ‘they eat grass so you don’t have to.’ The contemporary Australian moral philosopher, Peter Singer, for example, has recognised that ‘a reduction in the amount of animal flesh consumed by Westerners would release enormous amounts of grain, soybeans and other high-quality plant foods, now being fed to animals, for hungry and malnourished humans.’ [1]  Singer believes this argument proves that the theory of utilitarianism, in which pain is minimised and pleasure maximised, naturally leads to the necessity for one to become a vegetarian. Although there exists inconveniences with abstaining from meat, including great losses for animal producers, Singer argues that the positive long term consequences of vegetarianism in easing world hunger far outweigh any drawbacks, concluding vegetarianism logical.

However, contemporary philosophical debate exists with regard to vegetarianism, with American philosopher Tom Regan challenging Singer’s argument. Regan claims that Singer’s utilitarianism stance fails to justify his adoption of vegetarianism due to his lack of proof and therefore has instead offered an ethical theory based on rights to substantiate vegetarianism. Regan believes that the population should become vegetarians as all living things ‘have equal inherent value and an equal prima facie right not to be harmed.’ [2] Evidently, both Singer and Regan have become influential in the British philosophical debate for the adoption of a vegetarian diet.

Nonetheless, Britain is not solely dependent upon notable foreign philosophers to publicly discuss vegetarianism as British moral philosopher Mary Midgley has also made significant contribution to the debate. Midgley has criticised the natural superiority which humans believe they have over animals due to the continuing existence of ‘the model of concentric circles dividing us from them[3], despite the fact that ‘we are not just rather like animals; we are animals.’ [4]

The philosophical debate over the righteousness of vegetarians, however, has long preceded contemporary philosophers, with 6th century Pythagoras being ‘the father of philosophical vegetarianism.’ [5] In fact, Pythagoras was so influential in the defensive argument for a meatless diet that ‘Pythagoreanism or the Pythagorean diet were synonymous for vegetarianism well into the nineteenth century.’ [6] As the Oxford English Dictionary confirms, the word vegetarian was only in general use from 1847 onwards through the formation of the Vegetarian Society in the same year at Ramsgate. Pythagoras undeniably remains a representative of many modern vegetarians by putting forth the three main arguments which still act as paramount in our reasoning for the justification of vegetarianism – namely religion, health and ethics. Firstly, Pythagoras believed in transmigration which formed the foundation for his religious basis of vegetarianism as this doctrine recognised the ability of the soul post death to be transferred into another body, including that of an animal. Furthermore, Pythagoras believed that it was morally wrong to cause unnecessary pain to harmless animals, even more so when the aim was purely for edible purposes which would result in an unhealthier body and mind.

Although the work Ethical Vegetarianism: From Pythagoras to Peter Singer has noted that the reasoning behind vegetarianism has undergone changes from antiquity to the present day, it is indisputable that the three key arguments of Pythagoras continued, and still continue, to be utilised to give credence to vegetarianism.  For instance, the predominant arguments for vegetarianism were ethical in antiquity, religious in the Middle Ages and renaissance period, increasingly ethical again during both the eighteenth and nineteenth centuries and multiple in the modern day, including both health and ethical concerns.[7] Perhaps, more importantly, this historical study of vegetarianism highlights the constant presence and discussion of a vegetarian diet, thereby undermining the previously demonstrated societal attitude deeming vegetarians as unnatural.

The means through which vegetarians substantiated their position inevitably initially revolved around literature, including Percy Shelley’s 1813 A Vindication of Natural Diet’, however in more recent times vegetarians have increasingly been able to utilise more creative means to both portray their beliefs and attempt to persuade others to embrace their lifestyle.

In the case of the contemporary English illustrator, Sue Coe, art has been effectively utilised to both evidence and justify her adoption of a meatless diet in this 2011 image:

Sue Coe's 2011 pro-vegetarian artwork, Image from http://www.troutgallery.org/exhibitions/detail/15

Sue Coe’s 2011 pro-vegetarian artwork, Image from http://www.troutgallery.org/exhibitions/detail/15

Music has also acted as an important medium through which vegetarianism has been promoted and the societal status quo has been questioned. This is most explicitly evident in the 1985 song Meat is Murder by the Manchester band, The Smiths. Within this song the lead singer Morrissey sings ‘the flesh you so fancifully fry is not succulent, tasty or kind, its death for no reason, and death for no reason is murder.’


Despite the methods of the presentation of vegetarianism evolving, however, as it has already been noted, key elements of vegetarianism remained consistent throughout history. Not only did the reasoning behind a meatless diet largely remain consistent but furthermore vegetarians have continuously been linked to subversive movements which have challenged the highest societal authorities. Tristram Stuart has noted that during the middle ages the two key heretical sects opposing the Church, namely the Cathars and Manicheans, had adopted a meatless diet and even the 1789-1799 French Revolution became associated with vegetarianism as it represented equality by denouncing the privileged aristocratic carnivorous diet. [8] That vegetarians continue in modern society to be linked to causes which challenge the societal norm is concluded in the works of American feminist Carol Adams. Adams claims that feminism and vegetarianism are intrinsically linked as both women and animals are viewed by the current patriarchal society as merely possessions which are to be dominated and controlled. [9]

The link between the oppression of both animals and women is constantly expressed throughout everyday culture, as is evidenced in the advert for the Manchester burger restaurant, Filthy cow:

The Burger Chain Filthy Cow's 2016 Advertising Campaign, Image from: https://twitter.com/filthycowuk?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor

The Burger Chain Filthy Cow’s 2016 Advertising Campaign, Image from: https://twitter.com/filthycowuk?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor

Yet even though vegetarianism has consistently been associated with radicalism, it has never been able to gain enough adherents within Britain to bring about a revolutionary wide-spread adoption of vegetarianism and a subsequent mass decline of both animal deaths and industrial livestock production. This is perhaps due to its connotations with revolution which has threatened society and thereby led to its constant undermining and mocking throughout history. Furthermore, many significant authorities have also denied the necessity of adopting a vegetarian diet, including St. Thomas Aquinas, who believed that humans were naturally superior to animals, being more rational.

Nevertheless, undeniably I can fortunately conclude that there still exists both a point and hope for individuals who have made the choice to become vegetarian. As Singer has acknowledged, the philosopher Jonathan Glover has proven ‘the absurdity of denying that we are each responsible for a share of the harms we collectively cause’ [10], confirming that with each vegetarian a slight reduction occurs in the demand for slaughtered animal meat. Conclusively, although the 2% of the British population who are vegetarians are not realistically making a great impact upon the supply and demand of animal meat, ultimately ‘becoming a vegetarian is a way of attesting to the depth and sincerity of one’s belief in the wrongness of what we are doing to animals.’ [11] It is, therefore, only from the position of a vegetarian that one can begin to persuasively encourage others to stop consuming meat which could eventually create real change.

Further Reading

David DeGrazia, Animal Rights: A Very Short Introduction (Oxford: Oxford University Press, 2002)

Cultural Encyclopedia of Vegetarianism, ed. by Margaret Puskar-Pasewicz (California: Greenwood, 2010)

Tom Regan, ‘Utilitarianism, Vegetarianism and Animal Rights’, Philosophy and Public Affairs, 9/4 (1980), pp. 305-324

Peter Singer, Animal Liberation (London: Pimlico, 1995)

Colin Spencer, The Heretic’s Feast: A History of Vegetarianism (Hanover: University Press of New England, 1996)


Books and Articles

Carol Adams, The Sexual Politics of Meat (New York: Bloomsbury, 1990)

Daniel Dombrowski, The Philosophy of Vegetarianism (Amherst: The University of Massachusetts Press, 1984)

Mary Midgley, Animals and Why They Matter (Georgia: University of Georgia Press, 1998)

Mary Midgley, Beast and Man: The Roots of Human Nature (London: Routledge, 2002)

Tom Regan, The Case for Animal Rights (California: University of California Press, 2004)

Peter Singer, ‘A Vegetarian Philosophy’, in Consuming Passions: Food in the Age of Anxiety, ed. by Sian Griffiths and Jennifer Wallace (Manchester: Mandolin, 1998)

Peter Singer, ‘Utilitarianism and Vegetarianism’, Philosophy and Public Affairs, 9/4 (1980), pp.325-337

Tristram Stuart, The Bloodless Revolution (New York: W.W. Norton & Company, 2007)

Ethical Vegetarianism: From Pythagoras to Peter Singer, ed. by Kerry Walters and Lisa Portmess (New York: State University of New York Press, 1999)

Websites and Online Databases

The Vegetarian Society, https://www.vegsoc.org/[accessed 7 February 2016]

Definition of vegetarian in the OED, http://0-www.oed.com.catalogue.ulrls.lon.ac.uk/view/Entry/221880?redirectedFrom=vegetarian#eid, [accessed 8 February 2016]

Internet Archive, Percy Bysshe Shelley, A Vindication of Natural Diet, 1813, https://archive.org/details/vindicationofnat00shelrich,  [accessed 6 February 2016]


YouTube, Jack Dee Live and Uncut, 1999, https://youtu.be/luEPXmRJwxA?t=2m8s, [accessed 10 February 2016]

YouTube, The Smiths, Meat is Murder, 1985, https://youtu.be/xacRTqk5QFM,  [accessed 10 February 2016]


[1] Peter Singer, ‘Utilitarianism and Vegetarianism’, Philosophy and Public Affairs, 9/4 (1980), pp.333-334

[2] Tom Regan, The Case for Animal Rights (California: University of California Press, 2004), p. 324

[3] Mary Midgley, Animals and Why They Matter (Georgia: University of Georgia Press, 1998), p.32

[4] Mary Midgley, Beast and Man: The Roots of Human Nature (London: Routledge, 2002), p.xxxiii

[5] Daniel Dombrowski, The Philosophy of Vegetarianism (Amherst: The University of Massachusetts Press, 1984), p.35

[6] Ethical Vegetarianism: From Pythagoras to Peter Singer, ed. by Kerry Walters and Lisa Portmess (New York: State University of New York Press, 1999), p.13

[7] Ethical Vegetarianism: From Pythagoras to Peter Singer, ed. by Kerry Walters and Lisa Portmess (New York: State University of New York Press, 1999)

[8] Tristram Stuart, The Bloodless Revolution (New York: W.W. Norton & Company, 2007)

[9] Carol Adams, The Sexual Politics of Meat (New York: Bloomsbury, 1990)

[10] Peter Singer, ‘A Vegetarian Philosophy’, in Consuming Passions: Food in the Age of Anxiety, ed. by Sian Griffiths and Jennifer Wallace (Manchester: Mandolin, 1998), p.75

[11] Peter Singer, ‘Utilitarianism and Vegetarianism’, Philosophy and Public Affairs, 9/4 (1980), p.337




REVIEW: Women in Philosophy: What Needs to Change?

Katherine Angel is the author of Unmastered: A Book On Desire, Most Difficult To Tell (Penguin/Allen Lane, Farrar Straus & Giroux). She is a Lecturer in Creative Writing at Kingston University, and is completing her second book, an exploration of subjectivity and selfhood in contemporary sex research. She has a PhD from the University of Cambridge’s History and Philosophy of Science Department, and has held Wellcome Trust and Leverhulme research fellowships at the University of Warwick and Queen Mary, University of London. Her writing has appeared in The IndependentProspect,The New StatesmanAeonThe Los Angeles Review of Books, and Five Dials, and she reviews for the Times Literary Supplement and Poetry Review.

Katrina Hutchison and Fiona Jenkins (eds), Women in Philosophy: What Needs to Change? (Oxford: Oxford University Press, 2013).

Women in Philosophy cover imageIt took me ages to start writing this review. I read Women in Philosophy quite fast, buffeted by many feelings: frustration, sadness, rage. I felt not just frustration at the evident tenacity of philosophy’s problem with women, as documented in the book, nor just exasperation with the disingenuous, self-serving responses that so many in the field have persistently offered, but also a sense of unease at the act of engaging, philosophically, with those indifferent to the problem, or offering woeful apologies for it. I began, in fact, to be fascinated by how some of the book’s essays implicitly raise this question: in engaging with a problem, how much do you engage, patiently, doggedly, with blinkered and lame machinations? Do you patiently invoke every weak and problematic argument in order to dismantle it? Or do you turn your back on these?

There are very few women in academic philosophy. The numbers, as quoted in the book, are closer to those in STEM disciplines than other humanities subjects. In the US, figures from 2003 and 2009 show that women make up 21% of academic philosophers – comparable to figures of 20.6% in the physical sciences and 22.2% in the life sciences in 2009. In (relative) contrast, however, women make up 41% of academics in the humanities as a whole. The figures for the overwhelming whiteness of philosophy, as for many humanities disciplines, must be equally, if not more, dispiriting; and the book acknowledges the racial disparities as similarly pressing.

In the introduction to the book, Jenkins and Hutchison write of the feelings of frustration and anger amongst women about the discipline’s paltry success in improving this imbalance. They write that feelings of rage ‘should not be forgotten’, but that ‘training in the discipline’ can in fact ‘become a resource for careful analysis of the questions at stake.’ And yet, the question of what the discipline is – what methods it prioritizes, what it considers to be legitimate subject-matter, and what kind of world-view it assumes – is precisely what is at issue. The volume itself is torn between approaches to gender imbalance which, in their careful discussion of potential causes and solutions, remain rather timidly respectful of the discipline as a whole, and those which mount more substantial objections to the very core of the discipline.

The exciting essays in the book are the ones that pursue, hard, the question of philosophy’s own self-sustaining image, fundamentally challenging the cultural assumptions that might prevent women from penetrating the field more fully, and men from taking the problem seriously. Any bafflement at the fact that, despite good intentions and little overt desire to exclude women from philosophy, the numbers nonetheless remain pathetically low, needs to be replaced by a serious scrutiny of the culture of a discipline which, in the very manner in which it conceives itself – its rationale, its raison d’etre, and its everyday modus operandi – inherently resists a mode of thought that could enable both a clearer perception of why the problem has arisen, and of what might be done to change it.

But, perhaps understandably, as part of a discipline and a community, professional philosophers themselves undertake this task at times ambivalently, or at least cautiously.  An example is Marilyn Friedman’s chapter in the book. Given that philosophy is often conceived as the exercise of reason in the pursuit of wisdom, she writes, specific images of reason might work implicitly to exclude women. Philosophical reason is often represented as adversarial dialogue, as a contest of wills; an image prevails of the philosopher as a gadfly, a clever reasoner, challenging the errors and confusions of others. The emphasis on adversarial engagement, or on gadflies, or on wise contemplative sages is worrisome because students and/or their teachers may find it harder to see women in these roles, and women themselves, because they are not encouraged to, might not go around thinking of themselves as masterful thinkers achieving synoptic insight, nor as fearsome tearers-down of the confusions of others.

Friedman’s article, though, depressed me. It strikes me as almost unbearable that a woman philosopher has to write an article including a sentence such as this: ‘What are some of the counterbalancing benefits of promoting women’s greater participation in professional philosophy?’ Friedman’s careful, cautious reasoning – ‘First, current female students in philosophy classes might be more likely to be attracted to the field if women were recognizably present in substantial numbers…; Second, most women who are already professional philosophers would likely benefit from an increase in their numbers in the field’ – comes to seem like a symptom of the problem. Her gentle delineation of the problem feels like a way of keeping a resistant community on side; of not alienating them, of not raising their hackles. She writes that these reasons already presuppose that it is valuable to increase the numbers of women in philosophy; ‘We still need independent reasons for thinking that an increase in the number of women in philosophy will, at the very least, improve philosophy and, it is to be hoped, also be good for women.’ And so ‘My third and most important suggestion’ is ‘an independent reason to increase the number of women in professional philosophy, namely, that doing so would improve philosophy.’ Finally! We get to an acknowledgement that perhaps this discipline, its very constitution, is shot through with problems itself. But the manner in which this is done illustrates something of the methodological ambience permeating so much philosophy; a kind of hand-wringing, a slavish love of distinction-drawing. Friedman’s careful, hand-holding raising of this possibility – that there might be something wrong with a philosophy which enables such marginalization of women, and one which is so resistant to self-reflection and change – is an instance of the hemmed-in, constrained nature of philosophical argument. The mania for outlining steps (‘this suggestion has two parts’) is in one light simply the mechanics of clear writing; in another, it is the discourse of someone constantly pre-empting the interruptions of a pedant bent on pointing out inconsistencies, of finding a smuggled-in premise, of discovering a non sequitur. It’s a discourse that is anxious not to provoke the ire, irritation, or dismissal or the prototypical philosopher. Friedman’s chapter is haunted by the ever-hovering possibility of a clever young man pointing out a flaw in her argument; where the pernickety pointing out of flaws is a substitute for deep engagement with a big question of huge political importance – who gets to speak and why. That women end up having to write in such constrained tones about such an important flaw in the discipline is evidence of the discipline’s flaws to start with.

A similar atmosphere besets Jennifer Saul’s chapter – an interesting discussion of implicit bias and stereotype threat. Anonymous review of journal articles has, research suggests, led to a 33% increase in representation of female authors; implicit bias is clearly at work, regardless of whether reviewers see themselves as sexist readers. Stereotype threat ensures that, when the low numbers of women are emphasized or drawn attention to, women underperform. If circumstances elicit for someone an awareness of their stigmatized and under-represents status – circumstances that, in male-dominated (and overwhelmingly white) environments, will repeatedly, endlessly do – then they will underperform. As Helen Beebee explains in her chapter, ‘doing or saying anything that might suggest that you think women are worse at philosophy than men is not needed, any more than seven-year-old girls need to be told that girls are worse at maths in order to underperform on the maths test. They just need to be reminded that they are girls.’

But again, the hemmed-in sense of women having to gently make the case for why this matters is troubling. Saul asks: ‘why should philosophers care?’, and answers that implicit bias and stereotype threat ‘demand action from philosophy simply on grounds of concerns for either a) fairness or b) philosophy, even philosophy as traditionally conceived. No particular concern for women, for feminist philosophy, or even for enriching philosophy with a diversity of perspectives is needed in order to motivate action on the basis of these phenomena.’ There is something shocking about the possibility of writing this; it functions as a reassurance, to an audience hostile to reflecting on itself, that they don’t have to care about women’s experience to pay intellectual attention to these phenomena.

Some chapters are bolder in tone. Fiona Jenkins, in ‘Singing the Post-Discrimination Blues: Notes For a Critique of Academic Meritocracy’, discusses the fact that meritocracy often ends up reproducing the status quo and the dominance of elite groups. It is a common experience that attempts to improve gender balance (or indeed racial balance) are construed as a threat to or a dilution of quality (a concept troublingly laden with notions of purity and mixity). Jenkins argues against seeing gender equity as ‘an “add-on goal” divorced from “aims of excellence”’, where it then ‘seems to compete with selection for qualities presumed to be essential to the discipline’. She suggests instead that the idea of what is essential to the discipline is what needs to be examined, and frames the issue as one of asking the discipline to imagine more inclusive futures for itself. She does this by refusing the way philosophy tends to cast gender as a political/moral consideration external to the ‘neutral and impartial deliberations of academies’. The gendering of philosophy, Jenkins argues, ‘has gone hand in hand with the marginalization of certain orders of critical questions, including those of feminist theory, within the discipline’. Jenkins’ approach, then, is that much more radical and questioning; it looks to examine and change philosophy’s own self-conception and its own content, rather than taking philosophy’s self-rhetoric as a given and tinkering about at the edges, idly wondering why women don’t excel within a field whose rationale, content, and self-image are simply taken for granted as a necessary, un-historical given. In effect, Jenkins suggests that philosophy thinks about the production of knowledge in such a way that it ‘rules out the very relevance of gender in the production of knowledge’. Hence its lackadaisical response to the problem of the dominance of men in philosophy.

There are many interesting chapters in the collection; Dodds and Goddard, as well as Mackenzie and Townley, raise the issue of listening to student experience, and changing details of the curriculum. Dodds and Goddard ask: ‘Are the conceptual issues with understanding personal identity really best captured by examples involving standard-bearers in battle or Martian teletransportation?’ And Mackenzie and Townley argue that ‘some students may be well placed to identify some of our unconscious biases, as they might be highly sensitized in relevant domains’ – perspectives that could do some work in undermining some of the current glibness about students’ political awareness, glibness encapsulated in claims that today’s students are coddled and bullying in their interventions into the way universities are run and the kind of knowledge and perspectives they give space to.

Samantha Brennan’s chapter defends the significance of micro-inequities, arguing that an individualism about acts and their results prevents us from seeing cumulative harms – phenomena such as gestures, tone of voice, patterns of interruption, all the ‘little issues’ that can accumulate in a person’s experience of an environment until they reach breaking-point, and their reaction often cast as an over-reaction because it doesn’t explicitly reveal the sheer multitude of previous micro-harms. (Claudia Rankine discusses analogous processes in relation to racism, in her reflections of Serena Williams, in her award-winning Citizen.) Brennan also discusses the detailed phenomena that can affect how a particular philosopher might be seen by peers and seniors: she asks whether ‘those of us who are socialized to smile, to always make others comfortable, can ever really look smart in that deep-in-thought, furrowed-brow kind of way’, or ‘if we can “hear” soft voices, southern accents [she means American south; the UK equivalent would be northern accents], and lilting speech as “smart”’.

Justine McGill, in a fine chapter, casts these problems in terms of ‘repeated experiences of having speech acts fail’. She argues that ‘the gendered aspect of embodied experience is something that must be recognized and addressed, not dismissed as irrelevant to philosophical achievement’. It is, she suggests, ‘a privileged male philosopher’s fantasy’ to believe that ‘his body doesn’t matter’. Undeniably true: not least, the experience of being in a room dominated by men, not feeling conscious of one’s gender as one walks into a seminar. And Michelle Bastian disputes the assumption that embodied experience makes little difference to the philosophy one produces by reflecting on the experience of time for women in philosophy; women, she argues, are ‘continually refused a place in the flow of philosophy’s time’.

I studied philosophy, two decades ago, in Cambridge. There was a lot to feel uncomfortable about, to say the least – some of it to do with the bizarre, self-perpetuating and self-mythologising ethos of Cambridge itself, and some of it to with the particular culture of analytic philosophy that characterized the department there. Occasionally, a ‘special paper’ on feminist philosophy was allowed to feature on the syllabus; there was a distinct sense of this eccentricity, pushed by one of the few female philosophers on the faculty, being tolerated with raised eyebrows by colleagues. I had some very supportive and encouraging teachers, though increasingly I felt baffled by what exactly this discipline was, frozen in time, insulated from other critical thought. Moreover, I had repeated experiences of discomfort and alienation; I was the only woman in a philosophy of logic lectures, week in week out, taught by a professor who was a total letch, and made me feel extremely uncomfortable by directing his entire lectures at me, the only woman in a room of about 80 men. In one particularly delightful episode, an esteemed Nietzsche scholar walked around the lecture room on the first day of classes, stopping to chat to his acolytes, and came to my desk, where I sat with another woman (the only two in the room); he stood there for what felt like an age, looking at us with a withering contempt, bordering on disgust. The message was clear. I had been told that he wouldn’t supervise us because we were women; I assumed someone else would be available to teach us creatures; no-one was. I still got the top first in my year.

That felt satisfying; but also meaningless, because a large part of me did not want to be in this constrained, reactionary club, did not want to endorse the ethos, did not want to be part of a community that did not challenge the indifference or hostility of many of its members to the well-being of its women students. I left philosophy, migrating into various departments which, while beset by familiar gender problems, at least acknowledged to some extent the importance of feminist scholarship, as well as their own historical and geographical specificity as disciplines. I have long had an ambivalent and complex relationship to academia, in part because of an emerging writerly identity that pulled me away from ultimately identifying enough with any particular discipline to really want to make a place within it. But I’m relieved that I don’t have to confront these questions in a discipline whose very identity is both so entangled with the problem itself, by casting itself as a supremely neutral method of cleverness, and so invested in being baffled and dismissive of the possibility of its entanglement. A philosophy over-identified with a naïve picture of the discipline as simply the pursuit of wisdom and the interrogation of thought, without location, without context, without constitutive characteristics that allow it to understand itself as situated, contingent, and flawed, is exhausting to behold. Imagine a lifetime of being patronised by advocates of this naivety! Thank God I made my escape early.

In her chapter, Beebee notes the dominance of ‘argument-as-war’ metaphors in philosophy, and a tenacious notion of philosophy as a kind of performative virtuosity. She also queries the accusation that suggesting that an atmosphere of combative virtuosity might ‘drive women out’ of philosophy is ‘demeaning to women’. I too have long found this accusation self-serving. Whether or not a culture of performative combativeness is due to a discipline’s domination by men – and whether or not aggressiveness is a characteristic of men more than of women – the point is that this accusation (that it’s insulting to women to say they cannot tolerate combativeness) in fact assumes that it is weak not be able to cut it in this environment. The more relevant issue is, I think, that of judging a culture to manifest an ethos that you consider itself demeaning. I despise the self-satisfied culture of going in for the kill because I hold to a different set of principles; I don’t consider ‘tearing an opponent down’ for show to be a virtue, and, moreover, I am aware of how the romanticisation of this combativeness relies on an experience of not being in a marginalized group. Men and women, and whites and people of colour, have vastly differing social experiences; an attachment to a culture of triumphalist take-downs involves a refusal to consider how the dynamics of such take-downs look and feel to those in the minority, given the unequal demographic profile of the community. In other words, this attachment relies on a refusal on the part of the dominant group to acknowledge its dominance. I made my way out of philosophy because I knew its atmosphere of combative virtuosity to hide self-serving universalist premises, and because I found this its atmosphere to be beneath me; to be depressing, limiting, and not conducive to interesting thought or conversation.  There’s nothing demeaning about not wanting to be part of a culture that you disrespect.

Feminist philosophy is a large field, but has not persuaded the discipline as a whole to rethink the philosophical categories and intuitions that are cast as being universal, self-evident but are in fact dependent upon the particularities of the ‘typical’ philosopher: male, white, middle-class. In fact, philosophy as a discipline has tended to remain deaf to the swathes of work in some of its own fields as well as other disciplines – feminist philosophy; standpoint epistemology; the philosophy of science; the history of science, medicine and psychiatry; the history of sexuality; sociological studies of knowledge and “science studies”; anthropological and ethnographical studies, to mention just a few – that have established that method and content are not unrelated; that the location and context of knowledge production shapes that knowledge itself, as well as its reception and journey through the world; that who makes knowledge matters to what knowledge is made, and to where it goes; that time, place, gender, race, socioeconomic status, matter. Not just the Foucaults and Kuhns of the much-disputed boundary wars of philosophy-history, but also some of even analytic philosophy’s own darlings – Searle and Wittgenstein –  articulated insights that spawned entire approaches to the history of thought which emphasized the crucial importance of culture, context, and ethos to what many have longed to see as unsullied by such messiness.

Dodds and Goddard, in their chapter, discuss a conception of philosophy as ‘not concerned with the particular, with the contingent’, as requiring personal opinion ‘to be overcome or to be transformed’ (this is quoting Gatens 1986). This affects, they suggest, ‘the capacity for feminist challenges to be heard by philosophers as philosophy, and for these challenges to inform developments in philosophical argument or practice’. That the feminist philosophical challenges to philosophy remain unheard by the core of the discipline is on a disturbing par with women’s repeated experiences of ‘having speech acts fail’ (as Jenkins puts it) – for example, having a significant question interpreted as a naïve demand for help, or being repeatedly talked over and interrupted in discussions. McGill invokes the work of Rae Langton and Caroline West on pornography as a form of silencing of women; as a language game that ensures that there will be certain speech acts that women ‘can no longer successfully or easily perform’. Drawing on their analysis, McGill suggests that women’s speech is commonly either ‘ignored, treated as incompetent’ or ‘heard as if it had been scripted by a man’.

Sadly, no discipline is free from these dispiriting dynamics. But some are worse than others. The authors in this collection suggest various piecemeal solutions that will help the dire situation. They, presumably, want to stay in the field. But the suggestions at times feel to me too accommodating; as if courage has been lost. My response is more violent, more sweeping: ditch philosophy! What is this absurd discipline, this remnant of a positivist idealisation of a science whose positivism has been shown by all the disciplines worth taking seriously (including, in fact, the philosophy of science, much of which owes more to the history of science and the sociological and anthropological study of scientific knowledge) to be illusory and misplaced; what is this discipline that refuses to look at itself, that displays an obtuseness, a lack of historical self-awareness that should make it, even by its own lights, blush? I want no part of it; I almost think that trying to improve, by small, non-alienating gestures, this small-minded discipline, is a waste of time. I want to raze it to the ground and start again.

Follow Katherine Angel on Twitter: @KayEngels

Read Katherine Angel’s review of Ronald de Sousa’s Love: A Very Short Introduction for the Cultural History of Philosophy blog.



Lawrence Charlesworth took the ‘Philosophical Britain‘ module at Queen Mary in 2015. In this post he writes about ‘Agnosticism’ as a philosophical keyword.

We are quite familiar in twenty first century Britain with talk of the existence or non-existence of God. It is no surprise to us if, while browsing the Guardian website, we find an article describing Stephen Fry’s thoughts on God: “utterly evil, capricious and monstrous” – as though He were a dictator criticised for human rights abuses. You can watch the interview, first broadcast on Irish television, here:

Nor are we surprised to hear religious leaders talk about the subject – see, for example, Pope Francis’ thoughts on atheism: “God’s mercy has no limits if you go to him with a sincere and contrite heart. The issue for those who do not believe in God is to obey their conscience.” The headline proclaims: “You don’t have to believe in God to go to heaven” – a comforting thought for atheists everywhere.

Statistics on religious belief, too, reveal a pervading sense of uncertainty. According to the 2011 Census, a quarter of the population have ‘no religion’, and 7% did not answer the (voluntary) question.[i] However, the British Humanist Association has criticised the Census for using a leading question: “What is your religion?” They conducted their own survey in 2011 in order to prove the disingenuousness of the official results. They first asked people, “What is your religion?” and received similar results to the Census. They then asked the same people, “Are you religious?”, but this time around two thirds answered no and only one third answered yes. Furthermore, of those who said they were Christian, only 48% “said they believed that Jesus Christ was a real person who died and came back to life and was the son of God.”[ii] (Although their scruples with leading questions apparently don’t extend to their own website. You can take their quiz, ‘how humanist are you’, found on their homepage – but don’t be too surprised if you discover that you are, in fact, a humanist.)

Richard Dawkins is probably the most famous proponent of atheism today. His view of agnosticism is that it is a bit wishy-washy – at least religious believers have the courage of their convictions. His argument against agnosticism, in ‘The Poverty of Agnosticism’ – a chapter in ‘The God Delusion’ (2006) – is that if we don’t know for sure that God doesn’t exist, it is highly improbable; furthermore, he thinks, one day science will prove God’s inexistence.[iii] However, even he, the public face of atheism, doesn’t quite know what he is: in a debate with Rowan Williams, he said that he “preferred to call himself an agnostic”.

Clearly, modern Britain is uncertain about its religious beliefs. Yet, despite this, we do not often hear the word ‘agnostic’ used. The word itself was first coined in 1869, by Thomas Henry Huxley – a biologist, nicknamed ‘Darwin’s Bulldog’, for his defence of Darwin’s theories against his, often religious, opponents. He posited himself as ‘agnostic’, in contradistinction to ‘gnosis’ – the Greek for knowledge. However, although Huxley coined the word himself, the first modern agnostic might be said to be Auguste Comte (1798-1857). Comte was a French philosopher, and was especially influential in the development of sociology and positivism. It was from his positivism that he has been thought of as the first agnostic. Positivism, influential in the late nineteenth and early twentieth century, is a philosophical doctrine that elevates the scientific method as the only arbiter of truth. Knowledge consists of those things which are empirically verifiable – that is, things which can be proved or disproven through the senses (understanding tools such as microscopes or MRI machines as simply extensions of the senses). This means that anything which cannot be proven or disproven in such a manner does not even have the merit of being considered ‘true’ or ‘false’. It is beyond the limits of our knowledge, and to consider the truth or falseness of such things is essentially meaningless. This is an attack on metaphysics – on everything beyond the immediate physical world. God, by definition, is not a part of the physical world, and so, according the positivist world view, is thrown on the rubbish heap of ‘meaningless concepts’.

A pensive Bertrand Russell.

A pensive Bertrand Russell.

In this light, to be an agnostic is not so much a middle position, directly between theism and atheism, belief and un-belief, but rather a clarification of un-belief. Bertrand Russell (1872-1970), a towering figure of twentieth-century British philosophy, and the quintessential analytic philosopher, himself expressed such a view. In ‘Am I an Atheist or an Agnostic?’ (1947), he says that he would describe himself as an agnostic to a “purely philosophic audience”, but to an “ordinary man in the street” he would call himself an atheist. This reduces agnosticism to a sort of philosophical specificity. Indeed, he goes on to say that, strictly speaking, he is also agnostic about the existence of the gods of Ancient Greece. This position might be characterised as an ‘agnostic atheism’: ‘One cannot really know whether God exists or not; but I don’t believe he does.’ Russell goes on to speak of the virtues of agnosticism. Although some things are more probable than others, and there exist degrees of certainty, it is often felicitous to remind ourselves that there are no such things as certainties. He finishes the piece warning of the dangers of dogma and persecution – here he must have had in mind both the religious persecutions of the past and the (atheistic) intolerance of the Soviet Union in his own day. For Russell then, agnosticism represents the ideals of tolerance and liberalism – it functions as a kind of ‘memento mori’.[iv] He is not alone in giving agnosticism a broader context. The word, after all, means simply ‘a lack of knowledge’, and so others have used the word in its literal sense, for example, in clarifying their position on an unconfirmed scientific theory.

Agnosticism is not only used by un-believers, however. In the modern world there has been an increasing sense of conflict between religion and science. This is even more obvious to us today. In twenty first-century Britain, no-one can escape the vociferous tirades of the New Atheists. However, it would be a mistake to understand the natures of religion and science as fundamentally in conflict. Agnosticism might then be taken up as a position, for un-believers, to distance themselves from the aggressive un-belief of Dawkins and co., and for believers, to intimate their belief in the compatibility of God and a scientific universe. In 1997, the scientist and self-described ‘Jewish agnostic’ Stephen Jay Gould wrote the essay ‘Non-Overlapping Magisteria’. His view is best summarised in his own words: “The net of science covers the empirical universe: what is it made of (fact) and why does it work this way (theory). The net of religion extends over questions of moral meaning and value. These two magisteria do not overlap, nor do they encompass all inquiry (consider, for starters, the magisterium of art and the meaning of beauty).”[v] In writing his essay, he was inspired by Pope Pius XII’s papal encyclical, ‘Humani Generis’ (1950) and Pope John Paul II’s address to the Pontifical Academy of Sciences, ‘Truth Cannot Contradict Truth’ (1996). Both are concerned with evolution, and both, claims Gould, support his idea that religion and science are two separate areas of learning.

On the other side of the fence, many theologians of the twentieth-century have philosophised in support of a rapprochement between science and religion. This should come as no surprise: considering the successes of the scientific method in recent centuries, it is religion that has the most to lose by any supposed conflict between the two. One such theologian is Paul Tillich (1886-1965), considered one of the most influential of the twentieth century. As Gould did, Tillich definitely delineates between science and religion. Fundamental to his theology is the notion that before creating any theology, we must first understand the contemporary situation in which we live.  For twentieth-century theologians, this means understanding the empirical character of reason in the modern world. For Christianity to succeed in this situation, it must undergo a process of ‘demythologisation’ – the purpose of this being to prevent the language of Christianity being confused with the language of science. Scientific knowledge is of a fundamentally different type to knowledge of God. One can possess knowledge of the world. One cannot possess knowledge of God; knowledge of God possesses you.

Although Paul Tillich was not an agnostic, we can understand him in such a way. Whether God does or does not exist cannot be known – certainly it cannot be known in the same way things about the physical universe can be known. We might understand something of God, through revelatory experiences, but we do not know anything of him. For Tillich, to talk about God in the same way we talk about the universe, even to talk of God as a ‘first cause’, is to understand God as finite – it is idolatry. Tillich was an agnostic in the same sense Bertrand Russell was – understanding knowledge as limited by the bounds of the physical universe, neither had any knowledge of God.

If Tillich didn’t call himself an agnostic, many Christians since have done. One has only to type the two words together into Google to see the proliferation of blogs, books, and articles written on the subject. Clearly, agnosticism is a term which resonates with many Christians today. For a recent example of this, see this interview with Ian Hislop: “‘Is it an intellectual belief?’ ‘Someone once said, you’re a classic C-of-E agnostic. I’m not sure what I believe, but it’s definitely C-of-E.’“[vi] Or, from his chapter in ‘Why I am Still an Anglican’ (2007): “I’ve tried atheism and I can’t stick at it: I keep having doubts. That probably sums up my position.”[vii] If there is such a thing as an ‘English spirit’, it is much more akin to the mild-mannered, reasonable Anglicanism of, say, Rowan Williams, Giles Fraser, or Ian Hislop, than the rude, intolerant atheism of Richard Dawkins. After all, in the 2011 Census, around 60% of people did affirm themselves as Christian. I mentioned earlier in this post a quiz on the British Humanist Association’s website, which tries to show that any reasonably minded Briton is actually a humanist. Well, I will hazard the suggestion that we are a nation of agnostics – we just don’t know that we are.

Further Reading

Kloe Fowler, ‘Humanism’, Cultural History of Philosophy Blog, 2015.

Smart, J. J. C., “Atheism and Agnosticism”, The Stanford Encyclopedia of Philosophy (Spring 2013 Edition), Edward N. Zalta (ed.)

Dougherty, Trent, “Skeptical Theism”, The Stanford Encyclopedia of Philosophy (Spring 2014 Edition), Edward N. Zalta (ed.), 

The Cambridge Companion to Paul Tillich, Edited by Russell Re Manning, Cambridge University Press, 2009 – especially chapter 14, ‘Tillich in dialogue with natural science, by John F. Haught

‘Agnosticism; (1889), by Thomas Henry Huxley.

Science and Religion: A Very Short Introduction, Thomas Dixon, Oxford University Press, 2008


[i] http://www.ons.gov.uk/ons/rel/census/2011-census/key-statistics-for-local-authorities-in-england-and-wales/rpt-religion.html

[ii] https://humanism.org.uk/campaigns/religion-and-belief-some-surveys-and-statistics/

[iii] Richard Dawkins, The God Delusion, 2006

[iv] Bertrand Russell, ‘Am I an Atheist or an Agnostic?’, 1947, https://www.andrew.cmu.edu/user/jksadegh/A%20Good%20Atheist%20Secularist%20Skeptical%20Book%20Collection/Am%20I%20An%20Atheist%20Or%20An%20Agnostic%20-%20Bertrand%20Russell.pdf

[v] http://www.colorado.edu/physics/phys3000/phys3000_fa11/StevenJGoulldNOMA.pdf

[vi] http://youtu.be/GBfWQnjZeLY?t=12m20s

[vii] Ian Hislop, ‘Why I Am Still an Anglican: Essays and Conversations’, Edited by Caroline Chartres, A&C Black, 2007, p.99



Kloe Fowler took the ‘Philosophical Britain‘ module at Queen Mary in 2015. In this post she writes about ‘Humanism’ as a philosophical keyword.

Professor Richard Dawkins on a London bus displaying the Atheist message. Photograph: Anthony Devlin/PA

Professor Richard Dawkins on a London bus displaying the Atheist message.
Photograph: Anthony Devlin/PA

You may remember, back in 2009, seeing London buses adorned with a message reading ‘there probably is no God. Now stop worrying and enjoy your life’. The campaign was the creation of the British Humanist Association (BHA), a national Humanist group whose campaigners felt that the adverts would be a ‘reassuring antidote’ to religious adverts which ‘threaten eternal damnation’ to passengers.

The word ‘humanist’ preceded the word ‘humanism’ and began life with different connotations to its present day meaning. The word umanista was first employed in fifteenth century Italy as a slang term to describe a university professor of the studia humanitatis – the humanities. However the word ‘humanism’, in roughly the same context we speak of it today, first appeared in the German form humanismus as late as 1808.[1]

To define… humanism briefly, I would say that it is a philosophy of joyous service for the greater good of all humanity in this natural world and advocating the methods of reason, science, and democracy. – Corliss Lamont.[2]

The roots of humanist philosophy lie with the ‘fathers of humanism’. The early fourteenth-century Italian scholar, Francesco Petrarch, is often accredited with being one of the first humanists. Petrarch rejected traditional medieval scholarship in favor of a revival of Classical authors like Aristotle, Plato and Tacitus and, by doing so, Petrarch laid the foundations for a philosophy which would eventually manifest in a formally recognized movement, punctuated by national organizations and institutions.[3]

Desiderius Erasmus's Collectanea Adagiorum (1518) Photograph: Bavarian State Library

Desiderius Erasmus’s Collectanea Adagiorum (1518)
Photograph: Bavarian State Library

Petrarch’s ideas were developed further during the Renaissance. The word ‘Renaissance’ derives from the French term for ‘rebirth’ and it is accepted by most to mean the revival of classical scholarship and learning in Western Europe between 1400 and 1600.[4]  Renaissance humanists were followers of literary, Christian humanism, where the pursuit of self-knowledge was seen as a way of getting closer to God whilst also emphasising the principles of the Holy Trinity, namely the role of Jesus as the human son of God the Father.[5] The popularity of humanism during the Renaissance can be seen in the rising popularity of humanist literature. Desiderius Erasmus, for example, emphasised the importance of Greek scholarship via manuscripts like the Adagia. The Adagia began life in 1500 as a collection of eight-hundred and eighteen Greek proverbs. By the death of Erasmus in 1536, the collection had grown to around four-hundred and fifty-one proverbs.[6]

The eighteenth century witnessed a departure from Christian humanism. The age of Enlightenment was characterised by scientific discovery, which consequently ushered in a new definition of humanism. Eighteenth century humanism became fixated on ideas of rationality, reason and the natural order. Auguste Comte, the ‘father of sociology,’ even attempted to rationalise the twelve-month Gregorian calendar by devising a new thirteen-month calendar with an extra day to worship the dead and a system for leap years. Finally, Comte proposed that each month be named after a Classical graeco-Roman scholar hence, March would become ‘Aristotle’.[7]

The philosophy of ‘Positivism’ was the brainchild of Comte. It stipulated that any attempts to prove truths about the world were futile unless they were based on sense perception or empirical scientific evidence.[8] Over time, Comte combined his Positivist ideals with existing humanist ones and embodied them in the formation of the ‘Religion of Humanity’. The Religion of Humanity was an atheistic religion which aimed to eliminate transcendence and superstition but maintain religious rituals and ethical teaching.

A Positivist Church in Porto Alegre, Brazil (2014) Photograph: the Science University of Iceland

A Positivist Church in Porto Alegre, Brazil (2014)
Photograph: the Science University of Iceland

 “On or about December 1910, human nature changed… all human reactions shifted… and when human relations change, there is at the same time a change in religion, conduct, politics and literature” – Virginia Woolf.[9]

The early disasters of the early twentieth century resulted in widespread pessimism and disillusionment in Europe. People began to identify that humanism had been unable to prevent any of the disasters which had affected them.[10] However, post-war disillusionment was endemic only to wartime and the humanist philosophy made a quick recovery, particularly among the Allied nations. In 1952, in Amsterdam, the humanist movement became formally recognised as the ‘International Humanist and Ethical Union’ (IHEU).

At around the same time, Marxist principles were brought to bear on Humanism. Karl Marx’s early writings, like The Economic and Philosophic Manuscripts of 1844 and The German Ideology, concentrated on Marx’s theory of ‘alienation,’ the idea that humans had lost their natural attributes and abilities as a result of living in an artificial, class-based society. In 1953 Nikita Khrushchev espoused these ideas at a party congress and consequently delivered a current of class-conscious ‘Marxist Humanism’ across Western Europe. By 1966, Erich Fromm and several other Marxist and ‘new left wing’ Humanists had outlined their world-view in An International Symposium of Socialist Humanism.

In 1961 Julian Huxley (grandson of ‘Darwin’s Bulldog’ Thomas Huxley, and brother of novelist Aldous Huxley), further consolidated the Humanist world-view by compiling works by himself and twenty-five other leading thinkers into a comprehensive volume called The Humanist Frame. Unusually, Huxley was a somewhat fanatical believer of Humanism. He viewed Humanism as a revolutionary, world-unifying philosophy. Huxley says ‘Humanism is seminal. We must learn what it means and then disseminate humanist ideas and finally inject them where possible into practical affairs as a guiding frame’.[11]

“Because humanists believe in the unity of humanity and have faith in the future of man, they have never been fanatics” – Erich Fromm.[12]

As we emerge into the twenty-first century, for the first time in history, mankind has been brought together by worldwide global problems. Issues like international security, the population explosion and the needs of third-world countries have recently begun to be addressed.[13] The international climate of humanitarian humanism has been most recently demonstrated by the global response to the ongoing Ebola epidemic in Western Africa.

However, Humanism has still not acquired the recognition it deserves. Since the millennium, western society has been dogged by the threat of terrorism. The paradoxical rise of religious fundamentalism in the Middle East not only threatens the west with disruption and violence but it also threatens the west with the internal indoctrination of its peoples. Since the Charlie Hebdo attack in Paris, reports of atrocities committed by the radical Islamic group Islamic State (IS) have appeared daily in British news. Moreover, the UK Foreign Office estimates that around four-hundred Britons have travelled to Syria to fight for IS.

A motif which embodies the contemporary definition of Humanism (2014) Photograph: International Humanist and Ethical Union

A motif which embodies the contemporary definition of Humanism (2014). Photograph: International Humanist and Ethical Union

In light of this threat it seems to me that now is the perfect time for British society to adopt the principles of humanism. The atheist, collectivist and humanitarian qualities of humanism are invaluable to a modern nation tormented by the threat of violence and radicalism. The Atheist Bus campaign was launched six years ago and the BHA has not launched a repeat campaign since. Unfortunately, the marketing department at the BHA seems to be redundant at a time when British society is desperate for the reassurance of a shared, rational belief system.

Whilst writing this post I stumbled upon a motif which was of great interest to me. To my understanding, it perfectly exemplifies our contemporary understanding of the word Humanism. It advocates knowledge, rationalism, human rights, liberty and collectivism and is contextualised by the recent Charlie Hebdo tragedy and cast in front of the universally-recognised humanist logo.

So, I will conclude with a proposal. I propose that the BHA awake from their redundancy and advocate the updated definition of Humanism. I suggest that they renew their campaign and use this motif, combined with public transport, as a vehicle for disseminating humanist ideas. With the help of evolved humanist philosophy, perhaps the East and West will one day be able to reconcile their differences in a rational and humanitarian way.

Further Reading

A. Bullock, The Humanist Tradition in the West (New York and London: W. W. Norton and Company, 1985).

N. Everitt, The Non-Existence of God (London: Routledge, 2004).

E. Fromm (ed) An International Symposium of Socialist Humanism (New York: Anchor Books, 1966).

P. O. Kristeller, Renaissance Thought and the Arts: Collected Essays (New Jersey: Princeton University Press, 1964).

C. Lamont, The Philosophy of Humanism, Eighth Edition (New York: Humanist Press, 1997).

J. S. Mill, Auguste Comte and Positivism (London: Trubner and Co, 1865).

L. and M. Morain, Humanism as the Next Step (Washington: The Humanist Press, 2012) (Click here to download your free copy from the American Humanist Association)


[1] A. Bullock, The Humanist Tradition in the West (New York and London: W. W. Norton and Company, 1985) p.12.

[2] C. Lamont, The Philosophy of Humanism, Eighth Edition (New York: Humanist Press, 1997) pp.12-13.

[3] R. Morris, ‘Petrarch, the first humanist’ The New York Times, http://www.nytimes.com/2004/05/29/style/29iht-conway_ed3__0.html [accessed: 14/02/2015 01:05].

[4] W. R. Estep, Renaissance and Reformation (Michigan: William B. Eerdmans Publishing, 1986) p.20.

[5] J. Zimmermann, Incarnational Humanism: A Philosophy of Culture for the Church in the World (Illinois: InterVarsity Press, 2012) p.118.

[6] J. McConica, ‘Erasmus Disiderius, c.1467-1536’ in H. G. Matthew and B. Harrison (eds) Oxford Dictionary of National Biography (Oxford: Oxford University Press, 2007).

[7] E. Asprem, ‘Positivism and the Religion of Humanity’ Heterodoxology, http://heterodoxology.com/2010/03/08/positivism-and-the-religion-of-humanity/ [accessed: 14/02/2015 01:35].

[8] H. B. Acton, ‘Comte’s Positivism and the Science of Society’ Philosophy, Vol.26, No.99 (1951) pp.291-310.

[9] Bullock, The Humanist Tradition in the West, p.133.

[10] J. Vanheste, Guardians of the Humanist Legacy: T. S. Eliot’s Criterion Network and its relevance to our Postmodern World (Leiden: Brill, 2007). p.3.

[11] J. Huxley (ed)The Humanist Frame (New York: George Allen and Unwin, 1961) pp.11-49.

[12] E. Fromm (ed) Socialist Humanism: An International Symposium (New York: Anchor Books, 1966) p.vii.

[13] H. J. Blackham, Humanism (Middlesex: Penguin Books, 1968) pp.22-23.