Free will scepticism is irrefutably in vogue in pop philosophy at the moment. Indeed, a 2015 study by Scientific American showed that 41% of their readership do not believe in free will. So what implications (if any) could this wave of free will scepticism have on our key liberal traditions and the ideas that underpin them? Liberal concepts of individual freedom, law and order, moral accountability, and consumer choice form the foundations of Western societies. Our justice systems, electoral processes and, often, public morality are firmly rooted in these ideas. At first glance, these concepts seem inherently at odds with determinism. I began writing this article as an exploration of what I saw as my own internal inconsistencies. I simultaneously found myself utterly convinced by arguments in favour of determinism, whilst also living by assumptions based on concepts such as free choice, moral accountability and law and order. Admittedly, my hope was to find a way of reconciling these ostensibly incompatible positions and get rid of that icky cognitive dissonance I’d been feeling. In this article, I try to understand how free will scepticism interacts with liberal ideas, and whether the two can conceivably coinhabit the same logical realm. I conclude that maybe they can – but that it doesn’t matter anyway.

But first, a quick clarification of terms is in order:

Determinism is a theory positing that every event is causally determined by events that occurred before it.

Free will is the idea that individuals can control their own actions and make free choices which are not entirely determined by past causes or external factors.

Free will scepticism is the position that true free will (as described above) does not exist.

The rise of free will scepticism

Prior to the advent of Charles Darwin’s work on evolution, free will scepticism had remained firmly sealed within the hallowed halls of physics and philosophy departments. Darwin’s theory, though, sparked a flurry of research on the roles of nature and nurture in human development. Some believed that nurture, upbringing, culture, and surroundings were the primary factors in the characteristics, behaviour and personality of each individual. Others claimed that genetic factors were the primary indicators of a child’s future success.

The nature-versus-nurture debate that ensued dragged free will scepticism into the limelight. Scientists and philosophers saw the deterministic implications of the ever-growing evidence in favour of the ‘nature’ side of the debate: if genes are the key indicators of how a person will develop, then free will and random chance have little to no role in that person’s future success. Further advances in the area of free will came in the 1980s, when neuroscientist Benjamin Libet completed ground-breaking research. In his now famous experiment, subjects were asked to flick their wrist at a random moment. Subjects recorded the time at which they consciously made the decision to do so. Libet found that subjects’ brains seemed to be preparing for action 0.35 seconds before the conscious mind ‘decided’ to act. The study suggests that the experience of consciously deciding to act could be an ex post facto reconstruction that gives us the illusion of free will. It is certainly worth noting that in recent years, this study has faced growing amounts of controversy over its validity. Several critics have suggested that the ebb and flow of normal brain activity caused the decision-making process, rather than the other way round (find out more about the controversy here).

Nevertheless, in an article for The Atlantic, Stephen Cave, in reference to Libet’s study, observes that ‘this research and its implications are not new. What is new, though, is the spread of free will scepticism beyond the laboratories and into the mainstream’. Until recently, the debate had generally remained within the confines of academia. However, in the West’s unprecedentedly secular and information-driven era, free will has become a much more mainstream discussion, with books such as Elbow Room by Daniel Dennett, Free Will by Sam Harris and The Science of Fate by Hannah Critchlow hitting the shelves in recent decades.

So a new debate has emerged: what are the effects of the rising belief that free will does not exist and is it morally permissible to disseminate deterministic ideas? In their paper The Value of Believing in Free Will: Encouraging a Belief in Determinism Increases Cheating, psychology researchers Kathleen Vohs and Jonathan Schooler found that, when primed with the idea that free will does not exist, subjects are far more likely to cheat on a test. This is, of course, concerning at a time when free will scepticism is making its way into popular thought. Could belief in determinism make people less moral? Vohs and Schooler issue a caveat at the end of their paper:

‘Although the study reported here raises concerns about the possible impact of deterministic views on moral behavior, it is important not to overinterpret our findings. Our experiments measured only modest forms of ethical behavior, and whether or not free-will beliefs have the same effect on more significant moral and ethical infractions is unknown. In addition, a deterministic viewpoint may have a host of possible consequences, and only some of these may be unfavorable’.

This is really the crux of the matter: understanding how free will scepticism could affect our institutions will be crucial in finding out how we can protect them or even whether they need protecting. For the contemporary West, whose societies are based on classical liberal ideas, how can these ideas be reconciled with free will scepticism? And, more importantly, will the ideological foundations of our societies crumble if we cannot reconcile these opposing concepts?

Liberalism without liberty of choice?

Central to liberal thinking is the importance placed on individual liberty. People should be able to live as they choose, without coercion and as freely as is possible within an organised society. Positive liberty (see, for example, Isaiah Berlin’s Two Concepts of Liberty) is the type of freedom most discussed in free will philosophy, as it concerns both the availability of choice and the nature of choice itself.

To put the concept of agency and choice into real-world situations, we can examine its effect on liberal institutions such as free markets and representative democracy. The two are based on the freedom of each person to make decisions, whether it is choosing between which toothbrush to purchase at the supermarket or which member of parliament to vote for. The American liberal experiment was envisioned as an opportunity for people to move up in the social order, to freely choose their own paths in life and to be free from oppression and tyranny. According to Barack Obama in The Audacity of Hope:

‘These American values are rooted in a basic optimism about life and a faith in free will – a confidence that through pluck and sweat and smarts, each of us can rise above the circumstances of our birth. But these values also express a broader confidence that so long as individual men and women are free to pursue their own interests, society as a whole will prosper’.

If determinism is true, if we have no free will and if every ‘choice’ we make is determined by various internal and external factors, can we really say that we have made a free choice at all? As Yuval Noah Harari puts it: ‘I can choose what to eat, whom to marry and whom to vote for, but these choices are determined in part by my genes, my biochemistry, my gender, my family background, my national culture, etc – and I didn’t choose which genes or family to have’. In other words, as much as we have the feeling of being free to choose, our choice is determined by factors outside of our control.

So, is it possible to make ‘free’ decisions? Martin Heisenburg, in an article for Nature, makes the case that there is, in fact, evidence of indeterministic, or ‘random’ events occurring in the brain. He gives the examples of ‘the random opening and closing of ion channels in the neuronal membrane, or the miniature potentials of randomly discharging synaptic vesicles. Behaviour that is triggered by random events in the brain can be said to be truly ‘active’ — in other words, it has the quality of a beginning’. Put simply, neuroscientists have detected random events at the quantum level which could be evidence for behaviour that is not subject to determinism. Free will could be said to exist in the random opening of ion channels in the brain, which result in some of the choices humans make. However, neuroscientist and philosopher Sam Harris argues that this by no means constitutes proof of the existence of free will. He asks::

‘How can we be “free” as conscious agents if everything that we consciously intend is caused by events in our brain that we do not intend and of which we are entirely unaware? We can’t. To say that “my brain” decided to think or act in a particular way, whether consciously or not, and that this is the basis for my freedom, is to ignore the very source of our belief in free will: the feeling of conscious agency. People feel that they are the authors of their thoughts and actions, and this is the only reason why there seems to be a problem of free will worth talking about.’

Even if it is true that decisions are made at the quantum level by the random opening of ion channels, this no more means that one has command their own actions or decisions than if they were controlled by rolls of the dice. ‘You can do what you decide to do’, says Harris, ‘– but you cannot decide what you will decide to do’. Ultimately, random choices made by an individual’s subconscious brain could hardly amount to the rational free choice that is assumed in liberalism.

So can free choice be reconciled with free will scepticism? I would argue that it can, in a way. Each person’s ‘choice’ is based upon their existing tastes, desires, experiences, and genes. One would expect people to make choices based on these factors in order to make a decision that best suits them. The brain weighs up the pros and cons in favour of each possible decision. Once one of these outcomes has reached a threshold barrier of pros, it is decided upon. The reasoning that liberalism expects the rational individual to take part in does occur in the brain – although the conscious ‘self’ seems not to play a part in the process. In other words, on a subconscious level, the brain makes decisions based on the interests of the individual, given their genes, previous experiences, desires, and external influences, all of which contribute to decision making. Although this may not be considered ‘free’ in the conventional liberal sense, I don’t think it makes much practical difference when applied to activities associated with positive liberty such as market choice and democratic processes.

Moral accountability

With great power comes great responsibility, they say. Indeed, with freedom of choice comes moral accountability. Isaiah Berlin, in his essay Historical Inevitability, questions the idea that history moves in patterns and that its course is pre-determined and unalterable. Although Berlin does not refute the concept of determinism, he suggests that acceptance of it would call for a re-evaluation of ideas regarding moral responsibility, law, order, and justice. He argues that, if determinism is true, to blame a human for a wrong-doing is as irrational as blaming wild animals, since neither human nor beast has moral responsibility. This is a common criticism of free will scepticism – that systems of law and order such as retributive justice are no longer coherent if blame cannot be placed upon the perpetrator of a crime. In their paper Is Belief in Free Will a Cultural Universal, Sarkissian et al. conducted research on whether public opinion absolved criminals of moral responsibility in a deterministic universe. 86% of subjects thought people living in a universe in which determinism is true are not fully morally responsible for their actions. If we see, for example, murderers as lacking responsibility for their crime, since they have no free will and the decision to murder, therefore, was not made consciously by them, how could it be fair to punish or incarcerate them?

Sam Harris thinks that criminals must still be incarcerated, making the utilitarian argument that ‘everyone else will be better off this way. Dispensing with the illusion of free will allows us to focus on the things that matter – assessing risk, protecting innocent people, deterring crime, etc.’. Neuroscientists Joshua Greene and Jonathan Cohen also explored the issue of free will and judicial processes in their paper For the law, neuroscience changes nothing and everything (2004), which gives an optimistic view of determinism. Greene and Cohen hypothesise that changing views on free will could encourage people to see justice as a ‘consequentialist’ rather than a ‘retributive’ process. Understanding that criminals have no conscious control over the ultimate cause of their own actions could lead to justice reforms that take a more humane approach, which, according to Greene and Cohen, are certainly optimistic findings. In essence, a better understanding of free will and the way the mind works can allow policy makers to find solutions that will deter crime, protect the rights of victims, treat criminals in a more compassionate way, and, as Harris argues, maybe even find a neurological cure for conditions such as psychopathy.

Patricia Churchland, a Canadian philosopher, claims that free will scepticism may not necessarily do away with retributive justice entirely, as the biological need to punish others for their transgressions overrides the logical conclusion that people cannot have moral accountability if determinism is true. She says that ‘from an evolutionary perspective, punishment is justified by the value all individuals place on their social life, and by the limits on behaviour needed to maintain that value’.

Although one can fairly coherently reach conclusions that reconcile free will scepticism with classical liberal values and institutions, it does take a certain amount of reading and reasoning to arrive at this point. In addition to this, each topic requires a careful and different response to free will scepticism. My concern is that if too many people doubt the existence of free will, they may do so with an over-simplified understanding of what that means. As the tides of free will scepticism continue to rise, it seems imperative to me that we devise a solution to avoid this over-simplification that may lead to unfortunate consequences.

The good, the bad and the ugly: solutions in Western philosophy

Luckily, in its long history of free will scepticism, Western philosophy has devised a number of blanket solutions to the problem of free will. Many philosophers agree that determinism is true. However, belief in the extent to which determinism affects free will varies hugely amongst free will thinkers. This section looks at three strands of thought on free will scepticism with the aim of finding an ‘all-purpose’ way to approach reconciling free will scepticism with our existing liberal institutions and ideas.

The good: reframing free will scepticism

Some free will sceptics argue that the way we frame determinism is a deciding factor in how it will colour existing views on justice, freedom and tolerance. The term determinism, says philosopher Daniel Dennett, is often wrongly confused with fatalism. However, the potentially negative consequences of free will scepticism can be avoided by reframing the concept. ‘Determinism’, Stephen Cave explains, ‘is the belief that our decisions are part of an unbreakable chain of cause and effect. Fatalism, on the other hand, is the belief that our decisions don’t really matter, because whatever is destined to happen will happen’. It seems logical that, when determinism is framed in a fatalistic way, people will have pessimistic reactions. For example, in the experiment by Vohs and Schooler that found that subjects were more likely to cheat after reading passages explaining the non-existence of free will, the passages portrayed free will scepticism in a fatalistic way.

Sam Harris (in an interview with Cave), suggests that many of these experiments measure introspective elements of free will scepticism rather than looking at how subjects treat others:

‘Whereas the evidence from Kathleen Vohs and her colleagues suggests that social problems may arise from seeing our own actions as determined by forces beyond our control—weakening our morals, our motivation, and our sense of the meaningfulness of life—Harris thinks that social benefits will result from seeing other people’s behavior in the very same light. From that vantage point, the moral implications of determinism look very different, and quite a lot better.’

The bad: compatibilism

Approaches to free will in general tend to take one of three positions: determinism, libertarianism, or compatibilism. Both determinism and libertarianism are incompatibilist approaches, as they both view free will as incompatible with determinism – they assert that determinism is true and, therefore, free will does not exist (determinist) or that determinism is not true and, therefore, free will does exist (libertarian). The compatibilist position aims to reconcile free will with determinism. If an action comes from you then you can call it a free action, say compatibilists.

Compatibilism, or soft determinism, is the view that if one is able to make a decision without external factors affecting it, then that is considered to be an act of free will, making it a debate over sourcehood. The simplest way of understanding this stance is the following Frankfurt-style thought experiment (named after compatibilist philosopher Harry Frankfurt):

In a non-deterministic universe, a woman is on her way to vote for either political party A or B. What she does not know is that a chip has been implanted in her brain that will make her vote for party A if she chooses to vote for party B. In scenario 1, she gets to the voting booth and chooses party A of her own accord. In scenario 2, she gets to the voting booth having decided to vote for B. Therefore, the chip is activated and she votes for party A. The woman could not have done otherwise but in scenario 1 she chose freely and in scenario 2 she did not. Compatibilists see an action as free if the individual acts unencumbered by external coercion (negative liberty).

A number of free will thinkers see compatibilism as incoherent within the usual confines of the debate. Just because a decision is seemingly solely based on internal factors and deliberation, this is a product of everything that has led up to this moment including our genes, experiences and desires which are out of our control. If we are looking for a simple, blanket way of framing free will scepticism, compatibilism seems too complex and controversial to be a viable option.

The ugly: illusionism

Saul Smilansky is the founder of the illusionism theory of free will, the idea that humans have illusory ideas about free will but that losing faith in these is dangerous to the individual and to society. He argues that ‘humanity is fortunately deceived on the free will issues, and this seems to be a condition of civilised morality and personal value’. Smilansky makes the case that humans tend to over-simplify ideas, and that if the view that free will does not exist prevailed, it might be taken as a pure determinism or nihilism, which are wholly incompatible with agency or moral responsibility.

According to the illusionist position, if free will scepticism becomes widespread:

‘a broad loss of moral and personal confidence can be expected. The idea of action-based desert, true internal acceptance of responsibility, respect for effort and achievement, deep ethical appreciation, excusing the innocent – all these and more are threatened by the ‘levelling’ or homogenising view arising from the ultimate perspective’.

In other words, the inevitable simplification of the free will problem could lead to the breakdown of key liberal ideas and institutions. Smilansky offers the suggestion that, since the illusion of free will seems to be part of human nature, ‘scientists and commentators merely need to exercise some self-restraint, instead of gleefully disabusing people of the illusions that undergird all they hold dear’. Illusionists argue that liberal traditions and institutions might be so damaged by determinism that it is better for society as a whole to keep free will scepticism out of public consciousness.

The illusionist position is not particularly popular amongst philosophers, in part because many disagree that free will scepticism is incompatible with the values listed above, and also because denying the public access to ‘truths’ uncovered by philosophers and scientists is often considered to be paternalistic and unethical. After all, the point of science and academia is to shed light on the truth rather than to obscure it for the perceived greater good of society.

Moving forward

A slightly kinder alternative, although somewhat pessimistic one nonetheless, is this: maybe free will scepticism does not need to be hidden from the public because they will not be able to draw the full implications of it by themselves anyway (myself included). Although when thinking rationally about the subject I am fairly sure that I have no free will, I have been unable to internalise this belief. I wonder if it is possible for anyone to internalise it completely, or if the illusion of conscious thought is just too strong to ignore. In the words of Greg Boyd ‘People may sincerely think they believe in determinism, but they act otherwise, every time they deliberate’. His argument comes from a theological perspective, but putting that aside, I interpret it in this way: determinism is incredibly complex and goes against many of our human instincts and evolutionary mechanisms that cause us to value ideas such as freedom, rights, law and order and organised society. Perhaps public scepticism of free will indeed has no profound effect on how people view liberal concepts because it is simply too counterintuitive and complex to internalise.

Isaiah Berlin said that ‘there are those whose determinism is optimistic and benevolent, and those whose determinism is pessimistic’. To me, determinism constitutes the former. I hope, not only that those classical liberal ideas that form the bedrock of our societies can survive free will scepticism, but that they will be strengthened by a refreshed view of the human brain. For me, understanding our own minds better does not mean inevitably casting aside our existing institutions. Rather, it can help us to be more tolerant, more forgiving and ultimately, to have strong neuroscientific and philosophical bases for our liberal institutions.

I leave you with a quote from Einstein that generally makes me feel a whole lot better about it all…

‘I do not believe in free will. Schopenhauer’s words: ‘Man can do what he wants, but he cannot will what he wills,’ accompany me in all situations throughout my life and reconcile me with the actions of others, even if they are rather painful to me. This awareness of the lack of free will keeps me from taking myself and my fellow men too seriously as acting and deciding individuals, and from losing my temper.’- Albert Einstein

 

After studying Architecture at the University of Nottingham, Laura Walker-Beaven worked in fundraising and international development. She recently completed a masters in Human Rights, during which she became increasingly concerned about the impact of Critical Social Justice on universities.

 

In the introduction to his anthology Censorship and Silencing (1998), Robert Post makes the case that attitudes towards censorship, both in academia and beyond, have become much more politically and ideologically controversial in the past few decades. Previously, censorship had been largely associated with the political right, whereas the political left generally advocated for quasi-libertarian attitudes towards free expression. More recently, however, members of groups that are ideologically different in almost every way have converged in pro-censorship views. Post gives the examples of feminist scholars, such as Catharine MacKinnon and Andrea Dworkin, supporting the censorship of pornography, which is generally associated with Christian fundamentalism, or Critical Race Theory scholars joining the Jesse Helms-led conservative movement to ban hate-speech. Post wrote over twenty years ago, and yet, his observations are just as relevant today.

So, what caused this shift in the attitudes of many left-leaning scholars and public figures towards a classically conservative attitude towards free speech? I argue that, in part, the change in the way that censorship was perceived in the social sciences was catalysed by

Foucault’s writings on power and censorship. Hitherto, censorship in academia had been seen as a generally repressive and coercive act, indeed as the antithesis of academic freedom. Viewed through a liberal lens, preventing censorship is a matter of protecting the individual’s negative freedom, i.e., their freedom from restriction. Thus, censorship was defined as the coercive blocking of certain topics and content by an individual, group or state that takes place after the act of expression (see Freshwater, 2003). Michel Foucault and other postmodernist thinkers, though, sought to redefine censorship, showering it with a postmodern dose of radical scepticism and preoccupation with power.

Foucault on power and censorship

For Foucault, censorship was not the coercive and repressive act that modern liberal thinkers described it as. Instead, he saw it as a productive act whose power could be harnessed. Foucault dedicates the first chapter of The History of Sexuality (1978) to discussing the censorship of content relating to sex during the classical period. Central to his hypothesis is that, during this era, loci of power utilised the censorship of sexual content to control the discourse relating to sex, rather than to quell it completely. By inhibiting casual discussions on the topic, powerful entities were able to regulate sexual knowledge production itself. Through censorial means, they dictated the terms and framework through which sex was discussed. Foucault argues that this sparked an “explosion” in the production of discourse regarding sex. He gives the example of schools during the classical period. Every aspect of the design and management, such as classroom arrangements, bedtime monitoring and dormitory layouts, he says, was influenced by this sexual taboo. He calls this “incitement to discourse”, as, according to his thesis, the censorship of sex-related content paradoxically produced an era that was obsessed with sexuality. Foucault began to redefine censorship, uncoupling it from repression and characterising it instead as a positive force.

Another significant contribution Foucault made towards the postmodern redefinition of censorship was the idea that sousveillance can be more effective a mechanism of censorship than surveillance can. During the period in question, the expurgation taking place was not nearly as powerful a censorship tool as were the changing cultural norms that labelled sexual discourse taboo. Foucault writes:

“The forbidding of certain words, the decency of expressions, all the censorings of vocabulary, might well have been only secondary devices compared to that great subjugation: ways of rendering it morally acceptable and technically useful.”

In other words, Foucault argues that social norms, and the self-censorship they encourage, can be more coercive than explicitly repressive devices at controlling speech. In a similar vein to this latter observation, in Discipline and Punish (1975), Foucault uses the analogy of Bentham’s panopticon, a prison in which inmates never know when they are being watched. Foucault sees this as a useful metaphor for power relations in contemporary societies, where citizens self-sensor and modify their behaviour to fit in with social norms, where they are constantly sousveilled by those around them.

Foucault planted the seed of postmodern doubt in the minds of social scientists and thus emerged New Censorship Theory, a branch of postmodern thought that framed censorship as productive rather than repressive.

New Censorship Theory

Pierre Bourdieu was a particularly notable figure in the emergence of New Censorship Theory. Inspired by Foucault’s observations on panoptic societies, Bourdieu developed an idea of self-censorship as a process of pre-emption and anticipation of the way society will respond to certain discourses. The movement subsequently evolved from the contributions made by several key names in the social sciences, including Judith Butler and Michael Holquist. There are several key ways in which the New Censorship movement re-theorised censorship that are fundamental to how the contemporary social sciences perceive freedom of speech and censorship.

First, the movement re-framed censorship as an important and productive force. Much as Foucault saw power and censorship as productive, the academic proponents of New Censorship Theory saw the constructive potential of these concepts, too. In Excitable Speech (1997), Judith Butler explains exactly what is meant by censorship as a “productive” force. It is not to say that censorship is necessarily positive, rather it is “formative”, assisting in the production of discourse, instead of simply the denial of liberty. She describes how censorship could be conceived of as necessary to the realisation of certain social and political aims, giving the example of marginalised communities exercising censorship over others in order to regain control over their own representation. In other words, rather than being purely repressive, censorship can be harnessed as a productive and emancipatory tool. Important to note here is also Herbert Marcuse’s influence on this topic. In his widely cited chapter Repressive Tolerance, Marcuse proposes the concept of “liberating tolerance”, which would involve the censorship of speech that argues against “the extension of public services, social security, medical care, etc.” and other right-wing ideas. He makes the case that this “liberating tolerance” must be primarily enacted in the academic realm, where the freedom of right-wing academics must be limited in order to correct the power imbalances between minority and majority groups. Marcuse was a member of the Frankfurt school of critical theory, and though this chapter concerns itself primarily with postmodern thought, its influence in the shift in attitudes towards censorship in the academy must also be acknowledged.

Censorship was also re-characterised by postmodern scholars as ubiquitous. The aforementioned liberal idea of censorship understands un-censored expression to be “free”. The pioneers of the New Censorship movement subverted this definition and argued that no expression is ever truly free, as orators, writers and artists constantly self-censor as they create. According to Bourdieu, market conditions (i.e. what others will find valuable) affect speech prior to the act of speaking has even begun. The speaker, artist and writer self-censors according to what society will accept and appreciate. Likewise, Bourdieu argues that, in academia, when scholars wish to generate discourse in their given field, they must self-censor in order to adhere to the correct etiquette that is expected from academic literature. An oft-quoted passage from Holquist affirms this idea of censorship as a ubiquitous force, stating:

“To be for or against censorship as such is to assume a freedom no one has. Censorship is. One can only discriminate among its more and less repressive effects.”

As well as censoring to conform to linguistic and stylistic expectations, the New Censorship scholars argued that social and cultural norms constitute another form of ubiquitous censorship. These norms produce what Bourdieu calls structural censorship, where one pre-emptively alters the content of one’s expression in order to fit what is deemed to be socially acceptable. Previously, censorship had been considered to be always an explicit and repressive act. The new theory labelled some forms of censorship as implicit. For Bourdieu, structural censorship forces the individual to self-censor in order to conform to meet the demands of the market (the listener). Accordingly, Bourdieu presents censorship as necessary, and argues that it is most effective when it is implicit and shrouded in cultural norms and customs:

“Censorship is never quite as perfect or as invisible as when each agent has nothing to say apart from what he is objectively authorised to say […] he is […] censored once and for all, through the forms of perception and expression that he has internalised and which impose their form on all his expressions.”

Butler echoes this sentiment six years later, claiming that implicit forms of censorship can be more effective in limiting “speakability”. Part of the reason being that, when censors explicitly ban certain content, it conspicuously draws attention to that which has been censored. The effect is that censors self-sabotage their own work through the very act of explicit censorship. Social norms, on the other hand, censor by rendering certain utterances impossible. “Impossible speech”, for Butler is that which is socially unacceptable to say, that which renders the speaker “asocial” or “psychotic” in the eyes of society.

I find it important to note that these proponents of New Censorship Theory are particularly influential in their fields, and, in fact, their writings on the topic are generally contained within some of their most celebrated works. In 2007, Foucault, Bourdieu and Butler ranked as numbers 1, 2 and 9 respectively as the most cited authors in the humanities. For this reason, the implications of this redefinition of censorship are certainly not to be downplayed.

Implications for academia

That censorship is omnipresent is the inevitable terminus of the ideas raised in this strand of scholarship. As Post asserts, “If censorship is a technique by which discursive practices are maintained, and life largely consists of such practices, it follows that censorship is the norm rather than the exception”. Post acknowledges that once one sees censorship as ubiquitous, one must differentiate between different forms of censorship in order to accept or reject them. However, taken to its extremes, the general thesis of the New Censorship literature has the potential to render the term “censorship” meaningless. As mentioned, these theorists make the point that self-censorship is fundamental to the process of writing and content production, as a writer selects their words through a process of inclusion and exclusion. Although this may be the case, the process of self-censoring due to fear of reprisal or judgement is hardly the same as self-censorship as an artistic choice. Within the New Censorship framework, though, these are presented essentially as one and the same, since societal norms and values are both seen as effective and potentially productive tools of censorship. A gender-critical feminist scholar, for example, may self-censor in order to avoid being ostracised by colleagues. This is far from being an artistic choice and rather a type of censorship based on academic social norms. By redefining censorship as a ubiquitous and inevitable element of discourse, one takes the negative connotations away from the term “censorship”. In addition to this, Butler’s (and Marcuse’s) suggestion that censorship can be used as a tool for social justice could conceivably be used to justify censorships according to the political and ideological whims of censures (or violent student demonstrators at universities, for example).

My concern is that, once censorship is normalised as a concept, the moral outrage previously invoked by incidents of censorship may be eroded. I believe that classifying social norms as a form of censorship leads us down a slippery slope that ends with advocating for the use of explicit censorship as a tool for enforcing the popular dogma or theory of the day. Unfortunately, I suspect we’re already well on our way down said slippery slope, as 40% of millennials in the US believe that the censorship by government of potentially offensive content is justified.

There is now a growing disconnect between those who believe that self-censorship is a form of repression and those who see it is a positive tool for achieving progress. I believe that addressing this divide will be impossible without an understanding of the theories and literature that have led a generous portion of the population down this path.

After studying Architecture at the University of Nottingham, Laura Walker-Beaven worked in fundraising and international development. She recently completed a masters in Human Rights, during which she became increasingly concerned about the impact of Critical Social Justice on universities.

 


Academic freedom has faced numerous challenges over time. In the UK, a 1963 report on higher education regarded the greatest threat to academic freedom as being political influence. Recently, the debate surrounding academic freedom has been resurrected, as threats to freedom of speech appear to emerge from within academia itself. In December 2020, Civitas published a report on academic freedom in UK universities, which reviewed incidents of speech censorship between 2017 and 2019. At universities across the UK, Civitas observed instances of speech restrictions and censorship by way of restrictive campus speech codes, instances of no-platforming, and ‘cancel culture’ petitions and letters. Other think tanks and advocacy organisations have similarly expressed concern over the reported decline of academic freedom in universities in the UK and beyond (see UCU; Policy Exchange; Heterodox Academy; Gallup; Woman’s Place UK).

Of course, some deny the existence of a free speech crisis, often claiming that right-wing speakers have fabricated it in order to protect their ability to continue espousing questionable views on university campuses (see Fowles; Smith). These critics, however, fail to consider the wealth of literature coming from left-leaning academics and journalists also discussing the threats to academic freedom. For instance, the Harper’s Letter, which argues in favour of protecting free and open debate, counted Noam Chomsky, Salman Rushdie and Samuel Moyn amongst its signatories, indicating the bipartisan support for this cause.

Amidst this growing public concern over campus censorship, the UK government recently appointed a Free Speech and Academic Freedom Champion. As so often happens in public discourse, views on the topic are often highly polarised and attribute the “free speech crisis” at universities to various single-issue and highly politicised concerns. Instead, in this article I present some of the research and contemporary thought exploring potential factors that could threaten academic freedom. In an effort to avoid dichotomous thinking, this article considers different approaches to the issue and accepts that the cause of declining academic freedom is likely a complex amalgamation of the issues discussed here—and many more.

The political economy of censorship

A substantial body of literature suggests that threats to academic freedom, particularly in the UK and the US, have increased due to the neoliberal privatisation of universities. Professor Anna Traianou argues that the commercialisation of knowledge production has negatively impacted the intellectual freedom of academics. In the 1980s and 90s, a considerable shift took place in how higher education was conceptualised. The UK government began encouraging more of the population to attend university in order to compete globally in the emerging knowledge economy. According to Traianou, public sector funding for universities could not meet the resource demands incurred by the sudden influx of students. As a result, tuition fees were introduced, compelling universities to compete for funding. The Civitas report similarly proposes that privatisations have led university management to treat students as customers, producing a paradigm that prioritises customer satisfaction over academic freedom and knowledge production. Although the issue of university privatisation is undeniably multifaceted, Traianou and Civitas’ arguments make a compelling case for why public sector funding may allow universities to better pursue truth and knowledge unencumbered by the whims of their student customers.

Sociologist Adam Hedgecoe suggests that the establishment of research ethics committees in UK universities has also contributed to the threat to academic freedom. For those unfamiliar with the inner workings of universities, research ethics committees are groups of academics who judge whether proposed research projects are ethical. While the necessity of these committees may be abundantly obvious within the context of medical research, for instance, the encroachment of these into the social sciences has been the subject of many debates. In his study, Hedgecoe finds evidence that these committees tend to prioritise the reputation of the university rather than the ethics of research praxis. He theorises that sensitive or controversial research projects may be denied permission to proceed by these committees because they pose reputational risks to the university. Hedgecoe further suggests that these bodies promote academic cultures that prioritise public image over academic freedom and the pursuit of knowledge, the effects of which become even more significant when combined with the commercialisation of higher education. While Hedgecoe takes a decidedly measured approach to his research, other academics have spoken out more vocally about the potential for research ethics committees to curb academic freedom. Sociologist Martyn Hammersley, for example, has written a number of articles arguing that ethics committees tend to emphasise the individual autonomy of the research subject, but ignore that of the researcher. Hammersley also questions whether the ethics of a social sciences proposal can truly be decided by such a committee. He makes the point that this very act is hubristic in that it assumes that one could conclusively decide whether a proposal is ethical or not. Hammersley asserts that the existence of such committees “amounts not only to a bureaucratization of research but also to unwarranted restriction on the freedom of researchers”.

The Civitas report suggests that the development of equality policies, which rightly seek to protect minority students from discrimination, may go too far in limiting speech. “The university institution,” states the report, “is not created for the primary purpose of prohibiting discrimination—its founders do so for the purposes of providing places of higher education and learning”. Law professors Ian Cram and Helen Fenwick make a similar point, arguing that recent changes to UK counter-terrorism law could incentivise universities to cancel potentially controversial events. Cram and Fenwick explain that UK law changed to place duty on universities themselves to actively prevent potential instances of radicalisation and extremist content. The authors claim that the law already contained sufficient protection against such content and that including the “prevent” clause encourages universities to pre-emptively cancel or prevent campus events that may include or lead to “extremist expression”. The authors claim that this policy is excessively broad and ill-defined, potentially leading to universities restricting ideologically diverse content in accordance with the new laws.

Ideological influences

Taking a rather different approach to the works outlined above, other scholars adopt positions that correlate declining academic freedom with the rise of certain ideological movements. Greg Lukianoff and Jonathan Haidt, in The Coddling of the American Mind, posit that free speech in universities may be placed in jeopardy by growing cultures of “safetyism”, resulting in over-protective attitudes towards children both at home and at school. Lukianoff and Haidt argue that protecting children against adversity instils a mindset in young people that seeks to avoid discomfort or feelings of offence. Subsequently, according to the authors, when students arrive at university, they lack the cognitive toolkit required to confront ideas that fundamentally contradict their own worldviews and react by demanding “safe-spaces” free from intellectual discomfort. This, combined with privatisation, means that universities may be more inclined to compromise their commitment to academic freedom and open enquiry in order to appease their increasingly “coddled” customer base.

Others attribute the reported decline of academic freedom to the rise of postmodernism within academia. Public intellectuals Helen Pluckrose and James Lindsay, in their book Cynical Theories, outline the principles of postmodernism and map the journey of these concepts from their roots in the poststructuralism of Foucault and Derrida through to critical theorists in fields such as postcolonialism and, more recently, intersectional feminism, critical race theory and queer theory. They argue that these fields have formed an academic oligopoly that poses a fundamental threat to Enlightenment and scientific forms of reasoning. Although the authors touch only briefly on academic freedom directly, the implications are clear; the modus operandi of these critical social justice theories seem to be at odds with the established norms and justifications for academic freedom.

Philosopher John Sanbonmatsu makes a similar case, arguing that nihilism, stemming from the poststructuralist denial of the existence of objective truth, has come to monopolise academic thought. He claims that the effect of this has been to “blunt the critical imagination and to erode our capacity for truth-telling”. Sanbonmatsu suggests that the reason for postmodernism’s success in universities, despite opposition from almost all other ideological perspectives within the academy, has been its ability to morph and adapt into new forms. These ideas have penetrated each field of the humanities so profoundly that the term theory itself, argues Sanbonmatsu, has become synonymous with postmodern. This postmodern domination of academia, he claims, has shut down academic enquiry and “penalises those who dissent from its ideological frame”. Indeed, if Sanbonmatsu is right, the pervasive nature of this subset of academic theories could be seen as hegemonic in its monopolisation of academic thought. This could result in a “tyranny of the majority” effect in academia, whereby academics are openly discouraged from conducting research that opposes popular philosophies stemming from postmodernism. A particularly topical example of this is Kathleen Stock of Sussex University, my alma mater, who has been publicly denounced by the University and College Union for her critical stance on postmodern theories of gender and has now resigned after harassment by activist students and lack of support from colleagues. Although this, and many contemporary examples of campus censorship, may stem from postmodernism and its offshoots, the same could be conceivably said for any theory or ideology that comes to dominate academia.

This article has touched on research and thought from two approaches to the academic freedom debate. As I mentioned in the introduction, I fear the discourse on the topic tends to place the blame of declining academic freedom on single issues rather than considering its complexity, stemming as it does from the amalgamation of intersecting causes. Perhaps by broadening our view of the university free speech debate, we can better consider the steps necessary to bolster academic freedom in UK universities and beyond.


After studying Architecture at the University of Nottingham, Laura Walker-Beaven worked in fundraising and international development. She recently completed a masters in Human Rights, during which she became increasingly concerned about the impact of Critical Social Justice on universities.

 


It can be difficult to find clear information about gender dysphoria online, so we compiled answers to some questions you may have about the topic. This text should serve as an introduction for readers who hope to engage with the literature discussing gender, but who know very little about the issues to begin with.

What is gender dysphoria?

Gender dysphoria is a condition in which a person feels as though their sex at birth is mismatched with the gender they identify most with. According to the Diagnostic and Statistical Manual for Mental Disorders (DSM-5), 0.005–0.014% of biological males suffer from gender dysphoria and 0.002–0.003% of biological females do so, although this is changing due to recent surges in gender dysphoria diagnoses. There is no doubt about the existence of gender dysphoria as a legitimate condition that can be alleviated by going through a series of medical processes known as “transitioning”, in which an individual’s body is made to look and feel more similar to that of the gender they identify with. The causes of gender dysphoria are still unclear.

What does “transitioning” mean?

Transitioning can take several different forms and does not always involve surgery or irreversible procedures. The two types of transitions are social and medical.

What is the difference between a medical transition and a social transition?

Social transitioning is when a person chooses to present themselves as a member of the opposite sex. This may be through aesthetic changes such as clothing and hairstyles and could also include going by a different name and pronouns. Steps taken to socially transition are generally reversible. A medical transition is characterised by the use of medical interventions such as puberty blockers, hormone treatment and surgery. Some types of medical transitions, such as puberty blockers, are reversible, whereas surgery is generally permanent.

What is the difference between “sex” and “gender”?

Although the words “sex” and “gender” are often used interchangeably, they refer to different concepts. “Sex” is assigned at birth and describes the difference between men, who have XY chromosomes, and women, who have XX chromosomes. “Gender”, on the other hand, is broadly defined as the social elements of what it means to be a man or woman. This concept of gender as malleable and shaped by environmental factors was developed in the 1960s and 70s. The field of Gender Theory, which has become popularised since that period, generally posits that masculinity and femininity are socially constructed. Some gender academics even propose that sex is socially constructed. This field is not without criticism. Studies have shown that men and women are different in many ways including psychological traits, hormone levels and many observable traits. In addition to this, there is evidence that babies of just three months old prefer toys associated with their sex. This is also observable in rhesus monkeys, who preferred playing with toys associated with their sex, thereby casting doubt on the idea that these traits are solely developed through socialisation.

What does it feel like to suffer from gender dysphoria?

Children suffering from gender dysphoria can exhibit a number of symptoms including refusing toys designed for their sex, urinating in the position of the opposite sex and repeatedly stating they are really of the opposite sex. In teenagers and adults, gender dysphoria can cause deep disgust of one’s own genitals and the strong belief that their real gender is not the same as their biological sex. This is often accompanied by feelings of isolation and depression.

What’s the difference between the terms “gender dysphoria”, “transgender” and “gender non-conforming”?

As described above, “gender dysphoria” is a condition in which an individual feels that they do not identify with their sex. A “transgender” person is someone whose gender identity is different to their sex. A “gender non-conforming” person is someone whose appearance or behaviour does not align with what one might expect from a member of their sex.

What about parents of children who are questioning their gender?

One can only imagine how difficult it must be for people struggling with gender dysphoria. It must also be challenging for parents and family members of these individuals to respond to the revelation that a child does not identify as their biological sex. On top of the difficulty parents might face in understanding their children’s desire to change gender, parents can face even more challenges from those around them. Social media shaming and excessively PC school policies are just two challenges faced by parents who question their child regarding issues of gender.

Is gender dysphoria permanent?

Around 80% of gender non-conforming children grow up to be cisgender (i.e. identifying as the same gender they were born as). However, when gender dysphoria persists through puberty, it is usually permanent. Some research has shown that adults with gender dysphoria can have brain structures that are somewhere in between male and female brain structures.

Does body image and mental health play a role?

One study on teenage girls in the US found that 50% of 13-year olds felt unhappy with their bodies, growing to 80% by the time the girls reached the age of 17. Adolescence is a difficult time for young people. Being uncomfortable in their own bodies is perfectly normal.

Indeed, certain mental health problems have been found to correlate with feelings of gender dysphoria. There is anecdotal evidence of young people mistaking body image issues and depression for gender dysphoria.

In 2020, Keira Bell brought a case to the UK High Court. Keira had begun transitioning into a boy when she was a teenager but had changed her mind and de-transitioned. She claimed that she had been given false hope that transitioning would solve all her problems. She said:

I made a brash decision as a teenager, as a lot of teenagers do, trying to find confidence and happiness, except now the rest of my life will be negatively affected. Transition was a very temporary, superficial fix for a very complex identity issue.

Of course, disentangling causality in those who are—or simply believe themselves to be —suffering from gender dysphoria can be tricky. Those who genuinely and persistently believe themselves to be in the wrong body would doubtless be vulnerable to mental health issues and body issues.

What’s the role of sexuality?

Around 75% of boys who are gender-nonconforming grow up to be gay adults. In fact, gender-nonconformity in childhood is one of the best predictors of homosexuality in adulthood.

What treatment is given to children who question their gender?

When a child is thought to have gender dysphoria and is brought to a gender clinic, they will generally be required to see a number of different specialists over the course of a few months. As stated on the NHS website, many interventions involve psychological support rather than medical because gender dysphoria has a tendency to subside during puberty. Children may also be given puberty blockers, designed to pause the physical changes that usually occur during puberty. In the UK, after the age of 16, children may be given cross-sex hormones. These have more permanent effects and involve giving female patients the male hormone (testosterone) and male patients the female hormone (oestrogen). In the US, puberty blockers and hormone therapy can be administered to children under the age of 16 in the majority of states, while in some states treatment for children under 16 is banned. In Arkansas, doctors cannot provide any treatment for gender dysphoria to patients under the age of 18.

Are there any side effects of these treatments?

Puberty blockers do have a number of side-effects, some of which are relatively benign, for example headaches, muscle aches and changes in weight. Other more serious side-effects include lower bone density, changes in mood and delayed growth plate closure. The effects of puberty blockers are thought to be reversible – once a patient stops taking these blockers, puberty will resume, although longer term effects are still unknown and researchers are yet to discover whether puberty blockers affect brain development.

The side-effects of taking cross-sex hormones are still largely unknown. However, the few studies done have suggested that transgender women who were on the hormones had a greater risk of mortality, while transgender men had increased blood pressure and insulin resistance. In addition to this, infertility is a very common side-effect of long-term use of cross-sex hormones.

Although it is not a treatment per-se, chest binders are sometimes used by biological women who wish to give the appearance of a flat chest. It has been shown that the use of chest binders, a tight piece of cloth which is wrapped around the chest, comes with considerable side-effects. 53% of participants experienced back pain and 47% reported shortness of breath as a result of chest binding. Rarer side-effects can include spinal misalignment and rib damage.

Can children give informed consent for gender dysphoria treatments?

Despite the potential for harm, UK gender clinics have found a legal loophole through which pharmacies can administer hormone medication prescribed by doctors in any part of the EU and outside the jurisdiction of UK regulators. This was discovered when a reporter posing as a 15-year-old girl was prescribed testosterone by a gender clinic without parental knowledge. This raises the issue of informed consent. In the UK, children under the age of 16 are unable to consent to sexual intercourse, while those under 18 cannot marry, drink alcohol or vote in elections. These age limits are put in place because adolescents are not deemed to be capable of assessing decisions in the same way that a fully developed adult would. This is because the frontal cortex, the brain’s decision-making powerhouse, is not fully developed. While an adult should be well within their rights to transition if they decide to, a child may not be capable of making a decision that has such life-altering consequences, infertility being one of them.

Science has a long way to go in researching the causes and treatments of gender dysphoria, so open and honest dialogue about these issues will be increasingly important as we move forward. We can do this by learning the facts, keeping an open mind and treating everyone with dignity.

After studying Architecture at the University of Nottingham, Laura Walker-Beaven worked in fundraising and international development. She is currently studying a masters in Human Rights, during which she has become increasingly concerned about the impact of Critical Social Justice on universities.

 

 


When reading about speech censorship on university campuses both in the UK and across the pond, I wondered if it was all just catastrophising based on a few outlier events. This was until I began my master’s degree and I realised we may indeed be heading towards a semi-Orwellian nightmare. References to Nineteen EightyFour and Orwell’s writing more generally seem to be rather fashionable at the moment and are becoming somewhat of a cliché. Nevertheless, Orwell’s Newspeak seems like the perfect term for the phenomenon occurring on our campuses.

Newspeak is a language invented by the fictitious totalitarian regime in Nineteen Eighty-Four, designed to restrict the vocabulary of citizens. Syme, one of the protagonist’s colleagues in the book, says:

Don’t you see that the whole aim of Newspeak is to narrow the range of thought? In the end we shall make thoughtcrime literally impossible, because there will be no words in which to express it.

This is beginning to feel oddly familiar. In one particular master’s module, my lecturer gave us a snowballing list of words we were no longer allowed to use. “Global” and “local” are two words on the taboo list. Apparently, the words “global” and “local” are offensive to locals, as it implies the superiority of cosmopolitan “global” elites. Ironically, this seems to mean that I am no longer allowed to utter the name of the department in which I am studying: Global Studies.

In one class, I used the phrase “developing country” and was chastised by my lecturer, who saw this as an opportunity to educate me on the evils of the paternalistic language of our colonial legacy. In progressive parlance, the terms “developing/developed countries” have now been replaced with the “global South/North” (am I allowed to use “global” in this setting?). Confusingly, these new terms do not even correspond particularly coherently to a geographical North/South divide.

Many foreign students seem baffled by this type of censorship. Some of my classmates from the “global South” still refer to their countries as “third world” and seem to have little interest in learning the “correct” jargon designed by elite academics to avoid offending this very same group of people.

On another course, students were told they must not use the phrase “poor people” when discussing, well, poor people. According to lecturers, this phrase imposes western standards of what “poor” means on those who may live very “rich” lives in terms of family, happiness and community.

This culture of vocabulary suppression that is being bred in universities does not stay within the campus gates. Last December, I mentioned that it was the start of Hanukkah to some friends, as I was writing a short story aimed at teaching children about the holiday. In doing so I referred to Jewish people as “Jews” and was advised that this might be considered an offensive slur. I was stunned by the idea that using the word “Jews” in the context of celebrating Jewish culture could be considered “harmful”. Even this is not where the problematisation of regular discourse stops. Far from it.

In fact, the phenomenon is gradually seeping out of academia and into the mainstream. Across the pond, as of January this year, the US Congress Standing Rules now use only gender-neutral language. “Mother”, “uncle”, “sister” and “nephew” are amongst the banned words. In the UK, Brighton’s NHS Trust now advises health workers to use gender-neutral language, particularly when discussing motherhood… uh, sorry, parenthood. Maternity unit staff should refer to breastfeeding as “chestfeeding” and breast milk as “chest milk” in a bid to become more trans-inclusive. Although the Trust has emphasised that the language guidelines are not compulsory, I worry that rules limiting vocabulary may spread to more and more domains.

Note: the NHS Trust report also contained the following passage:
Please note that these language changes do not apply when discussing or caring for individuals in a one – on – one capacity where language and documentation should reflect the gender identity of the individual. When caring for cis women it is good practice to use terminology that is meaningful and appropriate to the individual; this may include terms such as woman, mother or breastfeeding.
*This passage was added after publication. The author already specified that the language guidelines were not compulsory but this passage seems important for extra context.

Of course, certain phrases may go out of use as public sensibilities adapt. I understand this and think that it can be a good thing – we alter how we speak in order to reflect our ever-expanding perspectives. However, this language censorship goes beyond this. It stems from an over-zealous attention to discourse and the problematisation of any vocabulary that might not be in line with the Critical Social Justice agenda. In such a worldview, any language that may be deemed offensive to someone or that supposedly reinforces existing power hierarchies is absolutely verboten. Unfortunately, the criteria that these academics use to determine whether language is offensive or hegemonic are usually informed by a cocktail of identity politics and self-flagellation from which almost anything can be seen as problematic. If all you have is a hammer, everything looks like a nail.

Treading on egg-shells when trying to discuss complex topics in class is not conducive to learning. Neither is making language more blurred and confusing than it already is. Perhaps in twenty years’ time, I will be required by law to refer to my niece as “my sibling’s offspring”. Or perhaps we will all realise this is bonkers and manage to pull back from the brink before we are all dominated by a (new) Newspeak.

After studying Architecture at the University of Nottingham, Laura Walker-Beaven worked in fundraising and international development. She is currently studying a masters in Human Rights, during which she has become increasingly concerned about the impact of Critical Social Justice on universities.