“Sexuality is complicated, Honey”: The Works of Lisa Cholodenko

This article was published in Issue 46: Profiles of Bright Wall/Dark Room. In it I survey the varied presentations of sexuality in the films of Lisa Cholodenko (‘High Art’, ‘Laurel Canyon’ and ‘The Kids Are All Right’). It was fantastic to have the chance to write about such a complex and talented auteur.

Read the full article at Bright Wall/Dark Room (check out the rest of the magazine while you’re there, it’s really good)

Advertisements

Between the lines; On Self-Harm

(First Published in Exetera Magazine, Issue 12 (The Body Issue); Spring 2015, pgs 16-17

Screen Shot 2017-04-11 at 13.30.08.png

“I think therefore I am”- René Descartes

“My mind’s telling me ‘no’/ But my body, my body is telling me ‘yes’ “- R.Kelly

 

 

R Kelly’s R&B classic “Bump and Grind” immediately sets out a clear distinction between the body and the mind. Philosophers as far back as Plato have championed this division, with René Descartes providing the most lucid elucidation. The French thinker, prefiguring R.Kelly, wrote in the 16th century that “it is certain that I am really distinct from my body, and can exist without it.” This mind/body dualism formed the existential starting point for much Western thought throughout the modern era, and continues to shape the way we talk about illness – particularly in the distinction made between mental and physical health.

 

For all the scientific and technical advances in the medical industry, mental health continues to occupy an awkward place in our socio-cultural imagination. Problems begin at an early age. While young children are given the language to talk about physical injury, our vocabulary for mental pain is limited to the hyperbolic and parodic. Words such as “mad”, “insane” and “mental” are meant to articulate complex and distressing psychological states, but do not even begin to describe the destructive and debilitating emotions that mental illness can produce.

 

This gap in language can, I think, be put down to a wider structural privileging of physical health over mental wellbeing. If the quantifiable results of my own mental illness – a regular prescription, several deadline extensions and several trips home- were attributed to, for example, a broken bone or glandular fever, then such events would surely be readily be acknowledged as a necessary step towards recovery.

 

As it is I kow that some people by their own admission, dismiss such mitigation as sitting somewhere between a mild annoyance and an undeserved academic privilege. Even when friends express sympathy for my condition, I cannot help but detect a strong sense of suspicion they may truly wish me the best, but, deep down, this sincerity is often tinged with cynicism. Whether I am paranoid of not, there is no ambiguity about the result: I often feel like a fraud when asking for help in a way that wouldn’t ever occur to me with normalised, physical illnesses. It seems that suffering is legitimised in the prevailing medical discourse only when the healthy body is physically violated.

 

With this in mind its, little wonder that self-harm is seen as one of the few options to those whom language has failed. Self-harm unites the mind and the body through giving inner turmoil a physical – and therefore legitimate – symptom. Pain, in other words, is represented in a way that does not require conventional language. For once, mental discomfort erupts on the surface of the body in its most visceral and visible form.

 

This fresh appearance of pain on the skin of the body is undoubtedly cathartic providing inflictors with momentary physical release. The pain one feels provides and immediate corporeal reprieve from invisible inner torture. But perhaps more significant is the social signposting: self-harm makes pain intelligible without the need for language. The many forms of self-inflicted abuse – from cuts to scratches and bruises to burns- turn bodies into things that speak for themselves. In this understanding, the landscape of mental illness can be crudely mapped onto a corporeal canvas that declares: “look at me, I need help.”

 

Nevertheless, this desperate plea is often lost in translation, dominant cultural narratives about self-harm immediately absorb and neutralise any intelligible meanings. As soon as self-harm is mentioned stereotypical imaged immediately spring to mind: the angst-ridden teenager with a penchant for My Chemical Romance and black hair-dye, or perhaps the attention seeking bulimic adolescent displacing one self-destructive strategy with another in a “phase” of self- loathing. Whatever the image, self-harm is usually dismissed as an immature reflex to the aging process. This reductive cultural narrative aestheticises self-harm, reading scars as little more that a fashion accessory for superficial sub-cultures rather than as potent corporal expressions of inner distress.

 

While this line of thoughts may be true of some piercings and tattoos, cutting oneself is premised on an entirely different teleology. Rather than prioritised the finished product, the act of self-harm often finds its locus in the process of mutilation. It is through the action of slicing the skin that individuals – my 15-year-old self included – experience a moment of clarity: the inner turmoil has, for an all-too-short moment, taken a visible form. But unlike tattoos, the damage is not usually meant to be permanent.

 

Instead, such fleeting acts are perhaps better understood as signalling a profound disillusion with the way we talk about mental health. It is in this vein that many troubled individuals are, in my view, forced to adopt a corporeal vocabulary to try and express the inexpressible. This includes 13% of 12-to-18 year-olds – numbers that they Royal Collage of Psychiatrists warns are extremely conservative due to the immense number of unreported cases.

 

This is not to promote self-harm as a productive way to cope with depression, anxiety or any other codification of inner turmoil. Instead. It’s a call to understand self-harm as a regressive but understandable reaction to the poverty of language when it comes to discussing mental health. Cuts and scars are perhaps best seen as a last-ditch attempt to render the inner workings of the mind intelligible. But such cries for hep are ultimately mediated through a limited and restrictive body of language that is – tragically, ironically, unspeakably – all too quick to dismiss them.

The Findus Lasagne of Global Food Politics

(First Published in Exetera Magazine, Issue 11; Autumn 2014)Screen Shot 2017-04-11 at 13.27.29.png

 

What we’re swallowing along with our food isn’t always on the ingredients list as the Findus horsemeat scandal proved. But these unexpected extras don’t all contain calories. In The Pervert’s Guide to Ideology, in collaboration with Sophie Finnes, the rock-star philosopher (and Marxist bastard child of Jacques Lacan and G.W.F. Hegel) Slavoj Zizek touches on the politics of food in Late Capitalism. His suggestion is that as subjects of this system, food is consumed on two levels. Physically- for calories to perpetuate our lives (also Capitalism, but, like, through our continued existence), and ideologically. This means we participate in relations of production that persuades us to buy food wildly misaligned with our nutritional requirements. This second stage creates a “weird perverted duty” to enjoy our food. It is not enough simply to take pleasure in food, but actively transform the process of eating into a performance where getting energy is secondary, and ideological engagement and therefore ‘enjoyment’ becomes primary, as we define ourselves by what we consume- as with any other aspect of Late Capitalism.

 

Zizek takes the example of the Kinder Egg, the food and toy part of which are both, objectively, a bit shit. If someone on the street offered us 20g of low quality chocolate and a piece of moulded plastic in any other situation, I for one would probably brush them off with a quick “I’ve already voted, thanks”. Wrapped in orange and white foil however, and packaged as an essential childhood treat, parents, children, and the odd nostalgic student (we must assume by their otherwise inexplicable presence in the Guild shop) will fling money at them. But what is this nostalgia? My only memory of eating a Kinder Surprise is one of disappointment, both parts left me unsatisfied, my four-year-old body deprived of the massive sugar hit required to make the gruesome hybrid of human-head and a car, or banal miniature jigsaw an object of fun. This didn’t prevent me from demanding another from whichever unfortunate parent happened to be looking after me the following day, week or month. I was, even then, a good subject, playing my part in a cultural, and marketing, narrative- craving an object that I knew would bring me no satisfaction. And yes, I have in the past year wondered what it would be like to have one again, momentarily recalling a childhood joy that I never experienced. Or, more accurately experienced only in the thrilled faces of children in TV adverts. I’m not the only one apparently, based on their continued presence in the guild shop.

 

This seems to be a trend in food, especially marketing, in our current moment- a hark bark to an un-experienced past. Take Innocent Smoothies, even as a child, who actually experienced the patronising bullshit that defaces the side of their cartons. No adult has ever spoken to a child like that- but the marketing team at Innocent Towers (or whatever) persuade us that this is how adults universally behaved towards us. Don’t get me wrong, I love a good fruit pulp as much as the next guy but I don’t give a fucky-wuck how many guavas were ‘squished’ to go into it. But clearly I’m one of the few. The underlying marketing point here is not, of course, the language of a self-parodic toddler, but the direct link between the homogenous liquid and its ‘natural’ ingredients in an increasingly industrialised food culture. Again we experience nostalgia for a past that almost no one alive today could have experienced. When we plant that allotment or go to the organic family-farm butcher in a part of the country where everyone has the same surname, what we’re really trying to do is find a physical connection with our food. Fads such as the Paleo-Diet attempt to emulate “hunter-gatherer” diets of our ancestors over 10,000 years ago are clearly clutching at the same idea- that we have moved away from a natural (whatever that means) relationship with our food. This is of course a relationship we can only create ideologically- because obviously even if you’re a Paleo-devotee chances are you’re not giving up your Barbour for the hide of a Mammoth, or digging in the woods for berries- so you end up hunter-gathering by proxy- assuring yourself that the food was sourced ‘naturally’. While I don’t think this is an inherently awful idea- making sure your eggs weren’t from a chicken that was punched in the face every hour of its life by buying free-range, or a bar of fair-trade chocolate to limit the capitalist larceny of the produce of developing economies- is probably, on balance, a good thing. And therefore a good way (within the confines of a pretty shitty system) to define ourselves through the products we consume. My problem comes when the ostensible ideological meaning is in direct contrast to its real world effect.

 

Quinoa is the obvious example of this. Since the mid-2000s we (and by we I mean especially those of us freaks who don’t eat meat, but also quite a few of you normals) have loved the little chunks or cardboard that are apparently really high in protein and good for the environment. The issue is not, as the Guardian suggested, that the Peruvian farmers who grow it can no longer afford to eat it after the tripling of the international market price, as the money added to the Peruvian economy on local levels has more than outweighed this effect. No, it’s the fact that the country is becoming reliant on the crop for export creating a monoculture, which is really bad news for the land. Although this is not quite as disastrous as the denuding of the Amazon rainforest to feed our beef habit, it’s really not a good thing to happen. And of course, depending so heavily on a single crop will mean they are fucked as soon as Yotam Ottolenghi introduces us to the next miracle grain. We become- again by proxy- food tourists, trying on different cultures that have a more natural (that word again) relationship with their food, while not truly engaging, and in fact damaging, the people we are emulating.

 

This tourism brings us back to the ideological function of food- we are forced to define ourselves by what we put in our bodies- not in the sense that it makes us up through amino-acid molecules- but in the sense that our ‘taste’ constructs our identity. Given the arbitrary relationship between the physical reality of the food and what it signifies, what we believe ourselves to be representing or supporting, and the real-world impact of our consumption are often misaligned- even directly opposed. This is to say that while it is important to be aware of the physical things we put into our bodies, probably more important is the ideology we swallow along with it. Defining ourselves by what we eat is fine, but just make sure you’re not eating ideological Big Macs thinking that its Fairtrade Organic kale.