I was talking yesterday with some young students about fictional characters, and had one tell me emphatically no, I was wrong – Harry Potter is definitely a real person because she’s seen him on tv.

This morning, the Kony 2012 campaign video has gone viral, and provoked some important responses from those far more knowledgeable about the situation than I am. However, what I have noticed with regards to the sudden violent outpouring of emotion and anger and sympathy over this issue strangely harks back to my conversation yesterday. Child abuse and exploitation is hideous and contemptible, and raising awareness of these crimes, condemning perpetrators and actively exploring solutions to prevent it is of paramount importance. And yet, there is something in the narrative being created to get the general public emotionally and financially involved and in the audience reaction it has provoked that demonstrates how purposeful story-telling can become substituted for the complexities of reality.

‘It’s only a movie/book/show’ is a familiar contempt leveled at the scholarly analysis and critique of fiction, and there is some power in this. There are profound and important things happening in the real world that need attention and close examination, and it can sometimes be hard to see why exploring and unsettling the foundations of fiction as simple escapism is significant, let alone useful.

But in presuming that fiction doesn’t affect us, our lives, and the way we think about our world, we leave ourselves open to other more subtle forms of narrative propaganda that, however noble their intentions, are also only giving us a simple, evocative and constructed story, not a complex, difficult, contradictory real-world history. In fiction, we are so often offered reassuring moral dichotomies: good versus evil, hero versus villain. We are also relieved to find good usually triumphs over evil – the hero kills the villain, and peace and happiness is restored to the world.

But we don’t believe that in real life, we scoff. That’s the charm and appeal of fiction: the escapism, the departure from real-world problems. We know that doesn’t happen outside the pages of the book or screen. It’s only a movie.

Yet isn’t this a very familiar narrative in the campaigns of world politics, and don’t we want very much to continue believing it? Hitler, Saddam, Bin Laden and now Kony for example: terrible and fearsome war criminals who deserved to face international justice, but strangely elevated in our world narratives to super-villain status – as though they were inexplicably the single force of evil behind all the wrong-doings during their reigns, all their followers were simply blank-faced extras, and all other influencing factors were deleted scenes. Unless there is a final confrontation, and a big boss or head bad guy to defeat and kill, there is no sense of triumph and completion in our shared social narrative. The deaths of the three former dictators were presented in the media and political narratives as a definitive end to evil, though in each case, each man had already virtually become powerless and pathetic within their respective regimes.

In fiction, the death of the bad guy is always inherently satisfying: good has won, and the narrative is drawing to a close. In real life though, the narrative is multifold and multifaceted and ongoing: there will be a new world power, a new heir, a new brand of terror, a new generation. The death of one does not in any way guarantee a neat, appealing close to a sprawling, complex, frightening international situation.

Moreover, in killing these symbolic figures, the world loses the chance to hold them accountable. For militants and extremists, why would death be a punishment when you look fervently towards greater rewards? There is perhaps even some satisfaction in dying as a martyr in the narratives of your followers where, in comparison to the world story, the assigned roles of good and evil are clearly reversed and have now been buoyed up by your sudden and violent death. Your successor will be spurred to carry on your work: this will not be an end.

There is no accountability, no humbling, no public humiliation at standing trial and being demonstrably held responsible for your horrific crimes. No imprisonment, no regret, no remorse, no remonstrance.

We may not believe Harry Potter is real, but from the narrative we are being offered, we do find ourselves believing for example that bringing about the downfall of Kony is a complete and satisfactory solution to a terrible international problem. And we might think that bit of light-hearted entertainment we saw on the weekend is ‘only a movie’, but however well-intentioned, so is the latest viral video, complete with its own neat rhetorical  narrative about good and evil, and hero and villain, and what will result in a finite and finished happy ending.

If we learn anything from a study of fiction, it is to recognize how it actually affects us and our ways of thinking at a profound level, to identify how fictional constructs are being wrought and worked in real-world situations, and to acknowledge that we always need to think critically about all the kinds of stories we are told in our lives.


The Arts of Teaching

May 17, 2009

University teaching in the arts has come to a strange pass.   With the increasing bureaucratization of academic tertiary institutions, funding arrives upon receipt of tangible research output.   Moreover, this output is weighted towards autogenous production: within the confines of a select scholarly arena for a minute academic audience.   Judith Brett addressed this concern in her superb article “The Bureaucratization of Writing: Why so few academics are public intellectuals” in Meanjin (50, 4, 1991), and I would like to consider further the implications of her claim that “(a)cademic writing is writing that never leaves school”.

Scholarly conferences, refereed journal articles and specialized books are acceptable and lucrative ways for a university discipline to prove its fruitfulness and public productivity, and gain financial and institutional approval.   This is despite the fact such conferences are usually populated only by colleagues, such journals might gather dust in a library or end up scattered under a ‘free – please take’ sign outside a ex-employee’s gutted office, and such books find their way often to the clearance tables of remaindered bookstores.   The outside world is cruel and contemptuous of academia in action, and the scholarly act becomes one performed only for fellow academics, students, and the bureaucratic powers that be: it never leaves school.

However, in instigating this self-perpetuating model, university research output suffers the fate of all closed communities: it fails to reach new initiates eager to be indoctrined into the system; it does not create the next generation.   This is where the importance of the university’s status as school is reasserted: if the writing cannot leave the school, students must be brought in, as readers and eventually as new writers.   Although much academic research material is careless of the extent of its readership in lieu of peer approval and departmental funding reward, the act of researching is considered worthy in itself, whereupon training up new researchers becomes imperative.

All tertiary institutions are therefore necessarily a dual enterprise: they must function successfully and simultaneously as both research facility and educational institute.   In recent years, however, academic funding schemes are increasingly weighted towards the products of the former, and not of the latter.   Scholars are being pressed to write and research more, because it pays the department and discipline better than the number of students do.   Rising undergraduate enrolments, graduate numbers and post-graduate achievements warrant hearty self-congratulation, but research will bring in the funding.

And here arises the paradox.   The best of Australia’s academics are not sharing their expert knowledge and scholarly experience with a new and upcoming audience in any kind of immediate or openly accessible way.   They are retreating into offices to produce research, and the research they produce guarantees the funding necessary to hire the sessional tutors needed to take over the classes the researching academics were going to teach.   And in this house that bureaucracy built, we find structural flaws and subcontracting threatening the very foundations of contemporary tertiary education.

The prevalence of sessional tutoring is a growing evil, and I say this advisedly and self-consciously.   I myself have been teaching in this capacity now for a decade, and the university will be increasingly reluctant to employ me in any kind of permanent position because I currently lack a prolific academic research profile.   As a sessional tutor, I am paid only for the hours I teach.   Not for my additional contact time with students when I schedule office hours, make extra appointments, and answer copious concerned emails from distressed and confused first-years and numerous students struggling with illnesses, injuries and all manner of personal problems and traumas.   Not for my hours of marking, where I take care to give full and detailed feedback to students on their work, for I know from personal experience that this is the only significant commentary they receive on their academic writing until post-graduate publication.   Not for my paperwork, that increases annually as classes expand, students face more administrative complications, and keeping records becomes its own job.   I am paid only for the hours I teach.   And this has occasioned the need for other work.   I write voraciously for non-academic sources, I teach pragmatic rather than scholarly classes at non-academic institutes, I devise and co-ordinate events and productions related to my discipline that further my experience and knowledge, and allow me to share my empirical research with new audiences and readers.   I must write and teach for a sustainable income.

But this is not recognised as academic research.   This is not acknowledged as a scholarly work history that makes me employable within the university.   My extensive teaching experience also does not make me permanently employable, even with the kudos of the Dean’s 2008 award for excellence in sessional tutoring.   Instead, the institution will hire a researcher, whose research will pay for me to teach their classes for them.

As academic writing never leaves school, how can we continue to pursue excellence in scholarship when the art of teaching is not given any merit?   For whom are we researching if we are not concerned with the level of teaching the next generation of readers is receiving?   If we are not protecting and instigating high standards of education, we are not creating scholars of a calibre to continue the research we are so anxious to perpetuate.   It is time the bureacracy saw the inherent contradiction in their fiscal rewards and employment structures.   For research to continue, education must be maintained.   Universities must function as schools in order to further the pursuit of knowledge, not as a secondary and inferior concern, for if academic writing never leaves school, the school becomes the necessary force in propagating research.   Hiring researchers over teachers is fundamentally impractical and ultimately unsustainable, for sessional tutors cannot realistically remain at the forefront of their academic field on a casual wage, which means new students are not even exposed to the latest in research and scholarly thought, nor to the disciplinary expertise of the permanent staff.   The university is failing in its necessary position as higher-than school education, by shuffling the burden of this role off onto post-students who are thus financially thwarted from becoming actual scholars.

Universities should not be hiring researchers who could teach, but won’t.   They should be hiring teachers who could research, and will.