“We haven’t got recourse to any kind of secure methodology”

12/Jun/2015

The digital humanities don’t just offer us new research methods; they also enable us to ask new research questions, says the literary scholar Gerhard Lauer. Although he describes himself as ‘conservative’, he’s one of the leading thinkers in the field of computer-based analysis. By Urs Hafner

(From "Horizons" no. 105, June 2015)

​Prof. Lauer, ‘digital humanities’ is a broad umbrella. What does it mean in your eyes?

Quite simply: the use of computer-based methods for realising digital editions and for the quantitative analysis of large corpora.

Has your research been changed by the digital humanities?

Yes, but not suddenly. The digital humanities are beginning to change our research by gradually expanding our range of methods and the questions we ask. To take my own field, literary studies, as an example, we’re beginning to apply quantitative terms to the analysis of literature, such as Goethe’s Werther or Kafka’s novellas. The demand for computer-based methods is increasing among students, and I’m now supervising the first bachelor theses in this field.

The ‘text’ is traditionally at the heart of the humanities. Scholars analyse the manifest and latent meanings of documents, and offer up their findings for discussion in a narrative, argumentative format. But the digital humanities bypass this dimension of the text.

At first glance, you’re right. Above all, the digital humanities count words. But their distribution in texts actually says something about those texts themselves. We can learn a lot about people through their word profile or, rather, the word profile of their texts. We tend to think that the use of articles or pronouns would tell us very little – but the opposite is in fact the case.

What new findings have you made thanks to the digital humanities?

Let me give you an example: we’re now looking at how many long and short words Kafka uses in comparison to other authors of the time, and how he used specific function words. The statistical distribution of word frequencies in his novellas allows us to identify what is special about his style. Words that are frequently used also tell us something about the period in which a work was written. And we’re no longer just exploring the canon – not just Goethe’s Elective affinities, for example – but the many other books that were also read at the time. This brings into focus the cultural history of what was read, not just the canon itself. These are initial insights, not earth-shattering findings. What’s new at the moment are the methods we’re using.

Isn’t historical, hermeneutic analysis superior to just counting words?

That is perhaps the case at the moment, but it’s changing bit by bit as we make new findings. It’s becoming increasingly clear that the text patterns we’re recognising allow us to read the development of the human ability to tell stories. This allows us to ask new questions, such as with regard to differences between European and Asian narrative styles.

How do you proceed when you want to find out what people in Europe read in the early 19th century?

We evaluate different sources and metadata made up of library catalogues that have been gathered together. This allows us to deduce what books were printed, bought, borrowed by people, and thus probably read too. Or we use Google’s Ngram viewer, which allows us to search through almost five million books in different languages.

But the Ngram analysis is actually selective. And we don’t know what criteria Google applies when deciding whether to scan a given book.

That’s true. The corpora with which we work have often not been created in a systematic, statistically balanced manner. Google Books is especially problematical because Google has gone through whole libraries from start to finish, digitising them indiscriminately. Although that doesn’t constitute a recognisable corpus, the company is using it to define our cultural heritage. This is why universities and libraries have to stand up and insist on our cultural heritage being kept from privatisation. We have to be able to analyse it according to the criteria of source criticism. But in the humanities, we aren’t active enough when it comes to these matters of cultural politics.

Many adherents of the digital humanities talk of a ‘revolution’. Do you?

Digitisation is a revolution; the digital humanities aren’t. The humanities are being
transformed just like chemistry, physics, medicine and biology were transformed
when they integrated computer-based methods. Biology has changed radically
since people began using computers, without the discipline itself ceasing to exist.
The same will happen in the humanities. Archaeology and linguistics have already
taken this step. The digital humanities are a kind of revolution of feeling. Something
new is coming – numbers and statistics in particular – and in many subjects, people
don’t yet know how to deal with them.

You mention the natural sciences as an example, because they’ve been doing
quantitative work for a long time. Are the humanities now copying them so that they can get more research funding?

You have to differentiate between two conflicting interests. On the one hand, funding
policies follow trends, favouring what promises to be the next new thing. The
digital humanities currently seem to constitute a highly promising field, so money
is being invested in them. But on the other hand, it’s difficult to find a job in this field
within the established faculties. They’re often hesitant and prefer to appoint someone
who corresponds to the traditional self-created image of the humanities. You could say that the humanities are modernising themselves with the handbrake on. It’s different in libraries and in editing and publishing.

Do you feel part of an avant-garde?

No, although along with others I’m being pigeon-holed in such a role. This is
despite the fact that I’m actually a rather conservative literary historian. Of course,
almost everyone in this new field knows each other, and that gives you the feeling
of belonging to a kind of group. I also have contact with the Swiss centres in Lausanne, Basel and Bern. Because of their methods, the digital humanities are more
collaborative in spirit than is the norm in the humanities.

Do the digital humanities think enough about how, for example, digital methods can alter the status of a text – that the text itself can seem to ‘slip through your fingers’ in the process?

The deficiencies here have been due to the basic questions and the practical work often not being brought together, and because of the whole debate about algorithmic criticism and the text as a digital object. But you mustn’t forget that we’re only at the
beginning of things. Often, we stand on the sidelines and haven’t even got recourse to
any kind of secure methodology.

So the humanities overall are ‘on the sidelines’?

If they carry on as they have up to now, then they won’t have a brilliant future before
them. They’re in a difficult position. In Anglo-American countries, public funding
for them has all but dried up, and they have to be financed almost completely through
study fees.

Do you see the digital humanities as a safety net?

No, they’re not that. But in a best-case scenario they’re part of the solution. The big
question is what the humanities will teach in future. This has to be answered independently of digital methods, but at the same time bearing in mind the dramatic
changes that digital modernisation will demand of us.

Urs Hafner is a historian and a science journalist.