[First published in Computers and the Humanities 34 (2000): 265-78. References updated July 2004.]
Dr Michael Fraser
Humanities Computing Unit, OUCS,
13 Banbury Road
Oxford OX2 6NN
Keywords: higher education, research, undergraduate, computer-aided learning, resource discovery, textual studies
This paper discusses selected aspects of the work of the CTI Centre for Textual Studies, a Centre which has its roots in a 1984 initiative and ceased to operate in 1999. The work of the Centre was grounded in humanities computing, a subject area which itself has developed over time. The article compares earlier observations made by Joseph Raben and Susan Hockey about the integration of resources within humanities teaching and learning, to current realities. Its focus is the development of access to distributed resources, beginning with an interface between the early PC and the mainframe and ending with a vision of a humanities portal to distributed resources.
This paper addresses the integration of computing technologies into the teaching of text-centred humanities disciplines and draws on the lengthy experience of the Computers in Teaching Initiative (CTI) Centre for Textual Studies, a nationally funded Centre whose funding ceased at the end of 1999. Two early references to the use of computing in the humanities are worthy of discussion as a prelude to looking at the current integration of computing in the humanities. The first is an article by Joseph Raben written in 1973 and the second is the final report of the first phase of the CTI in Oxford, written by Susan Hockey in 1989.
Over twenty-five years ago the first issue of the Bulletin of the Association for Literary and Linguistic Computing was published. Joseph Raben’s opening article was entitled ‘The humanist in the computing lab: thoughts on technology in the study of literature’. The article provides a canny overview of the application of computing technology to the humanities and calls for further development in a number of areas based on what seemed to be most effective at that time. We have come a long way since that article was written, in terms of computing hardware at least (the monitor and keyboard are two obvious examples, as well as surging processing power coupled with plunging bulk). A fair distance has also been travelled as regards the application of the technology within the humanities. The difficulty has lain in cultivating deep and stable developments rather than surfing with each emerging technology. The ever upwards progression of computing hardware and software (and the speed at which one is pressed to adopt them) has often resulted in the decay of unfinished and untested techniques and resources.
What Raben termed the ‘age of analysis’ has evolved into the age of information. However, computing in the textual subjects continues to find itself concerned with details. Computing applications have a tendency to demand radical approaches, returning the textual scholar to the fundamentals as well as the seemingly trivial aspects of text. It might be surprising that we still grapple with transliterated texts, encoding systems and incompatible text processing systems. On the other hand, the attention paid to fundamental aspects of the text, usually of no concern to the reader of the printed text, has developed almost into an area of study in itself. Nevertheless, whilst we may have beautifully encoded texts on the one hand, a frequent complaint to the CTI Centre was the lack of accessible software which would allow the user to display, navigate, and perform text analysis activities. There remains a gap between the text prepared in a popular word-processor and the text prepared for indexing - although it is now less of an abyss and more of a crevice. Other technologies, as they are applied to the humanities, encourage scholars to examine the processes which underlie humanities scholarship, particularly where the technology relates to the communication of ideas. The radical tendencies offered by new technologies give humanities computing a similar potential for the humanities as the application of historiography has for history, and critical theory for literary studies.
In the humanities the greatest impact of new technologies has been on scholarly research rather than teaching, and Raben’s article reflects that trend as it was then. He does, however, make a significant prediction concerning the use of computers to assist teaching. If, as he says, humanists do not seize the opportunities offered by the computer for teaching purposes then there is a risk that computer-aided learning will dwell amid the realms of drill and practice, the teaching of undisputed facts. The humanities deal in few undisputed facts, and Raben poses this challenge to humanities scholars:
Humanists trained to approach subjects which have no readily visible hierarchical structure may already have mastered the philosophy of multi-branched searching techniques that will bring computer assisted instruction out of the drill and practice phase into the broad realm of true learning - that is, self-teaching. (Raben, 1973: 8)
Significantly, Raben recognised that the greatest obstacle to the development and use of computing technology in the humanities is the ignorance of its methods by the funders and promoters: “cultural lag is nowhere more prominent than in the promotion committees, deans, presidents, and trustees. In their eyes the preparation of a text seems like secretarial work, but the publication of a book comes within the definition of scholarship". (p.5) Although national funding agencies for research now not only recognise but encourage the use of technology, there is a still an inability amongst many to recognise in practical terms what is involved in the development of digital resources for teaching and learning - unfortunately, this has been known to include assessment through peer-review (see Solopova, 1999) - or indeed adequately to fund their development in the first place.
The computer has been integrated within the humanist’s office, a development from Raben’s humanist installed in the computer lab (unless that lab also happens to be in the humanities faculty). The computer is present on the desktops of academics, librarians and administrators alike, and computing software is essential for the administrative life of the department. Everything from budget control and admissions to lecture notes and other forms of writing are undertaken on computer. In this, the computer has become indispensable. It is gradually becoming indispensable for research, and here the greatest benefit has been access to and dissemination of knowledge from the desktop. Bibliographic catalogues, journals in electronic form, personal email, word processing, placing lecture notes on the web, text databases, taken individually, express convenience more than they represent dependency. However, taken together, and with the addition of many other applications, they amount to one environment in which to undertake learning (for both staff and students).
As others have noted, it can be rather difficult to find solid evidence of the normal use of computing within humanities research. The journals dedicated to humanities computing (such as ‘Computers and the Humanities’ or ‘Literary and Linguistic Computing’) do not report ‘normal’ humanities research, and other peer-reviewed journals in the humanities carry few articles in which the authors discuss the dependency of their research on digital technology (see Warwick, 1999). However, it is entirely possible that it does not occur to most scholars to document their specific use of, for example, databases for bibliographic research or digital resources which are treated as surrogates for their printed counterparts. Indeed, if personal observation or anecdote is any authority, where a printed equivalent of a digital resource exists (for example, the works contained within the Thesaurus Linguae Graecae database), then for the purposes of citation it is preferred to grappling with discrete parts of an extensive database, even when use of the digital resource has enabled research simply impossible to undertake with printed sources alone.
The level of dependency on computing technology, visible or invisible, is not only demonstrated when it fails to work. Evidence of dependency lies rather in the recognition that the humanities requires not only more technology but also, crucially, support staff who have an understanding of the technology they apply, and about which they advise, combined with a deep understanding of the subject matter.
For over ten years the CTI Centre for Textual Studies has combined subject knowledge and a realistic enthusiasm for computing technology towards advising and supporting humanities subjects within UK universities. The Centre evolved from pioneering work in ‘arts computing’ undertaken by Susan Hockey at Oxford University, in a middle period between Raben’s reflections and the present time. The CTI has gone through a number of different phases. The first phase, beginning in 1984, saw the funding of a large number, and wide range, of development projects connected with the use of computers in higher education teaching. Oxford received funding in 1986 to develop a system which provided students (and staff) with a means to run Oxford Concordance Program commands via a PC interface, known as the Oxford Text Searching System. The project was a collaboration between Arts Computing staff and tutors in humanities faculties, particularly Literae Humaniores and Modern Languages where a number of courses revolved around the study of set texts, many of which were already available in machine-readable form via the Oxford Text Archive. Like the majority of CTI projects in this initial phase, the ‘Languages and Literature Project’ was developed as an institutional project, tailored to the courses and materials available within Oxford University. Key to its success, and especially its continuation beyond the three years’ funding, was its integration with both existing teaching practice and the computing support available at the University.
In September 1989, Susan Hockey, Director of the project, submitted her final report to the funding bodies with this conclusion:
The Project’s main impact has been to increase awareness of text analysis computing .... However it has also become clear that some staff who are prospective users can lose interest when they find that they need to do some work, either to put in new texts or prepare course material, or even to find funding for new texts. Because computing is not part of the normal syllabus at Oxford, it competes for time and resources with other activities which have a higher priority. To make OTSS become and remain widely used within the University will probably need someone to look after it, to maintain interest and to identify sources for new texts. (§ 14.4)
Of course, the quotation has been torn from its context, but these words might also offer a conclusion to the work of the subsequent CTI Centre. The report detailed the full integration of the Project’s software with courses in Classics, Italian and German studies, and with further courses planned in English and Theology. In terms of integration, with the involvement of academic staff and students, distributed access, dissemination and publicity, one can only conclude that Project 77 was a success. But success depended on a strong support team who were able to identify and then persuade potential enthusiasts within individual departments, and this team was only available for the duration of the project funding. Like many other projects, its continued success depended on institutional change. Computing in the text-based humanities was not a normal part of the syllabus so staff had other priorities. Hardly a surprising conclusion in 1989, when most humanities academics did not have computers on their desks or had to contend with the complexities of a mainframe computer. The same interlinked issues of normality and priority have been at the forefront of CTI Textual Studies’ work ever since.
Ten years later, is the integration of humanities teaching and digital technology now a normal part of the undergraduate syllabus? The humanities has something which on empirical evidence alone might suggest happy integration. The humanities has humanities computing, and it is clear that humanities computing exists as an accepted subject area. There is plenty of evidence in support of humanities computing as a taught discipline, with institutions offering part degrees, modules, and other courses (see McCarty and Kirschenbaum, 1999). However, although humanities computing has a high profile within the community, the number of institutions actually offering courses which are as well integrated into the undergraduate curriculum as other offerings in the humanities faculty are relatively few in number. Humanities computing does not, for example, appear within the UK higher education’s classification of taught subject areas. But, even leaving that aside as simply a UK issue, the very fact that humanities computing is evolving as a definable discipline in its own right, complete with its own teaching and research programmes and its own physical centres, surely argues against the proposition that new technologies are now integrated and a normal part of the humanities syllabus. Humanities computing, because it appeals across the range of arts and humanities subjects, because undergraduates can, in some institutions, opt for a humanities subject with humanities computing, emphasises the non-humanities aspect of the discipline - the computing. Humanities computing is, in many cases, a type of computing defined by its application to humanities subjects. That is, computing technologies such as hypertext, databases, desktop publishing and multimedia authoring are taught with examples and applications drawn from the humanities.
If humanities computing is to be a type of humanities, defined by its basis in digital technologies, then one would expect it to be a type of computing found only within the humanities, or originally developed from within the humanities and only subsequently reused for other purposes. Does such a type exist? It is pertinent to ask: who would have been interested in concordances and indexes if Fr Busa had not made the connection between Aquinas’ Latin style and the computer's innate ability to count? And who would have made the connection between an ability to count and the prospect of proving authorship or otherwise if Morton and others had not the determination to settle the Homer or the Pauline question once and for all? As the technology itself has become more accessible so the tendency has grown to emphasise the adaptation and adoption of existing technologies rather than pairing a humanist with a computing scientist and writing a specific programme from scratch. Thus, students of humanities computing are rarely recommended to learn a programming language and there is not, as far as I am aware, Perl for Humanists analogous to Susan Hockey’s Snobol Programming for the Humanities (OUP 1986).
On the other hand, the broad area of computer-assisted language learning (occasionally omitted altogether from typologies of humanities computing) has a long history of developing specific types of software for learning grammar and vocabulary and, in many ways, pioneered interaction between the computer and the user (now extended from the keyboard to the microphone). There are other areas of humanities computing which may have commenced with existing technologies but have modified them enough to make them almost exclusive to the humanities. The Text Encoding Initiative has driven SGML, and soon probably XML, to its limits by beginning with the needs of the humanities and making computers motivate scholarship rather than the other way round. Hypertext (and to a lesser degree text encoding) receives a notable amount of attention by those who see in such digital manifestations an embodiment of the link and the reader’s freedom to choose, supported by literary theories of the interrelationships between author and reader (e.g. Landow, 1997; Sutherland, 1997).
The ubiquity of email and the Web has resulted in a surge of new technologies fit for humanities adoption. As communication technologies it is particularly fitting that humanities scholars have, on the whole, been enthusiastic about their potential. Some of the earliest email discussion lists were in the humanities (including Humanist itself but also forums for Classics, Theology and Medieval Literature) and one of the first professional Web sites released was the Library of Congress’ online exhibition relating to the Dead Sea Scrolls. The Web is a means of dissemination of digital material (or dissemination in digital form about non-digital material). The process of digitization of humanities materials (namely cultural artefacts) has required the adoption of standards and applications for digitizing text, audio, video. The subsequent representation of artefacts on the Web or elsewhere has adopted emerging image formats, streaming audio and video, and 3D visualisation. And, of course, it is not only in the representation of a physical object that digital technologies have been adopted but also in the creative process itself. The imagination is projected and recorded as digital pictures, animation, music, text and as the basis for architecture, sculpture and other physical manifestations. In film and broadcasting the digital is intertwined with celluloid and analogue. The last act of printing a book on to paper might be the only ‘physical’ part of a process which up to that point has been entirely digital. The production of a scholarly collection of essays is now likely to be based on email, word-processors, attachments, an associated web site, as well as a disk delivered to the publisher backed up by hard copy. It is normal for students to word-process essays - many departments insist upon it - and to attend computer-literacy courses. Computers have been integrated into the creative process, not only creation by an individual but increasingly the creative acts of a collaborative group.
The act of writing an essay on a computer comes towards the end of a creative process that generally begins with activities such as listening, reading, conversing and linking, which are later formed into (hopefully) a coherent whole. These activities are less well integrated with the digital medium; not necessarily because they are ill-suited to the medium, but because the development of an environment in which digital technologies reproduce and enhance these activities is itself a costly and ill-tested one. The lecture still proves to be an effective means of communicating ideas to a large number of students. The tutorial is still considered an ideal, if expensive, means of encouraging dialogue and understanding. Seminar groups fall between the two, but are ever-growing in numbers so that thirty students in a seminar is not unusual (Condron, 2000). A typical course might be expected to comprise a range of these communication types. In general, computers have been employed to assist rather than replace these activities. The Web provides a means of presenting timetables, lecture notes, reading lists, and other material which would otherwise have been delivered on paper. Paper is only postponed. The increase in electronic document delivery systems, whether local (e.g. the electronic reserve collection) or global (e.g. fulltext journals), is comparable to the electronic reading list or lecture notes. The computer has been used extensively in the creative process but ultimately the results will be printed to paper. The priority here is access. One copy on the web, accessible by one hundred students or two thousand others, is better than one copy confined to the library and reserved for fifteen minute slots, or one hundred photocopies, or no copies at all.
Access to primary source materials ranks high as an opportunity afforded by digital technologies to the humanities, especially the text-based humanities which has a growing body of electronic texts and digital facsimiles. Subjects such as Cultural and Media studies, or Modern History can draw upon newspaper archives, the raw text from press associations, and WordPerfect files of US government legislation, for example.
Despite the debates about quality and the complex copyright inhibitors, there is now a diverse range of audio and video material available online (Hunter, 1999). Current affairs (e.g. news archives and commentary) abound via the BBC, CNN, and others. But there is also an increasing amount of archive footage, clips of which might conceivably be integrated into a film or television studies course or essay (e.g. Richard Burton’s Hamlet at http://www.aentv.com/home/hamlet/mainstage_body1.htm or XOOM’s Classic Movies at http://xoom.com/cobrand/classicmovies/classicmovies). In a similar manner to the independent music industry, the Web is quickly becoming a source of new films, especially shorts and animations, which are not easily available in any other format. StudentFilms.com (http://www.studentfilms.com/), as the name suggests, provides an outlet for the results of film productions courses (and the chance for students to receive critical reviews from the general public). One of the best examples of archive radio on the Web is the Niuean Centre for American Pop Culture’s digitised productions of the Mercury Theater (founded by Orson Welles and John Houseman), including the infamous War of the Worlds and also Les Miserables (http://www.unknown.nu/mercury/).
In the world of textual computing, however, the Perseus Project (http://www.perseus.tufts.edu/) remains an exemplary means of providing access to primary resources with its integrated environment of Greek texts, texts in translation, lexical tools, image databases, and a means by which users can link from their own web pages to precise points within the immense database. The Perseus Project is on a grand scale but there are also notable attempts to provide access to primary sources by individuals. Medieval History and Literature is especially well-served by the ORB Project (Online Reference Book for Medieval studies - http://orb.rhodes.edu/) which has developed a significant amount of original material, particular essays on particular subjects, as well as providing access to other online materials. Integrated into the ORB Project is Paul Halsall’s Internet Medieval Sourcebook, an ambitious attempt to collect together (and if necessary digitize) public domain texts for Medieval studies (http://www.fordham.edu/halsall/sbook.html). EuroDocs, developed by Richard Hacken at the library of Brigham Young University, provides access to translations, transcriptions and facsimiles of historical documents relating to Western Europe (http://library.byu.edu/~rdh/eurodocs/). And, of course, the Oxford Text Archive’s founding purpose was the preservation of machine-readable texts created by contemporaries of Busa and Morton for use by future scholars. The earliest deposited electronic text in the Archive has undergone at least four migrations across different systems, been described with a TEI Header, and been converted from one encoding format to another at least once. And the process of migration continues even if no one has requested that text since it was deposited by its creator. There always remains the possibility, as formats come and go, that someone will (see further Morrison, 2000).
The Web has long passed the point at which the cataloguing of its objects began. And whilst search engines suffice where the object of desire is precisely known, they do not offer the human intervention inherent in a subject gateway. The creation of subject gateways is a particularly popular activity for individual scholars, an activity in which they share (though rarely collaborate) with librarians. Both groups, scholars and librarians, tend to approach the development of subject gateways from different angles. The scholar is used to issuing reading lists which point students towards books and articles relevant (and often essential) to understanding a given subject. It is a natural development, where potentially useful resources might be online, to create an online reading list as part of a course Web page that also notes and points to Web-based resources. Such course-based gateways have been known to develop into larger subject-based gateways which have credibility beyond the boundaries of the developer’s department (e.g. Mark Goodacre’s New Testament Gateway at http://www.bham.ac.uk/theology/goodacre/links.htm). That credibility is based on the developer’s knowledge of the subject. However, if gateways are to be useful for discovering resources as well as evaluating them, as they grow in size they require some cataloguing mechanism. Given that many scholarly gateways are delivered as basic HTML files with little rigorous cataloguing data, we might term them ‘amateur gateways’ - not a derogatory term but one which reflects that the scholar’s expertise generally lies with the subject not the gateway, and that such gateways receive little or no funding or formal institutional support.
The library community developed the means to exclude ‘trash’ centuries ago, through collection development policies of one form or another. The online library catalogue and the library collection development policy offer a framework for the selection and description of internet resources. Many of the subject gateways which receive formal support from funding bodies or institutions have been created within a library or cataloguing community, as an extension of the traditional library service (making use of, for example, existing classification systems and library protocols). The collection policy contains within it the quality control mechanism and the catalogue exists to facilitate speedy access to the library’s holdings. The cataloguing data is descriptive and factual. Such gateways, which receive formal support and adhere to library-based policies and standards, might be termed ‘professional gateways’.
Selection criteria are an important element of any library collection development policy. One criterion is usually the need for subject expertise in determining a resource’s suitability for inclusion. Such expertise might vary from acting on the recommendations of teaching and research staff through to consulting published reviews. Whilst academic book reviews abound, Web resources have few reviews in either printed or online publications. On the other hand, an increasing number of proposals for funding teaching and research projects aim to develop online resources. Whilst the proposals are frequently peer-reviewed in order to determine whether they are fit for funding, it is less common for a digital resource to be reviewed post-publication. Part of the problem lies with the publishers of reviews, namely the scholarly journals, many of which rarely accept digital resources for review and especially not if they are Web sites (see Scott, 1999). A merging of the amateur and professional gateways might result in some concept of post-publication peer-review for online resources.
An academic subject gateway requires the set of criteria against which any resource should be measured at the moment of inclusion in the gateway, together with the structured, descriptive metadata which one finds in the professional gateway. In addition, a scholarly gateway might go beyond a catalogue and include the explicit evaluation of resources, as often found within the annotated links of the amateur gateway as well as throughout other forms of scholarly publication. It might seem strange to suggest attaching reviews to records of Web resources when the users can so easily discover the content for themselves. Perhaps an analogy is the copy of a second-hand book into the front cover of which a previous owner has pasted a review of that book. Reviews provide a unique record of an individual’s interpretation of that work and something about its fitness for a particular purpose, whether teaching, research, or general reading. As such their intention, especially when attached to the resource itself, is not only to influence a potential reader’s choice but also to supply a particular way in which the resource might be viewed, to note its original contributions and its omissions. Reviews, like evaluative annotations on reading lists, are a valuable source of scholarly interpretation for students, against which they might compare their own experience.
The gateway brings together information about distributed resources which otherwise would be time-consuming to locate. In the early (short) history of the Web, the first resources, for the most part, were placed there by enthusiasts. The range of resources currently available includes most subscription-based bibliographic services, image and text databases, multimedia encyclopaedias, news agencies and law reports, government documentation and museum collections. The web is a distributed environment with the potential to link together any two or more remote objects within a single Web page. A natural progression from the gateway, giving information about resources, is the portal which pulls in and combines content from multiple sources dynamically and transparently to the end-user. This concept is already present within the large Internet Service Provider portals which combine data from news agencies, weather stations, offer searches across distributed online stores, and blend the different streams into a seamless whole.
The Humbul Humanities Hub, a successor service to the CTI Centre, is re-developing the long-standing Humbul Gateway. One component will be an enhanced subject gateway offering descriptions of humanities Web resources based on the Dublin Core metadata set. A major component, however, will be development of a humanities portal which will clump together a range of databases frequently used (and occasionally under-used) by humanities researchers and students for the purposes of searching, retrieving, and exporting of results. A further development of this concept will enable the Hub’s users, especially librarians and subject experts who are undertaking some form of local subject gateway activity, to dynamically retrieve previously selected records from the Hub and have those records automatically included as part of a local Web page each time that page is served. The subject specialist continues to evaluate and select sites relevant for teaching or research; the Hub undertakes to maintain the records. The effort is shared and needless duplication is avoided.
The present emphasis on distributed and client-server computing lends itself to the humanities which is itself concerned with the construction of knowledge from sources of different types, scattered across different subject areas. The Web has the potential to draw together disparate sources of knowledge on to a single screen, despite those same objects remaining at their source. The culture of the time is to package raw data for different purposes, to re-present it over and over again and, especially on the Web, with each representation different from previous presentations. Database driven, dynamic, user-centred, customisable Web sites offer their audience something verging on multiple performances of the same virtual document, phenomena which one might expect to provide fodder for further developments in the theoretical side of humanities computing.
There is little doubt that in its practice humanities computing has evolved into a definable discipline area. However, as such, humanities computing cannot be identified with the full integration of computing into humanities research and teaching. Humanities computing, as one might expect, works with emerging technologies. On the one hand, the role of humanities computing professionals is to filter and convey new techniques, standards, and applications to colleagues in more traditional humanities disciplines. On the other hand, humanities computing professionals, precisely because they have humanities and computing specialisms, are in a good position to analyse methods and problems within the humanities, and locate - and, if necessary, extend - available technologies. In some sense, therefore, humanities computing as a subject, and the researcher and teachers who work within the field, inform the humanities. Gradually, in certain areas, this process appears to be working. For example, the major funder of the arts and humanities in the UK, the Arts and Humanities Research Board (AHRB), is at ease with accepting project proposals which contain a high digital content (in the methods or the outputs). In order to keep itself informed concerning the appropriateness of such applications the AHRB collaborates with the Arts and Humanities Data Service (AHDS), which itself is staffed by a combination of subject and technical experts (see http://www.ahrb.ac.uk/citpol.html for the AHRB/AHDS Draft Joint C&IT Policy). In the sphere of research, at least, funding for so-called ‘pure’ humanities research is becoming integrated with a type of research or output for which applicants may previously have had to seek other sources of funding.
The UK higher education funding bodies have funded a number of C&IT-related initiatives over the past ten years, not least the Computers in Teaching Initiative. Also, it is worth noting that something close to forty million pounds was invested in the Teaching and Learning Technology Programme (TLTP), a programme aimed at funding the development of courseware and other teaching and learning projects. The Programme did not explicitly favour any particular subject areas (which is another way of suggesting there was no strategy at the subject level for funding), though it is clear that only a small proportion of the funding went to arts and humanities projects. There were successful consortia projects in Archaeology, History, and Modern Languages, for example, but very little within the text-based humanities disciplines. More recently, there has been a gradual convergence of initiatives and programmes, especially at the subject level. The CTI Centres, at one time the only subject-based initiative, are being replaced by Learning and Teaching Support Network (LTSN) Centres which will continue with a subject focus (subjects formerly supported by CTI Textual Studies have mostly been divided up amongst them). The LTSN Centres have a remit not only for supporting and promoting the integration of C&IT but also for other non-C&IT areas such as assessment, placements, transferable skills, as well as subject-specific issues. The whole programme is being co-ordinated by the new Institute for Learning and Teaching, a professional association for all teaching and support staff in higher education. The LTSN Centres generally also have the involvement of subject associations. The convergence currently being encouraged by the funding bodies at the subject level is to the advantage of the arts and humanities. Collaboration between the LTSN, the Humanities and Creative Arts Hubs (also part of a subject-based service - the Resource Discovery Network), the Arts and Humanities Data Service, and the Arts and Humanities Research Board has the potential to serve the whole academic, researcher and teacher.
Convergence at the subject level of national programmes and services will be less than effective if a similar convergence does not take place on the ground, in the working life of the departments. The CTI always prided itself on supporting so-called grass-roots academics rather than climbing into the trees of senior management. Without deep changes at the institutional level there was always a risk that supporting individual academics would not bring about the wider changes in perception, as well as infrastructure and processes, which were required if the work of the CTI was to outlast itself. The Arts and the Humanities have benefited from one or two activities which have attempted to determine the needs of humanities scholars at different levels. The AHDS and CTI Textual Studies undertook a joint study in 1998 which outlined the opportunities offered by digital resources and the barriers to their further development and use. In a paragraph which has echoes of Hockey’s earlier observations, the report noted:
Workshop participants reported that they felt guilty about taking time to search for, experiment with, or create digital resources unless as part of formal research or teaching programmes which produced measurable outputs. Yet as one participant put it: ‘we need time to play with resources, to get to know them and see what they can do’. It is clear that experimentation is essential if scholars are to make fully informed decisions about how and to what extent they can effectively integrate IT into their research and teaching. Some mechanism for encouraging appropriate experimentation is accordingly as important as full professional recognition of computer-based research and teaching. On this point too, workshop participants spoke with one voice. They were particularly concerned that measures of scholarly performance (e.g. the Research Assessment Exercise, the Teaching Quality Assessment, and institutional pay and salary awards) took inadequate account of computer-based research and teaching and, as such, acted as a disincentive for scholarly exploitation of IT. (Greenstein and Porter, 1998, chapter 3, §4.5)
The report made a number of recommendations throughout. Relating to the above section, it recommended that professional recognition be accorded computer-based research and teaching; that there be agreed mechanisms for evaluating computer-based research and teaching; and that there be better collaboration with professional bodies. The partnership between the AHRB and the AHDS will greatly assist the evaluation of computer-based research, and likewise the partnership between the ILT and the LTSN centres for computer-based learning. It is also evident that the funding bodies are taking more seriously the opinions of the subject associations and other professional bodies. There is still some way to go, in the UK at least, especially concerning the funding of adequate support infrastructure for the arts and humanities within institutions. At the national level, the renewed emphasis on recognising that the home of the scholar lies within the subject community brings together both strategists and practitioners, and promises greater equality in the distribution of central resources across all the disciplines.
There is a tension, however, between encouraging the input of subject-based communities to the development of subject-based services and resources, and the likely effect of the digital upon those defined subject areas. Effective integration of C&IT within the arts and humanities lies in the commixture of what is traditional with what is new at all levels, not only in the classroom or computer lab but also in the committee room or email forum. It is a process of evolution. At first it seems that we are doing little more than replicating old technology but as the use evolves, and as the technology itself becomes ingrained, we look back and realise not only how dependent on digital resources are many of the courses we teach but how the application of communication and information technologies is changing the discipline itself. New technologies, independent of any humanities discipline, intensive in their resource demands, contribute to blurring the boundaries between traditional subject areas, out of which develop new topics of study which then evolve, through the normal process of solidifying human communication, into defined subject areas.
Humanities computing might be one such topic which is asserting its identity as a discipline (McCarty, 1999). But humanities computing itself has a tendency to evade definition, given that much of its fodder is the impact of ever-evolving technology on the effects of the imagination. Boundaries between humanities computing and other related subject areas (for example, art and design, publishing, media and cultural studies) are increasingly blurry, a result of the diffusion of digital technologies throughout the public arena. The Web and email might now be considered normal (and for many also a priority) but from these are evolving digital broadcasting, Internet-controlled home appliances, and hybrid portable communication devices. This is human computing and it may yet prove to have a stronger relationship with the humanities than concordancing and stylistic analysis ever did.
AENTV. Richard Burton’s Hamlet. <http://www.aentv.com/home/hamlet/>, 1999. [no longer available?]
Arts and Humanities Research Board. ‘Draft Joint C&It Policy of the Arts and Humanities Research Board and the Arts and Humanities Data Service’. <http://www.ahrb.ac.uk/citpol.html>, 1999. [now see http://ahds.ac.uk/ahrb/ahrb-advice.htm]
Condron, F. ‘A survey of small-group teaching in the humanities: the ASTER Project.’ Computers & Texts, 18/19 (forthcoming, 2000). [now see http://users.ox.ac.uk/~ctitext2/publish/comtxt/ct18-19/14condron.pdf]
Crane, G. (Ed.-in-Chief). Perseus Project: An Evolving Digital Library. <http://www.perseus.tufts.edu/>. Tufts University, 1995-.
Goodacre, M. The New Testament Gateway. <http://www.bham.ac.uk/theology/goodacre/links.htm>. University of Birmingham, 1999-. [now see http://www.ntgateway.com/]
Greenstein, D., and Porter, S. ‘Scholars’ information needs in a digital age.’ The New Review of Academic Librarianship, 4 (1998): 147-214. Also at <http://www.ahds.ac.uk/public/uneeds/un0.html>. [no longer online?]
Hacken, R. EuroDocs: Primary Historical Documents From Western Europe. <http://library.byu.edu/~rdh/eurodocs/>, 1996-.
Halsall, P. Internet Medieval Sourcebook. <http://www.fordham.edu/halsall/sbook.html>, 1996-.
Hockey, S. ‘CTI Languages and Literature Project at Oxford University: Final Report’. Unpublished draft. Oxford University Computing Services, September 1989.
Hunter, P. ‘Tiny TV: Streaming Video on the Web’. <http://www.ariadne.ac.uk/issue22/tiny-tv/>. Ariadne, 22 (1999).
Landow, G. Hypertext 2.0. The Convergence of Contemporary Critical Theory and Technology. 2nd ed. Baltimore; London: Johns Hopkins University Press, 1997.
McCarty, W. ‘Humanities computing as interdiscipline. Is Humanities Computing an Academic Discipline?’. <http://ilex.cc.kcl.ac.uk/wlm/essays/inter/>. IATH, University of Virginia, 5 November 1999. [see http://www.iath.virginia.edu/hcs/mccarty.html]
McCarty, W. and Kirschenbaum, M. ‘Humanities computing units and institutional resources.’ <http://ilex.cc.kcl.ac.uk/wlm/hc/> London: KCL, 1999. [Now see http://www.allc.org/imhc/]
Morrison, A., Popham, M. and Wikander, K. Creating and Documenting Electronic Texts: A Guide to Good Practice. AHDS Guides to Good Practice. <http://www.hcu.ox.ac.uk/ota/public/publications/ahds/C&Det/>. Oxford: Oxford Text Archive, 2000. [See http://ota.ahds.ac.uk/documents/creating/]
Niuean Centre for American Pop Culture. The Mercury Theatre on the Air. <http://www.unknown.nu/mercury/>, 1999.
Raben, J. ‘The humanist in the Computer Lab: thoughts on technology in the study of literature.’ Bulletin of the Association for Literary and Linguistic Computing, 1:1 (1973): 3-9.
Scott, B. ‘Reviewing Reviews of Electronic Products: how reliable are they?’. Digital Resources for the Humanities Conference. King’s College, London, September 1999.
Schriber, Carolyn (ed). ORB: The Online Reference Book for Medieval Studies. <http://orb.rhodes.edu/>, 1999-. [Now see http://www.the-orb.net/]
Solopova, E. ‘Fit for Purpose: Issues Surrounding the Use of Digital Resources in Research and Teaching’. Joint ACH-ALLC Conference, University of Virginia, June 1999. [See http://www.iath.virginia.edu/ach-allc.99/proceedings/fraser.html]
Sutherland, K (ed). Electronic Text: Investigations in Method and Theory. Oxford: Clarendon, 1997.
Warwick, C. ‘English literature, electronic text and computer analysis: An impossible combination?’. Joint ACH-ALLC Conference, University of Virginia, June 1999. [See http://www.iath.virginia.edu/ach-allc.99/proceedings/warwick.html]
Wright, C. (ed). Studentfilms.com. <http://www.studentfilms.com/>, 2000.
XOOM.COM. XOOM’s Classic Movies Community. <http://www.xoom.com/cobrand/classicmovies/classicmovies>, 1997-. [No longer available?]