I currently work as Head of Film Access at the Bundesarchiv in Berlin. Between 2016 and 2018 I was the administrative head and researcher at the Brandenburg Center for Media Studies in Potsdam. From 2010 to September 2016 I worked as researcher, curator and archivist at the Austrian Film Museum in Vienna. My main areas of expertise include database development and metadata structures as well as the publication of archival films on DVD and the internet (e.g. Kinonedelja – Online Edition, etc.). I obtained my PhD in Russian studies and a Masters in Comparative Literature from the University of Innsbruck and Vienna. In 2016 I have also completed Library- and Information Sciences at the Humboldt-University in Berlin. I am the author of the book Kollision der Kader. Dziga Vertovs Filme, die Visualisierung ihrer Strukturen und die Digital Humanities (2016) and have published on Russian cinema, archival collections and visualization of filmic structures.
Melissa Terras is Director of UCL Centre for Digital Humanities, Professor of Digital Humanities in UCL’s Department of Information Studies, and Vice Dean of Research in UCL’s Faculty of Arts and Humanities. With a background in Classical Art History, English Literature, and Computing Science, her doctorate (Engineering, University of Oxford) examined how to use advanced information engineering technologies to interpret and read Roman texts. Publications include “Image to Interpretation: Intelligent Systems to Aid Historians in the Reading of the Vindolanda Texts” (2006, Oxford University Press) and “Digital Images for the Information Professional” (2008, Ashgate) and she has co-edited various volumes such as “Digital Humanities in Practice” (Facet 2012) and “Defining Digital Humanities: A Reader” (Ashgate 2013). She is currently serving on the Board of Curators of the University of Oxford Libraries, and the Board of the National Library of Scotland, and is a Fellow of the Chartered Institute of Library and Information Professionals and Fellow of the British Computer Society. Her research focuses on the use of computational techniques to enable research in the arts and humanities that would otherwise be impossible. You can generally find her on twitter @melissaterras.
Guest presentation in Projects in Rare Book Digitization course (Pratt University, LIS 666) on analyzing digitized books and printed materials with digital humanities methods (primarily text analysis).
Increasing numbers of primary and secondary source texts have been digitized in recent years. Scholars who want to study these new collections in depth need computational assistance because of their large scale. The non-programmer tools for text analysis currently available operate at the word level, and they show tables of counts and lists of occurrences, but rarely interactive visualizations. We propose to build a text analysis tool that includes visualizations and works on the grammatical structure and stylistic features of text, applying highly accurate technology from computational linguistics and authorship identification to extract this information. We will develop our tool for a collection of slave narratives whose authorship is ambiguous. In doing so, we will find out whether visualizations of grammatical and stylistic features are useful to literary scholars, and whether this information allows them to make satisfying large-scale analyses of their text.
Early American Literature, Nineteenth-Century American Literature, Digital Humanities, Book History, Textual Scholarship, Bibliography, Distant Reading, Text Mining.
This syllabus was my second version of a course aimed at introducing the digital humanities at an undergraduate level. The course was organized around four projects: mapping a novel; text analysis with archival sources; reading a novel collaboratively with courses at other colleges and universities while building a multimedia response; and text extraction and analysis from a large, in-copyright corpus.
Dr. Matthew Lincoln is the Digital Humanities Developer at dSHARP, the digital scholarship center at Carnegie Mellon University, where he focuses on computational and data-driven approaches to the study of history and culture. His current book project with Getty Publications, co-authored with Dr. Sandra van Ginhoven, uses data-driven modeling, network analysis, and textual analysis to mine the Getty Provenance Index Databases for insights into the history of collecting and the art market. He earned his PhD in Art History at the University of Maryland, College Park, and has held positions at the Getty Research Institute and the National Gallery of Art. He is an editorial board member of The Programming Historian. He has previously worked as a curatorial fellow with the National Gallery of Art in Washington, DC, and as a graduate assistant in the Michelle Smith Collaboratory for Visual Culture in the University of Maryland’s Department of Art History and Archaeology. He has been a recipient of Kress and Getty Foundation grants for their summer institutes in digital art history, and served on the steering committee for the Kress and Getty-funded symposium Art History in Digital Dimensions at the University of Maryland in October 2016. He is a member of the College Art Association’s Student and Emerging Professionals Committee. In addition to conference papers at ADHO’s annual meeting, the College Art Association, and the Renaissance Society of America, his work has appeared in the International Journal for Digital Art History, British Art Studies, and Perspective: Actualité en histoire de l’art. He is also a contributor to The Programming Historian.
I spend most of my time thinking about and working on digital technology and its power to inform, educate and entertain. Like any tool, digital technology is progressive and creative, advancing and improving our lives in many ways; but it can also be disruptive and even dangerous depending on how it is used. These effects can be intentional or unintended. My aim is to understand how to best use technology to engage and empower as many people as possible whilst preventing or mitigating the auto-information disorders that degrade digital environments. My scholarship spans history, cultural and media studies, information science, social science and computing. My research interests centre on the evolution of documentary and communication media, the adoption of technology and associated socio-cultural shifts. My research has explored different advances in digital media: the web and digital publishing, digital television and narrowcasting, and the growing use of data sensors to quantify and analyse environments and behaviours. Working as a business analyst I’ve applied a wide range of methods and techniques from both my research training and professional certifications to design and develop various systems and services. I have a growing interest in behaviour driven design, data ethics and accessibility.
This essay describes the popular Bechdel Test—a measure of women’s dialogue in films—in terms of social network analysis within fictional narrative. It argues that this form of vernacular criticism arrives at a productive convergence with contemporary academic critical methodologies in surface and postcritical reading practices, on the one hand, and digital humanities, on the other. The data-oriented character of the Bechdel Test, which a text rigidly passes or fails, stands in sharp contrast to identification- or recognition-based evaluations of a text’s feminist orientation, particularly because the former does not prescribe the content, but merely the social form, of women’s agency. This essay connects the Bechdel Test and a lineage of feminist and early queer theory to current work on social network analysis within literary texts, and it argues that the Bechdel Test offers the beginnings of a measured approach to understanding agency within actor networks.