Teaching the teachers

During the winter semester of 2018 our Doctoral Training Unit on Digital History and Hermeneutics decided to introduce digital history to the first-year master students. We divided and conquered each lesson in pairs or groups of three PhD researchers, coordinated by Postdoc researcher Tim Van Der Heijden.

Preparing classes

Armed with the structure and experience from the previous teacher and fellow PhD researcher Max Kemman we discussed general content as a team and each took on two or three lessons to share the workload. For example, I took on the theoretical lesson on Publishing for the Web, and two practical sessions focussed on the When?-question, including Timelines and Databases, and Queries and Data Visualisation. In order to create a uniform structure for the students even with twelve different teachers, I first transformed the Hillary Clinton-emails from a spreadsheet to a relational database together with my colleague Shohreh Haddaddan for the practical sessions.[1]

Preparing Writing for the Webwas relatively straightforward with my colleague Marleen de Kramer, since we set our goal together and then divided the session into what (to publish on the web) and how (to set up a website). We provided the students optional readings and a mandatory HTML-module on codecademy so they could learn the basics on their own before class. We prepared group brainstorms to determine the goals and audience of the introdigitalhistory.dhlab.lu website.

On the 5th of October we followed a workshop by dr. Robert Reuter organised to prepare us for teaching this course specifically. We were asked to explain what and how we would be teaching, as well as how we would assess the learning and teaching afterwards. Together with Antonio Fiscarelli and Kaarel Sikk we answered these questions for the practical When-sessions. In general we explained concepts in the format of a traditional lecture. Furthermore we introduced practical tools such as database software MariaDB with the Navicat-interface and data visualisation software Tableau in tutorials for the students to follow at their own pace.

Trying to teach

We structured the Writing for the Web-lecture based on questions such as: What should we write? Who are we writing for? How do we write for the web? We also briefly introduced the Hillary Clinton emails and showed the website we created for the course. To determine the goal and audience of the course website students brainstormed in groups of two and together we created a persona. We tested their understanding of web-accessibility in the form of storyboards and wireframes. Students cooperated well in class discussions, but due to time restrictions had some trouble understanding the assignments.

In Timelines and Databaseswe included historical examples of timeline visualisations, and discussed the concept of time based on the article Is Time Historical? in a traditional lecture.[2] We wanted students to understand the link between a primary source and data, as well as the principles of a relational database. Therefore we prepared a step-by-step tutorial for them to create a database from scratch and manually insert data from the first 10 Hillary Clinton emails. Despite screenshots and a step-by-step explanation the students still struggled, although part of the struggle came from fear of the unknown. We also discovered some minor mistakes in the Windows-tutorial and were delayed by students who didn’t or couldn’t install the software beforehand.

For Queries and Data Visualisation we transformed research questions into SQL queries and discussed best-practices in data visualisation by criticising bad examples and introducing Edward Tuftes principles. The query-walkthrough also recapped the previous dataset structure and students started to better understand the dataset assignment. We were still short on time in class. Fortunately, the Tableau tutorial was much easier and most of the students finished quickly. We interpreted the timeline visualisation they created together at the end of class and gave them a month to repeat the exercise and reflect on their timelines in a blogpost.

Evaluating assignments

The storyboards depicted scenarios of use focussed on web accessibility and tested the student’s understanding of the course, as well as their creativity in problem solving. With a few minor exceptions, we were positively surprised by their first assignment. As for the wireframes we noticed that we should have provided clearer guidelines on web design, but we hope the feedback was useful.

Despite initial protest about the database assignment nearly all students scored very high, partially because evaluating a database is generally easier and more objective. For next year we do need more time to properly explain relational databases and the context and structure of the Hillary Clinton emails in particular. A separate session would furthermore ease the workload and give the students a little more time to build the database.

The timeline blogposts were challenging to grade even though we set out clear criteria from the start. This assignment sparked a serious discussion on bibliographic referencing in blogposts. We generally expected students to reference literature and emails either by including hyperlinks or adding a bibliography, but should have specified the format beforehand. Overall we were impressed by their historical analysis of events in their timelines of the Hillary Clinton emails.

[1] The procedure and code used to scrape and clean the Clinton emails can be found here: https://github.com/C2DH/A-Republic-of-Emails. We built a relational dataset by reformatting and cleaning the data from the spreadsheets using Python.

[2] Hunt, L., (2007). “Is Time Historical?” in Measuring time, making history. Budapest; New York: Central European University Press.

Experimenting with the Phonograph

During our last DTU skills training we experimented with the phonograph as a form of media ethnography. Simulating the learning-by-doing teaching technique that Kirsten Haring introduced we didn’t waste much time reflecting and got hands-on immediately during the second day of the training. The one thing we did reflect on the day before was what we wanted to record. Both Marleen and I were keen on combining our hobbies and work, so we decided on recording live music. I was particularly motivated to try out the violin because Dr. Stefan Krebs mentioned that at the time they couldn’t record the violin with the Edison phonograph because it wasn’t loud enough. Ever the historian I wanted to put that finding to the test (perhaps driven to prove it wrong).

When I got home that evening I got my dusty violin out of its case and started looking through my stack of sheet music for a piece that was easy to play and did not exceed the 2-minute wax cylinder limit. Eventually I decided on the Carnival of Venice because it is such a recognizable tune. I practiced for about an hour the night before the second training day and packed my violin and the sheet music before heading to the skills training. Although we were given a copy of the original user manual the night before, I only took a brief look at it. I generally don’t bother with user manuals too long unless I have to put together IKEA furniture, so this manual didn’t help me much either. That morning the first thing we did was take a long hard look at the Edison Phonograph and figure out what each component did. In order to understand the recording process, we watched a YouTube instruction video and after a presentation on video-reflexive ethnography by professor Jessica Mesman attempted our first recording.


Instead of sticking to the original schedule and having one group observing the other, we all gathered around for the recording. First I had to tune the violin while Marleen was also warming up the flute she brought. Jessica held the sheet music for me, Kaarel announced the title of the music, Stefan started the recording and measured the decibels with his phone, and the others were either recording the first try with their phone or observing the procedure. At the end of the first recording, I stopped the phonograph on time so that there was a part of the cylinder left for Marleen and then she recorded the song she knew by heart.

While listening to the recording afterwards the violin was indeed hard to hear, whereas the sound of the flute was slightly easier to pick up, so we decided to try out a few different techniques. First, we added a piece of carton board around the recording horn in order to capture the sound better. That didn’t work because the carton board absorbed the sound rather than expanding the reach of the horn. Furthermore, the piece of carton made it harder to stand close enough to the recording horn. Next, we heated the wax cylinder right before our final test and that improved the recording much better than the cardboard addition.

In order to listen to the recordings we had to change the horn and the ‘reproducer’, and both of these elements influenced the quality of playback. As I was visiting the Heinz-Nixdorf museumsForum a month later in Paderborn, the Dictaphone caught my attention. This device that otherwise looks very similar to the Phonograph, seemed to use headphones instead of a reproducer-horn.

Dictaphone on display at the Heinz-Nixdorf museumsForum in Paderborn.

The Phonograph was used at home, whereas the Dictaphone was used in the office. While I was looking at the images accompanying the display, two thoughts popped into my head. First, I realised that this constellation of a manager speaking into the Dictaphone and a secretary afterwards transcribing the recording must have inspired Vannevar Bush while he was describing certain features of the ‘memex’ in As We May Think. Second, transcribing speech to text – whether by typewriter or modern day computer – requires absolute concentration and noise-cancelling headphones. The headphones from this Dictaphone must have been inspired by the stethoscope because the metal part that you would expect to go over the head of the secretary hung below her chin. Whether for entertaining or professional purposes, this analogue media has influenced our experience of listening to music profoundly. To the point where modern songs usually only last between 2 and 4 minutes, the maximum length of a recording on a wax cylinder.

Mapping Leuven in 1649 with QGIS

On the 11th and 14th of May, dr. Catherine Jones introduced us to Geographical Information Systems (GIS) with the help of Kerry Schiel and Kaarel Sikk. On the first day we mostly learned how to work with QGIS uploading existing maps and exploring the Bombsight maps. We also started from scratch ‘stitching’ an old map to the open street map or google maps based on points that we could still recognise and overlap. Depending on how recent and/or accurate the historic map was, we had to choose a different transformation type. After wasting time locating buttons during the tutorial because the interface wasn’t entirely intuitive and many of the concepts were new, we got the hang of it. For the assignment, I decided to repeat the process with an historic map from Leuven where I was born and where I spent six years at the university. Since the resolution of the map I tried to use during the tutorial was too low, I found a more recent map (from 1649, rather than 1541) from the Atlas van Loon. Before uploading the historic map, I set the projection to TRS89 / Belgian Lambert 2008. Although Leuven has changed a lot the last four centuries, I could stitch the map mostly based on the locations of churches still in existence today. I have to admit that being a local really helped me locate some of these rather quickly. Once the eleven points were identified on the open street map, I used the Thin Plate Spline which stretches the historic map as if it were made of rubber because that made most sense for this rather old map.

schermafbeelding 2018-06-04 om 11.14.37

At the end of the first day and the beginning of the second day we learned how to insert and adapt data points such as locations of bombs that fell on a certain day of the week, or flight paths and areas affected by larger bombs in London. For my own project I decided to locate the original items of the legend and used the number of the location in the original legend as an id and the name of the item in a second field of the data table. At first I tried to find the buildings and locations (mostly churches, colleges and squares) in the order of the original legend, but after a serious struggle to find nr. 11 on the list, I decided to systematically look at areas of the maps and locate the numbers of the legend first before adding a data point. Once I finished locating most of the points, I looked at the data table once more and realised I had made at least two mistakes by identifying the same building twice on different locations. Luckily the points on the map light up if you select a row in the data table which made it easy for me to correct the errors. In the end I made a list of all the legend items I didn’t find and found a list and links to the heritage inventory that were useful in a final attempt to locate them. I managed to find two more buildings, but 11 out of 74 were not to be found (even though for some I knew for sure where they were).

schermafbeelding 2018-06-04 om 11.22.49

I can add labels and some more metadata to the dots, for example which type of building it is (university or religious building) and give those a different color. Another option is to outline the original city walls (the ‘binnenring’ and ‘buitenring’) or identify which areas contained housing, where the gardens were, and which plots were used as agricultural areas. In any case I really enjoyed the workshop and learned a lot, only time held me back to add extra data layers. The interface has a learning curve, yet can be very useful.