Category Archives: Other

CFP: Digital Humanities and Listening

Hi all! I’ve been eagerly following along with your progress on your projects. I wanted to share a CFP that seemed appropriate for the Civil War Sound group: http://soundstudiesblog.com/cfp-digital-humanities-and-listening-due-41016/.

For this year’s annual “World Listening Month” Forum, we are interested in posts considering the role of “listening” in the digital humanities. How have particular digital studies, projects, apps, and online archives addressed, challenged, expanded, played with, sharpened, questioned, and/or shifted “listening”? What happens to digital humanities when we use “listening” as a keyword rather than (or alongside) “sound”?

GIS workshop

A few weeks ago I was fortunate to be able to attend an all-day GIS workshop offered by Frank Connolly, the Geospatial Data Librarian at Baruch College. It was very thorough and by the end everyone had finished a simple chloropleth map. For those of you interested in continuing with map-related DH and can spare a friday, I recommend the workshop, which is free and offered several times a semester.

Most professional GIS projects use ArcGIS, made by ESRI, and many institutions subscribe to it to support their GIS projects. It’s not cheap. But, (yay!) there is an open-source alternative called QGIS, which anyone can download. This is the software we used in the workshop. QGIS is far more versatile than CartoDB, but it also has a complex interface and a steep learning curve.

In the workshop, we covered the pros and cons of various map projections (similar to some of our readings) and different types of map shapefiles (background map images); GPS coordinates vs. standard latitude/longitude (sometimes they differ) ;how to geo-rectify old maps so that they line up with modern maps and geocoordinates; open data sources; and how to organize and add information to a QGIS datababse.

The entire workshop tutorial, which participants took home, is available on the Baruch library website. If you’re comfortable learning complicated software on your own, it’s a great resource. Personally, I would need to spend a lot more time working with QGIS, with someone looking over my shoulder, to get a feel for the program. But practicing with the manual over the winter break will be on my ever-growing “to-do” list.

 

Data Presentation: Content Analysis and “In the Country”

Officially, my data set project is an attempt at content analysis using a short story collection as my chosen data set. In reality, this was me taking apart a good book so I could fool around with Python and MALLET, both of which I am very new to. In my previous post, I indicated that I was interested in “what the investigation of cultural layers in a novel can reveal about the narrative, or, in the case of my possible data set, In the Country: Stories by Mia Alvar, a shared narrative among a collection of short stories, each dealing specifically with transnational Filipino characters, their unique circumstances, and the historical contexts surrounding these narratives.” I’ve begun to scratch at the surface.

I prepared my data set by downloading the Kindle file onto my machine. This presented my first obstacle: converting the protected Kindle file into something readable. Using Calibre and some tutorials, I managed to remove the DRM and convert the file from Amazon’s .azw to .txt. I stored this .txt file and a .py file I found on a tutorial for performing content analysis using Python under the same directory and started with identifying a keyword in context (KWIC). After opening Terminal on my macbook, I typed the following script into the command line:

python kwic1.py itc_book.txt home 3

This reads my book’s text file and prints all instances of the word “home” and three words on either sides into the shell. The abbreviated output from the entire book can be seen below:

Alisons-Air:~ Alison$ ls
Applications Directory Library PYScripts Test
Calibre Library Documents Movies Pictures mallet-2.0.8RC2
Desktop Downloads Music Public
Alisons-Air:~ Alison$ cd PYScripts/
Alisons-Air:PYScripts Alison$ ls
In the Country
Alisons-Air:PYScripts Alison$ cd In\ the\ Country/
Alisons-Air:In the Country Alison$ ls
itc_book.txt itc_ch1.txt itc_ch2.txt kwic1.py twtest.py
Alisons-Air:In the Country Alison$ python kwic1.py itc_book.txt home 3
or tuition back [home,] I sent what
my pasalubong, or [homecoming] gifts: handheld digital
hard and missed [home] but didn’t complain,
that I’d come [home.] What did I
by the tidy [home] I kept. “Is
copy each other’s [homework] or make faces
my cheek. “You’re [home,”] she said. “All
Immaculate Conception Funeral [Home,] the mortician curved
and fourth days [home;] one to me.
was stunned. Back [home] in the Philippines
farmer could come [home] every day and
looked around my [home] at the life
them away back [home,] but used up
ever had back [home—and] meeting Minnie felt
shared neither a [hometown] nor a dialect.
sent her wages [home] to a sick
while you bring [home] the bacon.” Ed
bring my work [home.] Ed didn’t mind.
“Make yourself at [home,”] I said. “I’m
when Ed came [home.] By the time
have driven Minnie [home] before, back when
night Ed came [home] angry, having suffered
coffee in the [homes] of foreigners before.
of her employer’s [home] in Riffa. She
fly her body [home] for burial. Eleven
of their employers’ [homes] were dismissed for
contract. Six went [home] to the Philippines.
the people back [home,] but also: what
she herself left [home.] “She loved all
I drove her [home,] and then myself.
we brought boys [home] for the night.
hopefuls felt like [home.] I showed one
She once brought [home] a brown man
time she brought [home] a white man
against me back [home] worked in my
the guests went [home] and the women
I’d been sent [home] with a cancellation
feed,” relatives back [home] in the Philippines
we’d built back [home,] spent our days
keep us at [home.] Other women had
Alisons-Air:In the Country Alison$

I chose the word “home” without much thought, but the output reveals an interesting pattern: back home, come home, bring home. Although this initial analysis is simple and crude, I was excited to see the script work and that the output could suggest that the book’s characters do focus on returning to the homeland or are preoccupied, at least subconsciously, with being at home, memories of home, or matters of the home. In most of In the Country’s chapters, characters are abroad as Overseas Filipino Workers (OFWs). Although home exists elsewhere, identities and communities are created on a transnational scale.

Following an online MALLET tutorial for topic modeling, I ran MALLET using the command line and prepared my data by importing the same .txt file in a readable .mallet file. Navigating back into the MALLET directory, I type the following command:

bin/mallet train-topics --input itc_book.mallet

— And received the following abbreviated output:

Last login: Sun Nov 29 22:40:08 on ttys001
Alisons-Air:~ Alison$ cd mallet-2.0.8RC2/
Alisons-Air:mallet-2.0.8RC2 Alison$ bin/mallet train-topics --input itc_book.mallet
Mallet LDA: 10 topics, 4 topic bits, 1111 topic mask
Data loaded.
max tokens: 49172
total tokens: 49172
LL/token: -9.8894
LL/token: -9.74603
LL/token: -9.68895
LL/token: -9.658470 0.5 girl room voice hair thought mother’s story shoulder left turn real blood minnie ago annelise sick wondered rose today sit
1 0.5 didn’t people work asked kind woman aroush place hospital world doesn’t friends body american began you’ve hadn’t set front vivi
2 0.5 back mother time house can’t you’re home husband thought we’d table passed billy family hear sat food stop pepe radio
3 0.5 day i’d made called school turned mansour manila don’t child things jackie mouth wasn’t i’ll car air boy watch thinking
4 0.5 hands years water morning mother head girl’s sound doctor felt sabine talk case dinner sleep told trouble books town asleep
5 0.5 he’d life man bed days found inside husband country call skin job reached wrote york past mind philippines chair family
6 0.5 time knew looked it’s she’d girls felt living i’m floor president fingers jim’s john young church jorge boys women nurses
7 0.5 baby hand city jaime door words annelise andoy heard he’s gave put lived that’s make white ligaya held brother end
8 0.5 milagros night face couldn’t year son brought men head money open they’d worked stood laughed met find eat white wrong
9 0.5 jim father home children eyes mrs milagros told long good years left wanted feet delacruz she’s started side girl streetLL/token: -9.62373
LL/token: -9.60831
LL/token: -9.60397
LL/token: -9.60104
LL/token: -9.596280 0.5 voice room you’re wife mother’s he’s story wrote closed walls stories america father’s ago line times sick rose thought today
1 0.5 didn’t people asked kind woman place hospital work city body doesn’t started front milagros american you’ve hadn’t held set watched
2 0.5 mother back house school thought can’t days bed minnie parents billy we’d table passed read sat stop high food they’re
3 0.5 day i’d made manila called don’t turned mansour child head hair jackie mouth dark wasn’t car stopped boy watch bedroom
4 0.5 man hands morning water reached doctor real sabine dinner sleep town asleep isn’t told dead letters loved slept press standing
5 0.5 husband he’d life family found inside call country skin live past daughter book mind chair wall heart window shoes true
6 0.5 time it’s knew looked felt she’d living i’m floor close president fingers things young began church boys women thing leave
7 0.5 baby hand jaime annelise door room words andoy hear heard lived put brother make that’s paper ligaya city end world
8 0.5 milagros night face couldn’t white son year brought men work job open stood they’d met money worked laughed find head
9 0.5 jim girl home years father children eyes aroush left good long mrs told she’s wanted girls love gave feet girl’sLL/token: -9.59296

LL/token: -9.59174

Total time: 6 seconds
Alisons-Air:mallet-2.0.8RC2 Alison$

It doesn’t make much sense, but I would consider this a small success only because I managed to run MALLET and read the file. I would need to work further with my .txt file’s content for better results. At the very least, this MALLET output could also be used to identify possible categories and category members for dictionary-based content analysis.

User Experience and Addiction

I stumbled across this article the other day, which seemed like something that would be of interest to the class. It also brought me back to the workshop on User Experience with Samantha Raddatz, which I attended a few weeks ago. In her workshop, she explained a few different methods for testing programs (focus groups, randomly approaching people in coffee shops, etc.) and the way that program/site/app designers often miss the most glaring issues with their own interface, how integral it is to test everything and to be open to the possibility of having to reorganize the information architecture of the site/app/etc. in order to best serve the people using the interface. It is necessary to go through a rigorous and diverse testing phase (although the best results do actually come from the first 5 people who test an app), in order to ascertain that the interface supports user expectation, and enables a positive, simple user experience.

The linked article looks at the user experience, and questions what drives user experience—not just the experience of using an app or site, but the experience of wanting to check it, wanting to be constantly connected. Apparently the concept of internet addiction has been discussed by psychologists since the advent of the first mainstream web browser, but as our technology becomes more and more streamlined and is streaming into the palms of our hands it has become a serious issue.

Is intuitive information architecture partially to blame for this? Does the ease of use create the sort of dependence we see when, on a Friday night, half the people in a bar are on their phones rather than interacting with other people in the bar? Does the ease of use and the resulting expectation of constant accessibility cause the frustration or anxiety many people feel when they don’t have service or wifi to quickly check their social media accounts on a smartphone?

Workshop: User Experience with Samantha Raddatz

I had a lot of questions about user experience of digital tools and Samantha Raddatz answered them all during her User Experience workshop. Raddatz is a user experience consultant for the CUNY Academic Commons.

The workshop was divided into two parts: what good user experience is and how to do user testing. Not surprisingly, focus on the user is key for success of your tool. Therefore, user experience has to be as seamless as possible; the user should not have to think about how to use the tool. Not intuitive architecture, for example, can be frustrating and reduce the use of the app. Since the aim of the tool is to maximize the time a user spends on your page or program, the developer needs to spend the necessary time and resources testing throughout the development process. Raddatz listed the steps on the PowerPoint in a clear fashion, which was very helpful for my comprehension and retention. She spoke in an engaging and accessible manner and provided multiple recommended resources for further assistance. Overall, I found the workshop to be valuable for my understanding of digital tools development.

The main takeaway for me was that whatever you’re developing, do user testing frequently throughout the process. If you wait until the last minute, it might be too late to rework the architecture. And if you’re working with limited resources, do guerrilla testing in a coffee shop with cookies as treats for volunteers! 🙂

Data Project: Reading Transnationalism and Mapping “In the Country”

Last week, we discussed “thick mapping” in class using the Todd Presner readings from HyperCities: Thick Mapping in the Digital Humanities, segueing briefly into the topic of cultural production and power within transnational and postcolonial studies (Presner 52). I am interested in what the investigation of cultural layers in a novel can reveal about the narrative, or, in the case of my possible data set, In the Country: Stories by Mia Alvar, a shared narrative among a collection of short stories, each dealing specifically with transnational Filipino characters, their unique circumstances, and the historical contexts surrounding these narratives.

In the Country contains stories of Filipinos in the Philippines, the U.S., and the Middle East, some characters traveling across the world and coming back. For many Overseas Filipino Workers (OFWs), the expectation when working abroad is that you will return home permanently upon the end of a work contract or retirement. But the reality is that many Filipinos become citizens of and start families in the countries that they migrate to, sending home remittances or money transfers and only returning to the Philippines when it is affordable. The creation of communities and identities within the vast Filipino diaspora is a historical narrative worth examining and has been a driving force behind my research.

For my data set project, I hope to begin by looking at two or more chapters from In the Country and comparing themes and structures using Python and/or MALLET. The transnational aspect of these short stories, which take place in locations that span the globe, adds another possible layer of spatial analysis that could be explored using a mapping tool such as Neatline. My current task is creating the data set – if I need to convert it, I could possibly use Calibre.

Highly Recommend

On Wednesday, October 27th, I attended The Lexicon of Digital Humanities workshop. It was great. The fellows, Mary-Catherine and Patrick, were professional, helpful, and, obviously, very knowledgeable in the subject area. What I liked most and did not expect was the interactive atmosphere during the workshop. It is easier to focus in the classroom after a long workday if you are an active participant rather than a passive recipient. During The Lexicon of Digital Humanities we were introduced to a number of tools. Unfortunately, there was not enough time to explore them. Nevertheless, I found it helpful. At the DH seminars we are asked to search for tools and describe them, but it is rather hard to decide what exactly you want to work on having opened DIRT. At the workshop, the fellows showed us what was available and gave us time look into what seemed interesting. That way, I discovered Neatline, a few days before opening the homework page:). Now I am considering it as the essential part of my data project.

The Lexicon of Digital Humanities workshop delivered a huge amount of information in a very short time span to a full classroom of participants. I cannot speak for everybody, but it is unlikely someone felt left out. The digital fellows get A+.

DH Grammar

Last night I attended my first workshop this semester – The Lexicon of DH – and I found it extremely helpful. I was expecting something like a PowerPoint survey of terms, tools, and basic categories, with a bunch of tired people in a classroom. Instead we had an interactive and hands-on workshop in a computer lab, with fellows who have a comprehensive understanding of the material and really know how to teach effectively. That the material could comprise a DH “grammar” was a perspective I hadn’t considered before. I would have called it an “arsenal” – tools. But grammar is especially fitting, because grammar structures meaning at its most basic level, and each tool structures meaning and mediates information – in its own way – at a basic level that should be thoroughly understood before it is used.

Actually, the workshop was a PowerPoint survey of terms, tools, and categories. But having very engaged people coach us through an exploration of this material “in situ” – that is, online, where it lives – made it far more accessible. Too often I find I am still stuck in the “stand-alone” mindset when it comes to digital tools. For instance, although I have used a number of the tools we covered, like Zotero,  I actually haven’t taken advantage of Zotero’s online functionality very much, in terms of capturing bibliographic info. and metadata.  Sometimes you need someone to show you what is right in front of your face. (I do, anyway.)

Being introduced to so many resources for different types of DH tasks and projects in a single two-hour session was a little frustrating.*   And, the plethora of possibilities led me yet again to rethink what my data set should include or explore. That said, I’ve already been exploring a number of different tools on my own – and even have an academic subscription to Tableau, a visualization program – yet have at best a novice’s sense of how best to use any of them. So, I found that even this short summary of certain tools’ capabilities was helpful, in terms of winnowing out what may not be as useful for me right now.  It’s easy for me to get distracted by pretty, shiny websites, and it finally dawned on me that perhaps I should not let the tool I like the most determine my data set – at least when I have minimal or no user experience.

In addition to the material we covered, it was helpful simply to describe my areas of interest in a couple of sentences, look at some examples of projects online, and hear about what other people do or want to do. I was able to step away from the monitor briefly (metaphorically speaking) and affirm that indeed history, texts/material artifacts, and geo-spatial mapping are “my bag,” and that I want to work on something that uses all of these components.+ On the other hand, I still feel the need to connect my rather dusty academic interests (18th C English literature) to contemporary experience and/or socially relevant issues, and this pressure doesn’t help when it comes to figuring out what data sets to find or create and play with.

So, to be continued…

*   I know that workshops for specific tools and programs are held, but they are so limited in terms of space and scheduling that what is an extremely necessary academic resource for most new DH students is not as accessible as it should be.  This is especially true for those of us who work 9-5 and can’t frequent office hours or workshops held earlier in the day. I really hope that this situation will be remedied.  Also, I suggest some in-depth workshops that focus on different types of projects – specifically, which tools and programs can facilitate research, enhance content, and improve functionality for various project types.

+  Last semester, I attempted to do something a little like this in Lev Manovich’s visualization class. For our final project, we each had to create a data visualization reflecting our own experience, so history with a big H was not involved. My project comprised a short, reflective essay with sound files and an interactive map in CartoDB. I had hoped to put everything together on one page, but it is either not easy or else impossible to embed a CartoDB map on a WordPress site.  As a hybrid visualization exercise, it is fine. But my goal for this class is to develop a project that can employ these elements — history, text (which is an elastic term) and mapping in a more comprehensive, meaningful, and engaging way, and that – most important – has both historical and contemporary relevance.  If anyone is curious what that looked like, it’s on my long-neglected blog.

You Are Listening To New York: Reflections on Open APIs

The Digital Fellows workshops here at the GC have far exceeded my expectations of what a 2-hour seminar tends to be. There’s just only so much technical material that can be absorbed in such a small window of time. That being said, the real strength of these workshops comes from the capable Digital Fellows leading the discussions, and the superb, thorough documentation they provide.

Out of the workshops I’ve thus far attended (Server Architecture, Introduction to Webscraping, etc), I’ve found the Lexicon to be the most useful, as it touched, very briefly, on a range of DH tools and approaches. In fact, it was so successful in communicating an overview of the emerging field, that it has thrown my dataset/final project planning for a loop (for another blog post).

One fairly important aspect of DH project development glossed over during the Lexicon was the importance of open APIs. I wanted to share a project that uses open APIs to wonderful effect. The “You Are Listening To” project utilizes open APIs to curate an immersive user experience centered around a mashup of ambient music and real time transmissions of police radars and airwave communications from cities around the world. Check out this link for You Are Listening to New York.

What I like so much about this site is it’s simplicity. It’s an elegant digital curation of various streaming media. When you load the page there’s a javascript file that pulls in an audio stream from radioreference.com, which provides the police radio audio feed. It also pulls up a soundcloud list that has been screened by the site’s creator Eric Eberhardt to ensure that it only incorporates, ambient, dreamy soundscapes that contrast with and compliment the police scanner audio. It also loads the page’s background image (of the user’s chosen city), which is pulling from Flickr’s API. This is all legal, free, and only possible because each of the companies made an effort to provide access to their site through simple web APIs.

There’s also a ton of additional metrics in the “i” info dropdown to the website. It looks like it’s accessing  twitter and reddit feeds, a geotracking tool to provide metrics about and for listeners, some google reference info, and various news trackers.

Have a look!