Release of Experimental Stimuli and Questions for Evaluating Facial Expressions in Animations of American Sign Language

Introduction

We have developed a collection of stimuli (with accompanying comprehension questions and subjective-evaluation questions) that can be used to evaluate the perception and understanding of facial expressions in ASL animations or videos. The stimuli have been designed as part of our laboratory's on-going research on synthesizing ASL facial expressions such as Topic, Negation, Yes/No Questions, WH-questions, and RH-questions.

How to Obtain the Files

Please send email to matt at cs.qc.cuny.edu to inquire about accessing the corpus.

What format of files do we release?

The corpus consists of four types of files, for each story that we have recorded.

How many stories and signers are included in this collection?

This collection consists of 48 stimulus passages, performed by a male signer. Each stimulus is accompanied by four comprehension questions. Each comprehension question is performed by both a male signer (the same one performing the stimulus passage) and a female signer.

Citations and More Information

If you make use of this collection, please cite the following publication:

Matt Huenerfauth, Hernisa Kacorri. 2014. "Release of Experimental Stimuli and Questions for Evaluating Facial Expressions in Animations of American Sign Language." Proceedings of the 6th Workshop on the Representation and Processing of Sign Languages: Beyond the Manual Channel, The 9th International Conference on Language Resources and Evaluation (LREC 2014), Reykjavik, Iceland.
[Adobe Acrobat PDF.]

Examples of the Data

Examples of excerpts of the data contained in the corpus may be available by request. Please send email to matt at cs.qc.cuny.edu to request access.

Funding Support

This material is based upon work supported in part by the National Science Foundation under award number 1065013.