SLAAASh, ASL Signbank, ASL-LEX oh my! Interested in learning more? Join us Friday, Aug 18 from 1 to 6 (Gallaudet, SLCC 3rd floor open area)

The Sign Language Acquisition: Annotation, Archiving & Sharing (SLAAASh) project involves the construction of machine-readable annotated videos of Deaf children acquiring ASL interacting with their Deaf parents and/or researchers using ASL. The videos were collected in the 1990’s and are being re-annotated for consistency and accuracy. The annotations used in the SLAAASh project crucially involve the use of ID glosses, labels used to consistently identify a sign lemma regardless of changes in its use in various contexts. The ID glosses, along with videos, lexical, and phonological information about each sign, are housed in a new ASL Signbank, to be open to researchers, built on the basis of Signbanks previously constructed for other sign languages. The ASL Signbank is being constructed to be mutually compatible with the existing publicly available ASL-LEX database, containing overlapping phonological information as well as frequency and iconicity ratings, with a unique set of visualization options. Technological improvements to ELAN provide a bridge between an annotation file and Signbank, permitting a close integration of these components for improved consistency in data coding and further research. For example, all instances of a lemma can be called up, modified, or coded through the ELAN/ Signbank bridge. The SLAAASh infrastructure can also be used for other projects annotating ASL data.  (Follow us on Twitter @ASLSLAASH
On Friday August 18, there will be three presentations given in the open area of the linguistics department at Gallaudet University. These are open to the members of the department and others who are interested.  (Note if you are unable to make the entire afternoon, you are welcome to join for whatever presentation you are interested in)
1:30-2:30  “Overview of SLAAASh” by Diane Lillo-Martin
2:45 to 3:45 “ASL Signbank and linking to ELAN” by Julie Hochgesang
4:00 to 5:00 “ASL-LEX” by Zed Sehyr
5 to 6 “Lab” where more in-depth demonstrations of ASL-LEX or ASL Signbank could be provided or one-on-one assistance for getting linked to ASL Signbank, etc.
Interpreting will be provided. If close-vision or tactile interpreting is required, please email Julie (julie.hochgesang at and Paul (paul.dudis at as soon as possible.
Posted in Linguistics, Presentation, Research | Tagged , , , , , , , , , , ,

Dissertation defense – Casey Thornton “The status of palm orientation in the phonological representation of American Sign Language” 3/23/17, 2 pm LLRH6 101

Casey Thornton, a Ph.D. candidate in the Department of Linguistics, will defend her dissertation on “The status of palm orientation in the phonological representation of American Sign Language” on Thursday, March 23, at 2 p.m. in Living and Learning Residence Hall 6 (LLRH6) Room 101. The first forty minutes of the dissertation defense are open to the Gallaudet community.

Ms. Thornton’s dissertation examines the status of palm orientation in the phonological representation of signed languages through three unique but related studies using the Prosodic Model of sign language phonology as its theoretical foundation. The first study looks at how palm orientation behaves in natural signing, the second takes a psycholinguistic approach examining how native signers compensate when target joints responsible for orientation are restricted, and the third aims to determine if native signers are able to correctly identify signs modified to block orientation change. Results from the three combined studies indicated that, in line with the Prosodic Model, there are two types of palm orientation to be represented and how they function within signed languages are uniquely specified. This work contributes to the ever-growing sign linguistics field bridging the gaps between theoretical models and linguistic experimentation.

The members of Ms. Thornton’s dissertation committee are Dr. Gaurav Mathur, chair of the dissertation committee, Department of Linguistics; Dr. Deborah Chen Pichler, Department of Linguistics; Dr. Julie Hochgesang, Department of Linguistics; Dr. Daniel Koo, Department of Psychology; and Dr. Diane Brentari, Department of Linguistics, University of Chicago.

Ms. Thornton joined the Gallaudet University community in 2010, when she entered the masters program in linguistics. After earning completing her M.A. degree in 2012, she entered the doctoral program in linguistics. During her graduate studies, Casey has done extensive research on universal phonotactic constraints in signed languages and has taken a keen interest in bridging gaps between theoretical models of phonology and linguistic experimentation. She also worked as an adjunct professor at Gallaudet University and as a graduate assistant in the Brain and Language Laboratory for Neuroimagine (BL2). In 2015, Casey returned to her hometown and has been teaching at California State University, Northridge as an adjunct professor in Deaf Studies.

Posted in Linguistics, Presentation, Research, Students | Tagged , , ,

News: register now for FEAST 2017 June 21-22 in Reykjavík


FEAST 2017 Local organizing committee


Dear all.

Registration is now open for FEAST (Formal and Experimental Advances in
Sign Language Theory) in Reykjavík, June 21-22. Please consult the
conference website for further information:

We will post a preliminary programme on our website very soon.

If you plan to attend the conference, we strongly advise you to book
accommodation as soon as possible because Reykjavík has become a very
popular tourist destination in recent years. For further information on
booking accommodation see the conference homepage. Please note that Sunna
Guesthouse and Hótel Reykjavík Natura have reserved rooms for conference
guests until March 15th:
Local organizing committee:
Jóhannes Gísli Jónsson
Kristín Lena Þorvaldsdóttir
Rannveig Sverrisdóttir
Þórhalla Guðmundsdóttir Beck
Scientific committee:
Chiara Branchini
Diane Brentari
Anna Cardinaletti
Carlo Cecchetto
Caterina Donati
Karen Emmorey
Carlo Geraci
Meltem Kelepir
Gaurav Mathur
Roland Pfau
Christian Rathman
Josep Quer
Markus Steinbach
Ronnie Wilbur
Bencie Woll

Posted in Conferences, Linguistics, Research

CFP: The 6th Meeting of Signed and Spoken Language Linguistics (SSLL2017) Dates: 22-24 September 2017 in Osaka, Japan

Conference Announcement via Keiko Sagara on SLLS list

Conference: The 6th Meeting of Signed and Spoken Language Linguistics
Dates: 22-24 September 2017
Location: National Museum of Ethnology, Osaka, Japan
Organizers: HARA Daisuke (H), IIZUMI Naoko (H), IKEDA Masumi (D), Kikusawa Ritsuko (H, Chair), MATSUOKA Kazumi (H), SAGARA Keiko (D)

Abstract due: 31 March 2017

SSLL2017 will be held for the promotion of sign language linguistics, and also for a better understanding of human language by comparing and analyzing signed and spoken languages. English/Japanese, ASL/English, JSL/Japanese
interpretation will be provided. We invite presentations (30 minutes presentation, followed by 10 minutes questions and answers) on any topics related to sign languages linguistics and/or a comparison between signed and spoken languages.

Invited Presenters:
Lina LYNN YONG-SHI HOU (University of California, San Diego)

Past Events:
The 5th Meeting of Signed and Spoken Language Linguistics (SSLL2016)

Posted in Conferences, Linguistics, Research

SHARE: Native or early signer (including CODAs)? 18 and older. Please take this online survey about ASL

Calling native and early signers (those who started signing before the age of 6) including CODAs 18 and older, you are invited to participate in a research study on American Sign Language as used by Deaf and signing communities in the United States. The survey is online, so you can do it in your own home and it should take no longer than 30 minutes. We are offering to reimburse $5 to participants who complete the survey.

To take the survey, click on the following link or copy and paste it into your browser:

UPDATE (3/2/17) – The survey can only be taken on a desktop computer. We are working on the possibility of taking this on a mobile device.

If you have any questions, you can contact Heather Hamilton via email at You may also contact Dr. Julie Hochgesang via email at

This has been reviewed by the IRB committee at Gallaudet University.

Posted in Research, Students | Tagged , , ,

SHARE: Media piece on Pro-Tactile ASL in Quartz, “A language for the DeafBlind”


Quartz wrote about Pro-Tactile ASL, link below, featuring faculty member Terra Edwards, her colleague and a DeafBlind user of PT Oscar Serna, and PEN faculty Clifton Langdon.

DeafBlind Americans developed a language that doesn’t involve sight or sound

“Pro-tactile ASL borrows bits and pieces from ASL, adapting them to be useful for people who can’t see. Rather than having the using their own hands as a reference for communication, people who convey information with pro-tactile ASL use the perceiver’s hands and body. The speaker will touch the perceiver’s body and mover his or her hands; in doing so, the speaker takes advantage of the perceiver’s proprioception, or sense of where his or her limbs are. “When we’re talking about a particular shape, instead of showing the shape in space, you’d show [by moving] the perceiver’s arm,” said Serna.”

To read more and see video, visit the Quartz site using URL below.

In case not available in original post, here is a transcript of captions and video descriptions with time stamps for Quartz’s “A Language for the DeafBlind” (compiled by Clifton Langdon):
0:00-0:05 Clifton Langdon & Oscar Serna facing each other. Oscar signs using PTASL. Text with an arrow above Oscar appears: “Oscar Serna.”
CC: Oscar Serna is speaking in a brand new language
0:05-0;11 Text appears: “I’m really stressed out” with “stressed out” in bold.
Oscar: “I’m really stressed out!”
0:12-0:16 Oscar standing, directly facing the camera.
CC: Oscar is both deaf and blind, or, “DeafBlind.”
0:17-0:19 Oscar and Clifton walk outside.
CC: He works at Gallaudet University, on a project tracking the evolution of a language for those who can’t see or hear.
0:20-0:22 Text on screen shows animation of “Pro-Tactile ASL”
CC: This new language is called pro-tactile ASL. The ASL stands for American Sign Language, which uses visual signs for words and phrases.
0:23-0:29 Clifton appears on screen and signs visual ASL version of what Oscar said about being stressed.
CC: The ASL stands for American Sign Language, which uses visual signs for words and phrases.
0:30-0:35 Oscar uses PTASL to talk about a car accident
CC: Pro-tactile ASL communicates entirely through touch.
0:36-0:43 Clifton uses visual ASL. Text appears: “I cut down a tree.”
CC: For example, here’s a sentence in ASL: Clifton: “I cut down a tree.”
0:43-0:50 Oscar uses PTASL. Text appears: “I cut down a tree.”
Here’s how Oscar would say the same sentence in pro-tactile ASL: Oscar: “I cut down a tree.”
0:50-0:59 Three circles appear showing ASL, Fingerspelling and Braille with “ASL” “Fingerspelling” “Braille” written above each.
CC: Historically, DeafBlind people communicated through American Sign Language, Braille, and fingerspelling, where each letter of each word is signed into a person’s hand.
1:00-1:02 Helen Keller photo shown with circle drawn around her hand on another woman’s hand emphasizing how she communicated.
CC: Helen Keller, maybe the world’s most famous DeafBlind person, used fingerspelling.
1:02-1:14 close up shot of Oscar, Clifton and his PhD student, Lauren Berger using PTASL together
CC: But those are limiting, especially when DeafBlind people want to talk to each other.
CC: Pro-tactile ASL emerged in the early 2000s, as once-isolated DeafBlind people began to form communities.
1:14-1:19 Clifton, Oscar, and a CDI on screen. The CDI is interpreting to Oscar from person off screen.
CC: DeafBlind people have been adapting American Sign Language and adding gestures for things many languages don’t have words for.
1:19-1:23 slow-motion replay with circle drawn around Clifton’s hand on Oscar’s arm to emphasize that Clifton is tapping on Oscar’s hand.
CC: For example, this tap on the hand is like nodding.
1:23-1:28 Oscar and Clifton walk down the hall and chat.
CC: The language has been evolving ever since.
1:28-1:42 Clifton sits and signs using visual ASL. A title appears: “Clifton Langdon. Professor, Gallaudet University”
CC: Clifton: “Now what’s new in pro-tactile is that we’re seeing things that were used in visual sign language be transition from the use of space to the use of the perceiver’s body.”
1:43-1:48 Two circles appear. The first contains cartoon eyes. The second contains a cartoon ear.
CC: Traditional theories of language defined it as something seen or heard.
1:48-1:49 A third circle appears containing a cartoon hand.
CC: But Pro-tactile ASL proves that language can also be communicated through touch.
1:49-1:54 Oscar talks to Clifton and Lauren outside.
CC: And for the people speaking it, it allows for a life with richer communication.
1:55-2:07 Oscar talks to Clifton. A title appears: “Oscar Serna. Research assistant, Gallaudet University”
CC: Oscar: “Since I became pro-tactile, all of the channels have opened up; information flows freely.”
“It’s like going from dial-up to broadband.”
2:07-2:11 Fade to black with credit to reporters: Nushmia Khan & Katherine Foley

Posted in Faculty, Linguistics, Uncategorized | Tagged , ,

“The Dept of Linguistics: How We Got Here” – Brown Bag by Professor Emeritus Robert E Johnson 11/4, 12-1 @SAC1011

topic:  “The Department of Linguistics: How we got here.”
presenterRobert E. Johnson, PhD
when: Friday, November 4th 2016, 12 to 1 pm
where: SAC1011
Summary of presentation: Which people and principles drove the creation and growth of the department.

Bio: Robert E. (Bob) Johnson, PhD, is Professor Emeritus at Gallaudet, Washington, D.C., where, until he retired in 2012, he was Professor of Linguistics and Assistant Dean of the Graduate School and Extended Learning. He holds a B.A. degree in psychology from Stanford University and a Ph.D. in anthropology from Washington State University. He is an anthropological linguist, interested in the phonological and morphological structure of signed languages, their function in deaf communities, and their critical role in deaf education. He has examined the structures of a number of sign languages, including American Sign Language and the sign language of a Yucatec Maya community. He is co-author of the widely read monograph, “Unlocking the Curriculum: Principles for Achieving Access in Deaf Education,” and numerous papers on signed language structure and function.  Much of his recent work has focused on the imperative of bilingualism in the education of deaf children and on the ways in which the educational and medical communities resist it.

Posted in Brown bag lunch presentations, Faculty, Linguistics | Tagged , , , ,