Design and development of hand gesture recognition system

 

Design and development of hand gesture recognition
system for speech impaired people

 

 

R.Ananthi

 

ME-Computer Science
and Engineering

 

Dhanalakshmi
Srinivasan Engineering College.

 

 

Abstract— All over world, deaf and
dumb people face struggle in
expressing their feelings to other people. There are various challenges
experienced by speech and hearing impaired people at public places in
expressing themselves to normal people. The solution to this problem is
determined in this paper, by the usage of the Indian sign language symbols
which are generic to all deaf and dumb people in India. The gestures
illustrated by the Indian sign language symbols will be conquered with the
support of the flex sensors and accelerometer. The movements included during gesture
representation are rotation, angle tilt, and direction changes. The flex sensor
and the accelerometer are incorporated over fingers and wrist respectively to
acquire their dynamics, a these sensors are fitted over the data glove. These
voltage signals will then be processed by microcontroller and sent to voice
module, where the words voice outputs are stored and play backed equivalent to
each word values to produce the appropriate voice words with the help of the
speaker.

 

Keywords—Indian sign language, speech impaired,
flex sensors, accelerometer and voice module.

 

I.      
INTRODUCTION

 

The
language used by speech and hearing impaired to represent themselves is known
as Sign Languages. But these languages vary from one country to other, as it is
not common to all people. Some of the main challenges experienced by speech and
hearing impaired people while communicating with normal people were social
interaction, communication disparity, education, behavioral problems, mental
health, and safety concerns. As a result of these obstacles, deaf and dumb
people are discouraged to speak out about themselves or their situations in a
public place or emergency cases or in a private conversation.

 

Moreover the language diversify
is very vast in India, from place to place hence a common mode of connection
was needed for speech and hearing impaired people. This resulted

 

in the usage of the Indian Sign Language symbols
between deaf and dumb people to interact among them, but it cannot be
understood by other normal people. In this paper, Indian Sign language ISL
has been used. ISL has its own specific syntax, grammar, alphabets, words and
numerals. Hand Gestures made by using these symbols are the effective way of
communication by speech impaired people to express their idea or meaning. These
gestures are made with the help of fingers, hands, wrist movements and elbow
movements for different sequence of words. Here two aspects are being governed
as one with only finger position without changing hand position and orientation
and the other one is change in both finger and hand position and orientations.
The main need arises when these sign language symbols are not understood by
normal people, as most of them would not have studied ISL. As in real time
image processing methods, only a single individual can be benefited by
capturing his or her image and processing it into text or speech. But in this
paper, any speech impaired people hand gesture movements can be captured by the
flex sensors and accelerometers and produced as voice output through the voice
module.

 

Researches have been done for so
many years in the Hand gesture interpretation system using various sign
languages. As mentioned in paper 1, the sign language gestures are converted
into voice for a single alphabet or a complete string by concatenating each and
every word and thereby forming the full meaningful words, but it was done only
for both American and Pakistan Sign Languages. The method described 2 aims to
help patients with wrist impairments to perform some of their daily
excercises.In one of the research method 3 American Sign Language have been
used, where the boundary of the gesture image depicted by the speech impaired
people was approximated into a polygon. Then on further image processing
Douglas Peucker algorithm using Freeman Chain Code Direction the words was
determined. ISL have also been used in a research paper 4 where each set of
signs have been represented by the binary values of the `UP’ &`DOWN’
positions of the five

 

 

 

 

 

 

 

 

 

 

fingers. The respective images of
the symbols was dynamically loaded and converted into text. A material known as
Velostat was being used in one of the papers 5 for making piezo resistive
sensors, then these sensors was used to detect bend in fingers. This data was
mapped to a character set by implementing a Minimum Mean Square Error machine
learning algorithm. The method used in one of the research paper 6 is the
usage of sensor gloves for detecting hand gestures which uses British Sign
Language system. Here only the normal hand gestures are depicted, but not to
any sign language symbols pertaining to any country was captured. The outputs
are produced in the text format using LCD and audio format using flex sensor.
One of the researches 7 was on the robust approach for recognition of
bare-handed static American Sign Language using a novel combination of the
Local Binary Pattern histogram and Linear Binary Support Vector Machine (SVM)
classifiers. In one of the papers 8 it was mentioned to use a device which
detects and tracks hand and finger motions for Arabic Sign Language. It is done
by the data acquisition using the Multilayer Perceptron networks using Naves
Bayes classifier. In the research approach 9 discussed for the American Sign
Language uses glove with six colored markers and two cameras to extract the
co-ordinate points .The detection of the alphabets is done by the Circle Hough
Transform and backpropagation of the Artificial neural network. One of the ways
10 to detect American Sign Language was capable of recognizing hand gestures
even when the fore arm was involved and also its rotation. It has been
implemented using Principal component analysis to differentiate between two
similar gestures.

 

.

 

Thus there were various
limitations on the previous researches done so far in the field of Sign
language interpretation system. Some of them were usage of the image processing
method, as it will be restricted to only individual images being captured and
processed, hence it can be dynamically loaded and calculated for different
persons using it. Only finger gestures and alphabets have been obtained from
the sign language movements and were produced as output for other country
languages as British, American and Pakistan. Also the distance between the
camera and the person may disturb the accuracy. Therefore in this project, the
gestures for words in Indian sign languages have been used and eight commonly
used words are produced as voice outputs. The movements are captured with the
help of flex sensor and accelerometer and can be changed dynamically with the
change in person and hand orientations.

 

A. Data glove

 

A data glove is an associative
device , which facilitates tactile sensing and fine-motion control. It is is specially
used to capture the shape and dynamics of the hand in a more effective and
direct manner. The flex sensors are being fixed over each finger and
accelerometers over the wrist part. These sensors are being fixed onto the
cloth type data glove by the use of the cello tape or glue. The Fig.2 shows the
data glove that was worn by the speech impaired people and which does acquisition of the gestures with the aid of the flex
sensors and acclerometers.

 

B. Sensory part

 

The sensory part consists of flex
sensors for gaining the finger arrangements and acclerometer for the wrist
spins.In the flex sensor, the resistance varies equivalent to

 

.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Fig. 1.  Block Diagram of the ISL hand
gesture recognition system

 

 

 

 

 

 

 

Flex sensors

 

 

 

II. MATERIALS AND METHODOLOGY

 

The hand gesture recognition
setup represented in this paper comprise of the Data glove, Sensory part (flex
sensors and accelerometer), Amplifier, PIC Microcontroller, voice module and
speaker. The Fig.1.shows the block diagram of the ISL hand gesture recognition
system.

 

 

Accelerometer

 

 

Fig. 2.  Data glove fitted with senors

 

the bending of the sensors. This resistance is then
converted to voltage value by the use of the voltage divider circuit using a

 

 

 

 

 

 

 

 

1130

10k? resistor. Similiarly the voltage conversion is
done for each flex senor situated over the fingers. The accelerometer consist
of 3 axes as x,y,z axes and produces 3 different set of values corresponding to
each axis location and based on the wrist movement or orientation made in the
hand gesture.

 

C. PIC Micrcontroller

deviation to mean to each flex sensor and
accelerometer values. Once all the flex sensors and accelerometer were tested
with their good reputable readings, the experimental setup shown in Fig. 3. was
arranged. The Fig 3. Shows the Experimental set up of the ISL hand gesture
recognition system. The data glove fitted with sensors after testing was
connected to a PIC microcontroller, then to a voice module, speaker and LCD to
hear the voice signals.

 

The microcontroller is used to
govern the operartion of the signal vlaues that are being handled from the
sensors. Thus the outcome voltages of flex sensors and the accelerometers are
given as inputs to the to the ports of A and E of the PIC microcontroller for
further processing. The other end of all the sensory part is connected to
common ground. These signal values are converted to digital by the use of the
inbuilt ADC in the microcontroller.

 

D. Voice Module

 

Signals from the microcontroller
was then given to the voice module. The voice module consist of eight channels,
in which eight words can be recorded. The voice module can be operated in
various modes as parallel and serial modes. The voices were recorded when the
both signals CE(reset sound track) and RE(record) are low till the rising edge
of the trigger. Then the same voice can be play back when only RE is high and a
high to low edge is applied as trigger. The sound of the words can be heard
loud and clearly with the help of the speaker. The setup also incorporate a LCD
panel to display the flex sensor and acclerometer voltages.

 

LCD

 

Microcon

 

troller

 

 

 

 

 

 

 

Speaker

 

 

 

 

Voice
module

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Data glove fitted

 

with
sensors

 

III.  EXPERIMENTAL RESULTS

 

All the sensors on the data glove were first
tested. The flex sensor and the accelerometer readings were observed with
variation in their position, rotation and bending for 5 trials.

 

The bending in hand is being determined at three
bends of the bones of the hand known as distal, middle and proximal phalanges
as mentioned in TABLE I. of the flex sensor readings. The Fig. 2.shown is the
experimental setup of the project used for the Indian Sign Language gesture
Interpretation system. The inference was observed from these sensor readings of
flex sensor in TABLE I.and accelerometer in TABLE II., at different positions
and trials. For the flex sensor, the voltage across the finger when the sensor
is straight is 3.5V, for a power supply of 5V is given. The Voltage drop across
the flex sensor was maximum at the middle phalanges bend and minimum when at
the proximal phalanges bend. For the accelerometer, the maximum values were
observed for X-axis when hand turns right, Y-axis when hand moves up and Z-axis
when hand slants to up position. The trial readings were taken and from those
the final readings were derived and given as mean ± standard deviation. On
calculating the coefficient of variance for these readings, which is very less
than one which indicates these values have good repeatability and reproducibility.
The coefficient variance is the ratio of standard

 

Fig. 3.  Experimental Setup of the ISL
hand gesture recognition system

 

 

 

TABLE
I.

FLEX SENSOR
READINGS

 

 

 

 

 

 

Finger

Nobend

Distal  phalanx

 

Middle  phalanx

Name

(Mean±Stddev)

Bend

 

Bend

 

(V)

(Mean±Stddev)

 

(Mean±Stddev)

 

 

(V)

 

(V)

 

 

 

 

 

Thumb

3.498±8.944e-4

3.536±0.01073

 

3.662±3.57e-3

Index

3.504±1.788e-3

3.742±3.577e-3

 

3.872±8.944e-4

Middle

3.508±3.577e-3

3.786±2.68e-3

 

3.898±8.944e-4

Ring

3.506±2.68e-3

3.694±1.788e-3

 

3.842±3.577e-3

Little

3.502±8.944e-4

3.536±1.788e-3

 

3.694±1.788e-3

 

Subsequently after setting the full plan of the
system, with the consideration of tested both the sensors for their repeatable
values. After this the data glove with flex sensors over each finger and accelerometer
over the wrist was worn out by the

 

 

 

 

 

 

 

 

 

 

1131

speech impaired people. Once they get ready with
their gestures and start expressing in hands, simultaneously the voltage signal
equal to the bend and rotation will be fed to the microcontroller. The flex
sensor voltage of each finger movement according to each word gesticulation was
noted down depending on the bend in each word. Similarly it was done for each
finger’s various angle bends. Also in the accelerometer all the x, y, z axes
variations were calculated corresponding to each words rotation, up, down
positions. This scheme of measurements was repeated for different set of speech
impaired people and for a number of trails. By this approach the minimum and
maximum threshold of each finger and wrist was calculated proportionate to
eight words.

 

From these measurements the
average values of the sensor readings were computed. The mostly commonly used
eight set of words were listed, noted their each finger and wrist movement and
their corresponding values. The selected eight words were Monday, Tuesday,
Thursday, What and Which. The Indian sign language symbols for these words are
the gesture movements that was obtained from the speech impaired people after
they wore the glove fitted with seniors. The TABLE III shows the derived and
average values of one of the words which was reprented through Indian sign
language symbols. The graph shown in Fig.4. represnts the sensor readings of a
single word Monday .

 

Similarly the same procedure was
repeated for all other words by calculating their minimum and maximum

 

voltages for their corresponding ISL gestures made
by speech impaired people. These values will be given to the voice

module after processed by the PIC microcontroller. In the voice module
,the voltages which are received from the sensors

 

and microcontroller select their appropiate words’s sound output. If the
match between the word and voltage readings are

 

equivalent , then voice of the words can be heard through the speaker.
The same custom of steps will be applicable for all the

 

other word’s voltages and their respective voice
turnouts can be heard as output.

 

IV. CONCLUSIONS

 

In this paper, the usages of the
hand gesture made by speech and hearing impaired people have been made
successful to interpret their expression of words. Hence the gesture for each
word was acquired with the help of the flex sensors and accelerometers. Their
corresponding distinct voltages were fed serially to the setup. The data on
processing by the microcontroller and voice module would generate the consonant
words which can be heard by normal people with the help of the speaker. Thus,
the communication gap between normal and speech and hearing impaired people is
reduced. The discussions of the Indian sign language have been made and the
symbols of the eight commonly used words was captured and produced as voice
output. Hence this research provides an elucidation for all the obstacles faced
by all speech impaired people, as from this they will be satisfied, motivated and
gain self confidence that their feelings will also be understood by other
people.

At
present the gestures made by only single hand have been captured, but in future
it can be extended to symbols produced by both of the hands. Also as far as now
only eight words are produced in the voice module, which can also be enhanced
to more number of words as voice turnouts.

 

 

 

 

TABLE
II.

ACCLEROMETER READINGS

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Axis

 

Up

 

Down

 

Tilt

Tilt left

 

 

Slant

/

 

 

(Mean±St

 

(Mean±St

 

right

(Mean±

 

 

position

Positio

 

ddev)(V)

 

ddev)(V)

 

(Mean±

Stddev)

 

(Mean±Std

n

 

 

 

 

 

 

Stddev)

(V)

 

 

dev)(V)

 

 

 

 

 

 

 

 

(V)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

X axis

 

1.644±

1.688±

 

 

1.912±

1.358±8.

 

1.46±

 

 

 

1.788e-3

 

3.577e-3

 

8.944e-4

944e-4

 

 

4.472e-3

 

 

 

 

 

 

 

 

 

 

 

 

Y Axis

 

1.352±

1.858±

 

1.744±

1.698±8.

 

1.386±

 

 

 

3.577e-3

 

3.577e-3

 

1.788e-3

944e-4

 

 

2.683e-3

 

 

 

 

 

 

 

 

 

 

 

 

Z-axis

 

1.45±

1.394±

 

1.512±

1.442±3.

 

1.312±

 

 

 

4.472e-3

 

2.68e-3

 

3.577e-3

577e-3

 

 

3.577e-3

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

TABLE
III.

SENSOR READINGS FOR WORD

 

 

 

 

 

 

 

 

 

 

 

 

Sensor value for word-Monday

 

 

 

 

 

 

 

 

 

 

 

 

 

Position

 

Voltage Values (Mean±Stddev)(
(V)

 

 

 

 

Thumb finger

 

 

4.68±0.014142136

 

 

 

 

 

 

 

 

 

 

 

 

 

Index finger

 

 

 

3.71±0.0083666

 

 

 

 

 

Middle finger

 

 

 

4.3±0.021679483

 

 

 

 

 

 

Ring finger

 

 

 

3.96±0.008944272

 

 

 

 

 

 

Little finger

 

 

 

4.37±0.010954451

 

 

 

 

 

 

X-axis

 

 

 

1.83±0.01

 

 

 

 

 

 

 

Y-axis

 

 

 

1.58±0.0083666

 

 

 

 

 

 

Z-axis

 

 

 

1.42±0.008944272

 

 

 

 

 

 

 

 

 

 

 

 

1132

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Fig. 4.  Graphical Representaion of the
Sensor readings for word

 

V.    
ACKOWLEDGMENT

 

I would
like to thank the Principal, teachers and students of the St. Louis Institute,
Chennai who gave me permission to trial this hand gesture system in their
school. I would also convey my regards to Mrs. Jayanthi, who assisted me in the
techniques of learning Indian sign language.

REFERENCES

 

1    
SanAntonio.R, Shadaram.M,
Nehal.S, Virk.M.A, Ahmed, Ahmedani, and Khambaty.Y, “Cost effective portable
system for signlanguage

gesture recognition,”IEEE International Conference on System of Systems
Engineering on 2-4 June 2008,Farmingdale,USA.

 

2     Al-Osman,H.Gueaieb,El  Saddik.A,and 
Karime.  A,”   E-Glove: 
An

 

electronic glove with vibro-tactile feedback for
wrist rehabilitation of post-stroke patients,”IEEE International Conference on
Multimedia and Expo on 11-15 July 2011 in LaSalla university, Spain.

 

3    
Menon.R, Jayan.S, James.R,
Janardhan and Geetha.M, “Gesture Recognition for American Sign Language with
Polygon

 

Approximation,” IEEE International Conference Technology for Education
on 14-16 July 2011, Chennai..

 

4     Balakrishnan.G
and Rajam.P.S, “Real time Indian Sign Language

 

Recognition System to aid deaf-dumb people,” IEEE 13th International
Conference on Communication Technology on 25-28 September 2011, pp 737-742,
Australia.

 

5      
Ramakrishnan.G,Kumar.STamse.A,Krishnapura.N,andPreetham.C,”Ha
nd Talk-Implementation of a Gesture Recognizing Glove,” Texas Instruments
Conference on India Educators on 4-6 April 2013, Banglore NIMHANS Convention
Centre.

 

6    
Vikram Sharma, M.Vinay Kumar,
N.Masaguppi, S.C.Suma and M.N.Ambika , “Virtual Talk for Deaf, Mute, Blind and
NormalHumans,” Texas Instruments Conference on India Educators on 4-6 April
2013, IEEE Bangalore Section.

 

7            
M.H.Kamrani 
and  Weerasekera,”   Robust 
ASL  Finger  spelling

 

Recognition  Using Local Binary Patterns and
Geometric  Features,”

International Conference on Digital Image Computing Techniques and

8   Mohandes.M,Aliyu.S
and Deriche.M,”Arabic Sign Language Applications on 26-28 November
2013,Hobart,Australia.

recognition using the leap motion controller,”Industrail
Electtroincs(ISIE),IEEE 23rd International Symposium 1-4 June,2014, Istanbul.

 

9    
Tangsuksant.W,Adhan.S and
Pintavirooj.C,”Amercian Sign Language recognition using 3D geometric invariant
feature and ANN

classification,”,Biomedical Engineering International Conference(BMEiCON),26-28
November,2014 , Fukuoka.

 

10  Hussain.I,Tulukdar.A.K  and 
Sarma.K.K,”Hand  Gesture  recognition

 

system with real time palm tracking,” Indian Conference)INDICON),Annual
IEEE, 11-13 December,2014 Pune.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

1133