Abstract- gesture recognition. Differently from the Kinect, this

Topics: ArtPhotography

Type:

Sample donated:

Last updated: December 19, 2019

 Abstract-Drones nowadays are widely used aroundthe world for a variety of purposes including aerial videographer, photography,surveillance etc. A simple gesture controller can make the task of pilotingmuch easier. In our implementation, the Leap Motion Controller is used forrecognition of gestures, which are motion of the hand, and as a result, we cancontrol the motion of the drone by simple gestures from the human hand. Themotive of this project is to capture all points of hand and recognize thegestures and controlling the drone with the same gesture. From a hardwareperspective, the Leap Motion Controller is an eight by three centimeter unit,which comprises two stereo cameras and three infrared LEDs. This paper proposesa hand gesture recognition scheme explicitly targeted to Leap Motion camera.

The two stereo cameras as well as the three infrared LEDs perform the functionof tracking the infrared light having a wavelength of about 850 nanometers,which is outside the visible light spectrum. Our implementation of using amotion controller to control the motion of a drone is via simple humangestures. The main advantage of this system is that capturing all gestures willhelp to control the drone without using any remote.Keywords: Leap MotionController, LEDs, Gesture recognition, Leap camera.

Don't use plagiarized sources.
Get Your Custom Essay on "Abstract- gesture recognition. Differently from the Kinect, this..."
For You For Only $13.90/page!


Get custom paper

 I.                  INTRODUCTIONDronesnowadays are widely used around the world for a variety of purposes includingaerial videography, photography, surveillance etc. A simple gesture controllercan make the task of piloting much easier. In this study, we present ourimplementation of using a motion controller to control the motion of a dronevia simple human gestures.

In our implementation, the Leap Motion Controller isused for recognition of gestures, which are motion of the hand, and as aresult, we can control the motion of the drone by simple gestures from thehuman hand. In recent years, hand gesture recognition has attracted a growinginterest due to its applications in many different ?elds, such ashuman-computer interaction, robotics, and computer gaming, automaticsign-language interpretation and so on. The problem was originally tackled bythe computer vision community by means of images and video .

Dynamic handgesture recognition is considered to be the problem of sequential modeling andclassi?cation. The recent introduction of the LeapMotion device has opened new opportunities for gesture recognition. Differentlyfrom the Kinect, this device is explicitly targeted to hand gesturerecognition. Differently from the Kinect, this device is explicitly targeted tohand gesture recognition and directly computes the position of the ?ngertipsand the hand orientation. The Leap Motion controller is a small USBperipheral device which is designed to be placed ona physical desktop, facing upward. It can also be mounted onto a virtualreality headset.

Using two monochromatic IR cameras and three infrared LEDs, the device observes a roughly hemispherical area, to adistance of about 1 meter. In our implementation, the Leap MotionController is used for recognition of gestures, which are motion of the hand,and as a result, we can control the motion of the drone by simple gestures fromthe human hand. However  there areprevious implementation  to control thedrone but in this we will show hoe capture all the points of hands to controlthe drone with given gesture.  II.               PAST WORKAyanavaSarkar,  Ketul Arvindbhai Patel, GeetKrishna Capoor, Ganesh Ram R.K “Gesture Control of Drone Using a MotionController” examines that Drones nowadays are widely used around the world fora variety of purposes including aerial videographer, photography, surveillanceetc.

In many cases, there is a requirement of a skilled pilot to perform thesetasks using the drone which proves to be exorbitant. A simple gesturecontroller can make the task of piloting much easier. In this study, wepresent our implementation of using a motion controller to control the motionof a drone via simple human gestures. We have used the Leap as the motioncontroller and the Parrot AR DRONE 2.0 for this implementation. The Parrot ARDRONE is an off the shelf quad rotor having an on board Wi-Fi system 1.

GiulioMarin, Fabio Dominio, Pietro Zanuttigh “Hand Gesture Recognition With The LeapMotion And  Kinect  Devices” states that the paper proposes anovel hand gesture recognition scheme explicitly targeted to Leap Motion data.An ad-hoc feature set based on the positions and orientation of the ?ngertipsis computed and fed into a multi-class SVM classi?er in order to recognize theperformed gestures. A set of features is also extracted from the depth computedfrom the Kinect and combined with the Leap Motion ones in order to improve therecognition performance.

Therecent introduction of novel acquisition devices like the Leap Motion and theKinect allows to obtain a very informative descrip- tion of the hand pose thatcan be exploited for accurate gesture recognition2. WeiLu, Member, IEEE, Zheng Tong, and Jinghui Chu “Dynamic Hand Gesture RecognitionWith Leap Motion Controller”examines that this paper, we propose a novelfeature vector which is suitable for representing dynamic hand gestures, andpresents a satisfactory solution to recognizing dynamic hand gestures with aLeap Motion controller (LMC) only. These have not been re- ported in otherpapers. The feature vector with depth information is computed and fed into theHidden Conditional Neural Field (HCNF) classi?er to recognize dynamic handgestures. The proposed feature vector that consists of single-?nger featuresand double- ?nger features has two main bene?ts 3.Bing-YuhLu, Chin-Yuan Lin, Shu-Kuang Chang, Yi-Yen Lin, Chun-Hsiang Huang, Hai-Wu Lee, Ying-Pyng Lin “Bulbs Controlin Virtual Reality by Using Leap Motion Somatosensory Controlled Switches”states that the study presented a Leap Motion somatosensory controlledswitches.

The switches were implemented by the relays. The “open” or “short” ofthe switching circuit were controlled by the sensing of the Leap Motionsomatosensory module. The virtual switches on the screen have designed to be 5circle buttons. Leapmotion somatosensory controlled switches was implemented to aid some personswhose hands have been damaged can not perform the switches well 4.KemalERDO?AN, Akif DURDU, Nihat YILMAZ “Intention Recognition Using LeapMotionController and Artificial Neural Networks”, Intention recognition is animportant topic in the field of Human Robot Interaction. If the robot is wantedto make counter movements just in time according to human’s actions, a roboticsystem must recognize the intention of the human necessarily.

A method for arobotics system to estimate the human’s intention is presented. In our method,the information is provided from the sensor called as leap motion controllerdevice. The decision about the tendency of human intention is made byArtificial Neural Network.Toobtain a satisfying result from ANN classifier all data sets are clustered,trained and tested together with k-fold cross validation method with variedtransfer functions, training algorithms, hidden layer numbers and iterationnumbers.

5III.            SYSTEM  IMPLEMENTATION A.     Hardware: ·        Microcontroller:Weare using the ARM based AVR microcontroller- ATMEGA32 which is a 40 pin ICconsisting of 5 ports and 32 programmable input/output lines. It operates on an8 MHz crystal. The microcontroller has an 8-channel, 10-bit ADC and 3 on chiptimers.

It also consists of 1024 bytes of EEPROM and 2K bytes of internal SRAM.ThePC system are attached to the microcontroller through the Serial communication.The DAC converts the digital values from the sensors to analog. then the valuesare given to Remote unit.·        PCF8519P:The PCF8591P is a 8bitA/D and D/A converter in 16 pin DIP package. It is a single chip, singlesupply, low power 8bit CMOS data acquisition device with four analogue inputs,one analogue output and serial I2C bus interface.

The functions of PCF8591P includes analogueinput multiplexing, onchip track and hold function, 8bit analogue to digitalconversion and 8bit digital to analogue conversion. The maximum conversion rateis given by maximum speed of the I2C bus. Softwares: ·        mikroCPRO for AVR: used for coding the microcontroller inembedded C.·        AVRFLASH:usedto burn the program onto the microcontroller.

·        NetBeansIDE 7.1: to create the GUI, registration and user login formsfor the server.·        Serilization: tocreate the user database.·        ExpressPCB: to design the PCB layout. B.     Operation:TheLeap Motion Controller as shown in Fig.

1 is a gesture recognition device whichuses advanced algorithms for such operations. From a hardware perspective, theLeap Motion Controller is an eight by three centimeter unit, which comprisestwo stereo cameras and three infrared LEDs. The two stereo cameras as well asthe three infrared LEDs perform the function of tracking the infrared lighthaving a wavelength of about 850 nanometers, which is outside the visible lightspectrum.  When the palmportion is completely parallel to the LEAP motion controller, it will be ableto detect it. It is because the palm being parallel to the controller, it willrecognize it as a single finger after this the drone can be controlled by givendirection. The Leap Motion Controller uses its two monochromatic infrared (IR) stereocameras and three infrared LEDs to track any hand motion up to a distance ofabout 1 meter or about 3 feet directly above it.

Hence, it forms ahemispherical area above itself whose radius is about 1 meter and recognizesany hand gesture occurring in that plot of distance.  Inthis we will first read of all points from leap sensor. All the points taken from the leapcamera would be represented as P1, P2,…..PN, respectively. Scaling the points before would helpus to find their feature extraction further used for implementing inapplication. Nowstart to calculate the features of all points and then calculate the distancefactor.The Distance vector formula will calculate all the points i.

e D1,D2…..D16, andnow it would be used for detecting the feature.Compare the gestures stored indatabase with the distance vector points and if it matches then the resultantgesture will be given to hardware to control the drone. By using theCosine similarity algorithm, calculate the angle values for respective points.Then sort the values and find the maximum value out of it, & then createthe gesture by the maximum value.

at the end the command will be given tohardware ie drone to control it.The mainobjective of this project is to developing an application using 3D camera i.eLeap sensor to control drone. In this paper we implement writing codes tocapture the hand gesture captured by the Leap. Thispaper will help us to detect & calculate total 16 points of hand which willbe helpful to detect any gestures.

The drone responds to any hand gesture andmoves accordingly.The points can be determined  by using cosine Similarity Algorithm were ifthe gesture matches with the gesture stored then the drone would be controlledas that.     IV.             FUTURE SCOPEThe system illustrate of finding all the points of hand to make a gestureto control the drone. Creating a gestures and storing in database will help tofind everytime the gesture that is recorded and if it matches with the storeddatabase then it would be given as output to the hardware and respectively itwill move or control the drone.

Creatinga gestures and storing in database will help to find every time the gesturethat is recorded and if it matches with the stored database then it would begiven as output to the hardware and respectively it will move or control thedrone. The Leap Motion Controller uses its two monochromatic infrared(IR) cameras and three infrared LEDs to track any hand motion up to a distanceof about 1 meter or about 3 feet directly above it. The hand gestures relayed are converted to linear and angulardisplacements and stored in an database. The proposed featurevector that consists of single-?nger features and double- ?nger features hastwo main bene?ts. V.

                CONCLUSION AND DISCUSSIONS With the helpof the LEAP Motion Controller, we have been able to move the DRONE by usinghand motion. The drone responds to any hand gesture and moves accordingly. It forms a hemispherical area above itself whoseradius is about 1 meter and recognizes any hand motion occurring in that plotof volume.

WhileCreating a gestures and storingin database will help to find everytime the gesture that is recorded and if itmatches with the stored database then it would be given as output to thehardware and respectively it will move or control the drone. The handgestures relayed are converted to linear and angular displacements and storedin an array. Hence, it can be concluded that with the help of the Leap MotionController, we can use the DRONE to perform various tasks such as aerialvideography, performing acrobatic tasks, to name a few. This project concludesof detecting all 16 points and controls the drone respectively for the furtherapplication.

 References:1 AyanavaSarkar,  Ketul Arvindbhai Patel, GeetKrishna Capoor, Ganesh Ram R.K “GestureControl of Drone Using a Motion Controller” , ©2016 IEEE2 Giulio Marin, Fabio Dominio, PietroZanuttigh “Hand GestureRecognition With The Leap Motion And Kinect  Devices , Department ofInformation Engineering, University of Padova, ICIP 2014, ©2014 IEEE 3Wei Lu, Member, IEEE, Zheng Tong, and Jinghui Chu “Dynamic Hand Gesture RecognitionWith Leap Motion Controller” IEEE SIGNAL PROCESSING LETTERS,VOL. 23, NO. 9, SEPTEMBER 20164Bing-Yuh Lu, Chin-Yuan Lin, Shu-Kuang Chang, Yi-Yen Lin, Chun-Hsiang Huang, Hai-Wu Lee, Ying-Pyng Lin “Bulbs Controlin Virtual Reality by Using Leap Motion Somatosensory Controlled Switches” ICACT2017February 19 ~ 22, 20175Kemal ERDO?AN, Akif DURDU, Nihat YILMAZ “Intention Recognition Using LeapMotionController and Artificial Neural Networks” ©2016 IEEE6 GuanglongDu, Ping Zhang, and Xin Liu, “Markerless Human–Manipulator Interface Using LeapMotion With Interval Kalman Filter and Improved Particle Filter”, IEEETRANSACTIONS ON INDUSTRIAL INFORMATICS, VOL. 12, NO. 2, APRIL 2016 

Choose your subject

x

Hi!
I'm Jessica!

Don't know how to start your paper? Worry no more! Get professional writing assistance from me.

Click here