Hindawi Journal of Sensors Volume 2018, Article ID 6073786, 15 pages https://doi.org/10.1155/2018/6073786 Research Article Structural Motion Grammar for Universal Use of Leap Motion: Amusement and Functional Contents Focused Byungseok Lee, Donghwe Lee, and Seongah Chin DivisionofMediaSoftware,SungkyulUniversity,AnyangCity,RepublicofKorea CorrespondenceshouldbeaddressedtoSeongahChin;[email protected] Received 19 May 2017; Revised 17 November 2017; Accepted 27 November 2017; Published 9 January 2018 AcademicEditor:StefanoStassi Copyright©2018ByungseokLeeetal.ThisisanopenaccessarticledistributedundertheCreativeCommonsAttributionLicense, whichpermitsunrestricteduse,distribution,andreproductioninanymedium,providedtheoriginalworkisproperlycited. MotionsusingLeapMotioncontrollerarenotstandardizedwhiletheuseofitisspreadinginmediacontents.Eachcontentdefines its own motions, thereby creating confusion for users. Therefore, to alleviate user inconvenience, this study categorized the commonly used motion by Amusement and Functional Contents and defined the Structural Motion Grammar that can be universally used based on the classification. To this end, the Motion Lexicon was defined, which is a fundamental motion vocabulary, and an algorithm that enables real-time recognition of Structural Motion Grammar was developed. Moreover, the proposedmethodwasverifiedbyuserevaluationandquantitativecomparisontests. 1. Introduction [4],musicalinstruments[5],mixedreality[6],andrehabili- tationandmedicalapplications[7]. Interfacetechnology,whichsupportstheinteractionbetween Inparticular,LeapMotiongesturerecognitioninAmuse- contentandusers,iscontinuouslybeingdeveloped.Recently, ment (game) Contents plays a crucial role in keeping the the technology is transforming into a natural user interface playerengrossedinthegame.Italsoincreasestheimmersive (NUI) method that provides users with a bigger sense of senseoftheAmusementContentbecauseLeapMotionuses reality compared with the conventional method, which the player’s gestures without any controllers in real time as focusses on the use of mouse and keyboard. NUI is an theplayerinteractswiththecontent.Gamesthatusegesture up-to-date means of interacting with computers that has recognitioncancapturetheplayer’sattentioneasilythrough graduallydrawnmoreinterestsinhuman-computerinterac- theprogressofthegame[8]. tion(HCI).NUIcomprisesvoiceinterface,sensoryinterface, Research on the recognition of Leap Motion has been touchinterface,andgestureinterface.LeapMotionisafinger carried out in technical studies. Some studies on the use of gesture interface-supported device [1, 2]. The infrared ray SVM were reported in [9], and studies using HMM were cameras attached to the Leap Motion controller capture investigatedin[10–12].However,thesestudiesusemachine and analyse the hand gesture, and the content recognizes learning, which requires feature extraction, normalization, the motion. The Leap Motion controller introduces a new andtime-consumingtrainingprocedures. novel gesture and position tracking system with submilli- Asdescribedabove,werealisetheuseofLeapMotionin meter accuracy. Its motion sensing precision is unmatched contentsisexpandingandthetechnologyofrecognitionhas by any depth camera currently available. It can track all 10 a cumbersome preprocessing task. Although many studies of the human fingers simultaneously. As stated by the investigated movement recognition through Leap Motion manufacturer, the accuracy in the detection of each finger- and content application, authors have not found any litera- tip position is approximately 0.01mm, with a frame rate ture reported on standardized motion grammar. This study of up to 300fps. actuallyisdesignedtotargetleapmotiongesturesthathave For these benefits, the Leap Motion controller is widely beenusedingamessincegameusersareinconveniencedby usedinvariousapplicationssuchasgames[3],signlanguages having to learn different motions for content because they 2 JournalofSensors allhavetheirownmotions.Apreliminaryconferencepaper in interactions for architecture students in mixed reality isshownin[13]. environments. The menu interface design supported the Tothisend,thisstudydefinedtheMotionLexicon(ML) real-timedesignoflargeinteriorarchitecturalspacesexperi- that can be universally used in Amusement and Functional enced in mixed reality environments. Iosa et al. [7] con- Contents and designed the Structural Motion Grammar ducted a study to test the feasibility, the compliance, and (SMG) composed ofthecombinationofML.Then,thetree thepotentialefficacyofusingLeapMotioncontroller-based of SMG was recognized in real time thorough coupling a system for progressing the recovery of elderly patients with motion API without using complex procedures such as ischemicstroke. feature extraction and training process like a machine learning algorithm. Then, the defined motions were then 3. Methods tested for verification. To accomplish theproposedmethod,thecommon motions 2. Related Work used in Amusement and Functional Contents were first classified. Based on the classifications, the ML that can be Researchers have studied the accuracy and robustness of used universally was defined. Then, SMG was defined Leap Motion [14, 15]. Weichert et al. [14] analysed the through the combination of ML. Also, the recognition step accuracy and robustness of Leap Motion and applied the isprovided. research on industrial robots. Guna et al. [15] conducted Figure 1shows theoverall flowoftheproposed method research on the accuracy and reliability of the static and as an example. Leap Motion, which is a form of NUI, dynamic movements of Leap Motion. enables the free movement of hands and its recognition. TheLeapMotion’smovementrecognitionhasalsobeen To define ML, the contents were divided into Amusement investigated[16–21].Marinetal.[16,17]conductedresearch and Functional Contents. The representative motions were on the multiclass classifier by coupling Leap Motion with a thenselected.Then,wedefinedtheStructuralMotionGram- Kinect and depth camera, while Vikram et al. [18] studied mar (SMG) composed of the combination of ML. Every the recognition of handwritten characters. Lu et al. [19] motion can be represented in the SMG that is visualized in proposed the Hidden Conditional Neural Field (HCNF) a tree structure. Prior to defining the selected motions, the classifiertorecognizethemovinggestures.Boyalietal.[20] features of Leap Motion API were analysed to define the researched the robotic wheelchair control, which applied ML. For instance, Leap Motion API translated the orders blocksparse,sparserepresentative,andclassification.Seixas byspecifyingthemfromthetoptothebottomclassesbased et al. [21] compared the screen tab and selected gesture of on the top-down method. When the first condition of bothhands. identifying static or dynamic movement was applied, the The use of Leap Motion on sign language is also being identification of the motion whether it is static or dynamic investigated [22–24]. Chuan et al. [22] investigated the will be possible. When the second condition of hand API recognition of English sign language using the 3D motion was applied, the information on hands can be classified. sensor, while Khan et al. [23] researched the prototype that When the last condition of finger API was applied, the can convert sign language to text. Mohandes et al. [24] informationonfingerscanbeclassified.Basedonthisinfor- investigatedtheArabicsignlanguagerecognition. mation,thedifferentiatedmotionscanbedefinedandcanbe ResearchersalsoinvestigatedcontentusingLeapMotion laid out in diverse forms of SMG. More comprehensive [25–27]. The research evaluated 3D pointing tasks using gestureswillbedefinedinthefollowingsections. Leap Motion sensor to support 3D object manipulation [25]through some controlledexperiments including expos- 3.1. Content Classification. To define the universal motions ing test subjects to pointing task and object deformation, thatuseLeapMotion,therepresentativemotionsareneeded measuring the time taken to perform mesh extrusion and to be extracted by each content classification. The digital object translation. Sutton [26] presented the air painting contents where leap motion is applicable can be classified method using Leap Motion that could be an input data to intoAmusementContentandFunctionalContentbasedon the Corel Painter Freestyle application. The painting pro- their purposes. Both types of contents have subgenres, and cess through gestures was implemented. A study about commonlyusedmotionswereextractedthroughtheclassifi- sound synthesis and interactive live performance using cationandanalysisofthegenres. Leap Motion was also reported [27]. The study imple- mented a 5-grain granular synthesizer making users trigger individual grains. 3.1.1. Amusement Content. Amusement Content is also And there were also studies about various contents and known as the game content. This content can be classified techniquesusingLeapMotion,aforementionedinIntroduc- into the following subgenres based on the motions: tion[3,6,7].Leeetal.[3]studiedagamemodelusingLeap Action, FPS (First Person Shooing), Simulation-Racing/ Motionthatcombinedgesture-dependentimpactlevelswith Flight, Arcade, Sports, and Role-playing. Of the six genres, the physical characteristics of players. A game was realised Sports and Role-playing were excluded because they did in which the gesture was recognized to be associated with not fit in the current study. Sports games were not fit for a player’s gesture with a gesture-specific weight. Davis Leap Motion usage because multiple players need to be et al. [6] proposed the work to establish new modalities controlled simultaneously. JournalofSensors 3 Content classification Motion recognition Amusement content Static/dynamic Dynamic, static distinction Functional content Hand recognition Hand Finger Structural motion grammar Lefthand, righthand ML: Motion lexicon distinction Action performed AML: Adverb ML CML: Compound ML ACML: Adverb CML ML AML CML ACML Finger distinction Go Stop Go Left Go Shoot Go Left Shoot direction direction Figure1:Schematicoftheproposedmethod. For Role-playing games, which have a high level of Content and Functional Content using the hand and finger freedom, defining the motion has limitations because its API. To define ML, the hand and finger API reflecting the interface and the number of possibilities are very complex features of the genres have been analysed. Tables 3 and 4 and diverse. show the defined ML by contents. More specifically, To this end, the four genres, namely, FPS, Action, Table3definesthemotionsforbothleftandrighthandsto Simulation-Racing/Flight and Arcade, were analysed and be used for FPS, Action, Simulation-Racing/Flight, Arcade, common motions were extracted. Table 1 shows the repre- Sports, and Role-playing games. For Action, Simulation- sentative motions by game genres. Within the FPS genre of Racing/Flight and Arcade games, the left hand was defined AmusementContent,the“SuddenAttack(NexonCo.)”isa for movement and the right hand was defined for action representative game, and its motions are: “Move,” “Jump,” because both motions occur simultaneously. Table 3 shows “Run,” “Sit,” “Shot,” “Reload,” and “Weapon Change.” In the details of ML, image, and motion principle. “Go” for Table 1, three games (Sudden Attack, King of Fighters, the left hand was denoted by having all fingers straight, and Cookie Run) are just representative examples of games whereas “Stop” was denoted by having all fingers folded. from lots of games that we looked into to define the When defining ML, “Jump” or “Sit” motions were linked common motions. to the up and down directions. “Shot,” “Reload,” and The motions can be comprehensively categorized into “Weapon Change” motions for the right hand were also movement and action. In this study, ML is defined based defined by linking with the actual motion. on the framework that the left hand is the movement while For Functional Content, “Zoom In,” “Rotation,” “Play,” therighthandistheaction. “Pause,” and “Rewind” were representative motions. Given a very wide range of motions, not all of them can be 3.1.2.FunctionalContent.FunctionalContentwasclassified defined. Therefore,motions thatwere commonly used have into Experience and Creation Content and Teaching and been defined. Learning Content. With the recent expansion of the virtual Table 4 shows the Experience and Creation Contents reality market, numerous Experience contents or disaster of ML and explains their ML, image and motion princi- reactiontrainingcontentsuseNUI.Arepresentativeexample ple. Here, ML comprises “Zoom In,” “Zoom Out,” and oflecturecontentise-Learning,whichisaformofTeaching “Rotation.” For “Zoom In” and “Zoom Out,” the Vector3 andLearningContentthatprovideslecturevideosonlineto coordinate was applied to both hands, and the movement overcome the drawbacks of offline education, such as being of the x-axis was recognized as the distance. “Zoom Out” closed and collective. Table 2 shows the representative was defined as having both hands together, and “Zoom motionsusedbyeachFunctionalContent.TheVRMuseum In” as having both hands apart. For “Rotation,” the hor- of Fine Art (Steam VR Co.) is a representative example of izontal and vertical condition was applied to the hand to Experience and Creation Content, and its motions are identify whether the hand was horizontally positioned. “ZoomIn,”“ZoomOut,”“UsingTool,”and“Rotation.” The rotating counterclockwise motion was defined when the left hand moved towards the right side of the x-axis, 3.2. Motion Lexicon. Motion Lexicon (ML) consists of the and the clockwise rotation motion was defined when the motions that have been analysed within the Amusement right hand moved to the left side of the x-axis. 4 JournalofSensors Table1:MotioncategorybyAmusementContentgenres. Gamegenre Motion Gameexample FPS Move,Jump,Run,Sit,Shot,Reload,WeaponChange SuddenAttack(NexonCo.) Action Move,Jump,Run,Attack(Skill) KingofFighters(SNK) Arcade Move,Jump,Function CookieRun(Devsisters) Table2:MotioncategorybyFunctionalContentgenres. FunctionalContent Motion Example ZoomIn,ZoomOut,UsingTool(Drawing,Attaching, ExperienceandCreation Cutting,andsoforth),Rotation,andsoforth VRMuseumofFineArt(SteamVRCo.) TeachingandLearning Play,FastPlay,Stop,Rewind ICTinEducation(KRCSGroup) JournalofSensors 5 Table3:AmusementContentMotionLexicon. ML Image Motionprinciple (i)Dynamicandstaticclassificationcondition Go(G) (ii)Leftandrighthandclassificationcondition (iii)Numberoffingercondition (i)Dynamicandstaticclassificationcondition Stop(ST) (ii)Leftandrighthandclassificationcondition (iii)Numberoffingercondition (i)Dynamicandstaticclassificationcondition LeftDirection(LD) (ii)Leftandrighthandclassificationcondition (iii)Degreecondition LeftHand (i)Dynamicandstaticclassificationcondition RightDirection(RD) (ii)Leftandrighthandclassificationcondition (iii)Degreecondition (i)Dynamicandstaticclassificationcondition Jump(J) (ii)Leftandrighthandclassificationcondition (iii)Vector3coordinatecondition (i)Dynamicandstaticclassificationcondition Sit(S) (ii)Leftandrighthandclassificationcondition (iii)Vector3coordinatecondition (i)Dynamicandstaticclassificationcondition Roll(R) (ii)Leftandrighthandclassificationcondition (iii)Vector3coordinatecondition 6 JournalofSensors Table3:Continued. ML Image Motionprinciple (i)Dynamicandstaticclassificationcondition (ii)Leftandrighthandclassificationcondition Shot(sh) (iii)Numberoffingercondition (iv)Degreecondition (i)Dynamicandstaticclassificationcondition Reload(r) (ii)Leftandrighthandclassificationcondition (iii)Numberoffingercondition (i)Dynamicandstaticclassificationcondition WeaponChange(ch) (ii)Leftandrighthandclassificationcondition (iii)Horizontalandverticalcondition RightHand (i)Dynamicandstaticclassificationcondition Kick(k) (ii)Leftandrighthandclassificationcondition (iii)Vector3coordinatecondition (i)Dynamicandstaticclassificationcondition Punch(p) (ii)Leftandrighthandclassificationcondition (iii)Vector3coordinatecondition (i)Dynamicandstaticclassificationcondition Function1(F1) (ii)Leftandrighthandclassificationcondition (iii)Numberoffingercondition (i)Dynamicandstaticclassificationcondition Function2(F2) (ii)Leftandrighthandclassificationcondition (iii)Numberoffingercondition JournalofSensors 7 Table3:Continued. ML Image Motionprinciple (i)Dynamicandstaticclassificationcondition (ii)Leftandrighthandclassificationcondition Drift(D) (iii)Horizontalandverticalcondition (iv)Numberoffingercondition (i)Dynamicandstaticclassificationcondition Booster(B) (ii)Leftandrighthandclassificationcondition (iii)Numberoffingercondition Table4showstheMLdefinedinrelationtotheTeaching ACML is a combination of ML and ML and Adverb andLearningContentmotions.Here,MLcomprises“Play,” vocabularies and was used when three motions were exe- “Fast Play,” “Rewind,” and “Pause.” The horizontal and cuted. Forinstance, thelefthandresponsibleformovement vertical conditions and Vector3 coordinate condition were recognizes the ML of “Go” and also recognizes the Adverb applied to both hands, and “Play” was defined when the of “Left Direction” simultaneously. The right hand can distance of the two hands on the x-axis was 0. Specifically, express“Shot”withtheintegrationofML.Ontheschematic thisisthesamemotionasclapping. tree, theSMG leads toACML,which then leads to theML/ For “Fast Play” and “Rewind,” the movement was the ML/Adverb. The process of “Left Direction+Go+Shot” same, but with different hands and directions. For “Fast was identified with arrows on the schematic tree. In this Play,”thelefthandwasmovedtotherightsideofthex-axis; study,thevocabularycombinationsbasedontheaforemen- whilefor“Rewind,”therighthandwasmovedtotheleftside tioned schematic tree have been used to define the SMG. ofthex-axis.For“Pause,”thenumberoffingerconditionwas The red dotted arrows indicate the recognition procedures appliedtoidentifywhethertherearetwoleftfingers. that satisfy SMG. For example in Figure 2, Go and Shot literally means that a game player wants to make a tank 3.3. Structural Motion Grammar. Structural Motion Gram- go forward and shot at the same time. Thus, SMG can mar is a combination and grammaticalization of the be classified into CML broken down into ML (Go) and aforementionedMLthathasbeendefined.ItconsistsofML ML (Shot). (Motion Lexicon), AML (Adverb and ML), CML (Com- Aformal representationof SMG is theform ofcontext- pound ML), and ACML (Adverb and Compound ML). freegrammar(CFG)sinceSMGcanbebrokendownintoa Figure2isaschematictreeoftheclassificationandcoupling setofproductionrules.SMGillustratesallpossiblemotions oftheMLthathasbeendefined. ingivenformalmotions.WealsodefineSMGasatheoretical MLcanbeSMGbyitself,suchasthe“Rotation”motion formasbelow. of the Experience and Creation Content. SMG is connected SMG:=AML∥CML∥ACML∥ML, toML.Theprocessof“Rotation”motionhasbeenidentified AML:=ML+Adverb, witharrowswithintheschematictree. CML:=ML+ML, AML is a combination of ML and Adverb and Adverb ACML:=ML+ML+Adverb, wasusedasapartofspeechthatsupportsML.Forinstance, ML:=G∥ST∥S∥LD∥RD∥J∥S∥R∥sh∥r∥ch∥k∥p∥ forthelefthandmotionthatwasresponsibleformovement, F1∥F2∥D∥B∥ZI∥ZO∥RO∥p∥fp∥rw∥PA, the ML of “Go” was recognized and, at the same time, the Adverb:=LD∥RD. SMG of the “Right Direction+Go” was expressed with the coupling of the Adverb of “Right Direction.” Within the 3.4.MotionRecognition.GiventhatSMGhasacombination schematic tree, theSMG leads toAML,which thenleads to ofMLthatrepresentsamotioneitherusingonehandortwo the ML/Adverb. The process of “Right Direction+Go” hands,theSMGisdecomposedintofourchildrenML,AML, motionhasbeenidentifiedinarrowsontheschematictree. CML,orACML;then,therecognitionstepsofMLarecarried CMLwasusedwhentwotypesofmotionswereexecuted out.Recognitionreferstotheconditionsthatcanexplainthe using ML and ML. For example, the left hand that was recognizable APIon theLeap Motion device anddefine the responsible for movement recognizes the ML of “Go,” and motions. Leap Motion, which is a form of NUI, provides at the same time, the right hand can express the “Shot” various APIs [2]. Among the numerous APIs, most of the motion with the integration of ML. On the schematic tree, contents in the market use the hand and finger API. These SMGleadstoCML,whichthenleadstoML/ML.Theprocess contentsreceivetheirdatafromtheupper-mostframe,where of “Go+Shot” motion has been identified with arrows on thehandisrecognizedtotrackandcollectinformation.The the schematic tree. hand API that has received the data can recognize the 8 JournalofSensors Table4:FunctionalContentMotionLexicon. FunctionalContent ML Image Motionprinciple ZoomIn(ZI) (i)Dynamicandstaticclassificationcondition (ii)Leftandrighthandclassificationcondition (iii)Vector3coordinatecondition ZoomOut(ZO) ExperienceandCreation (i)Dynamicandstaticclassificationcondition (ii)Leftandrighthandclassificationcondition Rotation(RO) (iii)Horizontalandverticalcondition (iv)Vector3coordinatecondition (i)Dynamicandstaticclassificationcondition (ii)Leftandrighthandclassificationcondition Play(p) (iii)Horizontalandverticalcondition (iv)Vector3coordinatecondition (i)Dynamicandstaticclassificationcondition TeachingandLearning FastPlay(fp) (ii)Leftandrighthandclassificationcondition (iii)Vector3coordinatecondition (i)Dynamicandstaticclassificationcondition Rewind(rw) (ii)Leftandrighthandclassificationcondition (iii)Vector3coordinatecondition JournalofSensors 9 Table4:Continued. FunctionalContent ML Image Motionprinciple (i)Dynamicandstaticclassificationcondition Pause(PA) (ii)Leftandrighthandclassificationcondition (iii)Numberoffingercondition SMG (Structural motion grammar) AML CML (Adverb & ML) (Compound ML) Right direction+Go Go+Shot (RD+G) (G+sh) ML ML ML (Motion lexicon) Adverb (Motion lexicon) (Motion lexicon) Go (G) Stop (ST) Sit (S) Left Right Go (G) Stop (ST) Sit (S) Shot (sh) Weapon direction (LD) direction (LD) change (ch) ML ACML (Motion lexicon) (Adverb compound ML) Left direction+Go+Shot (LD+G+sh) Rotation (RO) (MotioMn lLexicon) (MotioMn lLexicon) Adverb Rotation Zoom (RO) out (ZO) Fast play (fp) Go (G) Stop (ST) Sit (S) Shot (sh) Weapon Left Right change (ch) direction (LD) direction (LD) Figure2:Structureofthemotiongrammartree. existence of the left or right hand and distinguish the left wastoapplythedynamicandstaticclassificationconditions from the right hand. In addition, the API can identify the to the shooting motion. Then, using the data on hand API, speed, location, and degree of the hand. Finger API can theclassificationconditionsfortheleftandrighthandswere distinguish each finger andidentify thespeed, location,and applied.Finally,usingthedataonfingerAPI,theconditions degree of the fingers. While the data on speed and location on the number of fingers on the right hand, as well as the were continuously updated, the former data was compared degrees of the fingers, were applied. When two fingers of with the current data by tracking the hands and fingers. the right hand were used, the API identifies whether the These comparison results can help distinguish whether ML fingers are thumb and forefinger and distinguishes the x-z isdynamicorstatic. axis degree of the thumb. The shooting motion was recog- The algorithms SMG (mr_SMG), ML or ML_Adverb nized only when all of the aforementioned conditions were (mr_ML and mr_ML_Adverb), Hand Count (HC), Hand set. Given that the shooting motion has been defined only Feature (HF), Finger Count (FC), and Finger Feature (FF) for the right hand, the direction and movement motions aredefinedasshowninFigure3.Supposingthattheshooting will be defined to the left hand, enabling the use of both motionhasbeendefinedwithintheFPScontent,thefirststep hands for manipulation. 10 JournalofSensors Algorithm : motion recognition of Algorithm : motion recognition of SMG: mr_SMG ML or ML_Adverb : mr_ML or mr_ML_Adverb (i) Input : Motion Lexicon (ML), Adverb ML (AML), (i) Input : Motion Lexicon (ML) (ii) Compound ML (CML), Adverb Compound ML (ii) Output : ML Recognition (iii) (ACML), Structural Motion Grammar (SMG) (iii) (iv) Output : SMG Recognition (iv) for ML do (v) (v) Call HC() ; (vi) switch (SMG) (vii) case CAML: Call mr_ML() ; (vi) Call HF() ; (viii) case AAML: Call mr_ML() + mr_ML_Adverb() ; (vii) Call FC() ; (ix) case ACML: Call mr_ML() + mr_ML() ; (viii) Call FF() ; (x) case ACML: Call mr_ML_Adverb() + mr_ML() (ix) end for (xi) + mr_ML() ; (xii) end switch Algorithm : Hand_Count: HC Algorithm : Hand_Feature: HF (i) Input : Left Hand (LH), Right Hand (RH) (i) Input : NH, LH, RH, H (ii) Output : the number of Hand (NH) (ii) Output : Hand Feature Vector (iii) (iii) (iv) if (1) then (iv) for NH do (v) Using either LH or RH ; (v) Using the Direction of H ; (vi) NH = 1 ; (vi) Using the Speed of H ; (vii) else (vii) Using the Location of H ; (viii) Using both LH and RH ; (viii) Using the Normal Vector of H ; (ix) NH = 2 ; (ix) end for (x) end if Algorithm : Finger_Count: FC Algorithm : Finger_Feature: FF (i) Input : Finger (F) (i) Input : NF, F (ii) Output : the number of Finger (NF) (ii) Output : Finger Feature Vector (iii) (iii) (iv) for F do (iv) for NF do (v) Increasing the number of F (NF) ; (v) Using the Direction of F ; (vi) end for (vi) Using the Touch of F ; (vii) Using the Speed of F ; (viii) Using the Location of F ; (ix) end for Figure3:Motionrecognitionalgorithm. 4. Experiments for the hardware. The motion recognition module was developed using C#. 4.1. SMG Recognition Rate Comparison Test. The following For the test method, the Amusement and Functional experimental environment was set up to evaluate the Contentsmotionsdefinedinthisstudyandestablishedinto SMG suggested in this study. The desktop used for simula- grammar (ours) were compared with the Leap Motion tion was installed with Window 7 64bit OS, with Geforce SVM [28] method through a quantitative evaluation. Each GTX 770 as the graphics card. For software, Unity 5.3.1f1 motion was tested 20 times as the number of input. The version was installed, and Leap Motion was established inputs are composed of the features of each gesture that
Description: