ebook img

perception, learning and use of tool affordances on humanoid robots PDF

111 Pages·2013·25.46 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview perception, learning and use of tool affordances on humanoid robots

PERCEPTION, LEARNING AND USE OF TOOL AFFORDANCES ON HUMANOID ROBOTS a thesis submitted to the department of computer engineering and the graduate school of engineering and science of bilkent university in partial fulfillment of the requirements for the degree of master of science By Yig˘it C¸alı¸skan May, 2013 I certify that I have read this thesis and that in my opinion it is fully adequate, in scope and in quality, as a thesis for the degree of Master of Science. Assist. Prof. Dr. Pınar Duygulu S¸ahin(Advisor) I certify that I have read this thesis and that in my opinion it is fully adequate, in scope and in quality, as a thesis for the degree of Master of Science. Assoc. Prof. Dr. Erol S¸ahin(Co-Advisor) I certify that I have read this thesis and that in my opinion it is fully adequate, in scope and in quality, as a thesis for the degree of Master of Science. Assist. Prof. Dr. Sinan Kalkan I certify that I have read this thesis and that in my opinion it is fully adequate, in scope and in quality, as a thesis for the degree of Master of Science. ¨ Assist. Prof. Dr. Oznur Ta¸stan Approved for the Graduate School of Engineering and Science: Prof. Dr. Levent Onural Director of the Graduate School ii ABSTRACT PERCEPTION, LEARNING AND USE OF TOOL AFFORDANCES ON HUMANOID ROBOTS Yig˘it C¸alı¸skan M.S. in Computer Engineering Supervisor: Assist. Prof. Dr. Pınar Duygulu S¸ahin Co-Supervisor: Assoc. Prof. Dr. Erol S¸ahin May, 2013 Humans and some animals use different tools for different aims such as extend- ing reach, amplifying mechanical force, create or augment signal value of social display, camouflage, bodily comfort and effective control of fluids. In robotics, tools are mostly used for extending the reach area of a robot. For this aim, the question “What kind of tool is better in which situation?” is very significant. The importance of affordance concept rises with this question. That is because, different tools afford variety of capabilities depending on target objects. Towards the aim of learning tool affordances, robots should experience effects by applying behaviors on different objects. In this study, our goal is to teach the humanoid robot iCub, the affordances of tools by applying different behaviors on a variety of objects and observing the effects of these interactions. Using eye camera and Kinect, tool and object features are obtained for each interaction to construct the training data. Success of a behavior depends on the tool features, object position and properties and also the hand that the robot uses the tool with. As a result of the training of each behavior, the robot successfully predicts effects of different behaviors and infers the affordances when a tool is given and an object is shown. When an affordance is requested, the robot can apply the appropriate behavior given a tool and an object, the robot can select the best tool among different tools when a specific affordanceisrequestedandanobjectisshown. This studyalsodemonstrateshow different positions and properties of objects affect the affordance and behavior results, and how affordance and behavior results are affected when a part of a tool is removed, modified or a new part is added. Keywords: Affordance, Tool Affordance, Humanoid Robot, Tool Use. iii ¨ OZET ˙ ˘ ˘ INSANSI ROBOTLARDA ALET SAGLARLIGI KAVRAMININ KULLANIMI, ALGILANMASI VE ¨ ˘ ˙ ˙ OGRENILMESI Yig˘it C¸alı¸skan Bilgisayar Mu¨hendisli˘gi, Yu¨ksek Lisans Tez Yo¨neticisi: Assist. Prof. Dr. Pınar Duygulu S¸ahin Ortak Tez Yo¨neticisi: Assoc. Prof. Dr. Erol S¸ahin Mayıs, 2013 ˙ Insanlar ve bazı hayvanlar aletleri eri¸sim alanını geni¸sletmek, mekanik kuvvet- lerini arttırmak, toplumsal deg˘erini olu¸sturmak ve arttırmak, kamuflaj, vu¨cudu rahatlatma ve sıvı kontrolleri gibi ama¸clar i¸cin kullanırlar. Robotik alanında alet kullanımı genelde robotun eri¸sim alanını arttırmak i¸cin kullanılır. Bu ama¸c dog˘rultusunda “Hangi alet hangi duruma uygundur?” sorusu ¸cok o¨nemlidir. Sag˘larlık kavramının ¨onemi bu soru ile ortaya c¸ıkmaktadır. C¸u¨nku¨ farklı aletler, hedef nesneler u¨zerinde farklı kabiliyetlere sahip olabilirler. Alet sa˘glarlıg˘ının o¨˘grenimi ic¸in robotlar farklı davranı¸sları farklı nesneler u¨zerinde deneyerek or- taya ¸cıkan sonu¸cları g¨ozlemlemelidirler. Bu ¸calı¸smadaki ama¸c, insansı robot iCub’a farklı davranı¸sları nesneler u¨zerinde uygulatıp, ¸cıkan sonu¸cları go¨zlemleterek alet sag˘larlıklarını ¨og˘retmektir. Go¨z kamerası ve Kinect kullanarak, eg˘itim verisi olu¸sturmak i¸cin, her etkile¸simde alet ve nesne nitelikleri elde edilir. Bir davranı¸sın ba¸sarısı alet niteliklerine, nesnenin pozisyon ve ¨ozelliklerine ve robotun kullandıg˘ı ele bag˘lıdır. Davranı¸s eg˘itimlerinin ardından, verilen bir alet ve nesneye go¨re, robot farklı davranı¸sların sonu¸clarını tahmin edip, sa˘glarlıkları ¸cıkarabilmektedir. Herhangi bir sag˘larlık istendig˘inde, robot kendisine verilen alet ve nesneye go¨re uygun davranı¸sı uygu- layabilmektedir, herhangi bir nesne go¨sterildig˘inde robot farklı aletler arasından enuygunaletisec¸ebilmektedir. Bu¸calı¸smaayrıcanesnelerinfarklıpozisyonlarıve o¨zelliklerinin sag˘larlık ve davranı¸s sonu¸clarını nasıl etkiledig˘ini ve bir aletin her- hangi bir par¸cası c¸ıkarıldıg˘ında, deg˘i¸stirildig˘inde veya yeni bir par¸ca eklendig˘inde sag˘larlıkların ve davranı¸s sonu¸clarının nasıl etkilendig˘ini g¨ostermektedir. ˙ Anahtar s¨ozcu¨kler: Sa˘glarlık, Alet Sag˘larlı˘gı, Insansı Robot, Alet Kullanımı. iv Acknowledgement First of all, I would like to thank to my advisor Pınar Duygulu S¸ahin and my co-advisor Erol S¸ahin for giving me a chance to work with them by providing a great work place full of brilliant people. Thank you for supporting and guiding me in this process. I also want to thank Sinan Kalkan for his guidance and ideas in hard times. I would like to thank Hande for always being open for help even she has lots of work to do, she really helped me a lot in the last stage of my thesis. Thanks to Gu¨ner for always making me crazy while working, this work would finish earlier if Gu¨ner wouldn’t be there. It would finish earlier but in a worse quality without the countless helps of Gu¨ner. Thanks for all your help. Thanks to Serta¸c for always creating trouble between me and Gu¨ner. Good luck to you with Gu¨ner after I leave the laboratory. Thanks for your support and listening my complaints anytime. (Yes, you will be able to take my computer soon). Thanks to Fatih for being a good companion at nights, I am sure that you will be one of the best ¨ faculty members in SDU. Thanks to Asil for being a great companion. Thanks to Kadir for the helps in urgent situations. You all always supported me and you were great friends for me. I will miss working with you all. And thanks to all ˙ other Kovan members; Doruk, Ilkay, Gaye, Akif, Bu˘gra,... I also would like to thank Mustafa, my frisbee partner and a great friend but now in India. It would be better if you would not leave from Turkey. Hope you find what you want in there. I want to thank to all my other friends who supported me, especially Saygın. My biggest thanks are to my family. They always supported me in this stage of my life. I know that they will continue their support after this stage. I owe them a lot. I could not do this without the sandwiches that my mom prepared for the dinners at the laboratory. ¨ ˙ I would like to acknowledge the support of TUBITAK (The Scientific and Tech- ¨ ˙ nological Research Council of Turkey) and this thesis is supported by TUBITAK under the project with id 109E033. v Contents 1 Introduction 1 1.1 Tool Use in Animals . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Tool Use in Early Stages of Humans . . . . . . . . . . . . . . . . 3 1.3 Affordances . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.4 Tool Affordances in Robots . . . . . . . . . . . . . . . . . . . . . 6 1.5 Contribution of the Thesis . . . . . . . . . . . . . . . . . . . . . . 8 2 Related Studies 9 2.1 Studies on Tool Use . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.2 Studies on Functionalities of the Tools . . . . . . . . . . . . . . . 10 2.3 Studies on Tool Affordances . . . . . . . . . . . . . . . . . . . . . 11 3 Experimental Setup 13 3.1 Reference Frame of the Robot . . . . . . . . . . . . . . . . . . . . 14 3.2 Available Space for Hands of iCub . . . . . . . . . . . . . . . . . . 15 3.3 Perception Hardware . . . . . . . . . . . . . . . . . . . . . . . . . 16 vi CONTENTS vii 3.3.1 Visualeyez 3D Motion Tracking System . . . . . . . . . . . 16 3.3.2 iCub Eye Camera . . . . . . . . . . . . . . . . . . . . . . . 16 3.3.3 Kinect RGB-D Camera . . . . . . . . . . . . . . . . . . . . 17 3.4 Construction of Tools . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.5 Software Development Tools, Platforms and Libraries . . . . . . . 18 4 Perception 21 4.1 Tool Perception . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 4.1.1 Tool Perception From Point Clouds . . . . . . . . . . . . . 21 4.1.2 Tool Perception From Visual Images . . . . . . . . . . . . 24 4.2 Object Perception . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 5 Behaviors 31 5.1 Behaviors Without Tools . . . . . . . . . . . . . . . . . . . . . . . 31 5.2 Behaviors With Tools . . . . . . . . . . . . . . . . . . . . . . . . . 32 5.2.1 Effects of Tool Behaviors . . . . . . . . . . . . . . . . . . . 32 5.2.2 Tool Behaviors . . . . . . . . . . . . . . . . . . . . . . . . 33 6 Training of the System 37 6.1 Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 6.1.1 Tools and Objects Used For Training . . . . . . . . . . . . 37 6.2 Training Phase and Results . . . . . . . . . . . . . . . . . . . . . 38 CONTENTS viii 6.2.1 Dataset Construction . . . . . . . . . . . . . . . . . . . . . 40 6.2.2 Combination of Tool and Object Features . . . . . . . . . 43 6.2.3 Feature Selection and Feature Elimination on Combined Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 6.2.4 Performance of Each Behaviors After Feature Selection and Elimination . . . . . . . . . . . . . . . . . . . . . . . . . . 45 6.2.5 Analysis of Each Behavior . . . . . . . . . . . . . . . . . . 49 7 Experiments 54 7.1 Novel Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 7.2 A Specific Affordance is Requested . . . . . . . . . . . . . . . . . 56 7.3 All Affordances are Requested . . . . . . . . . . . . . . . . . . . . 62 7.4 Similarity Between Tools . . . . . . . . . . . . . . . . . . . . . . . 69 7.5 Demonstration on iCub . . . . . . . . . . . . . . . . . . . . . . . . 81 8 Conclusion 85 8.1 Limitations of The System . . . . . . . . . . . . . . . . . . . . . . 86 8.2 Advantages of The System . . . . . . . . . . . . . . . . . . . . . 87 A Algorithms and Techniques 92 A.1 Details of the Skeletonization . . . . . . . . . . . . . . . . . . . . 92 A.1.1 Conditions for Removing a Contour Point . . . . . . . . . 92 A.2 ReliefF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 List of Figures 1.1 Developmental Stages of Tool Use (Taken from Guerin et al. [1]) . 3 1.2 Affordances between different subjects (Image is taken from http://www.macs-eu.org/images/affordance-animals.jpg) . . . . . 5 1.3 Visualization of Affordance Formalization . . . . . . . . . . . . . . 6 1.4 Pull Back an Object . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.5 Push Forward a Far Object . . . . . . . . . . . . . . . . . . . . . 7 3.1 The iCub Humanoid Robot . . . . . . . . . . . . . . . . . . . . . 13 3.2 Sensors in the Palm . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.3 Reference Frame of the iCub . . . . . . . . . . . . . . . . . . . . . 15 3.4 Available Space for Both Left and Right Hands on the Table . . . 15 3.5 Visualeyez 3D Motion Tracking System . . . . . . . . . . . . . . . 16 3.6 Kinect RGB-D Camera . . . . . . . . . . . . . . . . . . . . . . . . 17 4.1 Tool Perception Steps Using Point Cloud From Kinects . . . . . . 21 4.2 First Steps of the Tool Processing with Kinect . . . . . . . . . . . 22 ix LIST OF FIGURES x 4.3 Sample Hold Regions and Parts of the Tools . . . . . . . . . . . . 23 4.4 Kinect Tool Features . . . . . . . . . . . . . . . . . . . . . . . . . 24 4.5 Tool Perception Steps Using Visual Image From Eye Camera . . . 24 4.6 Processing Steps Before the Skeletonization . . . . . . . . . . . . 26 4.7 Skeletonized Tools . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.8 Red Dots Denote Middle Points, Blue Dots Denote End Points . . 27 4.9 Parts of the Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 4.10 Eye Tool Features . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.11 Object Perception Steps Using Point Cloud From Kinects . . . . . 29 4.12 Segmented Point Clouds From Sample Objects . . . . . . . . . . . 29 4.13 Features Obtained From Segmented Object From Kinect Range Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.1 Tuck Arms Position . . . . . . . . . . . . . . . . . . . . . . . . . . 32 5.2 Grasping the Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 5.3 Behavior Categories, Tool Behaviors and Their Possible Effects . . 33 5.4 Push Left From Right . . . . . . . . . . . . . . . . . . . . . . . . 34 5.5 Push Left From Top . . . . . . . . . . . . . . . . . . . . . . . . . 34 5.6 Push Forward Using Main Part . . . . . . . . . . . . . . . . . . . 35 5.7 Push Forward Using Left Part . . . . . . . . . . . . . . . . . . . . 35 5.8 Pull Backward From Top Using Right Part . . . . . . . . . . . . . 36

Description:
Humans and some animals use different tools for different aims such as extend- the aim of learning tool affordances, robots should experience effects by applying .. St. Amant and Horton made an experiment about .. can have different height, length and width and can be placed anywhere on the.
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.