Model-free Approaches to Robotic Manipulation Via Tactile Perception and Tension-driven Control

Model-free Approaches to Robotic Manipulation Via Tactile Perception and Tension-driven Control

Author: Kenneth Gutierrez

Publisher:

Published: 2021

Total Pages: 119

ISBN-13:

DOWNLOAD EBOOK

To execute manipulation tasks in unstructured environments, robots use computer vision and a priori information to locate and grasp objects of interest. However, once an object has been grasped, cameras cannot perceive tactile- or force-based information about finger-object interactions. To address this, tactile and proprioception data are used to develop novel methodologies that aid in robotic manipulation once an object has been grasped. In the first study, a method was developed for the perception of tactile directionality using convolutional neural networks (CNNs). The deformation of a tactile sensor is used to perceive the direction of a tangential stimulus acting on the fingerpad. A primary CNN was used to estimate the direction of perturbations applied to a grasped object. A secondary CNN provided a measure of uncertainty through the use of confidence intervals. Our CNN models were able to perceive tactile directionality on par with humans, outperformed a state-of-the-art force estimator network, and was demonstrated in real-time. In the second study, novel controllers were developed for model-free, tension-driven manipulation of deformable linear objects (DLOs) using force-based data. Prior works on DLO manipulation have focused on geometric or topological state and used complex modeling and computer vision approaches. In tasks such as wrapping a DLO around a structure, DLO tension needs to be carefully controlled. Such tension control cannot be achieved using vision alone once the DLO becomes taut. Two controllers were designed to regulate the tension of a DLO and precede traditional motion controllers. The controllers could be used for tasks in which maintaining DLO tension takes higher priority over exact DLO configuration. We evaluate and demonstrate the controllers in real-time on real robots for two different utilitarian tasks: circular wrapping around a horizontal post and figure-eight wrapping around a boat cleat. In summary, methods were developed to effectively manipulate objects using tactile- and force-based information. The model-free nature of the approaches allows the techniques to be utilized without exact knowledge of object properties. Our methods that leverage tactile sensation and proprioception for object manipulation can serve as a foundation for further enhancement with complementary sensory feedback such as computer vision.


Data-driven Robotic Manipulation of Deformable Objects Using Tactile Feedback

Data-driven Robotic Manipulation of Deformable Objects Using Tactile Feedback

Author: Yi Zheng

Publisher:

Published: 2023

Total Pages: 0

ISBN-13:

DOWNLOAD EBOOK

Perceiving and manipulating deformable objects with the sense of touch are essential skills in everyday life. However, it remains difficult for robots to autonomously manipulate deformable objects using tactile sensing because of numerous perception, modeling, planning, and control challenges. We believe this is partially due to two fundamental challenges: (1) Establishing a physics-based model describing physical interactions between deformable tactile sensors and deformable objects is difficult; (2) Modern tactile sensors provide high-dimensional data, which is beneficial for perception but impedes the development of practical planning and control strategies. To address these challenges, we developed systematic frameworks for the tactile-driven manipulation of deformable objects that integrates state-of-the-art tactile sensing with well-established tools used by other robotics communities. In Study \#1, we showed how a robot can learn to manipulate a deformable, thin-shell object via tactile sensor feedback using model-free reinforcement learning methods. A page flipping task was learned on a real robot using a two-stage approach. First, we learned nominal page flipping trajectories by constructing a reward function that quantifies functional task performance from the perspective of tactile sensing. Second, we learned adapted trajectories using tactile-driven perceptual coupling, with an intuitive assumption that, while the functional page flipping trajectories for different task contexts (page sizes) might differ, similar tactile sensing feedback should be expected. In Study \#2, we showed how a robot can use tactile sensor feedback to control the pose and tension of a deformable linear object (elastic cable). For a cable manipulation task, low-dimensional latent space features were extracted from high-dimensional raw tactile sensor data using unsupervised learning methods, and a dynamics model was constructed in the latent space using supervised learning methods. The dynamics model was integrated with an optimization-based, model predictive controller for end-to-end, tactile-driven motion planning and control on a real robot. In summary, we developed frameworks for the tactile-driven manipulation of deformable objects that either circumvents sensor modeling difficulties or constructs a dynamics model directly from tactile feedback and uses the model for planning and control. This work provides a foundation for the further development of systematic frameworks that can address complex, tactile-driven manipulation problems.


Tactile Sensing, Skill Learning, and Robotic Dexterous Manipulation

Tactile Sensing, Skill Learning, and Robotic Dexterous Manipulation

Author: Qiang Li

Publisher: Academic Press

Published: 2022-04-02

Total Pages: 374

ISBN-13: 0323904173

DOWNLOAD EBOOK

Tactile Sensing, Skill Learning and Robotic Dexterous Manipulation focuses on cross-disciplinary lines of research and groundbreaking research ideas in three research lines: tactile sensing, skill learning and dexterous control. The book introduces recent work about human dexterous skill representation and learning, along with discussions of tactile sensing and its applications on unknown objects’ property recognition and reconstruction. Sections also introduce the adaptive control schema and its learning by imitation and exploration. Other chapters describe the fundamental part of relevant research, paying attention to the connection among different fields and showing the state-of-the-art in related branches. The book summarizes the different approaches and discusses the pros and cons of each. Chapters not only describe the research but also include basic knowledge that can help readers understand the proposed work, making it an excellent resource for researchers and professionals who work in the robotics industry, haptics and in machine learning. Provides a review of tactile perception and the latest advances in the use of robotic dexterous manipulation Presents the most detailed work on synthesizing intelligent tactile perception, skill learning and adaptive control Introduces recent work on human’s dexterous skill representation and learning and the adaptive control schema and its learning by imitation and exploration Reveals and illustrates how robots can improve dexterity by modern tactile sensing, interactive perception, learning and adaptive control approaches


Robotic Tactile Perception and Understanding

Robotic Tactile Perception and Understanding

Author: Huaping Liu

Publisher: Springer

Published: 2018-03-20

Total Pages: 220

ISBN-13: 9811061718

DOWNLOAD EBOOK

This book introduces the challenges of robotic tactile perception and task understanding, and describes an advanced approach based on machine learning and sparse coding techniques. Further, a set of structured sparse coding models is developed to address the issues of dynamic tactile sensing. The book then proves that the proposed framework is effective in solving the problems of multi-finger tactile object recognition, multi-label tactile adjective recognition and multi-category material analysis, which are all challenging practical problems in the fields of robotics and automation. The proposed sparse coding model can be used to tackle the challenging visual-tactile fusion recognition problem, and the book develops a series of efficient optimization algorithms to implement the model. It is suitable as a reference book for graduate students with a basic knowledge of machine learning as well as professional researchers interested in robotic tactile perception and understanding, and machine learning.


Mobile Manipulation in Unstructured Environments with Haptic Sensing and Compliant Joints

Mobile Manipulation in Unstructured Environments with Haptic Sensing and Compliant Joints

Author: Advait Jain

Publisher:

Published: 2012

Total Pages:

ISBN-13:

DOWNLOAD EBOOK

We make two main contributions in this thesis. First, we present our approach to robot manipulation, which emphasizes the benefits of making contact with the world across all the surfaces of a manipulator with whole-arm tactile sensing and compliant actuation at the joints. In contrast, many current approaches to mobile manipulation assume most contact is a failure of the system, restrict contact to only occur at well modeled end effectors, and use stiff, precise control to avoid contact.\r : \r : We develop a controller that enables robots with whole-arm tactile sensing and compliant actuation at the joints to reach to locations in high clutter while regulating contact forces. We assume\r : that low contact forces are benign and our controller does not place any penalty on contact forces below a threshold. Our controller only requires haptic sensing, handles multiple contacts across the surface of the manipulator, and does not need an explicit model of the environment prior to contact. It uses model predictive control with a time horizon of length one, and a linear quasi-static mechanical model that it constructs at each time step.\r : \r : We show that our controller enables both a real and simulated robots to reach goal locations in high clutter with low contact forces. While doing so, the robots bend, compress, slide, and pivot around objects. To enable experiments on real robots, we also developed an inexpensive, flexible, and stretchable tactile sensor and covered large surfaces of two robot arms with these sensors. With an informal experiment, we show that our controller and sensor have the potential to enable robots to manipulate in close proximity to, and in contact with humans while keeping the contact forces low.\r : \r : Second, we present an approach to give robots common sense about everyday forces in the form of probabilistic data-driven object-centric models of haptic interactions. These models can be shared by different robots for improved manipulation performance. We use pulling open doors, an important task for service robots, as an example to demonstrate our approach.\r : \r : Specifically, we capture and model the statistics of forces while pulling open doors and drawers. Using a portable custom force and motion capture system, we create a database of forces as human operators pull open doors and drawers in six homes and one office. We then build data-driven\r : models of the expected forces while opening a mechanism, given knowledge of either its class (e.g, refrigerator) or the mechanism identity (e.g, a particular cabinet in Advait's kitchen). We demonstrate that these models can enable robots to detect anomalous conditions such as a locked door, or collisions between the door and the environment faster and with lower excess force applied to the door compared to methods that do not use a database of forces.


Visuo-tactile Perception for Dexterous Robotic Manipulation

Visuo-tactile Perception for Dexterous Robotic Manipulation

Author: Maria Bauza Villalonga

Publisher:

Published: 2022

Total Pages: 0

ISBN-13:

DOWNLOAD EBOOK

In this thesis, we develop visuo-tactile perception to enable general and precise robotic manipulation. In particular, we want to study how to effectively process visual and tactile information to allow robots to expand their capabilities while remaining accurate and reliable. We begin our work by focusing on developing tools for tactile perception. For the task of grasping, we use tactile observations to assess and improve grasp stability. Tactile information also allows extracting geometric information from contacts which is a task-independent feature. By learning to map tactile observations to contact shapes, we show robots can reconstruct accurate 3D models of objects, which can later be used for pose estimation. We build on the idea of using geometric information from contacts by developing tools that accurately render contact geometry in simulation. This enables us to develop a probabilistic approach to pose estimation for novel objects based on matching real visuo-tactile observations to a set of simulated ones. As a result, our method does not rely on real data and yields accurate pose distributions. Finally, we demonstrate how this approach to perception enables precise manipulations. In particular, we consider the task of precise pick-and-place of novel objects. Combining perception with task-aware planning, we build a robotic system that identifies in simulation which object grasps will facilitate grasping, planning, and perception; and selects the best one during execution. Our approach adapts to new objects by learning object-dependent models purely in simulation, allowing a robot to manipulate new objects successfully and perform highly accurate placements.


Reactive Manipulation with Contact Models and Tactile Feedback

Reactive Manipulation with Contact Models and Tactile Feedback

Author: Francois R. Hogan

Publisher:

Published: 2020

Total Pages: 120

ISBN-13:

DOWNLOAD EBOOK

This thesis focuses on closing the loop in robotic manipulation, moving towards robots that can better perceive their environment and react to unforeseen situations. Humans effectively process and react to information from visual and tactile sensing, however robots often remain programmed in an open-loop fashion, and struggle to correct their motion based on detected errors. We begin our work by developing full-state feedback controllers for dynamical systems involving frictional contact interactions. Hybridness and underactuation are key characteristics of these systems that complicate the design of feedback controllers. We design and experimentally validate the controllers on a planar manipulation system where the purpose is to control the motion of a sliding object on a flat surface using a point robotic pusher. The pusher-slider is a simple dynamical system that retains many of the challenges that are typical of robotic manipulation tasks. We extend this work to partially observable systems, by developing closed-loop tactile controllers for dexterous manipulation with dual-arm robotic palms. We introduce Tactile Dexterity, an approach to dexterous manipulation that plans for robot/object interactions that render interpretable tactile information for control. Key to this formulation is the decomposition of manipulation plans into sequences of manipulation primitives with simple mechanics and efficient planners.


Human Inspired Dexterity in Robotic Manipulation

Human Inspired Dexterity in Robotic Manipulation

Author: Tetsuyou Watanabe

Publisher: Academic Press

Published: 2018-06-29

Total Pages: 0

ISBN-13: 9780128133859

DOWNLOAD EBOOK

Human Inspired Dexterity in Robotic Manipulation provides up-to-date research and information on how to imitate humans and realize robotic manipulation. Approaches from both software and hardware viewpoints are shown, with sections discussing, and highlighting, case studies that demonstrate how human manipulation techniques or skills can be transferred to robotic manipulation. From the hardware viewpoint, the book discusses important human hand structures that are key for robotic hand design and how they should be embedded for dexterous manipulation. This book is ideal for the research communities in robotics, mechatronics and automation.


Haptic Perception, Decision-making, and Learning for Manipulation with Artificial Hands

Haptic Perception, Decision-making, and Learning for Manipulation with Artificial Hands

Author: Randall Blake Hellman

Publisher:

Published: 2016

Total Pages: 166

ISBN-13:

DOWNLOAD EBOOK

Robotic systems are outmatched by the abilities of the human hand to perceive and manipulate the world. Human hands are able to physically interact with the world to perceive, learn, and act to accomplish tasks. Limitations of robotic systems to interact with and manipulate the world diminish their usefulness. In order to advance robot end effectors, specifically artificial hands, rich multimodal tactile sensing is needed. In this work, a multi-articulating, anthropomorphic robot testbed was developed for investigating tactile sensory stimuli during finger-object interactions. The artificial finger is controlled by a tendon-driven remote actuation system that allows for modular control of any tendon-driven end effector and capabilities for both speed and strength. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. Next, attention was focused on real-time artificial perception for decision-making. A robotic system needs to perceive its environment in order to make decisions. Specific actions such as "exploratory procedures" can be employed to classify and characterize object features. Prior work on offline perception was extended to develop an anytime predictive model that returns the probability of having touched a specific feature of an object based on minimally processed sensor data. Developing models for anytime classification of features facilitates real-time action-perception loops. Finally, by combining real-time action-perception with reinforcement learning, a policy was learned to complete a functional contour-following task: closing a deformable ziplock bag. The approach relies only on proprioceptive and localized tactile data. A Contextual Multi-Armed Bandit (C-MAB) reinforcement learning algorithm was implemented to maximize cumulative rewards within a finite time period by balancing exploration versus exploitation of the action space. Performance of the C-MAB learner was compared to a benchmark Q-learner that eventually returns the optimal policy. To assess robustness and generalizability, the learned policy was tested on variations of the original contour-following task. The work presented contributes to the full range of tools necessary to advance the abilities of artificial hands with respect to dexterity, perception, decision-making, and learning.


Touch Driven Dexterous Robot Arm Control

Touch Driven Dexterous Robot Arm Control

Author: Zhanat Kappassov

Publisher:

Published: 2017

Total Pages: 0

ISBN-13:

DOWNLOAD EBOOK

Robots have improved industry processes, most recognizably in conveyor-belt assemblysystems, and have the potential to bring even more benefits to our society in transportation,exploration of dangerous zones, deep sea or even other planets, health care and inour everyday life. A major barrier to their escape from fenced industrial areas to environmentsco-shared with humans is their poor skills in physical interaction tasks, includingmanipulation of objects. While the dexterity in manipulation is not affected by the blindnessin humans, it dramatically decreases in robots. With no visual perception, robotoperations are limited to static environments, whereas the real world is a highly variantenvironment.In this thesis, we propose a different approach that considers controlling contact betweena robot and the environment during physical interactions. However, current physicalinteraction control approaches are poor in terms of the range of tasks that can beperformed. To allow robots to perform more tasks, we derive tactile features representingdeformations of the mechanically compliant sensing surface of a tactile sensor andincorporate these features to a robot controller via touch-dependent and task-dependenttactile feature mapping matrices.As a first contribution, we show how image processing algorithms can be used todiscover the underlying three dimensional structure of a contact frame between an objectand an array of pressure sensing elements with a mechanically compliant surfaceattached onto a robot arm's end-effector interacting with this object. These algorithmsobtain as outputs the so-called tactile features. As a second contribution, we design a tactileservoing controller that combines these tactile features with a position/torque controllerof the robot arm. It allows the end-effector of the arm to steer the contact frame ina desired manner by regulating errors in these features. Finally, as a last contribution, weextend this controller by adding a task description layer to address four common issuesin robotics: exploration, manipulation, recognition, and co-manipulation of objects.Throughout this thesis, we make emphasis on developing algorithms that work notonly with simulated robots but also with real ones. Thus, all these contributions havebeen evaluated in experiments conducted with at least one real robot. In general, thiswork aims to provide the robotics community with a unified framework to that will allowrobot arms to be more dexterous and autonomous. Preliminary works are proposedfor extending this framework to perform tasks that involve multicontact control withmultifingered robot hands.