Download Free Audio of Recent research shows a growing interest in adopti... - Woord

Read Aloud the Text Content

This audio was created by Woord's Text to Speech service by content creators from all around the world.


Text Content or SSML code:

Recent research shows a growing interest in adopting touch interaction for robot learning, yet it remains challenging to efficiently acquire high-quality, structured tactile data at a low cost. In this study, we propose the design of vision-based soft robotic tongs to generate reproducible and shareable data of tactile interaction for learning. We further developed a web-based platform for convenient data collection and a portable assembly that can be deployed within minutes. We trained a simple network to infer the 6D force and torque using relative pose data from markers on the fingers and reached a reasonably high accuracy but cost only 50 dollars per set. The recorded tactile data is downloadable for robot learning. We further demonstrated the system for interacting with robotic arms in manipulation learning and remote control. We have open-sourced the whole system on GitHub with further information. These are all components in a foldable fanny pack,including monocular camera, camera pedestal,pad with makers and meta-fingers and tongs. Shareability and reproducibility are the priority of design considerations while developing the tactile data collection system at a low cost for robot learning. The overall hardware design is mainly mechanical, with a single monocular camera as the sensing unit. Besides directly interacting with objects by hand, humans are also skilled at operating tools to manipulate objects. To make the system comparable with the standard parallel two-finger grippers used in the industry, we leveraged the existing design of typical kitchen tongs to systematically reduce the dimensionality of multi-fingered hand motion into two-fingered tongs. We used a soft, 3D meta-finger structure as the physical interface for interacting with the objects, which can be conveniently fixed either at the tip of the tongs or the gripper. The soft structure takes a unique design as a meta-material capable of generating passive geometric adaptation to object contact in any direction. In this way, we can ensure a transferable interaction between the tongs, or the gripper, with the objects. The soft meta-finger passively deforms to the object geometry, which enhances its performance in object grasping while providing a visible distortion that can be captured by a camera. In this study, we use the soft, 3D meta-finger design with two extra marker plates fixed at the backside. These fingers can be rapidly prototyped using 3D printing, including the whole soft structure, the detachable plates for ArUco markers, and the base mount connecting the finger to the tongs. One can easily redesign the base mount to fix the finger to different grippers to introduce passive adaptive compliance in omni-directions. The overall idea is to use the relative pose visually tracked by the markers on the soft, meta-finger's backside to infer the 6D force and touch during tactile contact against baseline data collected from a high-performing FT sensor at the finger's base. The soft, meta-finger is mounted on top of an ATI FT sensor through a simple, cube adaptor. The test rig can provide the loading flange with linear motions along the horizontal and vertical axes as well as rotary motion about the vertical axis. We adjust the camera viewing angle to face the soft, meta-finger fixed on the test-rig, use the web-based user interface to collect the relative pose of the two markers fixed at the finger's backside. Each time the push rod was fixed at different heights, and then the meta-finger was moved linearly toward the rod and rotate 10 or 20 degrees to mick the real situations while grasping. Force and torque readings from the ATI sensor were recorded as labels for training. While one might argue that the spatial distortion of the soft, meta-finger at different contact states may not be perfectly mapped to the 6D force and torque readings at the finger's base, we intend to build a simple neural network for testing as a low-cost experiment. The user interface consists of a navigation bar on the left and four data screens stacked on the right for camera view, visualization screen, data recording, and control panel. When markers are presented within the camera's view range, the users can define the labels of each marker in the control panel to assign physical meanings to each marker. For example, one can stick a marker plate to a YCB object and assign it using the control panel, which can be shown on the visualization screen with the object's model displayed. When the physical object moves, the 3D model on the visualization screen moves accordingly, and its motion data is displayed on the data recording screen. On the other hand, in this study, one can assign multiple markers to represent complex physical meanings, such as the soft meta-finger's distortion represented using two markers' relative poses. Accordingly, we prepared a simplified open chain to represent the tongs' motion on the visualization screen and data recording screens. The data is recorded at 60Hz in a time-series format. One can export the recorded pose tracking data labeled with predefined physical meanings for convenient processing in robot learning. The recorded data is recorded locally in the browser but not in the server. The network used in the training and the results are described in the paper. In addition, we have made a number of examples to demonstrate the remote manipulation of a robotic arm using soft fingers. Subsequently, we also made demonstration of applying the collected data for Gaussian Learning. In this experiment, we hope to realize the cooperation between the human and the robotic arm, so as to complete the task of double-arm operation. We symmetrically map the pose information of the tongs to the robot motion coordinate system, the robotic arm will Taking the central line as the central axis, the movement of the tongs is symmetrically reproduced, thereby realizing the tasks of carrying the bag with two arms and opening, closing and rotating the bag, which has a great prospect for the development of human-machine cooperation. We also conducted a pilot program using the proposed system for teaching a robot learning course at University level during the spring semester of 2022. Due to Covid-19, students were advised to attend the course remotely when the semester begins. While the lab session of this course was previously conducted on-site, the teaching team prepared the proposed design so that it can be fabricated at a low-cost, accessed with ease of engineering, and also provide a relatively rich data so that students can use for training models of their own. Although students returned to the campus during the second half the semester, we kept using this proposed design for teaching purpose and integrated four experiment sessions for students to practice on collect tactile data for robot learning. By the end of the course, we conducted a preliminary survey to collect feedback. We adopted the Real-Win-Worth framework for user evaluation. The RWW framework was previously used for screening product concept ideation by 3M, which was later redesigned for evaluating crowdfunding products. While the positive feedback shows a strong interest from the students using this proposed design, its further usefulness and functionality still needs future development to provide contributions to research and teaching robot learning related subjects.