Title: Fully automatic data collection for neuro-symbolic task planning for mobile robot navigation
Abstract: In this paper, we present an automatic collection method of image data for neuro-symbolic task planning for robot navigation. Collecting images for robot task planning would often be overwhelmed by laborious chores to operate the robot, change the environment, and repeatedly capture quality-assured images. We propose a method using a robotic simulator to perform a series of repetitive processes for data collection automatically. It generates (i) a random instance of the navigation problem, (ii) a simulation environment that depicts the instance, (iii) a planning problem instance described in a classical planning language, (iv) a task plan that solves the planning problem, (v) control inputs for the robot to execute the task plan, and (vi) a sequence of cropped images capturing the evolving states of the robot and the world while the robot performs the plan. We use one of the state-of-the-art neuro-symbolic planning models to validate our method. From the evaluation, the model achieves at most 92.6% of the success rate in generating task plans successfully from only a pair of images showing the initial and the desired state.
Publication Year: 2021
Publication Date: 2021-10-17
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 2
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot