-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathproject_goals.txt
9 lines (5 loc) · 1.72 KB
/
project_goals.txt
1
2
3
4
5
6
7
8
9
Team 1 Restaurant Helper Bot
Harry and Ben
On demo day, our project at the very least will be able to take input from the user, either typed in commands (MVP) or voice commands (best-case scenario), (e.g. get me a slice of pepperoni pizza), then the robot will know where to get the pizza and return to that spot. The robot will use fiducials to localize itself within the map, and use some combination of lidar and the map to plan its path. We found a data set of pizza and beverage orders that we will use to train our natural language understanding model. We plan to also use fiducials to represent different pizza types as well and designate an area on the map where the food and beverages will be. The robot will know to go to that general area and use the appropriate fiducial to get the right pizza. If possible, we want to attach a tray to the robot. In addition to this, we may attempt to have multiple robots that can take commands.
We hope to learn more about using maps and fiducials for navigation. As CL students, we hope to learn more about how to build a working natural language processing pipeline for robots. Neither of us have experience with automated speech recognition, so we hope this project will serve as an entry point for that. Lastly, we are wondering if using a map traditionally used for navigation in robotics can also be used in natural language processing to encode world knowledge.
In terms of evaluation, on the robotics side our project involves SLAM and should at the very least be able to navigate between points on the map. On the NLP side, this project has varying levels of possible difficulties. The main goal is to integrate a robotic system with an NLP system with ASR input being more of a user experience issue.