Natural Language Interaction with Systems and Agents: Acquiring Knowledge Understanding Text Reasoning and Responding

Project: Research project

Description

Goal: The long term goal of our research is to develop programs that can understand natural language (starting with English) so that they can interact with various systems and agents in natural language. For example, consider a human-robot team and the interaction between them. One kind of interaction between them that is well studied is where the human directs the robot in a natural language to do certain tasks. This interaction can be turned bi-directional by having the robot give feedback to the human and asking the humanat confusing juncturesfor clarifications regarding what actions to take. But a more nuanced interaction is where the human can teach the robot about new actions, both basic actions as well as complex actions. For example, when installing a new hardware (a sensor such as camera, or an actuator such as grip) the human may be able to teach the robot how that particular hardware may be used. In that case the human may explain in English the various actions that the robot may do using that new actuator and what the impact of that action on the world would be. Alternatively, the human after plugging in the new actuator may just feed the robot that part of the manual (written in English) of the actuator that describes what can be done with that actuator and how that would impact the world. In both cases we need a program that can convert English text about various new actions (of a new actuator), the conditions when they can be executed, and the change they could cause to the world, to a formal representation that the robot can use to make plans for its goals. In this case understanding corresponds to the above underlined ability. Similarly, the human may need to teach the robot (in English) complex actions such as go around the corridor , fetch me the box or keep trying in terms of simpler actions that the robot already knows. Complex actions may be associated with a new hardware that the robot is given to use. In that case the manual of that hardware or tool may describe (in English) such complex actions. Our aim then would be to develop a program that can take such input in English and convert it to a formal representation that the robot can use in fine tuning its task. Here again, understanding corresponds to the underlined ability. The above are but two examples of understanding natural language to interact with systems and agents. Other examples include asking natural language queries to a database; interacting with a system (say a command and control system) in English; intelligent tutoring and training systems that can not only communicate with the students and trainees in English but that can read and understand the manual and text about which the tutoring or training is being done; and systems that can understand text available to an intelligence officer or a doctor and guide her towards diagnosis or hypothesis formation.
StatusFinished
Effective start/end date1/1/1312/31/15

Funding

  • DOD-NAVY: Office of Naval Research (ONR): $324,763.00

Fingerprint

Robots
Actuators
Hardware
Command and control systems
Query languages
Tuning
Cameras
Students
Feedback
Sensors