Edinburgh Research Archive

Learning structured task related abstractions

dc.contributor.advisor
Ramamoorthy, Subramanian
en
dc.contributor.advisor
Nuthmann, Antje
en
dc.contributor.author
Penkov, Svetlin Valentinov
en
dc.contributor.sponsor
Engineering and Physical Sciences Research Council (EPSRC)
en
dc.date.accessioned
2019-07-25T14:54:46Z
dc.date.available
2019-07-25T14:54:46Z
dc.date.issued
2019-07-01
dc.description.abstract
As robots and autonomous agents are to assist people with more tasks in various domains they need the ability to quickly gain contextual awareness in unseen environments and learn new tasks. Current state of the art methods rely predominantly on statistical learning techniques which tend to overfit to sensory signals and often fail to extract structured task related abstractions. The obtained environment and task models are typically represented as black box objects that cannot be easily updated or inspected and provide limited generalisation capabilities. We address the aforementioned shortcomings of current methods by explicitly studying the problem of learning structured task related abstractions. In particular, we are interested in extracting symbolic representations of the environment from sensory signals and encoding the task to be executed as a computer program. We consider the standard problem of learning to solve a task by mapping sensory signals to actions and propose the decomposition of such a mapping into two stages: i) perceiving symbols from sensory data and ii) using a program to manipulate those symbols in order to make decisions. This thesis studies the bidirectional interactions between the agent’s capabilities to perceive symbols and the programs it can execute in order to solve a task. In the first part of the thesis we demonstrate that access to a programmatic description of the task provides a strong inductive bias which facilitates the learning of structured task related representations of the environment. In order to do so, we first consider a collaborative human-robot interaction setup and propose a framework for Grounding and Learning Instances through Demonstration and Eye tracking (GLIDE) which enables robots to learn symbolic representations of the environment from few demonstrations. In order to relax the constraints on the task encoding program which GLIDE assumes, we introduce the perceptor gradients algorithm and prove that it can be applied with any task encoding program. In the second part of the thesis we investigate the complement problem of inducing task encoding programs assuming that a symbolic representations of the environment is available. Therefore, we propose the p-machine – a novel program induction framework which combines standard enumerative search techniques with a stochastic gradient descent optimiser in order to obtain an efficient program synthesiser. We show that the induction of task encoding programs is applicable to various problems such as learning physics laws, inspecting neural networks and learning in human-robot interaction setups.
en
dc.identifier.uri
http://hdl.handle.net/1842/35875
dc.language.iso
en
dc.publisher
The University of Edinburgh
en
dc.relation.hasversion
S. Penkov, A. Bordallo, S. Ramamoorthy. Inverse Eye Tracking for Intention Inference and Symbol Grounding in Human-Robot Collaboration. In Proc. Robotics: Science and Systems Workshop on Planning for Human-Robot Interaction (RSS PHRI), 2016.
en
dc.relation.hasversion
S. Penkov, A. Bordallo, S. Ramamoorthy. Physical Symbol Grounding and Instance Learning through Demonstration and Eye Tracking. In Proc. IEEE International Conference on Robotics and Automation (ICRA), 2017.
en
dc.relation.hasversion
S. Penkov, S. Ramamoorthy. Using Program Induction to Interpret Transition System Dynamics. In Proc. International Conference on Machine Learning Workshop on Human Interpretability in Machine Learning (ICML WHI), 2017.
en
dc.relation.hasversion
Y. Hristov, S. Penkov, A. Lascarides, S. Ramamoorthy. Grounding Symbols in Multi- Modal Instructions. In Proc. Association for Computational Linguists Workshop on Language Grounding for Robotics (ACL LGR), 2017.
en
dc.relation.hasversion
S. Penkov, S. Ramamoorthy. Learning Programmatically Structured Representations with Perceptor Gradients. International Conference on Learning Representations (ICLR), 2019.
en
dc.subject
neural networks
en
dc.subject
learning
en
dc.subject
collaborative human-robot interaction
en
dc.subject
decision making processes
en
dc.subject
learning programs
en
dc.subject
black-box models
en
dc.subject
Grounding and Learning Instances through Demonstration and Eye tracking
en
dc.subject
GLIDE
en
dc.title
Learning structured task related abstractions
en
dc.type
Thesis or Dissertation
en
dc.type.qualificationlevel
Doctoral
en
dc.type.qualificationname
PhD Doctor of Philosophy
en

Files

Original bundle

Now showing 1 - 1 of 1
Name:
Penkov2019.pdf
Size:
8.91 MB
Format:
Adobe Portable Document Format

This item appears in the following Collection(s)