Angelique Taylor
Welcome to My Homepage



ABOUT ME
 


Mountain View I am a PhD student in the Healthcare Robotics Lab in the Computer Science and Engineering department at the University of California San Diego. I work under the direction of Dr. Laurel Riek.

My research lies in the intersection of computer vision, robotics, and artificial intelligence. My work aims to design algorithms that enable robots to interact and work with groups of people in real world environments. I am also an National Science Foundation GRFP Fellow, Arthur J. Schmitt Presidential Fellow, GEM Fellow, Google Anita Borg Memorial Scholar, National Center for Women in Information Technology (NCWIT), and Grace Hopper Celebration of Women in Computing (GHC) Scholar.

I received my BS in Electrical Engineering and Computer Engineering from the University of Missouri-Columbia in 2015 and my AS in Engineering Science from Saint Louis Community College in 2012.

 
Research Concentration

Artifical Intelligence

Autonomous Sytems

Human Group Perception

Robot Vision

RECENT NEWS


[June 2019]: Recieved the Microsoft Dissertation Grant!
[April 2019]: Presented my research at the 2019 SoCal Symposium
[April 2019]: Received the National Center for Women in IT Collegiate Award.
[June 2018]: Received the ROSCon Diversity Scholarship.
[June 2018]: Received the UC San Diego Computer Science and Engineering Doctoral Award for Excellence in Service/Leadership.
[June 2018]: Elected President of UC San Diego Graduate Robotics Association.
[Mar. 2018]: Presented "Robot-Centric Human Group Detection" at the Social Robots in the Wild Workshop at HRI.
[Feb. 2017]: Presented "Robot Affiliation Perception for Social Interaction” at Robots in Groups and Teams Workshop at the CSCW Conference.
[July 2017]: Presented "Robot Perception of Social Engagement Using Group Joint Action” at the Joint Action Meeting Conference.
[Nov. 2016]: Presented "Robot Perception of Human Groups in the Real World: State of the Art” at AAAI AI-HRI Symposium.
PUBLICATIONS
 

PROJECTS
 
Robot-Centric Human Group Detection Using Real World Data

Mountain ViewAs robots enter human-occupied environments, it is important that they work effectively with groups of people. To achieve this goal, robots need the ability to detect groups. This ability requires robots to have ego-centric (robot-centric) perception because placing external sensors in the environment is impractical. Additionally, robots need learning algorithms which do not require extensive training, as a priori knowledge of an environment is difficult to acquire. We introduce a new algorithm that addresses these needs. It detects moving groups in real-world, ego-centric RGB-D data from a mobile robot. Also, it uses unsupervised learning which leverages the underlying structure of the data for learning. This work will enable robots to work with human teams in the general public using their own onboard sensors with minimal training.

RoboCup@Home 2017

Mountain ViewAs robots enter people's homes, it is important that they are able to effectively accomplish tasks that people perform everyday. For the RoboCup@Home 2017 challenge, we designed an algoithm that enabled a Toyota Human Support Robot to transport groceries from a table to a cupboard. Each shelf of the cupboard had a different category of objects (e.g. bottles, can, etc); therefore, we had to ensure that the objects were placed on the correct shelf. We used Simultaneous Localization and Mapping (SLAM) to enable the HSR to autonomously navigate from the table to the cupboard. Additionally, we used a state-of-the-art object detection algorithm, YOLO, to detect the different types of objects.

Autonomous Robot Estimation of Afilliation in Human Groups

Mountain ViewAs robots become more integrated into human spaces, it is important that they move around autonomously during face-to-face interaction with humans as this facilitates fluent interaction. Although humans naturally and simultaneously interact with other humans and are able to to join various groups of conversation in a social environment, robots are not able to do this. This has motivated us to design algorithms that allow robots to autonomously join human groups and adapt its behavior to the group.

A Biologically Inspired Salient Region Filter for High-Fidelity Robotic Perception

Mountain ViewHigh fidelity robotic visual perception is at the brink of realization, but is not yet fully attainable due to computational resource constraints of current mobile computing platforms. Increased resolution will enhance robotic vision with the ability to detect small and far away objects, which will allow them to sense the world with greater detail, as a human would. We argue that high fidelity perception is necessary to enable robots the ability to dynamically adapt to naturalistic environments. In this work, we introduce a method that borrows ideas from human visual perception to give robots a better sense of where to look, to alleviate hardware resource constraints. Our method is designed as an abstraction to work with any object or pedestrian detection algorithm with little modification to the original algorithm, provided that the input is in the form of an RGB-D image. We compare our method to a HOG-based pedestrian detector on a high-definition dataset, to show that our algorithm achieves up to 100% faster computation time, without sacrificing significant detection accuracy.

ACKNOWLEDGEMENTS
 

Mountain View Mountain View Mountain View Mountain View
CONTACT
 
You can find me literally anywhere, just push a button and I'm there.


Mountain View