TY - JOUR
T1 - Low to High Dimensional Modality Hallucination Using Aggregated Fields of View
AU - Gunasekar, Kausic
AU - Qiu, Qiang
AU - Yang, Yezhou
N1 - Funding Information:
Manuscript received September 10, 2019; accepted January 8, 2020. Date of publication January 31, 2020; date of current version February 12, 2020. This letter was recommended for publication by Associate Editor L. Liu and C. Cadena Lerma upon evaluation of the reviewers’ comments. This work was supported in part by the National Science Foundation under Robust Intelligence program under Grant #1750082, in part by National Robotics Initiative program under Grant #1925403, in part by AWS MLRA, and in part by GPU donations from NVIDIA. (Corresponding author: Yezhou Yang.) K. Gunasekar and Y. Yang are with Arizona State University, Tempe, AZ 85281 USA (e-mail: kgunase3@asu.edu; yz.yang@asu.edu).
Publisher Copyright:
© 2016 IEEE.
PY - 2020/4
Y1 - 2020/4
N2 - Real-world robotics systems deal with data from a multitude of modalities, especially for tasks such as navigation and recognition. The performance of those systems can drastically degrade when one or more modalities become inaccessible, due to factors such as sensors' malfunctions or adverse environments. Here, we argue modality hallucination as one effective way to ensure consistent modality availability and thereby reduce unfavorable consequences. While hallucinating data from a modality with richer information, e.g., RGB to depth, has been researched extensively, we investigate the more challenging low-to-high modality hallucination with interesting use cases in robotics and autonomous systems. We present a novel hallucination architecture that aggregates information from multiple fields of view of the local neighborhood to recover the lost information from the extant modality. The process is implemented by capturing a non-linear mapping between the data modalities and the learned mapping is used to aid the extant modality to mitigate the risk posed to the system in the adverse scenarios which involve modality loss. We also conduct extensive classification and segmentation experiments on UWRGBD and NYUD datasets and demonstrate that hallucination allays the negative effects of the modality loss. Implementation and models: https://github.com/kausic94/Hallucination.
AB - Real-world robotics systems deal with data from a multitude of modalities, especially for tasks such as navigation and recognition. The performance of those systems can drastically degrade when one or more modalities become inaccessible, due to factors such as sensors' malfunctions or adverse environments. Here, we argue modality hallucination as one effective way to ensure consistent modality availability and thereby reduce unfavorable consequences. While hallucinating data from a modality with richer information, e.g., RGB to depth, has been researched extensively, we investigate the more challenging low-to-high modality hallucination with interesting use cases in robotics and autonomous systems. We present a novel hallucination architecture that aggregates information from multiple fields of view of the local neighborhood to recover the lost information from the extant modality. The process is implemented by capturing a non-linear mapping between the data modalities and the learned mapping is used to aid the extant modality to mitigate the risk posed to the system in the adverse scenarios which involve modality loss. We also conduct extensive classification and segmentation experiments on UWRGBD and NYUD datasets and demonstrate that hallucination allays the negative effects of the modality loss. Implementation and models: https://github.com/kausic94/Hallucination.
KW - Deep learning in robotics and automation
KW - computer vision for other robotic applications
KW - modality hallucination
KW - robot safety
UR - http://www.scopus.com/inward/record.url?scp=85079797595&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85079797595&partnerID=8YFLogxK
U2 - 10.1109/LRA.2020.2970679
DO - 10.1109/LRA.2020.2970679
M3 - Article
AN - SCOPUS:85079797595
SN - 2377-3766
VL - 5
SP - 1983
EP - 1990
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 2
M1 - 8977350
ER -