GleaM: An illumination estimation framework for real-time photorealistic augmented reality on mobile devices

Siddhant Prakash, Alireza Bahremand, Linda D. Nguyen, Robert LiKamWa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Mixed reality mobile platforms attempt to co-locate virtual scenes with physical environments, towards creating immersive user experiences. However, to create visual harmony between virtual and physical spaces, the virtual scene must be accurately illuminated with realistic lighting that matches the physical environment. To this end, we design GLEAM, a framework that provides robust illumination estimation in real-time by integrating physical light-probe estimation with current mobile AR systems. GLEAM visually observes reflective objects to compose a realistic estimation of physical lighting. Optionally, GLEAM can network multiple devices to sense illumination from different viewpoints and compose a richer estimation to enhance realism and fidelity. Using GLEAM, AR developers gain the freedom to use a wide range of materials, which is currently limited by the unrealistic appearance of materials that need accurate illumination, such as liquids, glass, and smooth metals. Our controlled environment user studies across 30 participants reveal the effectiveness of GLEAM in providing robust and adaptive illumination estimation over commercial status quo solutions, such as pre-baked directional lighting and ARKit 2.0 illumination estimation. Our benchmarks reveal the need for situation driven tradeoffs to optimize for quality factors in situations requiring freshness over quality and vice-versa. Optimizing for different quality factors in different situations, GLEAM can update scene illumination as fast as 30 ms by sacrificing richness and fidelity in highly dynamic scenes, or prioritize quality by allowing an update interval as high as 400 ms in scenes that require high-fidelity estimation.

Original languageEnglish (US)
Title of host publicationMobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services
PublisherAssociation for Computing Machinery, Inc
Pages142-154
Number of pages13
ISBN (Electronic)9781450366618
DOIs
StatePublished - Jun 12 2019
Event17th ACM International Conference on Mobile Systems, Applications, and Services, MobiSys 2019 - Seoul, Korea, Republic of
Duration: Jun 17 2019Jun 21 2019

Publication series

NameMobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services

Conference

Conference17th ACM International Conference on Mobile Systems, Applications, and Services, MobiSys 2019
CountryKorea, Republic of
CitySeoul
Period6/17/196/21/19

Fingerprint

Augmented reality
Mobile devices
Lighting
Glass
Liquids

Keywords

  • Augmented reality
  • Geometry
  • Image processing
  • Image-based lighting
  • Light estimation
  • Light probe
  • Lighting models

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Science Applications

Cite this

Prakash, S., Bahremand, A., Nguyen, L. D., & LiKamWa, R. (2019). GleaM: An illumination estimation framework for real-time photorealistic augmented reality on mobile devices. In MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services (pp. 142-154). (MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services). Association for Computing Machinery, Inc. https://doi.org/10.1145/3307334.3326098

GleaM : An illumination estimation framework for real-time photorealistic augmented reality on mobile devices. / Prakash, Siddhant; Bahremand, Alireza; Nguyen, Linda D.; LiKamWa, Robert.

MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services. Association for Computing Machinery, Inc, 2019. p. 142-154 (MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Prakash, S, Bahremand, A, Nguyen, LD & LiKamWa, R 2019, GleaM: An illumination estimation framework for real-time photorealistic augmented reality on mobile devices. in MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services. MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services, Association for Computing Machinery, Inc, pp. 142-154, 17th ACM International Conference on Mobile Systems, Applications, and Services, MobiSys 2019, Seoul, Korea, Republic of, 6/17/19. https://doi.org/10.1145/3307334.3326098
Prakash S, Bahremand A, Nguyen LD, LiKamWa R. GleaM: An illumination estimation framework for real-time photorealistic augmented reality on mobile devices. In MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services. Association for Computing Machinery, Inc. 2019. p. 142-154. (MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services). https://doi.org/10.1145/3307334.3326098
Prakash, Siddhant ; Bahremand, Alireza ; Nguyen, Linda D. ; LiKamWa, Robert. / GleaM : An illumination estimation framework for real-time photorealistic augmented reality on mobile devices. MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services. Association for Computing Machinery, Inc, 2019. pp. 142-154 (MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services).
@inproceedings{8ce2f2cccd1b4ce7913591aae83d845e,
title = "GleaM: An illumination estimation framework for real-time photorealistic augmented reality on mobile devices",
abstract = "Mixed reality mobile platforms attempt to co-locate virtual scenes with physical environments, towards creating immersive user experiences. However, to create visual harmony between virtual and physical spaces, the virtual scene must be accurately illuminated with realistic lighting that matches the physical environment. To this end, we design GLEAM, a framework that provides robust illumination estimation in real-time by integrating physical light-probe estimation with current mobile AR systems. GLEAM visually observes reflective objects to compose a realistic estimation of physical lighting. Optionally, GLEAM can network multiple devices to sense illumination from different viewpoints and compose a richer estimation to enhance realism and fidelity. Using GLEAM, AR developers gain the freedom to use a wide range of materials, which is currently limited by the unrealistic appearance of materials that need accurate illumination, such as liquids, glass, and smooth metals. Our controlled environment user studies across 30 participants reveal the effectiveness of GLEAM in providing robust and adaptive illumination estimation over commercial status quo solutions, such as pre-baked directional lighting and ARKit 2.0 illumination estimation. Our benchmarks reveal the need for situation driven tradeoffs to optimize for quality factors in situations requiring freshness over quality and vice-versa. Optimizing for different quality factors in different situations, GLEAM can update scene illumination as fast as 30 ms by sacrificing richness and fidelity in highly dynamic scenes, or prioritize quality by allowing an update interval as high as 400 ms in scenes that require high-fidelity estimation.",
keywords = "Augmented reality, Geometry, Image processing, Image-based lighting, Light estimation, Light probe, Lighting models",
author = "Siddhant Prakash and Alireza Bahremand and Nguyen, {Linda D.} and Robert LiKamWa",
year = "2019",
month = "6",
day = "12",
doi = "10.1145/3307334.3326098",
language = "English (US)",
series = "MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services",
publisher = "Association for Computing Machinery, Inc",
pages = "142--154",
booktitle = "MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services",

}

TY - GEN

T1 - GleaM

T2 - An illumination estimation framework for real-time photorealistic augmented reality on mobile devices

AU - Prakash, Siddhant

AU - Bahremand, Alireza

AU - Nguyen, Linda D.

AU - LiKamWa, Robert

PY - 2019/6/12

Y1 - 2019/6/12

N2 - Mixed reality mobile platforms attempt to co-locate virtual scenes with physical environments, towards creating immersive user experiences. However, to create visual harmony between virtual and physical spaces, the virtual scene must be accurately illuminated with realistic lighting that matches the physical environment. To this end, we design GLEAM, a framework that provides robust illumination estimation in real-time by integrating physical light-probe estimation with current mobile AR systems. GLEAM visually observes reflective objects to compose a realistic estimation of physical lighting. Optionally, GLEAM can network multiple devices to sense illumination from different viewpoints and compose a richer estimation to enhance realism and fidelity. Using GLEAM, AR developers gain the freedom to use a wide range of materials, which is currently limited by the unrealistic appearance of materials that need accurate illumination, such as liquids, glass, and smooth metals. Our controlled environment user studies across 30 participants reveal the effectiveness of GLEAM in providing robust and adaptive illumination estimation over commercial status quo solutions, such as pre-baked directional lighting and ARKit 2.0 illumination estimation. Our benchmarks reveal the need for situation driven tradeoffs to optimize for quality factors in situations requiring freshness over quality and vice-versa. Optimizing for different quality factors in different situations, GLEAM can update scene illumination as fast as 30 ms by sacrificing richness and fidelity in highly dynamic scenes, or prioritize quality by allowing an update interval as high as 400 ms in scenes that require high-fidelity estimation.

AB - Mixed reality mobile platforms attempt to co-locate virtual scenes with physical environments, towards creating immersive user experiences. However, to create visual harmony between virtual and physical spaces, the virtual scene must be accurately illuminated with realistic lighting that matches the physical environment. To this end, we design GLEAM, a framework that provides robust illumination estimation in real-time by integrating physical light-probe estimation with current mobile AR systems. GLEAM visually observes reflective objects to compose a realistic estimation of physical lighting. Optionally, GLEAM can network multiple devices to sense illumination from different viewpoints and compose a richer estimation to enhance realism and fidelity. Using GLEAM, AR developers gain the freedom to use a wide range of materials, which is currently limited by the unrealistic appearance of materials that need accurate illumination, such as liquids, glass, and smooth metals. Our controlled environment user studies across 30 participants reveal the effectiveness of GLEAM in providing robust and adaptive illumination estimation over commercial status quo solutions, such as pre-baked directional lighting and ARKit 2.0 illumination estimation. Our benchmarks reveal the need for situation driven tradeoffs to optimize for quality factors in situations requiring freshness over quality and vice-versa. Optimizing for different quality factors in different situations, GLEAM can update scene illumination as fast as 30 ms by sacrificing richness and fidelity in highly dynamic scenes, or prioritize quality by allowing an update interval as high as 400 ms in scenes that require high-fidelity estimation.

KW - Augmented reality

KW - Geometry

KW - Image processing

KW - Image-based lighting

KW - Light estimation

KW - Light probe

KW - Lighting models

UR - http://www.scopus.com/inward/record.url?scp=85069193152&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069193152&partnerID=8YFLogxK

U2 - 10.1145/3307334.3326098

DO - 10.1145/3307334.3326098

M3 - Conference contribution

AN - SCOPUS:85069193152

T3 - MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services

SP - 142

EP - 154

BT - MobiSys 2019 - Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services

PB - Association for Computing Machinery, Inc

ER -