Part of the book series:Lecture Notes in Computer Science ((LNIP,volume 10945))
Included in the following conference series:
841Accesses
4Citations
Abstract
In this paper we set out to find a new technical and commercial solution to easily acquire a virtual model of existing machinery for visualisation in a VR environment. To this end we introduce an image-based scanning approach with an initial focus on a monocular (handheld) capturing device such as a portable camera. Poses of the camera will be estimated with a Simultaneous Localisation and Mapping technique. Depending on the required quality offline calibration is incorporated by means of ArUco markers placed within the captured scene. Once the images are captured, they are compressed in a format that allows rapid low-latency streaming and decoding on the GPU. Finally, upon viewing the model in a VR environment, an optical flow method is used to interpolate between the triangulisation of the captured viewpoints to deliver a smooth VR experience. We believe our tool will facilitate the capturing of machinery into VR providing a wide range of benefits such as doing marketing, providing offsite help and performing remote maintenance.
This is a preview of subscription content,log in via an institution to check access.
Access this chapter
Subscribe and save
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
Buy Now
- Chapter
- JPY 3498
- Price includes VAT (Japan)
- eBook
- JPY 5719
- Price includes VAT (Japan)
- Softcover Book
- JPY 7149
- Price includes VAT (Japan)
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Choi, S.S., Jung, K., Noh, S.D.: Virtual reality applications in manufacturing industries: past research, present findings, and future directions. Concurrent Eng.23(1), 40–63 (2015)
Wang, X., Ong, S.K., Nee, A.Y.C.: A comprehensive survey of augmented reality assembly research. Adv. Manufact.4(1), 1–22 (2016)
Werrlich, S., Nitsche, K., Notni, G.: Demand analysis for an augmented reality based assembly training. In: Proceedings of the 10th International Conference on Pervasive Technologies Related to Assistive Environments, PETRA 2017, pp. 416–422. ACM, New York (2017)
Grajewski, D., Górski, F., Zawadzki, P., Hamrol, A.: Application of virtual reality techniques in design of ergonomic manufacturing workplaces. In: Proceedings of 2013 International Conference on Virtual and Augmented Reality in Education. Procedia Computer Science, vol. 25, pp. 289–301 (2013)
Pontonnier, C., Dumont, G., Samani, A., Madeleine, P., Badawi, M.: Designing and evaluating a workstation in real and virtual environment: toward virtual reality based ergonomic design sessions. J. Multimodal User Interfaces8(2), 199–208 (2014)
McMillan, L., Bishop, G.: Plenoptic modeling: an image-based rendering system. In: Proceedings of the 22 nd Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1995, pp. 39–46. ACM, New York (1995)
Mortensen, J.: Virtual light fields for global illumination in computer graphics. Ph.D. thesis (2011)
Buehler, C., Bosse, M., McMillan, L., Gortler, S., Cohen, M.: Unstructured lumigraph rendering. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2001, pp. 425–432. ACM, New York (2001)
Davis, A., Levoy, M., Durand, F.: Unstructured light fields. Comput. Graph. Forum31(2pt1), 305–314 (2012)
Raptis, G.E., Katsini, C., Fidas, C., Avouris, N.: Effects of image-based rendering and reconstruction on game developers efficiency, game performance, and gaming experience. In: Bernhaupt, R., Dalvi, G., Joshi, A., Balkrishan, D.K., O’Neill, J., Winckler, M. (eds.) INTERACT 2017. LNCS, vol. 10514, pp. 87–96. Springer, Cham (2017).https://doi.org/10.1007/978-3-319-67684-5_6
OpenSLAM. World Wide Web (2018).http://openslam.org/
ArUco. World Wide Web (2018).http://www.uco.es/investiga/grupos/ava/node/26
Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo and RGB-D cameras. CoRR, abs/1610.06475 (2016)
Acknowledgements
This research was partially supported by Flanders Make, the strategic research centre for the manufacturing industry, in view of the Flanders Make FLEXAS_VR project.
We also gratefully express our gratitude to the European Fund for Regional Development (ERDF) and the Flemish Government, which are kindly funding part of the research at the Expertise Centre for Digital Media.
Author information
Authors and Affiliations
Expertise Centre for Digital Media, Hasselt University - tUL - Flanders Make, Wetenschapspark 2, 3590, Diepenbeek, Belgium
Jeroen Put, Nick Michiels, Fabian Di Fiore & Frank Van Reeth
- Jeroen Put
Search author on:PubMed Google Scholar
- Nick Michiels
Search author on:PubMed Google Scholar
- Fabian Di Fiore
Search author on:PubMed Google Scholar
- Frank Van Reeth
Search author on:PubMed Google Scholar
Corresponding author
Correspondence toFabian Di Fiore.
Editor information
Editors and Affiliations
UIB – Universitat de les Illes Balears, Palma de Mallorca, Spain
Francisco José Perales
University of Surrey, Guildford, United Kingdom
Josef Kittler
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Put, J., Michiels, N., Di Fiore, F., Van Reeth, F. (2018). Capturing Industrial Machinery into Virtual Reality. In: Perales, F., Kittler, J. (eds) Articulated Motion and Deformable Objects. AMDO 2018. Lecture Notes in Computer Science(), vol 10945. Springer, Cham. https://doi.org/10.1007/978-3-319-94544-6_5
Download citation
Published:
Publisher Name:Springer, Cham
Print ISBN:978-3-319-94543-9
Online ISBN:978-3-319-94544-6
eBook Packages:Computer ScienceComputer Science (R0)
Share this paper
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative