File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

postgraduate thesis: Mixed reality for interactive modeling, fabrication and visualization

TitleMixed reality for interactive modeling, fabrication and visualization
Authors
Advisors
Advisor(s):Wang, WP
Issue Date2018
PublisherThe University of Hong Kong (Pokfulam, Hong Kong)
Citation
Yue, Y. [岳雅婷]. (2018). Mixed reality for interactive modeling, fabrication and visualization. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.
AbstractMixed Reality is the blending of physical and digital world. Advancements in sensors and related techniques provide innovative interaction possibilities among human, computer and environment. Due to the development of 3D sensing and modeling techniques, the state-of-the-art mixed reality devices have the ability of digitalizing the physical world. This unique feature bridges the gap between virtuality and reality and largely improves the user experience. This thesis presents three novel interactive systems to explore the contact and interaction among human, computer and environment in Mixed Reality. For merging virtual objects into reality, current solutions only perform well if the virtual contents complement the real scene. It can easily cause visual artifacts when the reality needs to be modified due to the virtuality (e.g., remove real objects to offer more space for virtual models), which is a common requirement in mixed reality applications such as room redecoration and environment design. Therefore, a novel system called `SceneCtrl' is presented. It allows the user to interactively edit the real scene sensed by Hololens, such that the reality can be adapted to suit virtuality. Our proof-of-concept prototype employs scene reconstruction and understanding to enable efficient editing such as deleting, moving, and copying real objects in the scene. We also demonstrate SceneCtrl on a number of example scenarios in mixed reality, verifying that enhanced experience resolves conflicts between virtuality and reality. Next, an immersive guidance system for 3D wire sculpturing called `WireDraw' is proposed. It allows us to edit/produce the real art work in physical with the MR guidance, not only in visual as in `SceneCtrl'. The availability of commodity 3D extruder pen allows direct drawing of 3D wire sculptures for novice users, enabling many novel applications such as intuitive spatial intelligence development for school students. However, the lack of spatial and structural cues among individual pen strokes makes the 3D drawing process challenging, which often leads to highly distorted and even incomplete wire sculptures. We add two web cameras to Oculus to create a video see-through device for Mixed Reality guidance in `WireDraw'. On-the-fly edits on unsatisfactory strokes are also allowed for creative design. We demonstrate the effectiveness of our system by testing on a variety of wire models and conducting a user study. The results show that the visual guidance provided by our system is extremely helpful for drawing high-quality wire sculptures. Other than the specific guidance for 3D drawing, a context-aware structural data visualization system is presented to offer a more general guidance and assistance. Data visualization is indispensable for complex data understanding. Most data are abstracted from real things, hence we display the virtual data based on the physical environment instead of just showing 2D or 3D graphics on screen. This context-aware visualization system combines the physical information and data structure, providing active assistance and guidance for human based on their location. A user study is also designed to evaluate the different visualization approaches and user behaviors.
DegreeDoctor of Philosophy
SubjectVirtual reality
Augmented reality
Human-computer interaction
Production engineering
Dept/ProgramComputer Science
Persistent Identifierhttp://hdl.handle.net/10722/263157

 

DC FieldValueLanguage
dc.contributor.advisorWang, WP-
dc.contributor.authorYue, Yating-
dc.contributor.author岳雅婷-
dc.date.accessioned2018-10-16T07:34:47Z-
dc.date.available2018-10-16T07:34:47Z-
dc.date.issued2018-
dc.identifier.citationYue, Y. [岳雅婷]. (2018). Mixed reality for interactive modeling, fabrication and visualization. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.-
dc.identifier.urihttp://hdl.handle.net/10722/263157-
dc.description.abstractMixed Reality is the blending of physical and digital world. Advancements in sensors and related techniques provide innovative interaction possibilities among human, computer and environment. Due to the development of 3D sensing and modeling techniques, the state-of-the-art mixed reality devices have the ability of digitalizing the physical world. This unique feature bridges the gap between virtuality and reality and largely improves the user experience. This thesis presents three novel interactive systems to explore the contact and interaction among human, computer and environment in Mixed Reality. For merging virtual objects into reality, current solutions only perform well if the virtual contents complement the real scene. It can easily cause visual artifacts when the reality needs to be modified due to the virtuality (e.g., remove real objects to offer more space for virtual models), which is a common requirement in mixed reality applications such as room redecoration and environment design. Therefore, a novel system called `SceneCtrl' is presented. It allows the user to interactively edit the real scene sensed by Hololens, such that the reality can be adapted to suit virtuality. Our proof-of-concept prototype employs scene reconstruction and understanding to enable efficient editing such as deleting, moving, and copying real objects in the scene. We also demonstrate SceneCtrl on a number of example scenarios in mixed reality, verifying that enhanced experience resolves conflicts between virtuality and reality. Next, an immersive guidance system for 3D wire sculpturing called `WireDraw' is proposed. It allows us to edit/produce the real art work in physical with the MR guidance, not only in visual as in `SceneCtrl'. The availability of commodity 3D extruder pen allows direct drawing of 3D wire sculptures for novice users, enabling many novel applications such as intuitive spatial intelligence development for school students. However, the lack of spatial and structural cues among individual pen strokes makes the 3D drawing process challenging, which often leads to highly distorted and even incomplete wire sculptures. We add two web cameras to Oculus to create a video see-through device for Mixed Reality guidance in `WireDraw'. On-the-fly edits on unsatisfactory strokes are also allowed for creative design. We demonstrate the effectiveness of our system by testing on a variety of wire models and conducting a user study. The results show that the visual guidance provided by our system is extremely helpful for drawing high-quality wire sculptures. Other than the specific guidance for 3D drawing, a context-aware structural data visualization system is presented to offer a more general guidance and assistance. Data visualization is indispensable for complex data understanding. Most data are abstracted from real things, hence we display the virtual data based on the physical environment instead of just showing 2D or 3D graphics on screen. This context-aware visualization system combines the physical information and data structure, providing active assistance and guidance for human based on their location. A user study is also designed to evaluate the different visualization approaches and user behaviors.-
dc.languageeng-
dc.publisherThe University of Hong Kong (Pokfulam, Hong Kong)-
dc.relation.ispartofHKU Theses Online (HKUTO)-
dc.rightsThe author retains all proprietary rights, (such as patent rights) and the right to use in future works.-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subject.lcshVirtual reality-
dc.subject.lcshAugmented reality-
dc.subject.lcshHuman-computer interaction-
dc.subject.lcshProduction engineering-
dc.titleMixed reality for interactive modeling, fabrication and visualization-
dc.typePG_Thesis-
dc.description.thesisnameDoctor of Philosophy-
dc.description.thesislevelDoctoral-
dc.description.thesisdisciplineComputer Science-
dc.description.naturepublished_or_final_version-
dc.identifier.doi10.5353/th_991044046694503414-
dc.date.hkucongregation2018-
dc.identifier.mmsid991044046694503414-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats