File Download
Supplementary

postgraduate thesis: Workflow-assisted digital sculpting

TitleWorkflow-assisted digital sculpting
Authors
Advisors
Issue Date2020
PublisherThe University of Hong Kong (Pokfulam, Hong Kong)
Citation
Peng, M. [彭梦琪]. (2020). Workflow-assisted digital sculpting. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.
AbstractDigital content creation plays a significant role in computer graphics and human-computer interaction, and yet remains a challenging task. Recent digital advances have alleviated the difficulties for certain forms of content, especially in text document editing. However, it still demands a significant amount of expertise and effort to author other more artistic content, such as sketching, modeling, and animation. The main challenge is to design user-friendly interfaces armed with general and effective methods. This thesis focuses on freeform 3D digital sculpting, to design interactive systems to assist users in creating diverse sculpting content. Our main idea is to leverage workflow, i.e. the users’ intermediate content authoring process. The rich information contained in the history workflows can provide cues for future editing, which we examine to build novel interface features. In this thesis, we introduce two novel workflow-driven systems to analyze and synthesize sculpting workflows. At first, we present a 3D sculpting system that utilizes recorded sculpting brushing workflows to assist interactive sculpting via online suggestions. With an interactive sculpting interface, our system silently records and analyzes users’ workflows, including brush strokes and camera movements, and predicts what they might do in the future. Users can accept, partially accept, or ignore the suggestions and thus retain full control and individual styles. Our key idea is to consider how a model is authored via dynamic workflows in addition to what is shaped in static geometry. 3D sculpting dynamically deforms the base object shape, the later strokes can overlap and alter earlier ones, including both visible/explicit mesh shapes and invisible/implicit stroke relationships. We analyze how existing geometry shapes and dynamic workflows relate to one another, to predict what the future workflows would be like. This allows our method for more accurate analysis of user intentions and more general synthesis of shape structures, to author diverse sculpting objects. Then, we extend the static sculpting system towards the temporal dimension, to create keyframe-based sculpting animations, with the capability to autocomplete user editing across frames. Specifically, we extend prior workflow analysis to capture both the local and global context and structure across frames. Our interface functions like traditional desktop and immersive VR keyframe-based brushing systems, with which users can freely brush spatial structures and their temporal changes. Meanwhile, our system analyzes their workflows and predicts what they might do in the future with a suite of interactive autocomplete tools, both spatially and temporally. Similarly, users can maintain full control over the suggestions. Our method can match strokes with different apparent properties such as length or temporal order and yet have similar user intentions across frames. Our system presents a new possibility for keyframe sculpting authoring within a unified interface. It supports various shape and motion styles, including those difficult to achieve via existing animation systems, such as topological changes that cannot be accomplished via simple rig-based deformations. Our sculpting systems are the first ones with interactive autocomplete features, which can inspire future research on further understanding and assisting the challenging 3D sculpting content creation.
DegreeDoctor of Philosophy
SubjectComputer graphics
Three-dimensional display systems
Dept/ProgramComputer Science
Persistent Identifierhttp://hdl.handle.net/10722/286788

 

DC FieldValueLanguage
dc.contributor.advisorCheng, CKR-
dc.contributor.advisorWang, WP-
dc.contributor.authorPeng, Mengqi-
dc.contributor.author彭梦琪-
dc.date.accessioned2020-09-05T01:20:56Z-
dc.date.available2020-09-05T01:20:56Z-
dc.date.issued2020-
dc.identifier.citationPeng, M. [彭梦琪]. (2020). Workflow-assisted digital sculpting. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.-
dc.identifier.urihttp://hdl.handle.net/10722/286788-
dc.description.abstractDigital content creation plays a significant role in computer graphics and human-computer interaction, and yet remains a challenging task. Recent digital advances have alleviated the difficulties for certain forms of content, especially in text document editing. However, it still demands a significant amount of expertise and effort to author other more artistic content, such as sketching, modeling, and animation. The main challenge is to design user-friendly interfaces armed with general and effective methods. This thesis focuses on freeform 3D digital sculpting, to design interactive systems to assist users in creating diverse sculpting content. Our main idea is to leverage workflow, i.e. the users’ intermediate content authoring process. The rich information contained in the history workflows can provide cues for future editing, which we examine to build novel interface features. In this thesis, we introduce two novel workflow-driven systems to analyze and synthesize sculpting workflows. At first, we present a 3D sculpting system that utilizes recorded sculpting brushing workflows to assist interactive sculpting via online suggestions. With an interactive sculpting interface, our system silently records and analyzes users’ workflows, including brush strokes and camera movements, and predicts what they might do in the future. Users can accept, partially accept, or ignore the suggestions and thus retain full control and individual styles. Our key idea is to consider how a model is authored via dynamic workflows in addition to what is shaped in static geometry. 3D sculpting dynamically deforms the base object shape, the later strokes can overlap and alter earlier ones, including both visible/explicit mesh shapes and invisible/implicit stroke relationships. We analyze how existing geometry shapes and dynamic workflows relate to one another, to predict what the future workflows would be like. This allows our method for more accurate analysis of user intentions and more general synthesis of shape structures, to author diverse sculpting objects. Then, we extend the static sculpting system towards the temporal dimension, to create keyframe-based sculpting animations, with the capability to autocomplete user editing across frames. Specifically, we extend prior workflow analysis to capture both the local and global context and structure across frames. Our interface functions like traditional desktop and immersive VR keyframe-based brushing systems, with which users can freely brush spatial structures and their temporal changes. Meanwhile, our system analyzes their workflows and predicts what they might do in the future with a suite of interactive autocomplete tools, both spatially and temporally. Similarly, users can maintain full control over the suggestions. Our method can match strokes with different apparent properties such as length or temporal order and yet have similar user intentions across frames. Our system presents a new possibility for keyframe sculpting authoring within a unified interface. It supports various shape and motion styles, including those difficult to achieve via existing animation systems, such as topological changes that cannot be accomplished via simple rig-based deformations. Our sculpting systems are the first ones with interactive autocomplete features, which can inspire future research on further understanding and assisting the challenging 3D sculpting content creation.-
dc.languageeng-
dc.publisherThe University of Hong Kong (Pokfulam, Hong Kong)-
dc.relation.ispartofHKU Theses Online (HKUTO)-
dc.rightsThe author retains all proprietary rights, (such as patent rights) and the right to use in future works.-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subject.lcshComputer graphics-
dc.subject.lcshThree-dimensional display systems-
dc.titleWorkflow-assisted digital sculpting-
dc.typePG_Thesis-
dc.description.thesisnameDoctor of Philosophy-
dc.description.thesislevelDoctoral-
dc.description.thesisdisciplineComputer Science-
dc.description.naturepublished_or_final_version-
dc.date.hkucongregation2020-
dc.identifier.mmsid991044268206903414-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats