File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

postgraduate thesis: Tensor-train-based methods : applications in image completion and probabilistic models

TitleTensor-train-based methods : applications in image completion and probabilistic models
Authors
Advisors
Advisor(s):Wong, N
Issue Date2019
PublisherThe University of Hong Kong (Pokfulam, Hong Kong)
Citation
Ko, C. [柯璟芸]. (2019). Tensor-train-based methods : applications in image completion and probabilistic models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.
AbstractTensors are a higher-order generalization of vectors and matrices, constituting a natural representation for many real-life data that are intrinsically multi-way. In analogy to the significance of matrix QR factorization and singular value decomposition in matrix preconditioning and principal component analysis, tensor decomposition concepts have been deployed in modern engineering topics. In this thesis, we draw our attention to tensor-train-based methods and their applications in image completion and probabilistic models. Specifically, We generalize the updating scheme of tensor trains in literature to tensor completion tasks and density estimations. The first part of the thesis proposes a new tensor completion method based on tensor trains. The to-be-completed tensor is modeled as a low-rank tensor train, where we use the known tensor entries and their coordinates to update the tensor train. A novel tensor train initialization procedure is proposed specifically for image and video completion, which is demonstrated to ensure fast convergence of the completion algorithm. The tensor train framework is also shown to easily accommodate Total Variation and Tikhonov regularization due to their low-rank tensor train representations. Image and video inpainting experiments verify the superiority of the proposed scheme in terms of both speed and scalability, where a speedup of up to 155$\times$ is observed compared to state-of-the-art tensor completion methods at a similar accuracy. Moreover, we demonstrate the proposed scheme is especially advantageous over existing algorithms when only tiny portions (say, $1$\%) of the to-be-completed images/videos are known. The second part of the thesis introduces Sum-product networks (SPNs) as an emerging class of neural networks with clear probabilistic semantics and superior inference speed over other graphical models. We reveal an important connection between SPNs and tensor trains, leading to a new canonical form which we call tensor SPNs (tSPNs). Moreover, we demonstrate the intimate relationship between a valid SPN and a tensor train. For the first time, through mapping an SPN onto a tSPN and employing specially customized optimization techniques, we demonstrate improvements up to a factor of 100 on both model compression and inference speedup for various datasets with negligible loss in accuracy.
DegreeMaster of Philosophy
SubjectTensor products
Image processing
Graphical modeling (Statistics)
Dept/ProgramElectrical and Electronic Engineering
Persistent Identifierhttp://hdl.handle.net/10722/279283

 

DC FieldValueLanguage
dc.contributor.advisorWong, N-
dc.contributor.authorKo, Ching-yun-
dc.contributor.author柯璟芸-
dc.date.accessioned2019-10-24T08:28:44Z-
dc.date.available2019-10-24T08:28:44Z-
dc.date.issued2019-
dc.identifier.citationKo, C. [柯璟芸]. (2019). Tensor-train-based methods : applications in image completion and probabilistic models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR.-
dc.identifier.urihttp://hdl.handle.net/10722/279283-
dc.description.abstractTensors are a higher-order generalization of vectors and matrices, constituting a natural representation for many real-life data that are intrinsically multi-way. In analogy to the significance of matrix QR factorization and singular value decomposition in matrix preconditioning and principal component analysis, tensor decomposition concepts have been deployed in modern engineering topics. In this thesis, we draw our attention to tensor-train-based methods and their applications in image completion and probabilistic models. Specifically, We generalize the updating scheme of tensor trains in literature to tensor completion tasks and density estimations. The first part of the thesis proposes a new tensor completion method based on tensor trains. The to-be-completed tensor is modeled as a low-rank tensor train, where we use the known tensor entries and their coordinates to update the tensor train. A novel tensor train initialization procedure is proposed specifically for image and video completion, which is demonstrated to ensure fast convergence of the completion algorithm. The tensor train framework is also shown to easily accommodate Total Variation and Tikhonov regularization due to their low-rank tensor train representations. Image and video inpainting experiments verify the superiority of the proposed scheme in terms of both speed and scalability, where a speedup of up to 155$\times$ is observed compared to state-of-the-art tensor completion methods at a similar accuracy. Moreover, we demonstrate the proposed scheme is especially advantageous over existing algorithms when only tiny portions (say, $1$\%) of the to-be-completed images/videos are known. The second part of the thesis introduces Sum-product networks (SPNs) as an emerging class of neural networks with clear probabilistic semantics and superior inference speed over other graphical models. We reveal an important connection between SPNs and tensor trains, leading to a new canonical form which we call tensor SPNs (tSPNs). Moreover, we demonstrate the intimate relationship between a valid SPN and a tensor train. For the first time, through mapping an SPN onto a tSPN and employing specially customized optimization techniques, we demonstrate improvements up to a factor of 100 on both model compression and inference speedup for various datasets with negligible loss in accuracy.-
dc.languageeng-
dc.publisherThe University of Hong Kong (Pokfulam, Hong Kong)-
dc.relation.ispartofHKU Theses Online (HKUTO)-
dc.rightsThe author retains all proprietary rights, (such as patent rights) and the right to use in future works.-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subject.lcshTensor products-
dc.subject.lcshImage processing-
dc.subject.lcshGraphical modeling (Statistics)-
dc.titleTensor-train-based methods : applications in image completion and probabilistic models-
dc.typePG_Thesis-
dc.description.thesisnameMaster of Philosophy-
dc.description.thesislevelMaster-
dc.description.thesisdisciplineElectrical and Electronic Engineering-
dc.description.naturepublished_or_final_version-
dc.identifier.doi10.5353/th_991044158737903414-
dc.date.hkucongregation2019-
dc.identifier.mmsid991044158737903414-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats