File Download
Supplementary
-
Citations:
- Appears in Collections:
postgraduate thesis: Tensor-train-based methods : applications in image completion and probabilistic models
Title | Tensor-train-based methods : applications in image completion and probabilistic models |
---|---|
Authors | |
Advisors | Advisor(s):Wong, N |
Issue Date | 2019 |
Publisher | The University of Hong Kong (Pokfulam, Hong Kong) |
Citation | Ko, C. [柯璟芸]. (2019). Tensor-train-based methods : applications in image completion and probabilistic models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. |
Abstract | Tensors are a higher-order generalization of vectors and matrices, constituting a natural representation for many real-life data that are intrinsically multi-way. In analogy to the significance of matrix QR factorization and singular value decomposition in matrix preconditioning and principal component analysis, tensor decomposition concepts have been deployed in modern engineering topics.
In this thesis, we draw our attention to tensor-train-based methods and their applications in image completion and probabilistic models. Specifically, We generalize the updating scheme of tensor trains in literature to tensor completion tasks and density estimations.
The first part of the thesis proposes a new tensor completion method based on tensor trains. The to-be-completed tensor is modeled as a low-rank tensor train, where we use the known tensor entries and their coordinates to update the tensor train. A novel tensor train initialization procedure is proposed specifically for image and video completion, which is demonstrated to ensure fast convergence of the completion algorithm. The tensor train framework is also shown to easily accommodate Total Variation and Tikhonov regularization due to their low-rank tensor train representations. Image and video inpainting experiments verify the superiority of the proposed scheme in terms of both speed and scalability, where a speedup of up to 155$\times$ is observed compared to state-of-the-art tensor completion methods at a similar accuracy. Moreover, we demonstrate the proposed scheme is especially advantageous over existing algorithms when only tiny portions (say, $1$\%) of the to-be-completed images/videos are known.
The second part of the thesis introduces Sum-product networks (SPNs) as an emerging class of neural networks with clear probabilistic semantics and superior inference speed over other graphical models. We reveal an important connection between SPNs and tensor trains, leading to a new canonical form which we call tensor SPNs (tSPNs). Moreover, we demonstrate the intimate relationship between a valid SPN and a tensor train. For the first time, through mapping an SPN onto a tSPN and employing specially customized optimization techniques, we demonstrate improvements up to a factor of 100 on both model compression and inference speedup for various datasets with negligible loss in accuracy. |
Degree | Master of Philosophy |
Subject | Tensor products Image processing Graphical modeling (Statistics) |
Dept/Program | Electrical and Electronic Engineering |
Persistent Identifier | http://hdl.handle.net/10722/279283 |
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Wong, N | - |
dc.contributor.author | Ko, Ching-yun | - |
dc.contributor.author | 柯璟芸 | - |
dc.date.accessioned | 2019-10-24T08:28:44Z | - |
dc.date.available | 2019-10-24T08:28:44Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | Ko, C. [柯璟芸]. (2019). Tensor-train-based methods : applications in image completion and probabilistic models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. | - |
dc.identifier.uri | http://hdl.handle.net/10722/279283 | - |
dc.description.abstract | Tensors are a higher-order generalization of vectors and matrices, constituting a natural representation for many real-life data that are intrinsically multi-way. In analogy to the significance of matrix QR factorization and singular value decomposition in matrix preconditioning and principal component analysis, tensor decomposition concepts have been deployed in modern engineering topics. In this thesis, we draw our attention to tensor-train-based methods and their applications in image completion and probabilistic models. Specifically, We generalize the updating scheme of tensor trains in literature to tensor completion tasks and density estimations. The first part of the thesis proposes a new tensor completion method based on tensor trains. The to-be-completed tensor is modeled as a low-rank tensor train, where we use the known tensor entries and their coordinates to update the tensor train. A novel tensor train initialization procedure is proposed specifically for image and video completion, which is demonstrated to ensure fast convergence of the completion algorithm. The tensor train framework is also shown to easily accommodate Total Variation and Tikhonov regularization due to their low-rank tensor train representations. Image and video inpainting experiments verify the superiority of the proposed scheme in terms of both speed and scalability, where a speedup of up to 155$\times$ is observed compared to state-of-the-art tensor completion methods at a similar accuracy. Moreover, we demonstrate the proposed scheme is especially advantageous over existing algorithms when only tiny portions (say, $1$\%) of the to-be-completed images/videos are known. The second part of the thesis introduces Sum-product networks (SPNs) as an emerging class of neural networks with clear probabilistic semantics and superior inference speed over other graphical models. We reveal an important connection between SPNs and tensor trains, leading to a new canonical form which we call tensor SPNs (tSPNs). Moreover, we demonstrate the intimate relationship between a valid SPN and a tensor train. For the first time, through mapping an SPN onto a tSPN and employing specially customized optimization techniques, we demonstrate improvements up to a factor of 100 on both model compression and inference speedup for various datasets with negligible loss in accuracy. | - |
dc.language | eng | - |
dc.publisher | The University of Hong Kong (Pokfulam, Hong Kong) | - |
dc.relation.ispartof | HKU Theses Online (HKUTO) | - |
dc.rights | The author retains all proprietary rights, (such as patent rights) and the right to use in future works. | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject.lcsh | Tensor products | - |
dc.subject.lcsh | Image processing | - |
dc.subject.lcsh | Graphical modeling (Statistics) | - |
dc.title | Tensor-train-based methods : applications in image completion and probabilistic models | - |
dc.type | PG_Thesis | - |
dc.description.thesisname | Master of Philosophy | - |
dc.description.thesislevel | Master | - |
dc.description.thesisdiscipline | Electrical and Electronic Engineering | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.doi | 10.5353/th_991044158737903414 | - |
dc.date.hkucongregation | 2019 | - |
dc.identifier.mmsid | 991044158737903414 | - |