File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Revisiting Event-Based Video Frame Interpolation

TitleRevisiting Event-Based Video Frame Interpolation
Authors
Issue Date2023
Citation
IEEE International Conference on Intelligent Robots and Systems, 2023, p. 1292-1299 How to Cite?
AbstractDynamic vision sensors or event cameras provide rich complementary information for video frame interpolation. Existing state-of-the-art methods follow the paradigm of combining both synthesis-based and warping networks. However, few of those methods fully respect the intrinsic characteristics of events streams. Given that event cameras only encode intensity changes and polarity rather than color intensities, estimating optical flow from events is arguably more difficult than from RGB information. We therefore propose to incorporate RGB information in an event-guided optical flow refinement strategy. Moreover, in light of the quasi-continuous nature of the time signals provided by event cameras, we propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages rather than in a single, long stage. Extensive experiments on both synthetic and real-world datasets show that these modifications lead to more reliable and realistic intermediate frame results than previous video frame interpolation methods. Our findings underline that a careful consideration of event characteristics such as high temporal density and elevated noise benefits interpolation accuracy.
Persistent Identifierhttp://hdl.handle.net/10722/345373
ISSN
2023 SCImago Journal Rankings: 1.094

 

DC FieldValueLanguage
dc.contributor.authorChen, Jiaben-
dc.contributor.authorZhu, Yichen-
dc.contributor.authorLian, Dongze-
dc.contributor.authorYang, Jiaqi-
dc.contributor.authorWang, Yifu-
dc.contributor.authorZhang, Renrui-
dc.contributor.authorLiu, Xinhang-
dc.contributor.authorQian, Shenhan-
dc.contributor.authorKneip, Laurent-
dc.contributor.authorGao, Shenghua-
dc.date.accessioned2024-08-15T09:26:56Z-
dc.date.available2024-08-15T09:26:56Z-
dc.date.issued2023-
dc.identifier.citationIEEE International Conference on Intelligent Robots and Systems, 2023, p. 1292-1299-
dc.identifier.issn2153-0858-
dc.identifier.urihttp://hdl.handle.net/10722/345373-
dc.description.abstractDynamic vision sensors or event cameras provide rich complementary information for video frame interpolation. Existing state-of-the-art methods follow the paradigm of combining both synthesis-based and warping networks. However, few of those methods fully respect the intrinsic characteristics of events streams. Given that event cameras only encode intensity changes and polarity rather than color intensities, estimating optical flow from events is arguably more difficult than from RGB information. We therefore propose to incorporate RGB information in an event-guided optical flow refinement strategy. Moreover, in light of the quasi-continuous nature of the time signals provided by event cameras, we propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages rather than in a single, long stage. Extensive experiments on both synthetic and real-world datasets show that these modifications lead to more reliable and realistic intermediate frame results than previous video frame interpolation methods. Our findings underline that a careful consideration of event characteristics such as high temporal density and elevated noise benefits interpolation accuracy.-
dc.languageeng-
dc.relation.ispartofIEEE International Conference on Intelligent Robots and Systems-
dc.titleRevisiting Event-Based Video Frame Interpolation-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/IROS55552.2023.10341804-
dc.identifier.scopuseid_2-s2.0-85182523276-
dc.identifier.spage1292-
dc.identifier.epage1299-
dc.identifier.eissn2153-0866-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats