File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/IROS55552.2023.10341804
- Scopus: eid_2-s2.0-85182523276
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Revisiting Event-Based Video Frame Interpolation
Title | Revisiting Event-Based Video Frame Interpolation |
---|---|
Authors | |
Issue Date | 2023 |
Citation | IEEE International Conference on Intelligent Robots and Systems, 2023, p. 1292-1299 How to Cite? |
Abstract | Dynamic vision sensors or event cameras provide rich complementary information for video frame interpolation. Existing state-of-the-art methods follow the paradigm of combining both synthesis-based and warping networks. However, few of those methods fully respect the intrinsic characteristics of events streams. Given that event cameras only encode intensity changes and polarity rather than color intensities, estimating optical flow from events is arguably more difficult than from RGB information. We therefore propose to incorporate RGB information in an event-guided optical flow refinement strategy. Moreover, in light of the quasi-continuous nature of the time signals provided by event cameras, we propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages rather than in a single, long stage. Extensive experiments on both synthetic and real-world datasets show that these modifications lead to more reliable and realistic intermediate frame results than previous video frame interpolation methods. Our findings underline that a careful consideration of event characteristics such as high temporal density and elevated noise benefits interpolation accuracy. |
Persistent Identifier | http://hdl.handle.net/10722/345373 |
ISSN | 2023 SCImago Journal Rankings: 1.094 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chen, Jiaben | - |
dc.contributor.author | Zhu, Yichen | - |
dc.contributor.author | Lian, Dongze | - |
dc.contributor.author | Yang, Jiaqi | - |
dc.contributor.author | Wang, Yifu | - |
dc.contributor.author | Zhang, Renrui | - |
dc.contributor.author | Liu, Xinhang | - |
dc.contributor.author | Qian, Shenhan | - |
dc.contributor.author | Kneip, Laurent | - |
dc.contributor.author | Gao, Shenghua | - |
dc.date.accessioned | 2024-08-15T09:26:56Z | - |
dc.date.available | 2024-08-15T09:26:56Z | - |
dc.date.issued | 2023 | - |
dc.identifier.citation | IEEE International Conference on Intelligent Robots and Systems, 2023, p. 1292-1299 | - |
dc.identifier.issn | 2153-0858 | - |
dc.identifier.uri | http://hdl.handle.net/10722/345373 | - |
dc.description.abstract | Dynamic vision sensors or event cameras provide rich complementary information for video frame interpolation. Existing state-of-the-art methods follow the paradigm of combining both synthesis-based and warping networks. However, few of those methods fully respect the intrinsic characteristics of events streams. Given that event cameras only encode intensity changes and polarity rather than color intensities, estimating optical flow from events is arguably more difficult than from RGB information. We therefore propose to incorporate RGB information in an event-guided optical flow refinement strategy. Moreover, in light of the quasi-continuous nature of the time signals provided by event cameras, we propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages rather than in a single, long stage. Extensive experiments on both synthetic and real-world datasets show that these modifications lead to more reliable and realistic intermediate frame results than previous video frame interpolation methods. Our findings underline that a careful consideration of event characteristics such as high temporal density and elevated noise benefits interpolation accuracy. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE International Conference on Intelligent Robots and Systems | - |
dc.title | Revisiting Event-Based Video Frame Interpolation | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/IROS55552.2023.10341804 | - |
dc.identifier.scopus | eid_2-s2.0-85182523276 | - |
dc.identifier.spage | 1292 | - |
dc.identifier.epage | 1299 | - |
dc.identifier.eissn | 2153-0866 | - |