File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TMI.2025.3545434
- Scopus: eid_2-s2.0-85218896178
- PMID: 40031758
- Find via

Supplementary
- Citations:
- Appears in Collections:
Article: VibNet: Vibration-Boosted Needle Detection in Ultrasound Images
| Title | VibNet: Vibration-Boosted Needle Detection in Ultrasound Images |
|---|---|
| Authors | |
| Keywords | image-guided surgery instrument segmentation Needle detection ultrasound image analysis |
| Issue Date | 2025 |
| Citation | IEEE Transactions on Medical Imaging, 2025, v. 44, n. 6, p. 2696-2708 How to Cite? |
| Abstract | Precise percutaneous needle detection is crucial for ultrasound (US)-guided interventions. However, inherent limitations such as speckles, needle-like artifacts, and low resolution make it challenging to robustly detect needles, especially when their visibility is reduced or imperceptible. To address this challenge, we propose VibNet, a learning-based framework designed to enhance the robustness and accuracy of needle detection in US images by leveraging periodic vibration applied externally to the needle shafts. VibNet integrates neural Short-Time Fourier Transform and Hough Transform modules to achieve successive sub-goals, including motion feature extraction in the spatiotemporal space, frequency feature aggregation, and needle detection in the Hough space. Due to the periodic subtle vibration, the features are more robust in the frequency domain than in the image intensity domain, making VibNet more effective than traditional intensity-based methods. To demonstrate the effectiveness of VibNet, we conducted experiments on distinct ex vivo porcine and bovine tissue samples. The results obtained on porcine samples demonstrate that VibNet effectively detects needles even when their visibility is severely reduced, with a tip error of 1.61 ± 1.56 mm compared to 8.15 ± 9.98 mm for UNet and 6.63 ± 7.58 mm for WNet, and a needle direction error of 1.64 ± 1.86° compared to 9.29 ± 15.30° for UNet and 8.54 ± 17.92° for WNet. |
| Persistent Identifier | http://hdl.handle.net/10722/365357 |
| ISSN | 2023 Impact Factor: 8.9 2023 SCImago Journal Rankings: 3.703 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Huang, Dianye | - |
| dc.contributor.author | Li, Chenyang | - |
| dc.contributor.author | Karlas, Angelos | - |
| dc.contributor.author | Chu, Xiangyu | - |
| dc.contributor.author | Samuel Au, K. W. | - |
| dc.contributor.author | Navab, Nassir | - |
| dc.contributor.author | Jiang, Zhongliang | - |
| dc.date.accessioned | 2025-11-05T06:55:37Z | - |
| dc.date.available | 2025-11-05T06:55:37Z | - |
| dc.date.issued | 2025 | - |
| dc.identifier.citation | IEEE Transactions on Medical Imaging, 2025, v. 44, n. 6, p. 2696-2708 | - |
| dc.identifier.issn | 0278-0062 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/365357 | - |
| dc.description.abstract | Precise percutaneous needle detection is crucial for ultrasound (US)-guided interventions. However, inherent limitations such as speckles, needle-like artifacts, and low resolution make it challenging to robustly detect needles, especially when their visibility is reduced or imperceptible. To address this challenge, we propose VibNet, a learning-based framework designed to enhance the robustness and accuracy of needle detection in US images by leveraging periodic vibration applied externally to the needle shafts. VibNet integrates neural Short-Time Fourier Transform and Hough Transform modules to achieve successive sub-goals, including motion feature extraction in the spatiotemporal space, frequency feature aggregation, and needle detection in the Hough space. Due to the periodic subtle vibration, the features are more robust in the frequency domain than in the image intensity domain, making VibNet more effective than traditional intensity-based methods. To demonstrate the effectiveness of VibNet, we conducted experiments on distinct ex vivo porcine and bovine tissue samples. The results obtained on porcine samples demonstrate that VibNet effectively detects needles even when their visibility is severely reduced, with a tip error of 1.61 ± 1.56 mm compared to 8.15 ± 9.98 mm for UNet and 6.63 ± 7.58 mm for WNet, and a needle direction error of 1.64 ± 1.86° compared to 9.29 ± 15.30° for UNet and 8.54 ± 17.92° for WNet. | - |
| dc.language | eng | - |
| dc.relation.ispartof | IEEE Transactions on Medical Imaging | - |
| dc.subject | image-guided surgery | - |
| dc.subject | instrument segmentation | - |
| dc.subject | Needle detection | - |
| dc.subject | ultrasound image analysis | - |
| dc.title | VibNet: Vibration-Boosted Needle Detection in Ultrasound Images | - |
| dc.type | Article | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1109/TMI.2025.3545434 | - |
| dc.identifier.pmid | 40031758 | - |
| dc.identifier.scopus | eid_2-s2.0-85218896178 | - |
| dc.identifier.volume | 44 | - |
| dc.identifier.issue | 6 | - |
| dc.identifier.spage | 2696 | - |
| dc.identifier.epage | 2708 | - |
| dc.identifier.eissn | 1558-254X | - |
