Real-time tracking and inpainting network with joint learning iterative modules for AR-based DALK surgical navigation

  • Weimin Liu
  • , Junjun Pan*
  • , Liyun Jia
  • , Sijing Rao
  • , Jie Zang
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Background and Objective: Deep anterior lamellar keratoplasty (DALK) is a widely used treatment for eye diseases and requires accurate and even stitch positions during the suturing process. In this regard, the utilization of Augmented Reality (AR) navigation systems shows promising potential in enhancing the stitching process, and a clear and unoccluded view of the corneal regions would help surgeons better plan the stitching positions. Methods: In this work, we present a joint-learning and iterative network for AR-based suturing navigation. This network aims to improve the performance of the inpainting under serious occlusion in the suturing process. And it can provide both original instruments and inpainted corneal masks along with inpainted frames. The network is based on feature reuse, iterative modules, and mask propagation structures to greatly reduce the computational cost. For the requirement of end-to-end training, we also propose a novel dataset synthesis method to construct a dataset with both occluded and unoccluded image pairs, along with mask and optical flow annotations. We also develop a novel pipeline based on the grid propagation method and inpainted optical flow outputs to provide clear and stable inpainted frames. Results: Based on the synthetic datasets, compared to the recent outstanding inpainting networks, our framework reaches a better trade-off between performance and computation efficiency. Our Iter-S model finally gets a mean endpoint error (mEPE) of 1.69, a peak signal-to-noise ratio (PSNR) of 36.86, and a structure similarity index measure (SSIM) of 0.976, along with a low inpainting inference time of 16.26ms. Based on the Iter-S, we construct a novel AR navigation system with a frame rate of around 35.14ms/28FPS on average. Conclusions: The iterative modules can progressively refine the outputs while providing a favorable trade-off between visual performance and real-time computation efficiency based on the selection of iteration times. Our AR navigation framework can provide stable and accurate tracking outputs with well-inpainted results in real time under severe occlusion conditions, which demonstrates the benefits of guiding the stitching operations of surgeons in corneal surgeries.

Original languageEnglish
Article number109068
JournalComputer Methods and Programs in Biomedicine
Volume272
DOIs
StatePublished - Dec 2025

Keywords

  • AR-based surgical navigation
  • DALK
  • Inpainting
  • Joint learning
  • Optical flow
  • Semantic segmentation

Fingerprint

Dive into the research topics of 'Real-time tracking and inpainting network with joint learning iterative modules for AR-based DALK surgical navigation'. Together they form a unique fingerprint.

Cite this