Add like
Add dislike
Add to saved papers

Dual-modality endoscopic probe for tissue surface shape reconstruction and hyperspectral imaging enabled by deep neural networks.

Surgical guidance and decision making could be improved with accurate and real-time measurement of intra-operative data including shape and spectral information of the tissue surface. In this work, a dual-modality endoscopic system has been proposed to enable tissue surface shape reconstruction and hyperspectral imaging (HSI). This system centers around a probe comprised of an incoherent fiber bundle, whose fiber arrangement is different at the two ends, and miniature imaging optics. For 3D reconstruction with structured light (SL), a light pattern formed of randomly distributed spots with different colors is projected onto the tissue surface, creating artificial texture. Pattern decoding with a Convolutional Neural Network (CNN) model and a customized feature descriptor enables real-time 3D surface reconstruction at approximately 12 frames per second (FPS). In HSI mode, spatially sparse hyperspectral signals from the tissue surface can be captured with a slit hyperspectral imager in a single snapshot. A CNN based super-resolution model, namely "super-spectral-resolution" network (SSRNet), has also been developed to estimate pixel-level dense hypercubes from the endoscope cameras standard RGB images and the sparse hyperspectral signals, at approximately 2 FPS. The probe, with a 2.1 mm diameter, enables the system to be used with endoscope working channels. Furthermore, since data acquisition in both modes can be accomplished in one snapshot, operation of this system in clinical applications is minimally affected by tissue surface movement and deformation. The whole apparatus has been validated on phantoms and tissue (ex vivo and in vivo), while initial measurements on patients during laryngeal surgery show its potential in real-world clinical applications.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app