Automated 3D Segmentation of Ultrasound Images for Prostate Cancer Assessment


Automatic 3D segmentation method based on image registration and filters to determine edges and textures of prostate images for use in the diagnosis and assessment of prostate cancer.

Key Benefits

  • Provides an automatic segmentation method that does not require initialization, unlike currently available methods.
  • Uses an image database and filters to train a statistical classification machine to segment novel images of prostate tissue.

Market Summary

Millions of prostate biopsies are performed on men each year. To date, 2D transrectal ultrasound (TRUS) image segmentation has been the clinical standard for prostate cancer diagnosis. 2D image partitioning is limited in its ability to extract texture features of the geometrical shape and boundaries of the prostate. Using the 2D image from the TRUS, the physician must estimate the location of the biopsy needle based on the 2D information. A conventional biopsy can miss up to 30% of cancers, which is likely due to the “blind” sampling. Compared to 2D prostate segmentation, 3D imaging segmentation and clustering would create more meaningful images that are easier to combine with other images, analyze, and use for targeted biopsies. Enhanced image quality and visual information from this technology will lead to more accurate biopsies which will decrease the risk of missing the cancer during the biopsy, increase the rate of early detection and dramatically improve patient outcomes.

Technical Summary

Because ultrasound images have a low signal-to-noise ratio, automatic segmentation of the prostate is difficult. In addition, manual segmentation during biopsy or treatment can be time consuming and can introduce human error. Previous segmentation methods for TRUS images are semiautomatic and require initialization by the user. The segmentation method in this technology does not require initialization. Instead it uses a database of 3D TRUS images and applies filters to provide hierarchical representation of image features as well as accurate edge detection in order to train the system as a whole. The pre-trained machine can then be used to label new images of prostate tissue automatically and accurately. In a test of this method, the average surface distance between manual segmentation and this automatic segmentation is approximately 0.3 millimeters with a maximum distance of 2.38 millimeters, indicating that this automatic segmentation method is robust and can be used for prostate cancer diagnosis and 3D ultrasound-guided prostate biopsy.

Developmental Stage

Segmentation method was tested with 10 patients.

Patent Information

Tech ID: 10131
Published: 5/29/2012