海角视频

U of T graduate students use AI to improve imaging tool used during breast cancer surgery

As part of an internship with a medical imaging firm, Bryant Bak-Yin Lim聽and Ali Yassine developed algorithms for a next-gen imaging system
""

Bryant Bak-Yin Lim, left, and Ali Yassine simulate reviewing a breast cancer tissue scan. As MITACS interns at Perimeter Medical Imaging, Lim and Yassine developed new AI algorithms for breast cancer imaging (photo by Neil Ta)

Bryant Bak-Yin Lim and Ali Yassine got the chance to make a difference in the lives of patients last summer by improving how breast cancer surgery is performed.

The two researchers at the University of Toronto were participating in internships with , a company with offices in Toronto and Dallas, that were organized through . There, they developed artificial intelligence (AI) algorithms for the next generation of a  that helps surgeons visualize tissue microstructures during a lumpectomy to determine whether they have excised all the cancerous tissue.

Their algorithms prioritize suspicious images, making it easier for surgeons to parse the images and reduce time spent in the operating room.

The imaging device is about the size and shape of a small photocopier, says Lim, an MD student in the Temerty Faculty of Medicine who is completing an in the Faculty of Applied Science & Engineering. 

Situated within the operating area, the device employs a technology called optical coherence tomography (OCT), which is similar to ultrasound technology but uses light instead of sound to generate images, resulting in an image resolution 10 times greater than ultrasound.

OCT has been widely used in clinical settings, including ophthalmology, dermatology and interventional cardiology, but Perimeter鈥檚 device is the first to bring wide-field OCT imaging into the OR, Lim says.

鈥淭he tissue removed from the patient is put in a plastic bag and placed on a glass imaging plate on the device, using mild suction to hold it in place,鈥 Lim says. 鈥淟ight shoots up from the optical imaging system below, penetrates the tissue and reflects back into the device, which then displays results as a digital image on the monitor.鈥

Surgeons are looking for any suspicious features in what鈥檚 called the 鈥渕argin,鈥 striving for about a two-millimetre rim of healthy tissue along the outer edges of the excised tissue.

鈥淐urrently, to assess a margin, specimens are sent out to a pathologist. That process usually takes days,鈥 says Yassine, who recently graduated with a master鈥檚 degree in electrical and computer engineering. 鈥淚f there鈥檚 cancerous tissue left, patients sometimes have to go back for another procedure, with all the risks and resource costs that come with it.

鈥淭he type of deep learning algorithm that I trained, called a convolutional neural network, can analyze the tissue image and identify whether the material is suspicious or non-suspicious with a very high accuracy rate.鈥

The challenge then is to display this analysis for the surgeon so they can make a timely, informed decision on whether they need to return to the operating table and remove more tissue from the patient, who is still under anesthesia.

Lim was tasked with building an efficient user interface to guide the surgeon.

鈥淭his device typically outputs hundreds of images, and it鈥檚 challenging for a surgeon in the OR to read through all of them and make a decision on the spot,鈥 he says.

鈥淚 developed an algorithm that clustered images together based on certain parameters and then displayed only the most representative one.鈥

The algorithm reduced the hundreds of images to a more manageable number of thumbnails that account for all the information gathered from the tissue scan. The surgeon can also manipulate the digital images to see the tissue from different perspectives.

There is great potential for AI-enhanced tools to make the medical professional鈥檚 work 鈥 and patient鈥檚 experience 鈥 smoother, says Ervin Sejdi膰, a professor in the Edward S. Rogers Sr. department of electrical and computer engineering who supervised both students.

鈥淭he Perimeter device that Bryant and Ali worked on is part of a wave of new tools that do the grunt work of sorting through and repackaging the copious amounts of data necessary for complex procedures or diagnoses,鈥 says Sejdi膰.

鈥淭his helps doctors sharpen their focus on the treatment.鈥

For his part, Yassine didn鈥檛 expect he would be this interested in medicine before he undertook this internship. He is finishing up his master鈥檚 project 鈥 a multi-class labeller algorithm for Perimeter that identifies specific tissues in breast cancer samples 鈥 and is planning to continue his career in medical technology.

鈥淚 had my own personal health challenges a while back, and that has motivated me to work in this field,鈥 he says. 鈥淚t鈥檚 nice to help people through technology.鈥

Lim, who has two years left to complete his medical degree, says, 鈥淚 hope to combine parts of AI and medicine and apply that to my future practice, whether industry research or some other collaborations. That鈥檚 where I want to bring my career to.鈥

鈥淲e are growing our MEng program in part because there are so many exciting possibilities out there for graduates,鈥 says Professor Deepa Kundur, chair of the department of electrical and computer engineering.

鈥淟im and Yassine鈥檚 internships at Perimeter demonstrate how quickly hands-on training can translate into real-world results.鈥

Note: Technologies referenced in this article are currently not available for sale in the United States and have not been evaluated by the FDA.

Engineering