İnsan Patologlarından Daha İyi Performans Gösteriyor – Harvard Tarafından Geliştirilen Yeni Yapay Zeka Aracı, Kolon Kanserinde Hayatta Kalmayı, Tedavi Yanıtını Tahmin Ediyor

Kolon Kanseri İllüstrasyonu

Kolon kanseri, kalın bağırsağı (kolon) etkileyen bir kanser türüdür. Dünya çapında en yaygın üçüncü kanser ve Amerika Birleşik Devletleri’nde kansere bağlı ölümlerin ikinci önde gelen nedenidir. Kolon kanseri belirtileri karın ağrısı, bağırsak alışkanlıklarında değişiklikler ve rektal kanamayı içerebilir.

Model, hekimler için eyleme dönüştürülebilir içgörüler sağlar ve kaynakların kısıtlı olduğu bölgelerde klinik karar vermeyi geliştirebilir.

Tayvan’daki Harvard Tıp Okulu ve Ulusal Cheng Kung Üniversitesi’ndeki araştırmacılar, dünya çapında kanser ölümlerinin ikinci önde gelen nedeni olan kolorektal kanserli hastalar için tedavi ve prognoz hakkında doktorların daha bilinçli kararlar vermesine yardımcı olabilecek yeni bir yapay zeka modeli oluşturdu.

Yeni araç, yalnızca kanser hücrelerinin mikroskobik tasvirleri olan tümör örneklerinin görüntülerini analiz ederek bir kolorektal tümörün agresifliğini, hastalık nüksü olsun ya da olmasın hayatta kalma olasılığını ve hasta için en uygun tedaviyi doğru bir şekilde tahmin edebiliyor.

Bu tür soruları yanıtlayan bir araca sahip olmak, klinisyenlerin ve hastaların, aynı tedaviyi alan benzer hastalık profillerine sahip insanlar arasında bile genellikle farklı davranan ve nihayetinde kolorektal kanserin her yıl talep ettiği 1 milyon hayattan bazılarını kurtarabilecek olan bu kurnaz hastalıkta gezinmesine yardımcı olabilir. .

Geçenlerde dergide ekibin çalışmaları hakkında bir rapor yayınlandı.

“Our model performs tasks that human pathologists cannot do based on image viewing alone,” said study co-senior author Kun-Hsing Yu, assistant professor of biomedical informatics in the Blavatnik Institute at HMS. Yu led an international team of pathologists, oncologists, biomedical informaticians, and computer scientists.

“What we anticipate is not a replacement of human pathology expertise, but the augmentation of what human pathologists can do,” Yu added. “We fully expect that this approach will augment the current clinical practice of cancer management.”

The researchers caution that any individual patient’s prognosis depends on multiple factors and that no model can perfectly predict any given patient’s survival. However, they add, the new model could be useful in guiding clinicians to follow up more closely, consider more aggressive treatments, or recommend clinical trials testing experimental therapies if their patients have worse predicted prognoses based on the tool’s assessment.

The tool could be particularly useful in resource-limited areas both in this country and around the world where advanced pathology and tumor genetic sequencing may not be readily available, the researchers noted.

The new tool goes beyond many current AI tools, which primarily perform tasks that replicate or optimize human expertise. The new tool, by comparison, detects and interprets visual patterns on microscopy images that are indiscernible to the human eye.

The tool, called MOMA (for Multi-omics Multi-cohort Assessment) is freely available to researchers and clinicians.

Extensive training and testing

The model was trained on information obtained from nearly 2,000 patients with colorectal cancer from diverse national patient cohorts that together include more than 450,000 participants — the Health Professionals Follow-up Study, the Nurses’ Health Study, the Cancer Genome Atlas Program, and the NIH’s PLCO (Prostate, Lung, Colorectal, and Ovarian) Cancer Screening Trial.

During the training phase, the researchers fed the model information about the patients’ age, sex, cancer stage, and outcomes. They also gave it information about the tumors’ genomic, epigenetic, protein, and metabolic profiles.

Then the researchers showed the model pathology images of tumor samples and asked it to look for visual markers related to tumor types, genetic mutations, epigenetic alterations, disease progression, and patient survival.

The researchers then tested how the model might perform in “the real world” by feeding it a set of images it had not seen before of tumor samples from different patients. They compared its performance with the actual patient outcomes and other available clinical information.

The model accurately predicted the patients’ overall survival following diagnosis, as well as how many of those years would be cancer-free.

The tool also accurately predicted how an individual patient might respond to different therapies, based on whether the patient’s tumor harbored specific genetic mutations that rendered the cancer more or less prone to progression or spread.

In both of those areas, the tool outperformed human pathologists as well as current AI models.

The researchers said the model will undergo periodic upgrading as science evolves and new data emerge.

“It is critical that with any AI model, we continuously monitor its behavior and performance because we may see shifts in the distributions of disease burden or new environmental toxins that contribute to cancer development,” Yu said. “It’s important to augment the model with new and more data as they come along so that its performance never lags behind.”

Discerning telltale patterns

The new model takes advantage of recent advances in tumor imaging techniques that offer unprecedented levels of detail, which nonetheless remain indiscernible to human evaluators. Based on these details, the model successfully identified indicators of how aggressive a tumor was and how likely it was to behave in response to a particular treatment.

Based on an image alone, the model also pinpointed characteristics associated with the presence or absence of specific genetic mutations — something that typically requires genomic sequencing of the tumor. Sequencing can be time-consuming and costly, particularly for hospitals where such services are not routinely available.

It is precisely in such situations that the model could provide timely decision support for treatment choice in resource-limited settings or in situations where there is no tumor tissue available for genetic sequencing, the researchers said.

The researchers said that before deploying the model for use in clinics and hospitals, it should be tested in a prospective, randomized trial that assesses the tool’s performance in actual patients over time after initial diagnosis. Such a study would provide the gold-standard demonstration of the model’s capabilities, Yu said, by directly comparing the tool’s real-life performance using images alone with that of human clinicians who use knowledge and test results that the model does not have access to.

Another strength of the model, the researchers said, is its transparent reasoning. If a clinician using the model asks why it made a given prediction, the tool would be able to explain its reasoning and the variables it used.

This feature is important for increasing clinicians’ confidence in the AI models they use, Yu said.

Gauging disease progression, optimal treatment

The model accurately pinpointed image characteristics related to differences in survival.

For example, it identified three image features that portended worse outcomes:

  • Greater cell density within a tumor.
  • The presence of connective supportive tissue around tumor cells, known as the stroma.
  • Interactions of tumor cells with smooth muscle cells.

The model also identified patterns within the tumor stroma that indicated which patients were more likely to live longer without cancer recurrence.

The tool also accurately predicted which patients would benefit from a class of cancer treatments known as immune checkpoint inhibitors. While these therapies work in many patients with colon cancer, some experience no measurable benefit and have serious side effects. The model could thus help clinicians tailor treatment and spare patients who wouldn’t benefit, Yu said.

The model also successfully detected epigenetic changes associated with colorectal cancer. These changes — which occur when molecules known as methyl groups attach to DNA and alter how that DNA behaves — are known to silence genes that suppress tumors, causing the cancers to grow rapidly. The model’s ability to identify these changes marks another way it can inform treatment choice and prognosis.

Reference: “Histopathology images predict multi-omics aberrations and prognoses in colorectal cancer patients” by Pei-Chen Tsai, Tsung-Hua Lee, Kun-Chi Kuo, Fang-Yi Su, Tsung-Lu Michael Lee, Eliana Marostica, Tomotaka Ugai, Melissa Zhao, Mai Chan Lau, Juha P. Väyrynen, Marios Giannakis, Yasutoshi Takashima, Seyed Mousavi Kahaki, Kana Wu, Mingyang Song, Jeffrey A. Meyerhardt, Andrew T. Chan, Jung-Hsien Chiang, Jonathan Nowak, Shuji Ogino and Kun-Hsing Yu, 13 April 2023, Nature Communications.
DOI: 10.1038/s41467-023-37179-4

Other institutions involved in the research included Harvard T.H. Chan School of Public Health, MIT, Dana-Farber Cancer Institute, Massachusetts General Hospital, Brigham and Women’s Hospital, Southern Taiwan University of Science and Technology, and Oulu University Hospital in Finland.

The study was funded by the National Institute of General Medical Sciences, the Google Research Scholar Award, and the Blavatnik Center for Computational Biomedicine Award. Computational support was provided through Microsoft Azure for Research Award, the NVIDIA GPU Grant Program, and Extreme Science and Engineering Discovery Environment (XSEDE) at the Pittsburgh Supercomputing Center (allocation TG-BCS180016).

Yu is an inventor of US 16/179,101 assigned to Harvard University. Yu was a consultant of Curatio DL. Wu is currently a stakeholder and employee of Vertex Pharmaceuticals, which did not contribute funding to the study.

Yorum yapın