The role of artificial intelligence in nasopharyngeal carcinoma radiotherapy
Review Article

The role of artificial intelligence in nasopharyngeal carcinoma radiotherapy

Xue-Song Sun1,2, Lin-Quan Tang1,2, Qiu-Yan Chen1,2, Ying Sun1,3, Hai-Qiang Mai1,2

1Sun Yat-sen University Cancer Center, State Key Laboratory of Oncology in South China, Collaborative Innovation Center for Cancer Medicine, Guangdong Key Laboratory of Nasopharyngeal Carcinoma Diagnosis and Therapy, Guangzhou 510060, China; 2Department of Nasopharyngeal Carcinoma, Sun Yat-sen University Cancer Center, Guangzhou 510060, China; 3Department of Radiation Oncology, Sun Yat-sen University Cancer Center, Guangzhou 510060, China

Contributions: (I) Conception and design: QY Chen, Y Sun, HQ Mai; (II) Administrative support: QY Chen, Y Sun, HQ Mai; (III) Provision of study materials or patients: XS Sun, LQ Tang; (IV) Collection and assembly of data: XS Sun, LQ Tang; (V) Data analysis and interpretation: XS Sun, LQ Tang; (VI) Manuscript writing: All authors; (VII) Final approval of manuscript: All authors.

Correspondence to: Hai-Qiang Mai, MD, PhD. Department of Nasopharyngeal Carcinoma, Sun Yat-sen University Cancer Center, 651 Dongfeng Road East, Guangzhou 510060, China. Email: maihq@mail.sysu.edu.cn.

Abstract: Artificial intelligence (AI) is a new kind of technology aiming at simulating and expanding human intelligence. With the continuous development of science and technology, the application of AI has penetrated in all aspects of tumor radiotherapy (RT), which has greatly reduced the workload of doctors and physicists and improved work efficiency. Nasopharyngeal carcinoma (NPC) is a type of malignant tumor with obvious geographical distribution, especially in the east and southern Asia. RT is the main treatment method for NPC. In this review, the current state and anticipated future impact of AI, which focused on common methods in the medical imaging, data analysis and possible applications in NPC treatment, are discussed.

Keywords: Nasopharyngeal carcinoma (NPC); artificial intelligence (AI); automatic delineation


Received: 02 March 2020; Accepted: 20 March 2020; Published: 30 June 2020.

doi: 10.21037/anpc.2020.03.02


Introduction

The term artificial intelligence (AI) was first introduced by John McCarthy in 1956, which refers to “The science and engineering of making intelligent machines, especially intelligent computer programs”. According to the United States National Cancer Institute (NCI), AI is defined as using a computer to perform tasks commonly associated with human intelligence. Machine learning (ML) is a type of AI that is not explicitly programmed to perform a specific task but rather learn iteratively to make predictions or decisions. Deep learning (DL) is a subset of ML that uses artificial neural networks modeled after the process of human brains learning and acquiring information from huge amount of data. Recent researches demonstrate that AI based on DL outperforms humans in many aspects such as visual tasks, target recognition, and biomedical image recognition (1-3). Within medical imaging field, AI shows satisfactory results for detecting, characterizing, and monitoring objects, which exerts impact on several aspects of radiation oncology, such as target delineation (4).

Nasopharyngeal carcinoma (NPC) is a type of malignant tumor originated from nasopharyngeal mucosa, as a consequence of genetic variation, environmental factors, and EBV infection (5). Due to the radiosensitivity and deep anatomic location of NPC, radiotherapy (RT) has been the main treatment method for NPC since 1965. In 2018, there were 129,000 new cases of NPC around the world and its geographical distribution is extremely unbalanced, with 70% of cases occurring in the east and south-east Asia (6). However, facing the serious shortage of RT staff in endemic area, the diagnosis and treatment of NPC imposes a huge burden on clinicians (7). Using AI to assist clinicians in the treatment of NPC is expected to improve work efficiency, accuracy, and save human resources costs. In this review, the current state and anticipated future impact of AI, which focused on common methods in the medical imaging, data analysis and possible applications in NPC treatment are discussed.


Automatic target delineation

There are two main principles of RT for malignancy including NPC. The radiation dose for tumor areas should be intensive enough, while the dose for surrounding normal tissues and organs should be reduced to minimum. Therefore, accurate delineation of the tumor target and organs at risk (OARs) is the premise and guarantee for successful treatment (8). However, due to the limitations of radiologists’ subjectivity and lack of experience, it is difficult to ensure the consistency of the outline among clinicians. Additionally, because of the complexity of the nasopharynx structure and its adjacent tissues, it is a time-consuming process to delineate the target area and normal tissues for treating NPC (9). Clinicians need to delineate tumor lesions and OARs layer by layer, which compromises the efficiency of the work. Taking this into consideration, the application of AI to achieve automatic target delineation has become a current research hotspot, aiming to reduce the variation between clinicians to ensure accuracy as well as save time needed for target design. At present, the most two common methods for automatic sketching are Atlas and DL.

Atlas based automatic delineation

By building a contrast image database, Atlas uses the method of rigid and deformation rectification to realize the automatic delineation of the tumor target area and the OARs (10-13). Pinnacle first achieved the automatic drawing of the region of interest (ROI) with the help of Atlas. On this foundation, Google has developed an AI system that can automatically sketch head and neck tumor lesions through ML (14). In order to further evaluate the application value of this method, Sims et al. used Atlas to delineate a patient’s brain stem, parotid gland, and mandible automatically, and manual delineations were considered as the gold standards. The Dice similarity coefficient (DSC) and ROC curve were used to evaluate the automatic delineation in its volume, sensitivity, and specificity. It is encouraging that Atlas-based automatic segmentation exhibits satisfactory sensitivity and specificity for the OARs being studied (15). In terms of work efficiency, it was also confirmed that automatic delineation technology based on Atlas could save more time compared with manual delineation for head and neck cancer (HNC) (16).

However, the Atlas-based AI maps the target area by extrapolation of the constructed library, so obvious limitations exist in dealing with the anatomical variations among different patients. For example, Xu et al. found that the use of automatic segmentation based on Atlas performed poorly in small-scale coronary artery or brachial plexus delineation, which required additional manual correction (17).

DL based automatic delineation

Different from the Atlas method, the DL algorithm based on a neural network is effectively trained using the standard data set and realizes the automatic delineation through the algorithm logic, which will improve performance. To compare the two methods, Zhang et al. selected CT images from 40 patients (including 10 each of head and neck, chest, abdomen, and pelvic tumor) and then used Atlas and DL based software to delineate the OARs. The manual delineation from senior RT physicians was regarded as the gold standard. The results showed that the DL-based automatic delineation achieved better accuracy compared with the Atlas-based method (18). Similarly, Yang et al. provided a platform for evaluating the performance of auto-segmentation methods of the OARs in thoracic CT images. It revealed that the lungs and heart can be segmented fairly accurately by various methods, while deep-learning methods performed better on the esophagus (19). All of these studies demonstrated that DL based automatic delineation was superior to the Atlas-based method.

Convolutional neural network (CNN) is a kind of feedforward neural network, which has outstanding performance in processing large-scale image, and has been widely used in image classification. Compared with other neural network structures, CNN needs relatively few parameters, which facilitates its wide application in many fields. Recently, with ML-based technology applied to target delineation, many scholars proposed CNN-based automatic segmentation, including deep convolutional neural networks (DCNN) (which means the DL methods using convolution neural networks) and fully convolutional neural networks (FCNN) (20-23). The biggest difference between FCNN and traditional CNN is that FCNN is not limited by high computational redundancy for the improved algorithm, so it can capture the whole image, not only the local features (22). In HNC, an innovative automated segmentation method that combines a FCNN with a shape representation model (SRM) was proposed. The new method can better solve the problems of substantial inter-patient anatomical variation and low CT soft tissue contrast. Therefore, its segmentation effect is consistently superior to the Atlas and the traditional DL model (22).

Although the multi OARs segmentation model based on the whole CT image can complete the DL of multiple organs at the same time, its performance of each organ delineation is not always satisfactory. As the target organs only fill a small part of the input image, it is easy to be confused by the complex and various content in the background area. As a result, some target OARs are misjudged. Therefore, some scholars put forward an organ-specific segmentation model, with the design of a special segmentation network for each target organ according to its property and the regional image information of the specific organ (20,24,25). However, it should be noted that there are more than ten OARs for NPC, diverse independent models were needed for the automatic segmentation. Thus, the training time is prolonged and the generalization of the model weakened. These disadvantages limit its value in clinical application. In that case, further optimization is still needed to improve the delineation of the OARs for NPC.

In terms of the tumor target in NPC, Lin et al. made a breakthrough in the clinical application of automatic delineation (26). A total of 1,021 patients were included in the study. Specifically, a FCNN architecture, which was composed of encoder and decoder paths to conduct the segmentation task, was designed. The AI-generated contours achieved high level of accuracy when compared with ground truth contours in testing cohorts (DSC, 0.79). In the multi-center evaluation, AI assistance improved contouring accuracy (five out of eight oncologists had a higher median DSC after AI assistance; average median DSC, 0.74 vs. 0.78, P<0.001). Besides, the AI-generated contours also showed 39.4% less contouring time, and reduced the intra-/inter-observer variations by 36.4% and 54.5%, respectively. The results suggested that AI assistance could effectively improve contouring accuracy and reduce intra-/inter-observer variation and contouring time. These advantages will positively impact the tumor control rate and prognosis of patients. Long-term follow-up is necessary to verify the survival benefit.

In conclusion, automatic delineation greatly shortens the working time of radiation oncologist under the premise of ensuring accuracy. Additionally, automatic delineation also reduces the discrepancy among clinicians, and it will become an important trend in the field of tumor RT in the future. Moreover, a new AI-based auto-contouring method was proposed for the abdominal MRI-based adaptive radiotherapy (ART) model built upon human brain cognition for manual contouring (27). Further researches are necessary to support the role of this method in other tumors.


Automatic dose calculation and optimization

Another application of AI in tumor RT is the optimization and calculation of target dose. Currently, there are two mature tools applied in clinical work, the RapidPlan (Eclipse) and AutoPlan (Pinnacle) platforms.

In HNC research, the advantages of an automatic plan based on these tools have been confirmed (28-30). The two different methods based on Pinnacle software and manual work were evaluated in the RT plan (28). The results demonstrated that automatic planning had some advantages compared with manual planning, which achieved a lower dose distribution of OARs with similar dose achieved in the tumor target area. Similarly, Fogliata et al. included the RT plans of 80 HNC patients treated with intensity modulated radiation therapy (IMRT) as the training model of RapidPlan, and 20 patients were selected to verify the model (29). After comparing the automatic and manual plans, it was found that the RapidPlan-based automatic plan performed better, and the average dose for the parotid gland, oral cavity, and throat decreased by 2, 5, and 10 Gy in automatic plans, respectively.

In NPC, the role of AI in automatic plan was also affirmed. A retrospective study included 97 NPC patients treated by IMRT, and completed the manual and automatic plans based on the Pinnacle treatment system (31). According to the results, the planning target volumes (PTV) coverage and homogeneity were not significantly different between the two plans, while the automatic plan could more effectively protect the OARs and decrease the mean dose by 270–1,870 cGy.

In conclusion, the automatic RT plan based on AI greatly improves the quality and consistency of treatment plans and saves the working time of physicists, which is of vital significance in clinical practice.


Other applications

In addition to automatic target delineation and automatic planning, other applications of AI have started to be developed by scholars. For RT plan evaluation, Zhu et al. included 212 IMRT plans of prostate cancer and evaluated their qualities using AI, and the results showed a predictive accuracy of 80% (32). In the disease diagnosis phase, AI tools based on DL are developed to detect nasopharyngeal malignancies in endoscopic or pathological images, which outperformed oncologist evaluation in the classification of nasopharyngeal mass from benign to malignant (33,34). In terms of risk stratification, Li et al. established a radionics model combined with ML method to facilitate early salvage for NPC patients who were at risk of in-field recurrence (35). Besides, a dataset-based study also verified that DL has utility in the identification of extranodal extension in patients with HNC and has the potential to be integrated into clinical decision-making process (36). In the prediction of curative effect and toxicity after RT, Ertiaei et al. created artificial neural networks to predict the clinical outcomes of trigeminal neuralgia patients treated with gamma knife radiosurgery on the basis of preoperative clinical factors. In their results, artificial neural networks could predict patients’ outcomes with a high level of accuracy (37).


Expectation

With the continuous development of science and technology, the application of AI has penetrated in all aspects of tumor RT, which has greatly reduced the workload of doctors and physicists and improved work efficiency. At present, an intelligent RT system based on a cloud platform is being developed. A remote RT system with the mode of “AI + RT” will provide important quality assurance for primary hospitals to carry out standardized RT. In addition, the probability that AI may significantly develop beyond its current capabilities is also recognized. Previously, it might have been mistakenly assumed that AI would continue to be less accurate than humans in medical decision-making. However, whether the performance of AI should only be judged and compared with human work is an issue worth discussion. For example, the contour from an experienced clinician is generally considered as the gold standard, but that doesn’t mean there is no space for further improvement (4). Therefore, the future of AI in the field of RT is promising.

The current research of AI still has some limitations. Primarily, its internal operation process and principle are not fully clarified. Even if it can behave similarly to human beings, its perception and processing are still different. On the other hand, the thinking mode of clinicians is also difficult to fully imitate. Therefore, AI could not completely replace the work of doctors and physicists, at least for now. However, the important trend of big data analysis and ML is expected to make it possible. The development of more precise AI algorithms and models, which makes RT more accurate and effective, is an exciting prospect.


Acknowledgments

Funding: None.


Footnote

Provenance and Peer Review: This article was commissioned by the editorial office, Annals of Nasopharynx Cancer for the series “Precision Radiotherapy in Nasopharyngeal Carcinoma”. The article has undergone external peer review.

Conflicts of Interest: All authors have completed the ICMJE uniform disclosure form (available at http://dx.doi.org/10.21037/anpc.2020.03.02). The series “Precision Radiotherapy in Nasopharyngeal Carcinoma” was commissioned by the editorial office without any funding or sponsorship. HQM serves as an unpaid editorial board member of Annals of Nasopharynx Cancer from May 2017 to Dec 2020. The other authors have no conflicts of interest to declare.

Ethical Statement: The authors are accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Open Access Statement: This is an Open Access article distributed in accordance with the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License (CC BY-NC-ND 4.0), which permits the non-commercial replication and distribution of the article with the strict proviso that no changes or edits are made and the original work is properly cited (including links to both the formal publication through the relevant DOI and the license). See: https://creativecommons.org/licenses/by-nc-nd/4.0/.


References

  1. Mnih V, Kavukcuoglu K, Silver D, et al. Human-level control through deep reinforcement learning. Nature 2015;518:529-33. [Crossref] [PubMed]
  2. Gulshan V, Peng L, Coram M, et al. Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. JAMA 2016;316:2402-10. [Crossref] [PubMed]
  3. Lakhani P, Sundaram B. Deep learning at chest radiography: automated classification of pulmonary tuberculosis by using convolutional neural networks. Radiology 2017;284:574-82. [Crossref] [PubMed]
  4. Thompson RF, Valdes G, Fuller CD, et al. Artificial Intelligence in Radiation Oncology Imaging. Int J Radiat Oncol Biol Phys 2018;102:1159-61. [Crossref] [PubMed]
  5. Wei WI, Sham JS. Nasopharyngeal carcinoma. Lancet 2005;365:2041-54. [Crossref] [PubMed]
  6. Bray F, Ferlay J, Soerjomataram I, et al. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin 2018;68:394-424. [Crossref] [PubMed]
  7. Lang J, Wang P, Dake WU, et al. An investigation of the basic situation of radiotherapy in mainland China in 2015. Chinese Journal of Radiation Oncology 2016;25:541-5.
  8. Lee JG, Jun S, Cho YW, et al. Deep learning in medical imaging: general overview. Korean J Radiol 2017;18:570-84. [Crossref] [PubMed]
  9. Sun XS, Li XY, Chen QY, et al. Future of radiotherapy in nasopharyngeal carcinoma. Br J Radiol 2019;92:20190209 [Crossref] [PubMed]
  10. Eldesoky AR, Yates ES, Nyeng TB, et al. Internal and external validation of an ESTRO delineation guideline - dependent automated segmentation tool for loco-regional radiation therapy of early breast cancer. Radiother Oncol 2016;121:424-30. [Crossref] [PubMed]
  11. Wang J, Chen W, Studenski M, et al. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance. Phys Med Biol 2013;58:N181-7.
  12. La Macchia M, Fellin F, Amichetti M, et al. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer. Radiat Oncol 2012;7:160. [Crossref] [PubMed]
  13. Nie K, Pouliot J, Smith E, et al. Performance variations among clinically available deformable image registration tools in adaptive radiotherapy - how should we evaluate and interpret the result? J Appl Clin Med Phys 2016;17:328-40. [Crossref] [PubMed]
  14. Powles J, Hodson H. Google DeepMind and healthcare in an age of algorithms. Health Technol (Berl) 2017;7:351-67. [Crossref] [PubMed]
  15. Sims R, Isambert A, Grégoire V, et al. A pre-clinical assessment of an atlas-based automatic segmentation tool for the head and neck. Radiother Oncol 2009;93:474-8. [Crossref] [PubMed]
  16. Lim JY, Leech M. Use of auto-segmentation in the delineation of target volumes and organs at risk in head and neck. Acta Oncol 2016;55:799-806. [Crossref] [PubMed]
  17. Xu H, Arsene Henry A, Robillard M, et al. The use of new delineation tool "MIRADA" at the level of regional lymph nodes, step-by-step development and first results for early-stage breast cancer patients. Br J Radiol 2018;91:20180095 [Crossref] [PubMed]
  18. Zhang FL, Cui DQ, Wang QS, et al. Comparative study of deep learning- versus Atlas-based auto-segmentation of organs-at-risk in tumor radiotherapy. Chinese Journal of Medical Physics 2019;36:1486-90.
  19. Yang J, Veeraraghavan H, Armato SG 3rd, et al. Autosegmentation for thoracic radiation treatment planning: A grand challenge at AAPM 2017. Med Phys 2018;45:4568-81. [Crossref] [PubMed]
  20. Ibragimov B, Xing L. Segmentation of organs-at-risks in head and neck CT images using convolutional neural networks. Med Phys 2017;44:547-57. [Crossref] [PubMed]
  21. Feng X, Qing K, Tustison NJ, et al. Deep convolutional neural network for segmentation of thoracic organs-at-risk using cropped 3D images. Med Phys 2019;46:2169-80. [Crossref] [PubMed]
  22. Tong N, Gou S, Yang S, et al. Fully automatic multi-organ segmentation for head and neck cancer radiotherapy using shape representation model constrained fully convolutional neural networks. Med Phys 2018;45:4558-67. [Crossref] [PubMed]
  23. Zhu W, Huang Y, Zeng L, et al. AnatomyNet: deep learning for fast and fully automated whole-volume segmentation of head and neck anatomy. Med Phys 2019;46:576-89. [Crossref] [PubMed]
  24. Ren X, Xiang L, Nie D, et al. Interleaved 3D-CNNs for joint segmentation of small-volume structures in head and neck CT images. Med Phys 2018;45:2063-75. [Crossref] [PubMed]
  25. Men K, Geng H, Cheng C, et al. Technical Note: more accurate and efficient segmentation of organs-at-risk in radiotherapy with convolutional neural networks cascades. Med Phys 2019;46:286-92. [PubMed]
  26. Lin L, Dou Q, Jin YM, et al. Deep learning for automated contouring of primary tumor volumes by MRI for nasopharyngeal carcinoma. Radiology 2019;291:677-86. [Crossref] [PubMed]
  27. Liang F, Qian P, Su KH, et al. Abdominal, multi-organ, auto-contouring method for online adaptive magnetic resonance guided radiotherapy: an intelligent, multi-level fusion approach. Artif Intell Med 2018;90:34-41. [Crossref] [PubMed]
  28. Hansen CR, Bertelsen A, Hazell I, et al. Automatic treatment planning improves the clinical quality of head and neck cancer treatment plans. Clin Transl Radiat Oncol 2016;1:2-8. [Crossref] [PubMed]
  29. Fogliata A, Reggiori G, Stravato A, et al. RapidPlan head and neck model: the objectives and possible clinical benefit. Radiat Oncol 2017;12:73. [Crossref] [PubMed]
  30. Krayenbuehl J, Norton I, Studer G, et al. Evaluation of an automated knowledge based treatment planning system for head and neck. Radiat Oncol 2015;10:226. [Crossref] [PubMed]
  31. Xin X, Churong LI, Jie LI, et al. Comparative study of automatic and manual plans of intensity-modulated radiation therapy for nasopharyngeal carcinoma. Chinese Journal of Radiation Oncology 2018;27:1072-7.
  32. Zhu X, Ge Y, Li T, et al. A planning quality evaluation tool for prostate adaptive IMRT based on machine learning. Med Phys 2011;38:719-26. [Crossref] [PubMed]
  33. Chuang WY, Chang SH, Yu WH, et al. Successful identification of nasopharyngeal carcinoma in nasopharyngeal biopsies using deep learning. Cancers (Basel) 2020; [Crossref] [PubMed]
  34. Li C, Jing B, Ke L, et al. Development and validation of an endoscopic images-based deep learning model for detection with nasopharyngeal malignancies. Cancer Commun (Lond) 2018;38:59. [Crossref] [PubMed]
  35. Li S, Wang K, Hou Z, et al. Use of radiomics combined with machine learning method in the recurrence patterns after intensity-modulated radiotherapy for nasopharyngeal carcinoma: a preliminary study. Front Oncol 2018;8:648. [Crossref] [PubMed]
  36. Kann BH, Hicks DF, Payabvash S, et al. Multi-institutional validation of deep learning for pretreatment identification of extranodal extension in head and neck squamous cell carcinoma. J Clin Oncol 2020;38:1304-11. [PubMed]
  37. Ertiaei A, Ataeinezhad Z, Bitaraf M, et al. Application of an artificial neural network model for early outcome prediction of gamma knife radiosurgery in patients with trigeminal neuralgia and determining the relative importance of risk factors. Clin Neurol Neurosurg 2019;179:47-52. [Crossref] [PubMed]
doi: 10.21037/anpc.2020.03.02
Cite this article as: Sun XS, Tang LQ, Chen QY, Sun Y, Mai HQ. The role of artificial intelligence in nasopharyngeal carcinoma radiotherapy. Ann Nasopharynx Cancer 2020;4:2.

Download Citation