J Gastric Cancer. 2023 Jul;23(3):388-399. English.
Published online Jul 31, 2023.
Copyright © 2023. Korean Gastric Cancer Association
Review

Artificial Intelligence in Gastric Cancer Imaging With Emphasis on Diagnostic Imaging and Body Morphometry

Kyung Won Kim,1,* Jimi Huh,2,* Bushra Urooj,3 Jeongjin Lee,4 Jinseok Lee,5 In-Seob Lee,6 Hyesun Park,7 Seongwon Na,3 and Yousun Ko1,3
    • 1Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.
    • 2Department of Radiology, Ajou University School of Medicine, Suwon, Korea.
    • 3Biomedical Research Center, Asan Institute for Life Sciences, Asan Medical Center, Seoul, Korea.
    • 4School of Computer Science and Engineering, Soongsil University, Seoul, Korea.
    • 5Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Korea.
    • 6Department of Surgery, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.
    • 7Body Imaging Department of Radiology, Lahey Hospital and Medical Center, Burlington, MA, USA.
Received July 20, 2023; Revised July 28, 2023; Accepted July 28, 2023.

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (https://creativecommons.org/licenses/by-nc/4.0) which permits unrestricted noncommercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Gastric cancer remains a significant global health concern, coercing the need for advancements in imaging techniques for ensuring accurate diagnosis and effective treatment planning. Artificial intelligence (AI) has emerged as a potent tool for gastric-cancer imaging, particularly for diagnostic imaging and body morphometry. This review article offers a comprehensive overview of the recent developments and applications of AI in gastric cancer imaging. We investigated the role of AI imaging in gastric cancer diagnosis and staging, showcasing its potential to enhance the accuracy and efficiency of these crucial aspects of patient management. Additionally, we explored the application of AI body morphometry specifically for assessing the clinical impact of gastrectomy. This aspect of AI utilization holds significant promise for understanding postoperative changes and optimizing patient outcomes. Furthermore, we examine the current state of AI techniques for the prognosis of patients with gastric cancer. These prognostic models leverage AI algorithms to predict long-term survival outcomes and assist clinicians in making informed treatment decisions. However, the implementation of AI techniques for gastric cancer imaging has several limitations. As AI continues to evolve, we hope to witness the translation of cutting-edge technologies into routine clinical practice, ultimately improving patient care and outcomes in the fight against gastric cancer.

Keywords
Artificial intelligence; Deep learning; Gastric cancer; Diagnostic imaging; Sarcopenia

INTRODUCTION

Gastric cancer is a life-threatening malignancy that is prevalent worldwide. Gastrectomy is an essential treatment for patients with gastric cancer [1]. Early diagnosis, appropriate surgical planning, and tailored patient management based on the prognostic factors are key to achieving better patient survival. Diagnostic imaging, particularly computed tomography (CT), is the primary modality used for the diagnosis, staging, treatment planning, and prognostication [2]. The rapid advancement of artificial intelligence (AI) techniques has spurred numerous research endeavors aimed at utilizing AI and radiomics models to enhance the diagnosis, staging, and prognosis of patients with gastric cancer undergoing gastrectomy. However, despite these promising research-level models, they have not yet been implemented in routine clinical practice. To achieve widespread adoption, complete automation, standardization, and rigorous validation are essential requirements that need to be addressed.

In addition, AI techniques have been developed for body morphometry, which can segment and measure body muscles and fat [3]. Sarcopenia, a condition characterized by the loss of muscle mass and function, is highly prevalent in surgical patients of gastric cancer [4]. Most patients experience worsened nutritional status, weight loss, and muscle mass decline after gastrectomy [5, 6]. These postgastrectomy changes are important prognostic factors that should be managed through personalized care to improve the overall survival [7]. Currently, multimodal AI models that use both CT imaging and electronic health records (EHRs) have gained emphasis for enhancing the accuracy of AI prognostic models [8].

This review provides an overview of the recent advancements and applications of AI in CT imaging for gastric cancer diagnosis, treatment planning, sarcopenia evaluation, and prognostication (Table 1). Additionally, we explored the limitations, potential future directions, and clinical impact of AI on CT imaging for gastric cancer.

Table 1
Summary of the artificial intelligence techniques for CT images utilized for patients with gastric cancer treated with gastrectomy

AI FOR DIAGNOSIS

Endoscopy plays a crucial role in detecting gastric cancer and distinguishing between early gastric cancer (EGC) and advanced gastric cancer (AGC). Thus, AI techniques for endoscopy have been developed, focusing on the detection of EGC, differential diagnosis of gastric lesions, and optimization of narrow-band imaging endoscopy [27]. In general, CT scans are crucial for determining the stage of cancer and devising suitable treatment plans [28]. CT often plays a significant role in the differential diagnosis of gastric masses, prompting the development of radiomics and AI models in this field.

Ba-Ssalamah et al. [11] used radiomic texture analysis to classify adenocarcinomas, lymphomas, and gastrointestinal stromal tumors using contrast-enhanced CT scans. Their study reported misclassification rates ranging from 0 to 10% [11]. Recently, Feng et al. [12] focused on differentiating Borrmann type IV gastric cancer from primary gastric lymphoma using contrast-enhanced CT scans. By combining a deep learning model to segment tumor areas and subjective finding model, they found that highly enhanced serosa and heterogeneity signature were significant factors for differentiating the 2 diseases [12].

AI FOR STAGING

The TNM classification is an internationally recognized standard staging system for gastric cancer. The staging accuracy of CT scans ranges from 67.1% to 89.1% for T staging, and from 49.3% to 79.5% for N staging [29]. The diagnostic accuracy of CT for detecting peritoneal metastases is low, ranging from 25.0% to 90.0% (median, 57.6%) [29]. This limited accuracy has prompted significant research efforts to develop AI techniques that enhance the TNM staging accuracy using CT scans.

T-staging

For T-staging of gastric cancer, radiologists assessed the presence of focal or diffuse wall thickening or mass-like lesions with densities different from those of the normal stomach wall on contrast-enhanced CT images. The detailed criteria for CT staging of the T stage are summarized in Fig. 1 [30]. Even experienced radiologists often encounter challenges in accurately distinguishing between EGC (T1 cancer) and T2 stage gastric cancer on CT scans. The accuracy for this differentiation ranges from 51.6% to 91.5%. Furthermore, discriminating between T1a and T1b stage cancers is more challenging, with accuracies ranging from 62.5% to 69.2% [31]. Zeng et al. [9] reported that AI classifier models based on ResNet101 demonstrated high accuracy in distinguishing between EGC (T1 cancer) and T2 cancer, with accuracies ranging from 91.4% to 94.6%. Furthermore, their model demonstrated the ability to discriminate between T1a and T1b cancers with accuracies ranging from 62.3% to 88.6% [9]. Despite these promising results, these AI models have not yet been implemented in clinical practice owing to their insufficient diagnostic accuracy compared to endoscopic ultrasonography, which remains the preferred method for evaluating the depth of invasion.

Fig. 1
Criteria for computed tomography staging of gastric cancer with sample cases.
LN = lymph node.

Another challenge in T-staging using CT is the evaluation of serosal invasion in gastric cancer. The accuracy of CT in identifying serosal invasion has been reported to be 76.6% (sensitivity,96%; specificity,72.4%) [10]. As T4a stage cancer with serosal invasion is considered to increase the risk of peritoneal metastases, identifying serosal invasion is important for treatment planning. Recently, Sun et al. [10] reported that a CT-based deep learning radiomics model could accurately evaluate serosal invasion in AGC, with area under curves (AUCs) ranging from 0.76 0.90. However, these AI radiomics models require the manual segmentation of tumors, which is time-consuming. In addition, these models have not been validated in many external validation cohorts; thus, they have not yet been clinically applied.

N-staging

Lymph node (LN) metastasis is a significant prognostic factor for gastric cancer. The accurate assessment of LN metastasis before surgery is crucial for making treatment decisions, such as determining the suitability of endoscopic mucosal resection, neoadjuvant chemotherapy, or gastrectomy with or without lymphadenectomy. Currently, the commonly used criteria for N-staging on contrast-enhanced CT are as follows: 1) LNs with a short-axis diameter ≥8 mm; 2) a cluster of 3 or more peritumoral LNs regardless of size; 3) LNs showing strong enhancement >100 HU; or 4) LNs with central necrosis and perinodal infiltration regardless of size [32]. However, N-staging on contrast-enhanced CT scans is unsatisfactory owing to its low diagnostic accuracy, leading to the development of several AI techniques with or without radiomics for evaluating LN metastasis [19, 20, 21, 22]. These studies have utilized both radiomics analysis of handcrafted primary gastric cancer lesions or metastatic LNs as well as deep learning models.

Most studies have focused on distinguishing between the N+ (LN-positive) and N- (LN-negative) statuses [19, 21, 22]. However, Dong et al. [20] aimed to develop a deep learning radiomic nomogram to predict the number of LN metastases. The model in the study by Dong et al. [20] was a well-validated international multicenter collaboration, which was tested in three external validation cohorts and one international validation cohort, ensuring the reliability and reproducibility of their results across different centers. They demonstrated that the deep learning radiomic nomogram achieved good discrimination in predicting the number of metastatic LNs across all cohorts, with C-indices ranging from 0.797 to 0.821 [20]. However, these advanced models require manual tumor region segmentation by an experienced radiologist, which limits their practical utility in daily clinical practice.

M-staging

The evaluation of peritoneal metastasis is the most crucial aspect of M staging in gastric cancer. Typically, CT images show specific characteristics associated with peritoneal metastasis, including peritoneal fat stranding, omental cakes, ascites, parietal peritoneal thickening, and nodules/masses. However, when these features are subtle, radiologists may overlook them, leading to reduced sensitivity and diagnostic accuracy of peritoneal metastases. In addition, occult peritoneal metastasis (OPM) refers to lesions in which the initial CT scan indicates negative peritoneal metastasis but subsequent laparoscopy or surgery reveals its presence. Owing to the inherent nature of OPM, CT scans have limited sensitivity and diagnostic accuracy in patients with OPM [23]. Consequently, extensive research efforts have focused on developing radiomics or AI models to enhance the detection rates of peritoneal metastasis and OPM on CT scans [23, 24, 25, 26].

Previous studies have predominantly employed radiomics analysis based on manually crafted features from primary gastric cancer or potential peritoneal lesions, yielding AUC values ranging from 0.724 to 0.836 [24, 25, 26]. However, one study utilized a deep learning model that exhibited improved diagnostic accuracy with an AUC of 0.900, surpassing that of the conventional clinical model (AUC of 0.670) [23]. Nevertheless, a common challenge shared by all M-staging studies, as well as T-staging and N-staging, is the need for manual segmentation of tumors, which significantly hampers their practical application in routine clinical practice.

AI FOR BODY MORPHOMETRY

Body morphometry refers to the quantification of body fat and muscle mass, which are commonly assessed using cross-sectional imaging, such as abdominal CT or magnetic resonance imaging (MRI). Body morphometry is based on imaging segmentation techniques. Traditionally, manual or semiautomated segmentation methods involving labor-intensive processes have been employed to measure the muscle and fat areas in cross-sectional images. However, these approaches are not feasible for large datasets owing to the substantial time and human resources needed. To overcome these limitations, fully automated segmentation methods utilizing deep learning techniques have been developed. [4].

Gastrectomy greatly influences the physiological and nutritional changes in patients treated with gastrectomy. These postgastrectomy changes can be critical determinants of the long-term survival of patients [7]. Body morphometric changes, such as sarcopenia, myosteatosis, and osteoporosis, have gained attention as important adverse events following gastrectomy that can influence prognosis. These body morphometric changes can be evaluated through abdominal CT imaging, which is routinely performed to evaluate the primary disease [4]. When these routine CT scans are evaluated for body morphometric changes other than primary disease, they are referred to as opportunistic CT scans. Recently, several AI techniques have been developed for body morphometric analysis of opportunistic CT scans.

Algorithms for sarcopenia evaluation

According to the revised European Working Group on Sarcopenia in Older People (EWGSOP2), the muscle area measured at the third lumbar vertebral level on CT was used as a representative value because it reflects overall body muscle mass [33]. Deep learning techniques enable automatic segmentation of the muscle and fat areas. The authors developed a fully automated AI technique for body morphometry, consisting of 2 AI algorithms: 1) automatic selection of the L3 vertebral level and 2) automated segmentation of the muscle and fat areas [3, 13]. The overall framework of the fully automated body morphometry is illustrated in Fig. 2. This deep learning model offers a fully automated selection of the axial CT slice at the L3 vertebral level and accurate segmentation of the abdominal muscle area, irrespective of the anatomical variations. The average cross-sectional area errors of the deep learning model were 2.22% in the normal anatomy group and 2.37%–4.06% in the anatomic variation group [13]. Recently, this AI body mophometry solution (AID-U™; iIAD Inc., Seoul, Korea) obtained regulatory approval, enabling its utilization in clinical practice.

Fig. 2
Framework of the artificial intelligence techniques for fully automated body morphometry.
CT = computed tomography; L3 = the third lumbar spine vertebra.

Algorithms for myosteatosis evaluation

Recently, there has been a growing interest in evaluating myosteatosis, which refers to excessive fat infiltration within skeletal muscles, as part of the body morphometry analysis [34]. With advancements in deep learning and imaging post-processing techniques, it is now possible to assess both the muscle area and myosteatosis. A web-based toolkit was developed for the automatic analysis of myosteatosis on abdominal CT scans, as depicted in Fig. 3 [14]. Additionally, this toolkit enables the evaluation of visceral and subcutaneous fat areas. This toolkit utilizes deep-learning algorithms to accurately quantify and evaluate the degree of myosteatosis, thereby providing a valuable tool for comprehensive body morphometry analysis.

Fig. 3
Web-based toolkit for myosteatosis evaluation.
Sfat = subcutaneous fat area; Vfat = visceral fat area; IMA = intermuscular adipose tissue content; LAMA = low-attenuation muscle area; NAMA = normal-attenuation muscle area.

Evaluation of the clinical impact following gastrectomy

Gastrectomy influences one of the most crucial human functions, eating, owing to early satiety, loss of appetite, dumping syndrome, reflux gastritis/esophagitis, and a variety of other adverse events [7]. Consequently, poor eating habits can cause considerable weight loss, nutritional deficiencies, and decreased physical activity. These changes can lead to sarcopenia, frailty, and mortality [6]. Furthermore, individuals undergoing subtotal and total gastrectomies exhibit substantial variations in weight change and body morphometric alterations, with remnant stomach volume (RSV) being a key contributing factor [35]. We used AI body morphometry techniques to evaluate the clinical impact of gastrectomies.

First, a volumetric measurement method for RSV was developed using postgastrectomy CT gastrography [35]. In patients who have undergone gastrectomy, CT gastrography with the oral administration of effervescent granules is performed to adequately distend the remnant stomach for volume measurements. At our institution, a three-dimensional (3D) volumetry technique has been established that generates volume-rendering images using semiautomatic segmentation software that distinguishes the luminal space of the remnant stomach containing air (Fig. 4).

Fig. 4
Combined use of three-dimensional volumetry of RSV and artificial intelligence body morphometry.
RSV = remnant stomach volume.

By employing AI-based body morphometry techniques to measure the muscle mass and utilizing 3D volumetry for RSV, Lee et al. [6] investigated the impact of RSV on nutrition, anemia, and sarcopenia. During the first year after gastrectomy, all the patients experienced a decrease in the visceral fat area on CT, leading to significant weight loss. The extent of this decrease was influenced by the RSV, which was the highest in the total gastrectomy group without RSV and lowest in the large RSV group [6]. Kim et al. [15] reported that a larger RSV was associated with better preservation of muscle and fat after distal gastrectomy, as illustrated in Fig. 4.

These findings suggest the potential for a paradigm shift in the guidelines for gastrectomy with regard to preserving the RSV [6, 15]. Currently, gastrectomy for gastric cancer involves the removal of at least two-thirds of the stomach to minimize the risk of tumor recurrence [29]. However, based on the results of these studies, modifications to the guidelines could be considered to prioritize the preservation of the RSV and optimize postoperative outcomes, nutritional status, and body composition. Further research and clinical trials are warranted to validate these findings and support potential guideline changes for gastrectomy procedures.

AI FOR PROGNOSTICATION

Accurate prognostic prediction of gastric cancer is of significant importance to all clinicians and patients. This information can aid clinicians in making informed decisions and improving patient management strategies [7]. Following curative-intent gastrectomy, two primary prognostic outcomes were assessed: overall survival and disease-free survival, also known as recurrence-free survival. These outcomes provide critical information regarding long-term prognosis and recurrence risk in patients who have undergone gastrectomy for gastric cancer [1]. In recent years, a surge in research focused on the development of prognostic models using AI and radiomics techniques has been observed [36].

Radiomics-based approaches

Radiomics approaches primarily utilize radiomic features extracted from medical images, such as CT or MRI, to predict the prognosis of patients with gastric cancer. These approaches often focus on disease-free survival with the aim of evaluating the likelihood of recurrence following gastrectomy. By analyzing and quantifying a wide range of radiomic features, radiomic models offer the potential to enhance prognostic assessments, enabling clinicians to identify patients at higher risk of recurrence and tailor treatment strategies accordingly [16, 17]. However, a notable challenge with these radiomics approaches is the need for manual segmentation of tumors, which poses a significant hurdle to their integration into routine clinical practice.

Multi-modal approach

Overall survival after gastrectomy is influenced by a multitude of factors: 1) tumor factors, including TNM staging or histology; 2) demographic factors, such as age, sex, performance status, and socioeconomic status; 3) pathophysiological factors, such as neutrophil/lymphocyte ratio, nutritional status, and body morphometric changes after gastrectomy; and 4) treatment factors, such as adjuvant chemotherapy and surgical procedures [5]. Considering the multifactorial nature of overall survival, it is essential to comprehensively assess these factors when determining the prognostic outcomes and developing personalized treatment strategies for patients who have undergone gastrectomy for gastric cancer [8].

Traditionally, statistical prognostic models, such as the Cox proportional hazards regression models or nomogram models, have been employed to predict the overall survival. However, these models were limited by the number of variables included in the model. Consequently, AI-based prediction models have garnered increasing attention, particularly owing to the advantages of deep learning in handling large clinical datasets with nonlinear effects and interactions between variables. Consequently, growing emphasis exists on multimodal AI models that integrate both CT imaging and EHRs to enhance the accuracy of prognostic models [18].

Recently, Chung et al. [8] developed an AI prognostic model for predicting 5-year survival using a substantial dataset of over 4,000 cases with 63 variables. These variables include nutrition, skeletal muscle mass, visceral/subcutaneous fat, sarcopenia, obesity, comorbidities, interval changes before and after surgery, and cancer-related variables [8]. In the external validation, they achieved promising results, with an AUC of 0.8903, sensitivity of 86.96%, and specificity of 74.60% for predicting 5-year survival. Despite these encouraging findings, the authors emphasize the need for further validation in larger populations to confirm the robustness and generalizability of the model.

CHALLENGES AND FUTURE DIRECTION

Although AI and radiomics have shown promising results in the diagnosis, staging, body morphometry, and prognosis prediction, several barriers should be addressed before their widespread adoption in clinical practice [36]. First, although AI algorithms may perform well in research settings, their effectiveness and safety should be rigorously validated in real-world clinical scenarios. Robust clinical trials and validation studies are essential for demonstrating the utility and impact of AI in improving patient outcomes. Second, seamless integration of AI and radiomics solutions into existing clinical workflows and healthcare systems is essential. Thus, AI or radiomics techniques that require manual or semiautomatic segmentation of tumors should evolve into fully automatic seamless solutions [27]. Moreover, several issues, including source data quality, privacy concerns, interpretability, and explainability should be overcome.

CONCLUSIONS

In this review, we discuss the current status of AI imaging research in patients with gastric cancer treated with gastrectomy. Although several useful results have been reported, there is still room for further development. Solutions for diagnosis, staging, and prognostication are still in the research phase, and AI body morphometry has been commercialized and used in clinical settings. By evolving AI and radiomics techniques towards fully automatic solutions, we can enhance their clinical utility, reduce potential human errors, and promote their wider adoption in routine clinical practice. This evolution would accelerate the integration of these powerful technologies into the healthcare system, ultimately benefiting patients by enabling more accurate and efficient diagnosis, staging, and prognostication of various diseases, including gastric cancer.

Notes

Funding:This study was supported by a grant of the Korea Health Technology R&D Project through the Korea Health Industry Development Institute (KHIDI), funded by the Ministry of Health & Welfare, Republic of Korea (HI18C1216).

Conflict of Interest:No potential conflict of interest relevant to this article was reported.

Author Contributions:

  • Conceptualization: Kim KW, Jimi Huh, Ko Y.

  • Data curation: Kim KW, Lee IS, Park H, Lee J, Lee J.

  • Funding acquisition: Kim KW.

  • Supervision: Ko Y.

  • Visualization: Kim KW, Ko Y.

  • Writing – original draft: Kim KW, Jimi Huh, Urooj B.

  • Writing – review & editing: Kim KW, Jimi Huh, Urooj B, Lee J, Lee J, Lee IS, Park H, Na S, Ko Y.

References

    1. Eom SS, Choi W, Eom BW, Park SH, Kim SJ, Kim YI, et al. A comprehensive and comparative review of global gastric cancer treatment guidelines. J Gastric Cancer 2022;22:3–23.
    1. Kim JW, Shin SS, Heo SH, Lim HS, Lim NY, Park YK, et al. The role of three-dimensional multidetector CT gastrography in the preoperative imaging of stomach cancer: emphasis on detection and localization of the tumor. Korean J Radiol 2015;16:80–89.
    1. Park HJ, Shin Y, Park J, Kim H, Lee IS, Seo DW, et al. Development and validation of a deep learning system for segmentation of abdominal muscle and fat on computed tomography. Korean J Radiol 2020;21:88–100.
    1. Lee K, Shin Y, Huh J, Sung YS, Lee IS, Yoon KH, et al. Recent issues on body composition imaging for sarcopenia evaluation. Korean J Radiol 2019;20:205–217.
    1. Kim KW, Lee K, Lee JB, Park T, Khang S, Jeong H, et al. Preoperative nutritional risk index and postoperative one-year skeletal muscle loss can predict the prognosis of patients with gastric adenocarcinoma: a registry-based study. BMC Cancer 2021;21:157.
    1. Lee K, Kim KW, Lee JB, Shin Y, Jang JK, Yook JH, et al. Impact of remnant stomach volume and anastomosis on nutrition and body composition in gastric cancer patients. Surg Oncol 2019;31:75–82.
    1. Park JH, Lee HJ, Oh SY, Park SH, Berlth F, Son YG, et al. Prediction of postoperative mortality in patients with organ failure after gastric cancer surgery. World J Surg 2020;44:1569–1577.
    1. Chung H, Ko Y, Lee IS, Hur H, Huh J, Han SU, et al. Prognostic artificial intelligence model to predict 5 year survival at 1 year after gastric cancer surgery based on nutrition and body morphometry. J Cachexia Sarcopenia Muscle 2023;14:847–859.
    1. Zeng Q, Feng Z, Zhu Y, Zhang Y, Shu X, Wu A, et al. Deep learning model for diagnosing early gastric cancer using preoperative computed tomography images. Front Oncol 2022;12:1065934
    1. Sun RJ, Fang MJ, Tang L, Li XT, Lu QY, Dong D, et al. CT-based deep learning radiomics analysis for evaluation of serosa invasion in advanced gastric cancer. Eur J Radiol 2020;132:109277
    1. Ba-Ssalamah A, Muin D, Schernthaner R, Kulinna-Cosentini C, Bastati N, Stift J, et al. Texture-based classification of different gastric tumors at contrast-enhanced CT. Eur J Radiol 2013;82:e537–e543.
    1. Feng B, Huang L, Li C, Quan Y, Chen Y, Xue H, et al. A heterogeneity radiomic nomogram for preoperative differentiation of primary gastric lymphoma from borrmann type IV gastric cancer. J Comput Assist Tomogr 2021;45:191–202.
    1. Ha J, Park T, Kim HK, Shin Y, Ko Y, Kim DW, et al. Development of a fully automatic deep learning system for L3 selection and body composition assessment on computed tomography. Sci Rep 2021;11:21656.
    1. Kim DW, Kim KW, Ko Y, Park T, Khang S, Jeong H, et al. Assessment of myosteatosis on computed tomography by automatic generation of a muscle quality map using a web-based toolkit: feasibility study. JMIR Med Inform 2020;8:e23049
    1. Kim A, Lee JB, Ko Y, Park T, Jo H, Jang JK, et al. Larger remaining stomach volume is associated with better nutrition and muscle preservation in patients with gastric cancer receiving distal gastrectomy with gastroduodenostomy. J Gastric Cancer 2022;22:145–155.
    1. Shi S, Miao Z, Zhou Y, Xu C, Zhang X. Radiomics signature for predicting postoperative disease-free survival of patients with gastric cancer: development and validation of a predictive nomogram. Diagn Interv Radiol 2022;28:441–449.
    1. Zhang W, Fang M, Dong D, Wang X, Ke X, Zhang L, et al. Development and validation of a CT-based radiomic nomogram for preoperative prediction of early recurrence in advanced gastric cancer. Radiother Oncol 2020;145:13–20.
    1. Li Z, Wu X, Gao X, Shan F, Ying X, Zhang Y, et al. Development and validation of an artificial neural network prognostic model after gastrectomy for gastric carcinoma: an international multicenter cohort study. Cancer Med 2020;9:6205–6215.
    1. Gao Y, Zhang ZD, Li S, Guo YT, Wu QY, Liu SH, et al. Deep neural network-assisted computed tomography diagnosis of metastatic lymph nodes from gastric cancer. Chin Med J (Engl) 2019;132:2804–2811.
    1. Dong D, Fang MJ, Tang L, Shan XH, Gao JB, Giganti F, et al. Deep learning radiomic nomogram can predict the number of lymph node metastasis in locally advanced gastric cancer: an international multicenter study. Ann Oncol 2020;31:912–920.
    1. Li J, Dong D, Fang M, Wang R, Tian J, Li H, et al. Dual-energy CT-based deep learning radiomics can improve lymph node metastasis risk prediction for gastric cancer. Eur Radiol 2020;30:2324–2333.
    1. Jiang Y, Wang W, Chen C, Zhang X, Zha X, Lv W, et al. Radiomics signature on computed tomography imaging: association with lymph node metastasis in patients with gastric cancer. Front Oncol 2019;9:340.
    1. Huang Z, Liu D, Chen X, He D, Yu P, Liu B, et al. Deep convolutional neural network based on computed tomography images for the preoperative diagnosis of occult peritoneal metastasis in advanced gastric cancer. Front Oncol 2020;10:601869
    1. Liu S, He J, Liu S, Ji C, Guan W, Chen L, et al. Radiomics analysis using contrast-enhanced CT for preoperative prediction of occult peritoneal metastasis in advanced gastric cancer. Eur Radiol 2020;30:239–246.
    1. Liu P, Ding P, Wu H, Wu J, Yang P, Tian Y, et al. Prediction of occult peritoneal metastases or positive cytology using CT in gastric cancer. Eur Radiol. 2023
    1. Wu A, Wu C, Zeng Q, Cao Y, Shu X, Luo L, et al. Development and validation of a CT radiomics and clinical feature model to predict omental metastases for locally advanced gastric cancer. Sci Rep 2023;13:8442.
    1. Niu PH, Zhao LL, Wu HL, Zhao DB, Chen YT. Artificial intelligence in gastric cancer: application and future perspectives. World J Gastroenterol 2020;26:5408–5419.
    1. Jeong SH, Seo KW, Min JS. Intraoperative tumor localization of early gastric cancers. J Gastric Cancer 2021;21:4–15.
    1. Kim TH, Kim IH, Kang SJ, Choi M, Kim BH, Eom BW, et al. Korean practice guidelines for gastric cancer 2022: an evidence-based, multidisciplinary approach. J Gastric Cancer 2023;23:3–106.
    1. Kim DJ, Hyung WJ, Park YK, Lee HJ, An JY, Kim HI, et al. Accuracy of preoperative clinical staging for locally advanced gastric cancer in KLASS-02 randomized clinical trial. Front Surg 2022;9:1001245
    1. Wang ZL, Li YL, Tang L, Li XT, Bu ZD, Sun YS. Utility of the gastric window in computed tomography for differentiation of early gastric cancer (T1 stage) from muscularis involvement (T2 stage). Abdom Radiol (NY) 2021;46:1478–1486.
    1. Kim SH, Kim JJ, Lee JS, Kim SH, Kim BS, Maeng YH, et al. Preoperative N staging of gastric cancer by stomach protocol computed tomography. J Gastric Cancer 2013;13:149–156.
    1. Cruz-Jentoft AJ, Bahat G, Bauer J, Boirie Y, Bruyère O, Cederholm T, et al. Sarcopenia: revised European consensus on definition and diagnosis. Age Ageing 2019;48:16–31.
    1. Ahn H, Kim DW, Ko Y, Ha J, Shin YB, Lee J, et al. Updated systematic review and meta-analysis on diagnostic issues and the prognostic impact of myosteatosis: a new paradigm beyond sarcopenia. Ageing Res Rev 2021;70:101398
    1. Huh J, Lee IS, Kim KW, Park J, Kim AY, Lee JS, et al. CT gastrography for volumetric measurement of remnant stomach after distal gastrectomy: a feasibility study. Abdom Radiol (NY) 2016;41:1899–1905.
    1. Jin P, Ji X, Kang W, Li Y, Liu H, Ma F, et al. Artificial intelligence in gastric cancer: a systematic review. J Cancer Res Clin Oncol 2020;146:2339–2350.

Metrics
Share
Figures

1 / 4

Tables

1 / 1

Funding Information
PERMALINK