当前位置: X-MOL 学术Softw. Pract. Exp. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An efficient Hadoop-based brain tumor detection framework using big data analytic
Software: Practice and Experience ( IF 2.6 ) Pub Date : 2020-09-14 , DOI: 10.1002/spe.2899
Prabhjot Kaur Chahal 1 , Shreelekha Pandey 1
Affiliation  

The exponential increase of brain MR image data in the medical imaging field requires faster and accurate segmentation of tumor. The computer aided detection systems acting as a second option to experts, radiologists, and surgeons needs to be swift enough to handle parallelism. However, handling of massive MR data for segmentation with high accuracy and low processing time is significant concern of any framework. In this article, distributed platforms for brain tumor segmentation using hybrid weighted fuzzy approach integrated with Matlab Distributed Computing Server and Hadoop has been proposed. The approach is based on the fuzzification of the pixel values to achieve more meaningful clusters by grouping of large data into similar clusters. The article focuses on analyzing the performance of varying sized data sets using hybrid fuzzy clustering in MapReduce on Hadoop to deal with huge MR brain data cross clusters of commodity computers. For experimentation varying size of DICOM data set is processed through different number of clusters to compare the read, write, and processing time on each node. The read and write operation time elevates as the data size increasing is floated to multinode. However, the processing time of the proposed approach turns to be 35 min on single, whereas 3-node clusters process the same data set (215 MB) in 3.4 min. Furthermore, increasing the data set to 7.3 GB the 3-node cluster performs in 235.4 min which is greatly reduced from single node processing time of 2085.2 min.

中文翻译:

使用大数据分析的基于 Hadoop 的高效脑肿瘤检测框架

医学影像领域脑部 MR 图像数据的指数级增长需要对肿瘤进行更快、更准确的分割。作为专家、放射科医生和外科医生的第二选择的计算机辅助检测系统需要足够快以处理并行性。然而,以高精度和低处理时间处理海量 MR 数据以进行分割是任何框架的重要关注点。在本文中,提出了使用与 Matlab 分布式计算服务器和 Hadoop 集成的混合加权模糊方法进行脑肿瘤分割的分布式平台。该方法基于像素值的模糊化,通过将大数据分组到相似的集群中来实现更有意义的集群。本文重点分析不同规模数据集的性能,在 MapReduce on Hadoop 中使用混合模糊聚类处理跨商品计算机集群的巨大 MR 大脑数据。对于实验,通过不同数量的集群处理不同大小的 DICOM 数据集,以比较每个节点上的读取、写入和处理时间。读写操作时间随着数据大小的增加浮动到多节点而增加。然而,所提出的方法的处理时间变成了 35 分钟,而 3 节点集群在 3.4 分钟内处理相同的数据集 (215 MB)。此外,将数据集增加到 7.3 GB,3 节点集群在 235.4 分钟内执行,这比单节点 2085.2 分钟的处理时间大大减少。
更新日期:2020-09-14
down
wechat
bug