Vi bøger
Levering: 1 - 2 hverdage

A Hadoop Based Framework for Secure Medical Datasets - Harinder Singh - Bog

Bag om A Hadoop Based Framework for Secure Medical Datasets

The development in medical field leads to produce massive amount of medical data. In 2002, more than 12000 images a day were produced by the Department of Radiology of a hospital in Geneva. The medical datasets are available for further exploration and research which have far reaching impact on the progress and execution of health programs. The information archived from exploring medical datasets paves the way for health administration, e-health diagnosis and therapy. So, there is urgent need to accentuate the research in medical data. The medical data is a huge growing industry and its size normally lies in terabytes. Such a big data puts forward many challenges and issues due to its large volume, variety, velocity, value and variability. Moreover, the working of traditional file management systems is slowing down due to its incapability of managing unstructured, variable and complex big data. The managing of such big data is very cumbersome and time-consuming task which requires new computing techniques. So, the exponential growth of medical data has necessitated a paradigm shift in the way the data is managed and processed. The recent technological advancements influenced the way of storing and processing big data. This motivated us to think about finding new solutions for managing volumetric medical datasets and to obtain valuable information efficiently. Hadoop is a top-level Apache project and is written in Java. Hadoop was developed by Doug Cutting as a collection of open-source projects. It is presently used on massive amount of unstructured data. With Hadoop, the data can be harnessed that was previously difficult to analyze. Hadoop has the ability to process extremely large data with changing structure. Hadoop is composed of different modules like HBase, Pig, HCatalog, Hive, Zookeeper, Oozie and Kafka, but the most common paradigms for big data are Hadoop Distributed File sSystem (HDFS) and MapReduce.

Vis mere
  • Sprog:
  • Engelsk
  • ISBN:
  • 9798224231140
  • Indbinding:
  • Paperback
  • Sideantal:
  • 124
  • Udgivet:
  • 10. Februar 2024
  • Størrelse:
  • 216x8x280 mm.
  • Vægt:
  • 334 g.
  • 2-3 uger.
  • 14. Maj 2024
På lager

Normalpris

Medlemspris

Prøv i 30 dage for 45 kr.
Herefter fra 79 kr./md. Ingen binding.

Beskrivelse af A Hadoop Based Framework for Secure Medical Datasets

The development in medical field leads to produce massive amount of medical data. In 2002, more than 12000 images a day were produced by the Department of Radiology of a hospital in Geneva. The medical datasets are available for further exploration and research which have far reaching impact on the progress and execution of health programs. The information archived from exploring medical datasets paves the way for health administration, e-health diagnosis and therapy. So, there is urgent need to accentuate the research in medical data.

The medical data is a huge growing industry and its size normally lies in terabytes. Such a big data puts forward many challenges and issues due to its large volume, variety, velocity, value and variability. Moreover, the working of traditional file management systems is slowing down due to its incapability of managing unstructured, variable and complex big data. The managing of such big data is very cumbersome and time-consuming task which requires new computing techniques. So, the exponential growth of medical data has necessitated a paradigm shift in the way the data is managed and processed. The recent technological advancements influenced the way of storing and processing big data. This motivated us to think about finding new solutions for managing volumetric medical datasets and to obtain valuable information efficiently.

Hadoop is a top-level Apache project and is written in Java. Hadoop was developed by Doug Cutting as a collection of open-source projects. It is presently used on massive amount of unstructured data. With Hadoop, the data can be harnessed that was previously difficult to analyze. Hadoop has the ability to process extremely large data with changing structure. Hadoop is composed of different modules like HBase, Pig, HCatalog, Hive, Zookeeper, Oozie and Kafka, but the most common paradigms for big data are Hadoop Distributed File sSystem (HDFS) and MapReduce.

Brugerbedømmelser af A Hadoop Based Framework for Secure Medical Datasets



Find lignende bøger
Bogen A Hadoop Based Framework for Secure Medical Datasets findes i følgende kategorier:

Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.