Machine Learning Methods for Engineering Application Development

High-Performance Computing for Satellite Image Processing Using Apache Spark

Author(s): Pallavi Hiwarkar* and Mangala S. Madankar

Pp: 76-91 (16)

DOI: 10.2174/9879815079180122010009

* (Excluding Mailing and Handling)

Abstract

High-Performance Computing is the aggregate computing application that solves computational problems that are either huge or time-consuming for traditional computers. This technology is used for processing satellite images and analysing massive data sets quickly and efficiently. Parallel processing and distributed computing methods are very important to process satellite images quickly and efficiently. Parallel Computing is a computation type in which multiple processors execute multiple tasks simultaneously to rapidly process data using shared memory. In this, we process satellite image parallel in a single computer. In distributed computing, we use multiple systems to process satellite images quickly. With the help of VMware, we are creating a different operating system (like Linux, windows etc.) as a worker. In this project we are using cluster formation for connecting master and slave: apache spark is one of the important concepts in this project. Apache spark is one of the frameworks and Resilient Distributed Datasets are one of the concepts in the spark, we are using RDD for dividing dataset on the different node of the cluster.

© 2024 Bentham Science Publishers | Privacy Policy