Abstract
Background: Proteomics was built around two-dimensional (2D) gel electrophoresis. Accurately analyzing the images generated from 2D gel electrophoresis for spot detection is a timeconsuming process especially for high-resolution and large images.
Objective: In this paper, we present an accurate GPU-accelerated software tool for the detection and quantification of protein spots in 2D gel electrophoresis images.
Method: We adopt pixel-based approach that employs wavelet relational fuzzy C-means clustering and distance transform to detect and quantify the protein spots. This pixel-based spot detection approach is more accurate than the contour-based approaches; however it is compute-intensive. So, along with algorithmic optimizations, we present the mapping and optimization of the pixel-based spot detection algorithm onto graphics processing units (GPUs); including NVIDIA and AMD GPUs.
Results: This approach is proved to exhibit better spot detection in quantitative comparisons with the commercial software tools. Specifically, it achieves a degree of improvement in F-measure of 21.237% and 11.716% on the average compared to Delta2D and Melanie, respectively. We carry out experiments on images of large size and high resolution for healthy and diseased samples. Our implementation has accomplished up to five orders of magnitude speedup compared to the single-threaded MATLAB implementation.
Conclusion: We proposed an accurate and efficient tool for detecting and quantifying the protein spots in 2D Gel Electrophoresis images. Our tool outperforms commercial software tools in accuracy of detecting protein spots while achieving significantly better performance than single-threaded MATLAB implementation by utilizing the GPU accelerators.
Keywords: Image processing, 2D gel electrophoresis, heterogeneous computing, parallel computing, GPU, OpenCL.
Graphical Abstract