September 24th, Switzerland

Free Registration, Part of MipTec 2014, Congress Center Basel

Thursday, February 4, 2016

GPU Computing: Massive Inexpensive Parallel Processing with GPUs Challenge Standard CPUs

GPUs were originally designed to attack very data-intensive graphic processing problems, used for rendering very complex images and for graphics in games and multimedia, and hence their name Graphic Processing Units (GPU).  What makes GPUs so powerful is the massive number of core processors in each device, compared to standard multi-core CPUs that have just a few or a handful of internal cores.  GPU cores are typically slower processors than standard CPU cores but are able to process many threads at once and can accelerate software to over 100 times faster than standard CPUs.

The challenge with GPUs and parallel processing is that in order to take full advantage of the GPU processing power it’s necessary to write software that can effectively manage and control work in parallel across all of the cores.  It’s a tough problem but there has been progress in developing software to effectively control the GPU cores.
GPUs are now mass produced, relatively inexpensive and increasingly being used for applications like scientific fields, signal processing, medical imaging, virus pattern matching, neural computing deep learning, life sciences, big data and fluid dynamics.
One recent example application of GPUs is in the area of bioinformatics.  Dr. Alex Zhavoronkov, CEO of Insilico, said that “when you’re using deep learning in bioinformatics your only option today is GPU computing.  Deep neural networks are evolving and revolutionizing many aspects of our daily lives – in pictures in videos in voice. GPU computing is becoming much more available and more databases, with millions of samples, also are becoming available. So success in deep learning is primarily centered around two factors: being able to utilize the full power of GPU computing, and access to huge databases.”
 Keywords: Bioinformatics; GPU Computing; Deep learning.


Post a Comment

Note: Only a member of this blog may post a comment.