Algorithm Speeds GPU-Based AI Training 10x on Big Data Sets

in #artificial-intelligence7 years ago (edited)

Researchers at IBM Zurich in Switzerland claim they have developed a generic artificial intelligence preprocessing module that can speed up big data machine-learning algorithms at least 10-fold over existing methods, using mathematical duality to exclusively pick out items in a big data stream that will make a difference.

"Our motivation was how to use hardware accelerators, such as GPUs and FPGAs, when they do not have enough memory to hold all the data points" for big data machine learning, says IBM Zurich's Celestine Dunner. The method hinges on preprocessing each data point to determine if it is the mathematical dual of a point already processed; if so, the algorithm skips it, and this occurs more often as the dataset is processed. "If you can fit your problem in the memory space of the accelerator, then running in-memory will achieve even better results"

IBM Zurich's Thomas Parnell.

Click here to read the full article at EETimes

Coin Marketplace

STEEM 0.20
TRX 0.19
JST 0.034
BTC 91309.99
ETH 3150.55
USDT 1.00
SBD 2.89