Thursday, January 24, 2019

A good example is in the task of image detection

a good example is in the task of image detection

Deep learning has become popular mainly because of the impressive results that it has attained in several learning problems. Unlike traditional machine learning algorithms whose performance seem to plateau beyond a certain point regardless of the availability of data, deep learning models generally get better with more data. The factors for the meteoric rise of deep learning are summarized below:

Data: The availability of data which has been ushered in large part by the internet has lead to more data being collected in the past few years than at any point in human history. The amounts of data being stored is gigantic and this has lead to a wave of deep learning applications. A good example is in the task of image detection. With billions of natural images now available on the internet, deep learning models have been trained on millions of well labelled examples to the extent that they now outperform humans at the task of image classification. This is true across other learning problems where big data is now available to train large models.

Hardware: The processing speed of computers has grown exponentially over several years in line with Moore’s law. This has enabled larger computations to run on more powerful processors in a shorter period of time. The creation of Graphics Processing Units (GPUs) which are generally 50 times faster than Central Processing Units (CPUs) at deep learning tasks made it possible to train larger models on bigger datasets. GPUs which were originally designed to render graphics in the gaming industry have proven to be specially suited to parallel computation of the kind which is predominant in deep learning as feature learning is usually separated into small chunks and computed via a distributed infrastructure.