Deep Learning

Deep neural network: starting with raw pixels, each layer learns abstract features such as edges, shapes, object parts and complete objects.

Deep learning is the state-of-the-art research area in machine learning inspired by the way human brain works. Deep learning trains layers after layers of artificial neurons, successively learning  abstract features in a manner similar to the way our visual system works. 

Deep learning has outperformed other machine learning methods in the fields of computer vision, natural language processing and autonomous cars among others.

At BIGLAB we use advanced deep learning methods to speed up biomechanical simulations, to investigate neural motor control, to map plant phenotype to genotype and for shape modeling,  among other applications.

A deep autoencoder is trained to learn low-dimensional features that are used to map initial and final hand positions to muscle activations.

Areas of research

Deep neural networks are expressive machine learning models capable of representing many complex phenomenon. However, with great expressive power comes great vulnerability to overfitting. At BIGLAB, we are investigating novel techniques to overcome overfitting in resource-limited problem domains. 

Most of the research in the deep learning community is directed towards classification problems where first order methods with well established training criteria are used. At BIGLAB, we investigate optimization methods, learning criteria and optimal architectures for control problems that are usually formulated as regression problems.

Parallel Computing for Deep Learning

Deep learning involves training huge models with tens of thousands of training iterations and large datasets. BIGLAB researchers have access to high performance computing resources including a cluster of NVIDIA K40 GPUs, Canada’s national computing grid WestGrid and high-end workstations.

BIGLAB researchers use parallel numerical libraries with reverse mode auto-differentiation capabilities such as Google’s TensorFlow and  Cafe among others.

some changes here! More changes!

Participants

Approved
Yuli Chen
Approved
Ian Stavness

Publications