MATLAB Targets Big Data, says Dr Shure

9 March, 2015

Dr Loren Shure from MathWorks: "MATLAB environment, computational speed, and integration for GPU-acceleration makes a good environment for Deep Learning"

Dr Loren Shure from MathWorks: “MATLAB environment, computational speed, and integration for GPU-acceleration makes a good environment for Deep Learning”


This month, Dr Loren Shure from MathWorks will particiate in IMVC 2015 Conference for Machine Vision in Tel-aviv, where she will talk about recent MATLAB features to solve problems such as training image recognition algorithms or processing large volumes of medical images.

Dr. Shure is one of the first employees of MathWorks, and works for MathWorks over 28 years. For the first 27 of these years, she co-authored several MathWorks products, contributed core functionality to MATLAB and to the design of the MATLAB language. She agreed to Make an email interview with Techtime.

You are the first employee in Mathworks. What are the major advancements that took place since you started for Image Processing & Computer Vision?

“Large Data – A decade ago, a 2000×3000 image was HUGE. Now it’s quite common. It’s also common now to have thousands of images. Performance – Almost all of our IP Toolbox is performance optimized. This means C/C++ code under the hood with multithreading, Intel instruction optimizations, and world class algorithm implementations. Additionally, we’ve got over 50 algorithms that can run on NVIDIA GPU cards.  For image processing, MATLAB is pretty fast now.

“Algorithms – The world keeps on discovering new ways to solve problems and we make sure that the best techniques make their way into our tools. We’ve made major advancements in the areas of image registration and image segmentation. Computer Vision – A decade ago, this was still a relatively young field. It has matured quickly and so has the MathWorks solution. CVST has a very compelling set of capabilities with easy to use tools like Camera Calibrator.”

 “Deep Learning has become an important ingredient for computer vision. Does Matlab support DL?

“MATLAB is being used in two of the popular deep learning frameworks, caffe and torch – computer vision is a primary application area. The MATLAB environment, computational speed (e.g., MKL integration), and integration for GPU-acceleration make it a good environment for deep learning. Today deep learning is being used by a relatively few researchers; they often use multiple technique to get optimal results. Tools like feature detection in Computer Vision Toolbox, classification in the Statistics and Machine Learning Toolbox, and Neural Network toolbox are also used for these types of problems.

“Last year at IMVC, a professor in CV with deep learning (Lior Wolf) professed his love for MATLAB because it made setting up his problems so much easier.  There’s a lot of data prep in order to use deep learning algorithms. MATLAB is really great at that.

Deep Learning works well for users with large sets of labelled data and with significant computing resources; this explains why most of the deep learning research is being performed by companies like Google and Facebook. Our tools for computer vision and machine learning will help those without the computing power or data to solve these problems. Additionally, the MATLAB community has created several third party toolboxes that perform deep learning as well.”

How does Matlab support GPU computing for computer vision applications ?

“We support about 50 image processing algorithms for use on NVIDIA GPUs.  Also, users can integrate CUDA kernels into their MATLAB workflow with PCT.”

What can we expect in the near future in the computer vision toolbox ?

“As of R2015a, we just finished a bunch of work on making recognition frameworks easier to use. For example, check out the new ‘Bag of Visual Words’ framework. Moving forward, we’re focused on stereo vision and point cloud processing. We’ve release some capabilities and there’s more to come.”

How does Matlab support big data problems?

“Our support for big data is designed to help users to scale their algorithms with very few code changes ( e.g., parfor , batch, imageSet ). They can start trying out their ideas with imageSet or datastore in MATLAB, and continue their analysis using MapReduce. With MATLAB MapReduce they can explore and analyze big data sets on their workstations with the MapReduce programming technique that’s fundamental to Hadoop. They can create applications based upon MATLAB MapReduce to work with their ‘uncomfortably large’ data on workstations, and deploy these same applications within production instances of Hadoop, using MATLAB Compiler.

“This is important since the majority of companies don’t have production Hadoop systems today, but many are considering these in the near future. By supporting MATLAB MapReduce today, companies can begin learning to use and build programs and become familiar with how easy it is to take models and integrate them into their IT systems. We see many companies taking the next steps to build more sophisticated analytics.”

Share via Whatsapp

Posted in: News , Software and IT

Posted in tags: featured