Putting AI to Work – an Associate’s reflections on attending cutting edge AI Conference in New York

By Ajibola Obayemi

Data and Knowledge Systems Developer – KTP Associate

The O’Reilly AI Conference held in June in Manhattan, New York brought together industry pioneers, university experts, and thought leaders to debate, discuss and move thinking forward in one of the most cutting-edge areas in computing: Artificial Intelligence (AI). Attending this Conference and undertaking the training has deepened my knowledge, inspired my thinking and widened my network.

The conference covered two days of training and tutorials and two days of talks, workshops and seminars about applied AI in businesses and the use cases in different industries. As a Data and Knowledge Systems developer with BCMY Ltd and the University of Brighton, I have been tasked with building intelligent systems, optimizing work flow and using technology to facilitate business growth. This conference and the training provided just the right mix of learning, networking and understanding what other businesses are doing, what they are using to do it and how this is positively or negatively affecting their businesses.

The training

There were four different training sessions: Deep learning with TensorFlow, NVIDIA Deep Learning Institute bootcamp, Natural Language processing with Deep Learning and Neural Networks for time series analysis using DeepLearning4j.

The training was hands-on and we worked with a few deep learning frameworks (Caffe2, TensorFlow, Theano, and NVIDIA Digits) and library (Keras). For the most part, we used Convolutional Neural Networks (CNN) to solve Image Classification, Image Segmentation and Object Detection problems and used Recurrent Neural Networks (RNN) for modelling timeseries. Using transfer learning we made a model solve a similar problem on a new dataset which the model was not trained for. This is interesting as it means by making some changes and removing the output layer we can use pre-trained models on a new dataset, saving a significant amount of time and resources.

One of the cogent points for me is the clear distinction between training, validation and test datasets. Usually, validation and test datasets are used interchangeably in books and papers but each of these datasets have their uses and should be treated differently. Hyperparameter optimization was key as well, as it affects your learning rate, loss function, momentum and basically your training iterations.

 

The Conference

After two days of training, the full Conference got underway with some great keynotes from industry pioneers and experts leading significant projects and research in companies such as Google (Google Brain), IBM (IBM Watson), Facebook, NVIDIA, Intel (Intel Nervana), Salesforce and universities such as MIT, UC Berkeley, John Hopkins University, Carnegie Mellon University. The O’Reilly AI Conference is definitely a key place to network with industry experts. There was also a speed networking event which set the basis for introduction and other non-formal events held after the day was over for attendees to bond outside of the conference.

Several sessions were held with experts showing how they have applied AI and Machine Learning to varying problems. Some of the sessions highlighted the use cases for using AI and Machine Learning in discovering new drugs; discovering cancerous cells; solving eye care issues; predicting faults in machines before they happen, thereby facilitating cost effective preventive maintenance for industries that cannot afford any sort of downtime; cognitive mobile healthcare for patients and physicians; solving financial fraud with Machine Learning; and solving child pornography and human trafficking with AI. Seeing first-hand the diversity of AI applications across such a range of sectors and their impact was inspiring.

Some of the more technical sessions included Deploying AI systems in Edge and Cloud environments; Running TensorFlow at scale in the cloud; Software architectures for building enterprise AI; integrating deep learning libraries with Apache Spark; Recommending products for 1.91 billion people on Facebook; and the AI-powered newsroom.

Certainly, there was a lot to take away from the conference and the blend of these experiences from the training sessions and seminars/ workshops has for me ignited a new way of thinking about problems which can be solved using these Artificial Intelligence and Machine Learning techniques.

My project

For my project, the team and I have worked on a model for dynamic product pricing based on historical prices and product performance and I am currently rounding up work on a classifier algorithm for customer classification. With the new skillset, I will be optimizing the product pricing model, predicting demand and using this to generate a demand curve which can be clustered and the effective pricing for each cluster can be applied to products that have not been sold before. I will also be using sentiment analysis to reduce sales cycle and increasing the average negotiation turnaround period. Specifically, deep learning will help facilitate operational efficiency at BCMY Ltd by solving some computer vision tasks and ultimately remove certain constraints experienced at the moment. It is certainly an exciting time for BCMY Ltd as technology continues to play important roles in the service delivery pipeline.

I have the support of an effective team at BCMY Ltd and the University of Brighton and undoubtedly look forward to the coming months and how these implementations will deliver value to both BCMY Ltd and the University of Brighton.

Lastly, a word to take home, “as an Engineer, your focus should be in building your network, increasing the inference accuracy and ensuring your model does not mimic human bias”.

Ajibola Obayemi

Data and Knowledge Systems Developer – KTP Associate

BCMY Ltd and University of Brighton

Leave a Reply

Your email address will not be published. Required fields are marked *