Microsoft, Adobe and SAP understand the customer experience is no longer a sales management conversation. CEOs are breaking down the silos of the status quo so they can get all people inside their companies focused on serving people outside their companies. With the Open Data Initiative, we will help businesses run with a true single view of the customer.
Learn about the basics of feature selection and how to implement and investigate various feature selection techniques in Python
In this tutorial, you will learn what data engineering entails along with learning about our future data engineering course offerings.
Prophet is a forecasting tool developed by Facebook to quickly forecast time series data, available in R and Python. In this post I’ll walk you through a quick example of how to forecast U.S. candy sales using Prophet and Python.
If software ate the world, models will run it. But are we ready to be controlled by blackbox intelligent softwares? Probably not. And this is fair. We, as human, need to understand how AI works – especially when it drives our behaviours or businesses. That´s why in a previous post, we spotted machine learning transparency as one of the hottest AI trends. Let us walk through a brief history of machine learning models explainability – illustrated by real examples from our AI Claim Management solution for insurers.
As a research scientist at IBM, Malioutov spends part of his time building machine learning systems that solve difficult problems faced by IBM´s corporate clients. One such program was meant for a large insurance corporation. It was a challenging assignment, requiring a sophisticated algorithm. When it came time to describe the results to his client, though, there was a wrinkle. ‘We couldn´t explain the model to them because they didn´t have the training in machine learning.’ In fact, it may not have helped even if they were machine learning experts. That´s because the model was an artificial neural network, a program that takes in a given type of data – in this case, the insurance company´s customer records – and finds patterns in them. These networks have been in practical use for over half a century, but lately they´ve seen a resurgence, powering breakthroughs in everything from speech recognition and language translation to Go-playing robots and self-driving cars.
Automated machine learning is a rapidly developing segment of artificial intelligence – it´s time to define what an AutoML product is so end-users can compare product capabilities intelligently.
A step-by-step guide that includes suggestions on how to preprocess data and deriving features from this. This article also contains links to help you explore additional resources about machine learning methods and other examples.
In the second part of this blog series, I showed how to compute spatial kernel density estimates based on area-level data. The Kernelheaping package also supports boundary-corrected kernel density estimation, which allows us to exclude certain areas, where we know that the density must be zero. One example is estimating the population density where we like to exclude uninhabited areas such as lakes, forests, parks etc. The Kernelheaping package employs a boundary correction method, where each single kernel is restricted to the area of interest.
This post assumes basic knowledge of Artificial Neural Networks (ANN) architecture-also called fully connected networks (FCN). These notes are originally made for myself. It will benefit others who have already taken the Course 4, and quickly want to brush up during interviews or need help with theory when getting stuck with development. It is not supposed to cover everything from scratch. Hence for someone who has not taken the course, the content might look daunting and it might scare them away from Deep Learning. My suggestion is not to read beyond 2 if you haven’t taken the course.
Designing deep neural nets can be a painful task considering so many parameters involved and no general formula seems to fit all the use cases. We can use CNNs for image classification, LSTMs for NLP related tasks but still number of features, size of features, number of neurons, number of hidden layers, choice of activation functions, initialization of weights etc. will vary for each in different use cases.
In this post, I am going to discuss Apache Airflow, a workflow management system developed by Airbnb.
Recently I started working on a speech classification problem, as I know very little about speech/audio processing, I had to recap the very basics. In this post, I want to go over some of the things I learned. For this purpose, I want to work on the ‘speech MNIST’ dataset, i.e, a set of recorded spoken digits.
Difference between NLP, NLU, NLG and the possible things which can be achieved when implementing an NLP engine for chatbots.
Previously released under the name of SQL Operations Studio, Azure Data Studio offers a modern editor experience for managing data across multiple sources with fast intellisense, code snippets, source control integration, and an integrated terminal. Azure Data Studio is engineered with the data platform user in mind, with built-in charting of query result-sets and customizable dashboards. Azure Data Studio is complementary to SQL Server Management Studio with experiences around query editing and data development, while SQL Server Management Studio still offers the broadest range of administrative functions, and remains the flagship tool for platform management tasks. Azure Data Studio will continue to be updated on a monthly basis and currently offers built-in support for SQL Server on-premises and Azure SQL Database, along with preview support for Azure SQL Managed Instance, Azure SQL Data Warehouse, and SQL Server 2019 Big Data.
When working on a supervised machine learning problem with a given data set, we try different algorithms and techniques to search for models to produce general hypotheses, which then make the most accurate predictions possible about future instances. The same principles apply to text (or document) classification where there are many models can be used to train a text classifier. The answer to the question ‘What machine learning model should I use?’ is always ‘It depends.’ Even the most experienced data scientists can’t tell which algorithm will perform best before experimenting them. This is what we are going to do today: use everything that we have presented about text classification in the previous articles (and more) and comparing between the text classification models we trained in order to choose the most accurate one for our problem.
Hi and welcome to an Illustrated Guide to LSTM’s and GRU’s. I’m Michael, and I’m a Machine Learning Engineer in the AI voice assistant space. In this post, we’ll start with the intuition behind LSTM ‘s and GRU’s. Then I’ll explain the internal mechanisms that allow LSTM’s and GRU’s to perform so well. If you want to understand what’s happening under the hood for these two networks, then this post is for you. You can also watch the video version of this post on youtube if you prefer.