Anomaly Detection With SQL

There are many ways to detect anomalies in data and like most things in life, not really a clear and cut ‘right way’ it to do it. The approach you take will be dictated by your data but also the requirements of the project. Is this something that needs to be as accurate as possible like credit fraud detection, do you just need to monitor some metrics for potential issues, does this need to be up and running fast or do you have 6 months to get it into production, what kind of resources do you have available to you etc.

Enhancing AR with machine learning

As Augmented Reality (AR) technologies improve, we are starting to see use cases that stretch beyond marketing and simple visualizations. These include product visualization, remote assistance, enhanced learning, quality control and maintenance. Apple’s Measure is one of my favorite AR apps. It’s a simple and reliable way of taking physical measurements with your smartphone and it demonstrates how robust AR tracking has become.

Building an AI that can read your mind

Machine learning for Mental State Classification using EEG data

Automated Machine Learning

Automated Machine Learning(AutoML) is currently one of the explosive subfields within Data Science. It sounds great for those who are not fluent in machine learning and terrifying for current Data Scientists. The way AutoML has been portrayed in the media makes it seem capable of completely revolutionizing the way we create models by removing the need for Data Scientists. While some companies such as DataRobot aim to fully automate the machine learning process, most in the field are creating AutoML as a tool to increase the production of current Data Scientists, and simplify the process for those entering the field to make it more accessible.

T5 – a model that explores the limits of transfer learning

The T5 model treats a wide variety of many-to-one and many-to-one NLP tasks in a unified manner by encoding the different tasks as text directives in the input stream. This enables a single model to be trained supervised on a wide variety of NLP tasks such as translation, classification, Q&A, summarization and even regression (though in reality it is actually a classification).

Time Complexity for Data Scientists

Big data is hard, and the challenges of big data manifest in both inference and computation. As we move towards more fine-grain and personalized inferences, we are faced with the general challenge of producing timely, trustable, and transparent inference and decision-making at the individual level. For now, we will concern ourselves with the challenge of ‘timeliness’, and try to gain intuitions on how certain algorithms scale and whether or not they can be feasibly used to tackle massive data sets.

Deep Learning for Natural Language Processing Using word2vec-keras

A deep learning approach for NLP by combining Word2Vec with Keras LSTM. Natural language processing (NLP) is a common research subfield shared by many research fields such as linguistics, computer science, information engineering, and artificial intelligence, etc. NLP is concerned with the interactions between computers and human natural languages in general and in particular how to use computers to process and analyze natural language data (e.g., text, voice, etc.). Some of the major challenges in NLP include speech recognition, natural language understanding, and natural language generation. Text is one of the most widespread forms of NLP data. It can be treated as either a sequence of characters or a sequence of words, but with the advance of deep learning, the trend is to work at the level of words. Given a sequence of words, it must be somehow converted into numerical numbers before it can be understood by a machine learning or deep learning algorithm/model such as LSTM. One straight forward way is to use One-hot encoding to map each word to a sparse vector of the length of vocabulary. The other method (e.g., Word2vec) uses word embedding to convert a word into a compact vector.

On the sensationalism of artificial intelligence news

When AI articles misinform and mislead. It’s no longer a secret that artificial intelligence (AI) is here to stay. What once was a puzzling and rather niche area of computer science, has suddenly started to take over our lives with its many applications. As a result, due to this mysterious and unknown characteristic of AI and its more prominent child, machine learning, news sites, and the press, in general, has taken a liking on overstating the reality behind the successes or advances in the field. This phenomenon often leads to articles of an unsavory nature that seems to sensationalize and even fearmonger what’s genuinely going on. In this essay, I want to shed some light on this issue.

How To Plot Time Series

Dealing with time series can be one of the most insightful parts of exploratory data analysis, if done right. In This post, we are going to use the checkin log from the Yelp Dataset to explore trends across different time periods using Pandas and Matplotlib.

The Serious Downsides To The Julia Language In 1.0.3

have talked a lot about how much I love Julia. If I was asked what my favorite programming language right now, ‘ Julia’ would certainly be my reply. On the other hand, something I don’t talk about as much are the things I don’t like about Julia. Although today I would like to address some of these weaknesses and personal gripes I have with the language, I firmly believe that the benefits outweigh any compromises that might have to be made to use Julia. If you have never used Julia, I would highly encourage you to check it out! And with my recommendation, I’m also going to prelude my gripes with the reasons I love it so much.

Why Intelligence might be simpler than we think

How much information do you need to build a human being and, more specifically, a human brain? After all, we are by far the most complex species on the planet. To take it up a notch, some of our brains think that our brains are the most complex structures in the universe! Nevertheless, a tomato has more genes than a human being. 7000 more, to be precise.

A Hands on Guide to Multiprocessing in Python

Getting started with parallel programming and speed up your python programs. This guide aims to explain why multi-processing is needed and how to use them in your programs. As a machine learning researcher, I use them extensively while preparing data for my models and feature engineering. Python is slow. To increase the speed of processing in Python, code can be made to run on multiple processes. This parallelization allows for the distribution of work across all the available CPU cores. When running on multiple cores long running jobs can be broken down into smaller manageable chunks. Once the individual jobs are run in parallel the results are returned and in this manner the time to process is cut down drastically.

AutoML Vision – how to train your model?

In the previous post, we have learned how to use Vision API in our project with Python. Thanks to Google, they help to train those APIs and it is very fast and convenient to use them directly. But what if we need to train a new model with our data, how could we achieve that? Here comes the AutoML Vision to save our lives.