Deep Learning with DNN Compiler

Deep Neural Network Compiler (DNNC) is an AOT Compiler and inference framework. Part-1 of this article series showed how to use DNNC as an inference framework. This article shows how to use DNNC to compile a machine learning model on a microcontroller or microcomputer worth ranging from 10¢ to $5.


How to Turn Dark Lakes Into Gold

Over the past few years, many companies have ‘AI-washed’ their brand to join the AI trend. The terms AI-powered, AI-inside, AI-driven seem to have been added by default to every software and service offering. The question is whether the companies monetize on the AI strategy (do ‘Cognification’) or just losing money similar to what happened in the Big Data collection and storage investment. In this blog, I will summarize my take from the AI & Big Data Expo North America 2019 event which took place last week in Santa Clara. I will cover the potential value created with AI and the opportunity to cut down the famous 80% of data preparation for the ML model creation. Apparently, this is the industry-wide conversation of the hour.


Data Science and Machine Learning Must-Read Books for 2020

Year 2020 is just around the corner, and with it, many opportunities will come for learning new things, strengthening current skills, and developing new ones. The data science field will continue expanding to new horizons; new libraries, algorithms, and analytical tools will continue being developed; more data scientists, data analysts, and machine learning job openings will continue arising within the industry; new majors and academic programs will be designed and offered within the Academia; and more people with programming, coding, and analytical skills will be required to handle today’s large amounts of data in an increasing number of industries wanting to get information, knowledge, insights, and wisdom from it. To achieve so, people must start getting familiar with multiple programming languages and software based on their fields of interest. They should comprehend the theory behind the analytical and machine learning algorithms as well as their statistical and mathematical fundamentals. Taking into consideration the increasing popularity of open-source software (e.g. R, Python), having a well established learning plan becomes crucial for truly mastering a new programming language and software. Below is a list of data science and machine learning must-read books for new and current programmers interested in R, a free software environment for statistical computing and graphics. Hope you find it useful.


Artificially Conscious Machines

What is consciousness? Is it the same as having the ability to think? Or is it like having a soul? Are plants conscious? These are some general questions which arise after reading the title. Defining consciousness in words is difficult although according to Dr Harry H. Porter III, there are roughly three meanings of consciousness: First, conscious means awake. A person who is asleep, in a coma is said to be unconscious. Second, the word conscious is often used to mean thinking of the way a human thinks. Third, being conscious means being aware of your self and your own thoughts. So, what does it mean to have artificial consciousness? How can we artificially create consciousness if we do not have a precise definition? Sometimes referred to as machine consciousness or synthetic consciousness, an artificial conscious machine could be a machine to possess the ability to act as humanely as possible and be self-aware of its existence. A machine which can indulge in long conversations, listen to music, have hobbies, embroil in disputes, feel emotions, do mathematics etc. These characteristics come naturally to a normal human but for a machine, these simple tasks are the same as the problem of intergalactic travel for humans. Today there are more than 10 million machines(robots) on earth and this number will multiply further in future and all of them excel in their respective tasks. But the number of machines that can truly understand this article is stigmatizing. From SHAKEY (so named because of its tendency to tremble during operation), ELIZA to OpenWorm and Sophia, the world seems to be on a space rocket to achieve better AI machines. Researchers in the past decade shows some promising results in the field of AI but still, it is far from a Human-Level Artificial Intelligence (HMLI). Nick Bostrom in his book Superintelligence: Paths, Dangers, Strategies discusses thoroughly the ways to reach HMLI, it’s after effect and challenges. But the idea of a truly conscious machine is far from reality. Some may argue that Neural Networks have promising results to present but is it a real intelligence or just simple statistical pattern observation? Although neural networks are said to emulate networks of neurons same as in our brains, let us take it into account that maybe someday we have enough processing power to develop a system capable enough to run an artificial brain. Is it going to be the same as the thoughts in our mind while eating delicious food or just a simple dot product of vectors matching the labelled output?


Lit BERT: NLP Transfer Learning In 3 Steps

BERT (Devlin, et al, 2018) is perhaps the most popular NLP approach to transfer learning. The implementation by Huggingface offers a lot of nice features and abstracts away details behind a beautiful API. PyTorch Lightning is a lightweight framework (really more like refactoring your PyTorch code) which allows anyone using PyTorch. This means students, researchers and production teams can scale deep learning code easily while making it reproducible. Lightning does not add abstractions on to of PyTorch which means it plays nicely with other great packages like Huggingface! In this tutorial we’ll use their implementation of BERT to do a finetuning task in Lightning.


Integrate Jupyter into Your Data Pipeline

How to automate data pipelines with Jupyter and Papermill.


papermill

papermill is a tool for parameterizing, executing, and analyzing Jupyter Notebooks. Papermill lets you:
• parameterize notebooks
• execute notebooks
This opens up new opportunities for how notebooks can be used. For example:
• Perhaps you have a financial report that you wish to run with different values on the first or last day of a month or at the beginning or end of the year, using parameters makes this task easier.
• Do you want to run a notebook and depending on its results, choose a particular notebook to run next? You can now programmatically execute a workflow without having to copy and paste from notebook to notebook manually.


Real-time Object Detection Without Machine Learning

For me, this isn’t a clear win for deep learning and I think there still is a place for an heuristic approach. The more assumptions that can be made about the detection conditions (consistent background and / or scale, constrained object types, distinguishing features such as colour) the more appeal heuristics have. As a developer, I would consider a heuristic based solution if time and resources were tight and the input constraints were clearly defined. If I wanted increased robustness and flexibility, I would opt for machine learning. Both approaches definitely have their place, and it’s a question of choosing the right tool for the job.


Causality in Machine Learning 101 for Dummies like Me

Recently I started wondering how Causal inference is being used with machine learning and especially wherein the data science project lifecycle? I have been researching this for a while, discussed with colleagues and finally came to the conclusion that Causal inference can be used after the modeling phase in order to confirm some correlations between variables and the target/outcome. For example, if the model has good accuracy and gives you a high correlation/association between input A and the target B you may want to perform a causal inference to validate that A has an effect on B. But then I ponder whether that’s all Causality can help me achieve or is there some other application of Causality in Machine learning. But to be really honest finding correlations between variables is simple but turning them into causal assertions needs an extra effort. Causal inference is used mostly to reach a prescription in the form of do X so that Y happens. Causality seems to be the core interest of the scientific community when we think about relationships between different entities, variables, and concepts in life. ‘Why’ is the ultimate question when we explore nature and her interactions with the ecosystem.


How to Make Computers Dream

A soft introduction to Generative Models.
Advertisements