Lagrange multipliers with pictures and code

In this story, we’re going to take an aerial tour of optimization with Lagrange multipliers. When do we need them? Whenever we have an optimization problem with constraints.

Calling Bullshit in Data Analytics

Fake news has been a hot topic of discussion in the post-Brexit and Trump world. While the world’s politicians, media and internet giants argue over how to counter the spread of propaganda through fake news, a much more devious evil lurks in the shadows. Harry G. Frankfurt, Professor of Philosophy at Princeton, calls it bullshit in his essay titled ‘On Bullshit’. Unlike fake news, which can be fact-checked and called out for its outright lies, bullshit ‘falls short of lying’ but is deceptive nevertheless. In Professor Frankfurt’s words it is ‘a greater enemy of truth than lies are’. While fake news may be the domain of online trolls, the art of bullshitting is practiced by the intellectual elite of the society.

Fast-SCNN explained and implemented using Tensorflow 2.0

Fast Segmentation Convolutional Neural Network (Fast-SCNN) is an above real-time semantic segmentation model on high resolution image data suited to efficient computation on embedded devices with low memory. The authors of the original paper are: Rudra PK Poudel, Stephan Liwicki and Roberto Cipolla. The code used in this article is not the official implementation from the authors but an attempt by me to re-construct the model as described on the paper. Since the rise of autonomous vehicles, it is highly desirable that there exist a model that can process input in real-time. There already exist some state-of-the-art offline semantic segmentation models, but these models are large in size and memory requirement and requires expensive computation, Fast-SCNN can provide solution to all these problems.

Data Science Project: Scraping YouTube Data using Python and Selenium to Classify Videos

I’m an avid YouTube user. The sheer amount of content I can watch on a single platform is staggering. In fact, a lot of my data science learning has happened through YouTube videos! So, I was browsing YouTube a few weeks ago searching for a certain category to watch. That’s when my data scientist thought process kicked in. Given my love for web scraping and machine learning, could I extract data about YouTube videos and build a model to classify them into their respective categories? I was intrigued! This sounded like the perfect opportunity to combine my existing Python and data science knowledge with my curiosity to learn something new. And Analytics Vidhya’s internship challenge offered me the chance to pen down my learning in article form.

Quantum Simulator Qubiter now has a native TensorFlow backend

I am pleased to announce that my quantum simulator Qubiter (available at GitHub, BSD license) now has a native TensorFlow Backend-Simulator (see its class `SEO_simulator_tf`, the `tf` stands for TensorFlow). This complements Qubiter’s original numpy simulator (contained in its class `SEO_simulator`). A small step for Mankind, a giant leap for me! Hip Hip Hurray! This means that Qubiter can now calculate the evolution of a state vector using CPU, GPU or TPU. Plus it can do back-propagation on a quantum circuit. Here is a jupyter notebook that I wrote that uses Qubiter’s TF backend to do VQE (Variational Quantum Eigensolving). (I like to call VQE, mean Hamiltonian minimization)

Speedup your CNN using Fast Dense Feature Extraction and PyTorch

Back in March, we open-sourced our implementation of ‘Fast Dense Feature Extraction with CNN’s that have Pooling or Striding Layers’, Although not broadly known, The 2017 BMVC published paper offers an efficient and elegant solution on how to avoid computational redundancy when using patch based Convolution Neural networks. So in this post I’ll explain how the model works and show how to use it in a real applications. I’ll cover two things: First, an overview of the method named ‘Fast Dense Feature Extraction with CNN’s that have Pooling or Striding Layers’. And, second, how to use this approach on an existing trained patch network to speed up inference time.

Why data science.

… or, why I decided to attend Flatiron School’s data science immersive bootcamp.

The Exciting Future with Blockchain and Artificial Intelligence

AI is going to change the world. We all know that. Blockchain is also (like AI) disruptive, revolutionary, and earth-shakingly transformative. So what happens if we take two buzzwords like AI (data science) and blockchain and combine them? Let’s find out! But first, do you know what a blockchain is? And why it’s one of the keys to a democratic digital future? Well, I could repeat what others have already detailed on this blog in the past and finish my word count and get the job done – but I want to inspire you! Give you fresh stunning information which you can hopefully use to ignite the flame of learning within you to start this journey or adventure in technology.

Testing and Debugging in Machine Learning

Testing and debugging machine learning systems differs significantly from testing and debugging traditional software. This course describes how, starting from debugging your model all the way to monitoring your pipeline in production.

The Conversational AI Playbook

This playbook represents a first step toward defining the governing principles and best practices which will enable developers to build great conversational applications. It is the result of several years of practical experience building and deploying dozens of the most advanced conversational applications achievable. Cutting-edge research and state-of-the-art algorithms are not surveyed here; there are many other resources available for that purpose. Instead, this playbook focuses on helping developers and data scientists build real production applications. The detailed instructions, practical advice, and real-world examples provided here should empower developers to improve the quality and variety of conversational experiences of the coming months and years.

Autoencoders: Deep Learning with TensorFlow’s Eager Execution

Deep Learning has revolutionized the Machine Learning scene in the last years. Can we apply it to image compression? How well can a Deep Learning algorithm reconstruct pictures of kittens? What’s an autoencoder?