In this post, I'm going to briefly write about the recently launched Andrew Ng's coursera courses in Neural Network and Deep Learning that I just finished with certificates. I want to argue that there's merit in taking these courses even if you're already familiar with some good portion of the syllabi. First a short relevant … Continue reading Overview of Andrew Ng’s deeplearning.ai courses

# What’s up with word embedding?

Word embedding is one of the interesting areas of research in Natural Language Processing. There are huge amount of materials with a lot of interesting ideas. I have been studying some of them lately and in this post, I'd like to create a brief account of the ideas I have found most interesting so far. … Continue reading What’s up with word embedding?

# From Machine Learning to Formal Math and Type Theory

The idea of this post was sparkled from the new paper Developing Bug-Free Machine Learning Systems with Formal Mathematics. Meanwhile, I have had the idea of writing about what you're going to read for a long time and this paper happily forced me to do it finally! The first and final parts are about my journey and … Continue reading From Machine Learning to Formal Math and Type Theory

# Summer Internship at IBM as a Data Scientist

My Summer internship at IBM as a Data Scientist has ended gracefully. I am very grateful for the opportunity. During the time, IBMers from various positions and background were asked to contribute filling the educational gaps exist in Data Science and Big Data by the team behind BDU and DSWB. Undoubtedly, BDU is becoming more … Continue reading Summer Internship at IBM as a Data Scientist

# General Monty Hall Simulation

The idea of this post came up to my mind last night. I'm assuming you have already heard about the famous Monty Hall Problem (if you haven't, watch the quicker intro in Numberphile clip). Here I'd like to demonstrate a simulation taking the general case into account, i.e. assume we have $latex n$ bins (boxes or doors, … Continue reading General Monty Hall Simulation

# Restaurant Revenue Prediction with BART Machine

In this post, I'd like to show you how to use the newly written package on Bayesian Additive Regression Trees i.e. BART Machine for Restaurant Revenue Prediction with R. The datasets are part of the passed Kaggle competition that can be found here. What is BART? BART is the Bayesian sibling of Random Forest. For … Continue reading Restaurant Revenue Prediction with BART Machine

# Vector Bundles, Locally Free Sheaves and Divisors on a Curve

In this post, I'll be summarizing the basics of the correspondence between vector bundles, locally free sheaves and divisors on a smooth curve (defined over an algebraically closed field $latex k$ of characteristic zero) together with some of their individual properties. Locally free sheaves and Vector bundles: Proposition 1: a) A coherent sheaf $latex \mathcal{E}$ on a … Continue reading Vector Bundles, Locally Free Sheaves and Divisors on a Curve

# Classification of Vector Bundles on Elliptic curves

I'm supposed to give a talk on this subject for one of my courses, so I consider this post as a "pre-exposition." I learned from and heavily used the great exposition "Vector bundles on curves" by Montserrat Teixidor I Bigas in this post. I wrote up the pre-requisites here. In 1957, Atiyah in this famous paper "Vector bundles over … Continue reading Classification of Vector Bundles on Elliptic curves

# Some Homological Algebra Computations

In this post, I'm going to write down the detailed proofs of some of the exercises in Rotman's Homological Algebra. They were asked in ML and then answered by me. 1. Let $latex A$ be a torsion abelian group. Then $latex \text{Ext}^1_\mathbb{Z}(A, \mathbb{Z}) \cong \text{Hom}_\mathbb{Z}(A,S^1)$, where $latex S^1$ is the unit circle. One point is … Continue reading Some Homological Algebra Computations