Daily Reads

  1. This site outlines a detailed categorization of different NLP tasks, related and recently publishes datasets and state-of-the-art papers. [14/12/2018]
  2. Pytext, a great deep-learning based NLP modeling framework built on PyTorch is open-sourced by Facebook [Github repo]. It has production-ready models for various NLP tasks like : text classifiers, sequence taggers, joint intent-slot model, contextual intent-slot model. [14/12/2018]
  3. An article on Visualizing Convolutional Neural Networks as a part of the Stanford Course 231n titled “Convolutional Neural Networks for Visual Recognition” [09/12/2018]
  4. A detailed article on weak supervision [08/12/2018]
  5. A good article by Andrej Karpathy that covers crucial points regarding your PhD journey. [07/12/2018]
  6. A Medium article that covers the topic of visualizing high-dimensional data using PCA and t-SNE in Python [26/11/2018]
  7. Google AI open-sources BERT- A Pretraining model for NLP tasks. This mainly addresses domains that have a small amount of labeled data like sentiment analysis and Question-answering. BERT is trained to produce generalized language model representation which we then train on our own tasks, now requiring much less amount of labeled data.[article][code][02/11/2018]
  8. OpenAI develops Random Network Distillation for Reinforcement Learning, which now uses curiosity to train, instead of a future reward function or without any given objective function.[article][code] [02/11/2018]
  9. An interesting news article on how the future of Healthcare looks with AI.[26/10/2018]
  10. Check out the article covering a recent review of the Neural History of NLP. [23/10/2018]
  11. GoogleAI Research introduces FluidAnnotation which makes image annotations faster and easier.  Previously, an annotator used to carefully click on the boundaries to outline each object in the image, which is tedious. This makes it easier to annotate object boundaries. [Demo][Article][23/10/2018]
  12. Follow the projects of the following MIT Media Lab research groups[17/10/2018] :
    • Scalable cooperation – Re-imagining human cooperation in the age of social media and artificial intelligence
    • Civic media – Creating technology for social change
  13. Transfer Learning explained.  Some recent papers summarised in this blog at Medium. [17/10/2018]
  14. FastText – Learning high quality word representations in 157 languages [arXiv][Page] [17/10/2018]
  15. MIT Technology Review’s topic on Intelligent Machines post interesting articles from time to time[24/09/2018]
  16. Google’s AutoML – a new direction where 100x computational power is estimated to replace machine learning expertise. The Tree-Based Pipeline Optimization Tool (TPOT) was one of the very first AutoML methods and open-source software packages developed for the data science community [23/09/2018]
  17. “Deep Learning for Coders”, a free 7-week course provided by fast.AI. Covers topics such as Image recognition, CNN, Embeddings and RNN. [23/09/2018]
  18. Good article on Matrix decomposition, Singular Value Decomposition and Latent Semantic Indexing [23/09/2018]
  19. A good collection of Emotion and Sentiment lexicons.[11/09/2018]
  20. CommonCrawl archive contains 3.4 billion web pages and 270+ TiB of uncompressed content, crawled between February 17th and Feb 26th.[11/09/2018]
  21. CORE ranking of Computer Science conferences [26/08/2018]
  22. What is ablation study in Deep Learning. Read this answer on Quora [15/08/2018]
  23. Read this Quora answer on Preparing Resume for a Data Science position [06/08/2018]
  24. You can go through my answers given on Quora
  25. If you are starting to write your first research paper and are seriously struggling, you can follow this self-help series by Prof. Jari Saramäki. I found it very useful and am still going through it. [22/07/2018]
  26. List of upcoming AI Conference deadlines in the field of : Machine Learning,  Computer Vision, Natural Language Processing, Robotics, Speech/SigProc[22/07/2018]
  27. Some Quora answers that I found good – The 2nd answer by Daniel Bourke[22/07/2018]
  • I will adding articles weekly or once every 2 weeks.