AI News Wednesday, February 7
AI News TLDR / Table of Contents
- TensorFlow for R | RStudio Blog
- Over the past year we’ve been hard at work on creating R interfaces to TensorFlow, an open-source machine learning framework from Google. We are excited about TensorFlow for many reasons, not the least of which is its state-of-the-art infrastructure for deep learning applications.
- deep learning, TensorFlow, deep learning applications, NVIDIA GPU, high-end NVIDIA GPU
- Much about Blockchain Technology
- Cryptocurrencies according to technology experts is just the tip of the iceberg called Blockchain Technology. There could be many such applications of this technology.
- blockchain technology, smart contracts, incorruptible digital ledger, private key encryption, possible legal applications
- Photonic Computing Company Takes Aim at Artificial Intelligence | TOP500 Supercomputer Sites
- Chip startup Lightmatter has received an infusion of $11 million from investors to help bring the world’s first silicon photonics processor for AI to market. Using technology originally developed at MIT, the company is promising “orders of magnitude performance improvements over what’s feasible using existing technologies.”
- Chip startup Lightmatter, silicon photonics processor, AI chip startups, disruptive AI processor, AI algorithms
- An Overview of Multi-Task Learning for Deep Learning
- I’m a PhD student in Natural Language Processing and a research scientist at AYLIEN. I blog about Machine Learning, Deep Learning, NLP, and startups.
- multi-task learning, tasks, MTL, auxiliary tasks, auxiliary task
- Top 6 errors novice machine learning engineers make
- What common mistakes beginners do when working on machine learning or data science projects? Here we present list of such most common errors.
- Machine Learning Engineer, loss function, features, example fraud detection, good idea
Tweeted At: Wed Feb 07 16:47:46 +0000 2018
- Over the past year we’ve been hard at work on creating R interfaces to TensorFlow, an open-source machine learning framework from Google.
- On Saturday, we formally announced our work on TensorFlow during J.J. Allaire’s keynote at rstudio::conf: – – In the keynote, J.J. describes not only the work we’ve done on TensorFlow but also discusses deep learning more broadly (what it is, how it works, and where it might be relevant to…
- RStudio Server with Tensorflow-GPU for AWS (an Amazon EC2 image preconfigured with NVIDIA CUDA drivers, TensorFlow, the TensorFlow for R interface, as well as RStudio Server).
- We’ve also made a significant investment in learning resources, all of these resources are available on the TensorFlow for R website at Some of the learning resources include: Deep Learning with R – – Deep Learning with R is meant for statisticians, analysts, engineers, and students with a reasonable amount…
- If you are an R user who has been curious about TensorFlow and/or deep learning applications, now is a great time to dive in and learn more!
Tweeted At: Wed Feb 07 17:12:22 +0000 2018
Publish Date: 2018-01-13T10:22:30+00:00
Author: The Black
- A concept not so much in synchrony with the basic technology behind cryptocurrency: Blockchain Technology.
- Blockchain Technology as explained by the authors of Blockchain Revolution (2016) Don and Alex Tapscott, is an “incorruptible digital ledger of an economic transaction which can be programmed for many things and not just financial transactions.”
- After last year media attention to cryptocurrencies like Bitcoins have gained momentum, there has been much talk about the actual technology behind it.
- The cryptocurrencies according to technology experts is just the tip of the iceberg called Blockchain Technology.
- Honduras and Republic of Georgia have initiated blockchain technology into their land registration procedure.
Tweeted At: Wed Feb 07 19:17:03 +0000 2018
- Chip startup Lightmatter has received an infusion of $11 million from investors to help bring the world’s first silicon photonics processor for AI to market.
- Left to right: Darius Bunandar, Thomas Graham, and Nicholas Harris – – The technology was developed over a four-year span at MIT’s Quantum Photonics Laboratory.
- Like other programmable optical processors, the Lightmatter chip uses light, rather than electrons, as the basis for its processing.
- “For decades, electronic computers have been at the foundation of the computational progress that has ultimately enabled the AI revolution, but AI algorithms have a voracious appetite for computational power,” said Harris in the company’s first press release.
- At Lightmatter, we are augmenting electronic computers with photonics to power a fundamentally new kind of computer that is efficient enough to propel the next generation of AI.”
Tweeted At: Wed Feb 07 19:30:22 +0000 2018
Publish Date: 2017-05-29T13:00:00+00:00
Author: Sebastian Ruder
- Figure 2: Soft parameter sharing for multi-task learning in deep neural networks – – – The constraints used for soft parameter sharing in deep neural networks have been greatly inspired by regularization techniques for MTL that have been developed for other models, which we will soon discuss.
- Learning just task \(A\) bears the risk of overfitting to task \(A\), while learning \(A\) and \(B\) jointly enables the model to obtain a better representation \(F\) through averaging the noise patterns.
- MTL can help the model focus its attention on those features that actually matter as other tasks will provide additional evidence for the relevance or irrelevance of those features \(G\) are easy to learn for some task \(B\), while being difficult to learn for another task \(A\).
- Through MTL, we can allow the model to eavesdrop, i.e. learn \(G\) through task \(B\).
- Representation bias – – MTL biases the model to prefer representations that other tasks also prefer.
Tweeted At: Wed Feb 07 20:01:28 +0000 2018
- Becoming a Machine Learning Engineer | Step 2: Pick a process Goes over best practices that you can use to avoid this mistake – – Take Away: Always look at your data closely before you start your work and determine if outliers should be ignored or looked at more closely -…
- Many new machine learning engineers don’t think to convert these features into a representation that can preserve information such as hour 23 and hour 0 being close to each other and not far.
- Many people have asked for code example: Here it is – – Take Away: If you have cyclical features and you are not converting them you are giving your model garbage data to start with.
- L1 and L2 regularization penalizes large coefficients and is a common way to regularize linear or logistic regression; however, many machine learning engineers are not aware that is important to standardize features before applying regularization.
- These coefficients many time cause novice machine learning engineers to believe that for linear models, the bigger the value of the coefficient, the more important the feature is.
Top AI Courses
Artificial Intelligence A-Z™: Learn How To Build An AI (44,056 students enrolled)By Hadelin de Ponteves
- Build an AI
- Understand the theory behind Artificial Intelligence
- Make a virtual Self Driving Car
- Make an AI to beat games
- Solve Real World Problems with AI
- Master the State of the Art AI models
- Deep Q-Learning
- Deep Convolutional Q-Learning
Advanced AI: Deep Reinforcement Learning in Python (10,029 students enrolled)By Lazy Programmer Inc.
- Build various deep learning agents
- Apply a variety of advanced reinforcement learning algorithms to any problem
- Q-Learning with Deep Neural Networks
- Policy Gradient Methods with Neural Networks
- Reinforcement Learning with RBF Networks
- Use Convolutional Neural Networks with Deep Q-Learning