Howdy and welcome to another issue of Deep Learning Weekly, this week we delve into Google's AI strat
|
March 29 · Issue #34 · View online |
|
Howdy and welcome to another issue of Deep Learning Weekly, As we just crossed 4000 subscribers, we would like to thank you once again for all of your support. As always, if you want to help us grow this great community of deep learning enthusiasts, simply share this issue with friends and colleagues. See you next week!
|
|
|
|
The Shrewd AI Strategy behind Google's Kaggle Acquisition
In our latest blog post, we explore Google’s current AI strategy and investigate the role of the recent Kaggle acquisition. Google seems to be paving its way into leading the rising AI market, by making potential future engineers and scientists comfortable with TensorFlow, expanding it’s cloud platforms and offering new services. And all of this due to the slowly decreasing importance of search ad placements and rising AI markets.
|
Understanding The Limits Of Deep Learning
Ever wondered what that ‘AI’ everyone is talking about today is actually capable of? This article explains what possible and what are some of the current problems, like 'individually unreliable neural networks’ mean for their applications.
|
Can FPGAs Beat GPUs in Accelerating Next-Generation Deep Learning?
A detailed review of a recent paper on FPGAs from Intel, that comes to the conclusion that FPGAs may indeed be the future for running deep neural nets, as they offer more computational power while using less energy, when used for low precision computations.
|
|
Transfer Learning - Machine Learning's Next Frontier
In his article, Sebastian Ruder contrasts transfer learning with the well known supervised learning, highlights the advantages of transfer learning and explains its details and a technical definition using different scenarios. He finishes with applications, practical methods and a look into the future.
|
Fantastic GANs and where to find them
Guim Perarnau gives an overview on generative adversarial networks in general, sums up their evolution over the last year and highlights the most important improvements and optimizations. The whole article is packed with links to code, more detailed explanations, and further reading material, so you should definitely take a look.
|
Neural Networks Tutorial - A Pathway to Deep Learning
In this exceptionally detailed tutorial on neural networks, you learn the basic concepts and mathematical details using Python and the Numpy package. After learning about neural networks in general, their structure, gradient descent and backpropagation, you finally create your own network and assess its accuracy.
|
|
Weld
Weld is the product of an MIT and Stanford cooperation, that improves the performance of data-intensive applications. It optimizes across libraries and functions by expressing the core computations in libraries using a small common intermediate representation, similar to CUDA and OpenCL. See this article for more details.
|
Cyclical Learning Rates
An implementation of Cyclical Learning Rates for Keras, that allows you to cyclically increase your learning rate in order to make your model converge faster.
|
Google Cloud Machine Learning APIs tutorials
If you’re fed up with implementing and training your deep learning models, take a look at this tutorial series on the Google Cloud Machine Learning APIs. Learn how to use the different endpoints using Python.
|
|
Sequence-to-Sequence Models Can Directly Transcribe Foreign Speech
Fascinating model from DeepMind, that allows direct transcription of a foreign spoken language into another language.
|
Ten Deserving Deep Learning Papers that were Rejected at ICLR 2017
Carlos Perez lists some of the rejected papers from this years ICLR and explains, why they are worth a read.
|
Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization
Style transfer has become the omnipresent research topic and has been taken even further in this paper. Apply any style to an image in real-time.
|
Did you enjoy this issue?
|
|
|
|
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
|
|
|