source: Alfred Anwander on Wikimedia Commons

Interesting stuff of AI, Machine learning, and Deep Learning 2017–09 #1

Shan Tang
BuzzRobot
Published in
5 min readSep 4, 2017

--

A list of ICs and IPs for AI, Machine Learning and Deep Learning.(keep updating)

1. Add cloud FPGA service information from Aliyun, Tencent Cloud, Baidu Cloud and Huawei Cloud.
2. Add news and articles about Cambricon and Bitmain.
3. Add articles about Microsoft’s BrainWave, Baidu XPU and Wave Computing Dataflow Processing Unit (DPU) after Hot Chips 2017.
4. Add NovuMind in “Startup Worldwide” section.
5. Add Movidius Myriad™ X and Chipintelli in “Startup in China” section.
6. Add HiSilicon Kirin970 presentations.

1. Tools To Design CNNs

If neural networking architectures are simpler than a CPU, why is it so difficult to create them? Samer Hijazi, senior design engineering architect in the IP group at Cadence, said that in the near future, for CNN technology to propagate properly, there are two products that need to — and will — come to market. First, new hardware architectures are needed. These will show up as chips and as IP, and will become available to the chip and system designers. Second, enhanced tools are in the works that both enable design of CNNs and allow optimization of the network from a power point of view to automate network design for power-conscious applications. Other tool companies are currently taking the approach of using compression and pruning techniques in existing development tools and hardware in order to reduce bandwidth and computations, noted Gordon Cooper, product marketing manager for Synopsys’ embedded vision processors. “What will probably happen is a start up company will come along and build the right tool that helps divide things up,” Allen said. “Then it will go in that direction. Part of the reason I say start up company is because the EDA companies or the hardware companies really don’t have enough knowledge or insight into the way applications are run to be able to come up and do something like that. And the application guys really don’t have enough insight into the problems of power or that power even is a problem to come around and focus on it. So it will probably be someone that spans the two that comes along and comes up with something in there at some point.”

2. Inside Waymo’s Secret World for Training Self-Driving Cars

An exclusive look at how Alphabet understands its most ambitious artificial intelligence project In a corner of Alphabet’s campus, there is a team working on a piece of software that may be the key to self-driving cars. No journalist has ever seen it in action until now. They call it Carcraft, after the popular game World of Warcraft.

3. How does physics connect to machine learning?

Physics and machine learning are intricately connected, but making the overlaps precise is not easy. If you have a Physics background and want to break into ML, this post is for you.

4. HERE’S HOW BOSCH TEACHES CARS TO SEE USING ARTIFICIAL INTELLIGENCE

AI by Bosch is the brains behind many self-driving car platforms, and to see how it works — and how a car sees the world around it — the company gave us a chance to briefly explore the German countryside in one of its prototypes. It turns out Andy Warhol and the future of mobility have more in common than you might think.

5. ‘Cortana, Open Alexa,’ Amazon Says. And Microsoft Agrees.

In an unusual partnership, Amazon and Microsoft are working together to extend the abilities of their voice-controlled digital assistants. For the past year, the two companies have been coordinating behind the scenes to make Alexa and Cortana communicate with each other. The partnership, which the companies plan to announce early Wednesday, will allow people to summon Cortana using Alexa, and vice versa, by the end of the year.

6. ETHICS COMMISSION AUTOMATED AND CONNECTED DRIVING

In June, the ethics commission of the German Federal Ministry of Transport and Digital Infrastructure released guidelines (German language guidelines) for self-driving vehicles (English language summary). At the time, German Federal Minister Dobrint touted the “pioneering” work as the “first guidelines in the world for automated driving” addressing these ethical issues. Now, the Ministry has announced it will implement those guidelines.

7. Same Stats, Different Graphs: Generating Datasets with Varied Appearance and Identical Statistics through Simulated Annealing

…make both calculations and graphs. Both sorts of output should be studied; each will contribute to understanding. — F. J. Anscombe, 1973 (and echoed in nearly all talks about data visualization…)

8. Deep Learning (DLSS) and Reinforcement Learning (RLSS) Summer School, Montreal 2017

The Deep Learning Summer School (DLSS) is aimed at graduate students and industrial engineers and researchers who already have some basic knowledge of machine learning (and possibly but not necessarily of deep learning) and wish to learn more about this rapidly growing field of research. In collaboration with DLSS we will hold the first edition of the Montreal Reinforcement Learning Summer School (RLSS). RLSS will cover the basics of reinforcement learning and show its most recent research trends and discoveries, as well as present an opportunity to interact with graduate students and senior researchers in the field. The school is intended for graduate students in Machine Learning and related fields. Participants should have advanced prior training in computer science and mathematics, and preference will be given to students from research labs affiliated with the CIFAR program on Learning in Machines and Brains.

9. Transformer: A Novel Neural Network Architecture for Language Understanding

Neural networks, in particular recurrent neural networks (RNNs), are now at the core of the leading approaches to language understanding tasks such as language modeling, machine translation and question answering. In Attention Is All You Need we introduce the Transformer, a novel neural network architecture based on a self-attention mechanism that we believe to be particularly well-suited for language understanding. In our paper, we show that the Transformer outperforms both recurrent and convolutional models on academic English to German and English to French translation benchmarks. On top of higher translation quality, the Transformer requires less computation to train and is a much better fit for modern machine learning hardware, speeding up training by up to an order of magnitude.

10. Online neural doodle

Can you draw like Monet? You probably can not. In fact you can with a little help of modern technologies. This little application presents the finest scientific approach which lets you to draw like famous artists. The drawing process is now reduced to sketching a five color doodle, everything else is done by a neural network.

Weekly Digest Aug. 2017 #2

Weekly Digest Aug. 2017 #3

Weekly Digest Aug. 2017 #4

Weekly Digest Aug. 2017 #5

--

--

Shan Tang
BuzzRobot

Since 2000, I worked as engineer, architect or manager in different types of IC projects. From mid-2016, I started working on hardware for Deep Learning.