How is PyTorch different from Tensorflow?

Debarko De šŸ¦
HackerNoon.com
Published in
2 min readJan 19, 2017

--

PyTorch early release version was announced yesterday 1/19. PyTorch is currently maintained by Adam Paszke, Sam Gross and Soumith Chintala. The first question that comes to mind is What exactly is PyTorch? Well to put in the words of the makers, PyTorch gives

GPU Tensors, Dynamic Neural Networks and deep Python integration.

Itā€™s a Python first library, unlike others it doesnā€™t work like C-Extensions, with a minimal framework overhead, integrating with acceleration libraries such as Intel MKL and NVIDIA (CuDNN, NCCL) to maximise speed.

Letā€™s take a pause here and try to realise that till last few months, people were under the assumption that the deep learning library ecosystem was stabilising but it was far from the ground reality. Cutting edge tech in that ecosystem is ensuring efficient support for dynamic computation graphs and PyTorch just aces that is all aspects.

Dynamic computation graphs arise whenever the amount of work that needs to be done is variable. This may be when weā€™re processing text, one example being a few words while another being paragraphs of text, or when we are performing operations against a tree structure of variable size. This problem is particularly prominent in particular subfields, such as natural language processing, where I spend most of my time.

PyTorch is heavily influenced by Chainer and DyNet. In Chainerā€™s words, it is a difference between ā€œDefine-and-Runā€ frameworks and ā€œDefine-by-Runā€ frameworks. TensorFlow is a ā€œDefine-and-Runā€ framework where one would define conditions and iterations in the graph structure whereas in comparison Chainer, DyNet, PyTorch are all ā€œDefine-by-Runā€ frameworks. In this case at runtime the system generates the graph structure. This is closer to writing code in any language as a for loop in code will behave as a for loop inside the graph structure as well. TensorFlow doesnā€™t handle dynamic graphs very well though there are some not so flexible and frankly quite limiting primitive dynamic constructs.

Do follow me on twitter and you can also signup for a small and infrequent mailing list that I maintain. If you want to understand Deep Learning, go through this Medium post.

Hacker Noon is how hackers start their afternoons. Weā€™re a part of the @AMIfamily. We are now accepting submissions and happy to discuss advertising &sponsorship opportunities.

To learn more, read our about page, like/message us on Facebook, or simply, tweet/DM @HackerNoon.

If you enjoyed this story, we recommend reading our latest tech stories and trending tech stories. Until next time, donā€™t take the realities of the world for granted!

--

--

Debarko De šŸ¦
HackerNoon.com

Director @ Narvar || #AI, #MachineLearning, #NeuralNetworks, #Javascript || Certified Open Water Diver šŸŒŠ, Camera Love šŸ“·