Quantcast
Channel: Analytics India Magazine
Viewing all articles
Browse latest Browse all 21426

Julia Users Can Now Rejoice, Google Cloud Has Powerful Capabilities To Support The Language

$
0
0

 

The new technological era is one where task-specific hardware and software are on the rise. This year at Google I/O 2018, Google launched a new generation of Tensor Processing Unit (TPU), already in use to turbocharge a set of products. Now the MountainView search giant has announced enhanced Julia capabilities to the TPU ecosystem. To remain relevant in the new era, Julia Computing has developed a method for running suitable sections of Julia programs to TPUs using an API and the Google XLA compiler. This development has added more options alongside Tensorflow to leverage Google Cloud.

Google CEO Sundar Pichai said that the new lines of TPUs were around eight times more powerful than previous editions. The added Julia power will help Google Cloud to reach out to a bigger pool of developers and data scientists who use a combination of Julia and machine learning. Matrix heavy operations have been made easy by heavy compute available on Graphical Processing Units.

Julia And Machine Learning

Julia is a high-level programming language which was specially designed for numerical analysis and computational science and it can also be used for server web use or as a specification language. Julia has been used in a concurrent, parallel and distributed computing of C and other low-level programming languages inside the code.

It also comes packed with Flux a powerful framework for machine learning and AI tasks. Today, ML has become increasingly complex and there is an imminent need for differentiable languages where code can be used to represent algorithms. Hence in the modern world, Julia’s syntax is just the right way to express algorithms. Google Cloud TPU’s support for Julia is also a recognition of the popularity of the language and its utility in modern machine learning.  

Specially adapted for neural networks, Julia Flux also has layer stacking based interface for simple neural network models and can also handle variational auto-encoders and other complex networks. Interestingly Julia also supports many favourite frameworks such as Tensorflow and MXNet hence fixing itself a place in the data science toolkit and workflow.

According to Viral Shah, co-creator, “Apart from C and CUDA from NVIDIA, Julia is the only widely used language which has natively built CUDA code generation. So you can write your code in Julia and deploy them to CPUs without knowing any C or C++”

Google Announces XLA Compiler

Julia Computing wants to expand its services and offerings and become available at a large number of workflows. In the middle of 2017, Julia computing raised $4.6M in seed funding from General Catalyst and Founder Collective investors. Julia is one of the modern high-performance computing startups and wants to grow fast. It has also evolved as top 10 programming languages with more than 1 million downloads.

In 2018 when Google made the big cloud announcements all features were tuned towards Tensorflow only. In September of 2018, much to the joy of Julia creators, Google opened up the access to Google Cloud. The XLA (“Accelerated Linear Algebra”) is a partially open-source compiler project released by Google and has rich IR for telling the GPU about the algebraic operations. XLA takes arrays of basic data types and also tuples. High-level operations include basic arithmetic, generalized linear algebra operations, high-level array operations, special functions and some basic distributed computation operations.

Julia Computing CTO Keno Fischer, was cited in the paper, “Google opened up access to TPUs via the IR of the lower level XLA compiler. This IR is general purpose and is an optimizing compiler for expressing arbitrary computations of linear algebra primitives and thus provides a good foundation for targeting TPUs by non-Tensorflow users as well as for non-machine learning workloads.”

Adapting Julia To Cloud TPU

Each high-level operation has two kinds of operands:

  1. Static operands where values need to be available at compile time. Additionally few of these operands may reference other computational modules that are part of the same HLO module.
  2. Dynamic operands consist of tensors. They need not be available at compile time.

Researchers outline methods that use Julia middle-end compiler to determine sufficiently precise information. This information is used in sufficiently large subregions of the program to amortize any launch overhead. They also emphasised in the paper, “We now have the ability to compile Julia programs to XLA, as long as those programs are written in terms of XLA primitives. Julia programs not, however, written in terms of arcane HLO operations; they are written in terms of the functions and abstractions provided by Julia’s base library. Luckily Julia’s use of multiple dispatches makes it easy to express how the standard library abstractions are implemented in terms of HLO operations. ”

It will be great for many data scientists to have the power to compile Julia to Google Cloud’s XLA hence enabling offload to TPU devices.

The post Julia Users Can Now Rejoice, Google Cloud Has Powerful Capabilities To Support The Language appeared first on Analytics India Magazine.


Viewing all articles
Browse latest Browse all 21426

Trending Articles