With the new division, which will be led by DeepMind co-founder and CEO Demis Hassabis, Google expects to combine expertise in AI with computing power, infrastructure, and resources to develop AI breakthroughs and products across the company and Alphabet

Google_1300_Charleston_building

Alphabet to combine its AI research groups into a single unit called Google DeepMind. (Credit: Coolcaesar at English Wikipedia/Wikimedia Commons)

Google’s parent firm Alphabet has announced its decision to combine the Brain team from Google Research and DeepMind, which are engaged in artificial intelligence (AI) research, into a single unit called Google DeepMind.

The combination, which is supported by the computational resources of Google, is a move to expedite the American technology major’s advancements in the field of AI.

With the new division, Google expects to combine expertise in AI with computing power, infrastructure, and resources to develop AI breakthroughs and products across Google and Alphabet.

Alphabet and Google CEO Sundar Pichai said: “The pace of progress is now faster than ever before. To ensure the bold and responsible development of general AI, we’re creating a unit that will help us build more capable systems more safely and responsibly.”

Google Brain was founded in 2011 as an exploratory lab by Jeff Dean, Greg Corrado, Andrew Ng and other engineers.

The division’s accomplishments include AI infrastructure for developing TensorFlow, sequence-to-sequence learning which led to Transformers and BERT, and AutoML that used automated machine learning for production use.

DeepMind was launched in 2010 by Demis Hassabis and Shane Legg. The company was acquired by Google in 2014.

Hassabis will be the CEO of Google DeepMind and will lead the development of responsible general AI systems and research to power the next generation of products and services.

Hassabis said: “The research advances from the phenomenal Brain and DeepMind teams laid much of the foundations of the current AI industry, from Deep Reinforcement Learning to Transformers, and the work we are going to be doing now as part of this new combined unit will create the next wave of world-changing breakthroughs.”

Earlier this month, Google revealed new information regarding the TPU v4 supercomputer that has been used in training AI models. The tech major claimed that it is faster and delivers more power efficiency than similar systems from Nvidia.