Google says its custom machine learning chips are often 15-30x faster than GPUs and CPUs

It’s no secret that Google has developed its own custom chips to accelerate its machine learning algorithms. The company first revealed those chips, called Tensor Processing Units (TPUs), at its I/O developer conference back in May 2016, but it never went into all that many details about them, except for saying that they were optimized around the company’s own TensorFlow machine-learning framework. Today, for the first time, it’s sharing more details and benchmarks about the project.

If you’re a chip designer, you can find all the gory glorious details of how the TPU works in Google’s paper. The numbers that matter most here, though, are that based on Google’s own benchmarks (and it’s worth keeping in mind that this is Google evaluating its own chip), the TPUs are on average 15x to 30x faster in executing Google’s regular machine learning workloads than a standard GPU/CPU combination (in this case, Intel Haswell processors and Nvidia K80 GPUs). And because power consumption counts in a data center, the TPUs also offer 30x to 80x higher TeraOps/Watt (and with using faster memory in the future, those numbers will probably increase).

It’s worth noting that these numbers are about using machine learning models in production, by the way — not about creating the model in the first place.

Google also notes that while most architects optimize their chips for convolutional neural networks (a specific type of neural network that works well for image recognition, for example). Google, however, says, those networks only account for about 5 percent of its own data center workload while the majority of its applications use multi-layer perceptrons.

 Google says it started looking into how it could use GPUs, FPGAs and custom ASICS (which is essentially what the TPUs are) in its data centers back in 2006. At the time, though, there weren’t all that many applications that could really benefit from this special hardware because most of the heavy workloads they required could just make use of the excess hardware that was already available in the data center anyway. “The conversation changed in 2013 when we projected that DNNs could become so popular that they might double computation demands on our data centers, which would be very expensive to satisfy with conventional CPUs,” the authors of Google’s paper write. “Thus, we started a high-priority project to quickly produce a custom ASIC for inference (and bought off-the-shelf GPUs for training).” The goal here, Google’s researchers say, “was to improve cost-performance by 10x over GPUs.”

Google isn’t likely to make the TPUs available outside of its own cloud, but the company notes that it expects that others will take what it has learned and “build successors that will raise the bar even higher.”

Google merges YouTube, Play Music teams as it looks to create a streamlined experience

Google’s YouTube Music and Play Music apps have always been two ships in need of a single rudder, offering an overlapping set of features with separate logins and interfaces. Now, Google has taken the first step toward streamlining its music streaming experience.9 google play music tips tinker with music queue 7

According to a report by The Verge, Google has merged its YouTube Music and Google Play Music teams into a single unit, marking the first step toward a possible creation of a unified experience across a single app. While a subscription to Google Play Music or YouTube Red already includes access to the other service (and both have a decent chunk of content that can be accessed for free), Google told the Verge that improvements to the way the two services interact could be coming:

“Music is very important to Google and we’re evaluating how to bring together our music offerings to deliver the best possible product for our users, music partners and artists. Nothing will change for users today and we’ll provide plenty of notice before any changes are made.”

When asked about the rate of YouTube Red signups during Alphabet’s fourth-quarter conference call last month, Google CEO Sundar Pichai also alluded to some changes to Google’s music streaming strategy. “We have YouTube Red, YouTube Music and we do offer it across Google Play Music as well,” he said. “You will see us invest more, more countries, more original content. And we’ll bring together the experiences we have over the course of this year, so it’s even more compelling for users.”

Streaming is rapidly becoming one of the music industry’s biggest business, but it’s unclear how much of the pie Google actually owns. Spotify is still far and away the biggest music streaming service with some 40 million subscribers, but Apple Music is gaining fast, having crossed the 20 million threshold after just a year and a half. However, while Google has yet to release any subscriber numbers for either Play Music or YouTube Red, which are bundled, it has a built-in advantage by pre-installing the app on most Android phones, much like Apple does with Apple Music. And a simple, single experience across YouTube and Play Music could prove to be a serious threat to Spotify’s dominance.

This story, “Google merges YouTube, Play Music teams as it looks to create a streamlined experience” was originally published by Greenbot.

Google, Microsoft, Facebook, IBM, and Amazon Collaborate on AI

Google, Microsoft, Facebook, IBM, and Amazon Announce 'Partnership on AI'In a major boost to artificial intelligence (AI) research, five top-notch tech companies – Facebook, Amazon, Google, IBM and Microsoft – have joined hands to announce a historic partnership on AI and machine learning.

Called the Partnership on Artificial Intelligence to Benefit People and Society – or Partnership on AI for short, it means that these companies will discuss advancements and conduct research in AI and how to develop best products and services powered by machine learning, TechCrunch reported on Thursday.

Initial financial help will come from these companies and as other stakeholders join the group, the finances are expected to increase.

“We want to involve people impacted by AI as well,” Mustafa Suleyman, co-founder and head of applied AI at DeepMind, a subsidiary of Alphabet (parent company of Google), was quoted as saying.

According to the report, the organisational structure has been designed to allow non-corporate groups to have equal leadership side-by-side with large tech companies.
“The power of AI is in the enterprise sector. For society at-large to get the benefits of AI, we first have to trust it,” Francesca Rossi, AI ethics researcher at IBM Research, told Tech Crunch.

AI-powered bots will become the next interface, shaping our interactions with the applications and devices we rely on and Microsoft’s latest solutions are set to change the way HP interacts with its customers and partners, Microsoft’s Indian-born Microsoft CEO Satya Nadella said recently.

At Microsoft’s Worldwide Partner Conference in August, Nadella had said that AI-powered chatbots will “fundamentally revolutionise how computing is experienced by everybody.”

Google Now Owns abcdefghijklmnopqrstuvwxyz.com

abc

Google recently split its company into a number of different companies, which are all owned by Alphabet.

The popular Alphabet domains like ABC.com or Alphabet.com aren’t owned by Google, so they have usedABC.xyz and now the have acquired another new domain, which has all the letter of the alphabet, abcdefghijklmnopqrstuvwxyz.com.


Google has recently acquired the abcdefghijklmnopqrstuvwxyz.com domain name and the company is now listed as the official owner of the domain.

Obviously the company plans to use the domain name, although it is not clear as yet on whether this new one will be the main one for Alphabet or whether they will continue with the ABC.xyz.