Google Makes Its Special A.I. Chips Available to Others

Google is always up to something new even when it doesn’t appear so. A few years back, the company created special kinds of computer chips called Tensor Processing Units (TPUs.) These are what have been powering the enormous artificial intelligence systems that Google use. Basically, the chips were designed to meet specific needs – handling the complex processes involved in running SE and other systems like Google Assistant.

Ideally, the technology was private and confidential, not to be shared with, or exposed to any potential competitor. But things are changing now. On Monday, the ‘big G’ came out to say that it is now giving access to those chips via its cloud-computing service –is it for free? No, it will be paid access: to be specific.

So Why is Google Making These Special AI Chips Available?

Source: Google

Well, Google has seen the potential of creating a new business around these chips. And this can be linked to the fact that so far, most companies have in place, the infrastructure necessary to handle cloud-computing technology. In its undertaking, it means that Google will be rubbing shoulders with companies like Nvidia and Intel, which are already established in the AI-chip business –but on a different approach.

“We want to reach a bigger audience as quickly as we can,” those are the words of Zak Stone -an engineer in the team which designs these chips. You may wonder why the haste, but it’s clear that like many other start-ups, Google must prove to the public that it is seriously-serious with the new business.

A Shift In How Companies Handle Modern Technology

From a keen perspective, the move highlights a turn of events on how modern technology is built, tested and made available to the public. Reports have it that Google is busy designing other chips for artificial intelligence, the new force that has ideally given a lot of popularity to Nvidia, Qualcomm, Intel and dozens of other start-ups.

Worth mentioning is, we are no longer in those days when companies stuck to one line of production. We’ve known Google, Amazon, and Microsoft as major internet firms but these days, they are also major hardware makers. But why that? Most of what fuels this approach is that it’s very expensive for the companies to have someone else on the job of making hardware and chips for them.

Besides that, Google designs much of its hardware to maximize efficiency and optimize space. After that, the engineers build chains of servers that network all the machines. To have them work together inside these massive facilities. And you know what, the two other internet giants mentioned above do exactly the same.

Google’s Current AI Related Projects

Right now, the search engine giant is focused on its latest service called Computer Vision Technology. The service works to train computers to identify objects. And from a different view, that would be beneficial to driverless cars –to keep them from running into obstacles. “With time, this new chip will be used in other business areas that might not yet be visible as of now,” said Mr. Stone.

Popular Projects Linked to Tensor Processing Units

Source: Google

According to the company, T.P.U chips are the key players in the development of nearly everything at Google. That is, from the popular service that that helps Android phones recognize voice commands -also called Google assistant, to Google Translate, the system that helps websites to translate languages into what the user wants.

Google’s Need for Chips and Hardware

For the longest time, Google has been depending on outside sources for chips and hardware. Most of these chips came from Intel and Nvidia. Cisco, HP, and Dell were also initially the biggest suppliers of hardware to the big G. However, Google decided to make its own software, servers and networking hardware basically to redeem itself from depending on the outside makers.

According to Casey Bisson from Samsung, he says that when running a large online operation you don’t have otherwise but to build your own hardware. Logically speaking, that is the only way you can build an efficient service as well as keep costs down. “It is about squeezing as much computing power as possible into the small space, and within a heat/power budget,” added Mr. Bisson.

Is it a Smart Move?

While some people think that it is not a good idea for Google to focus on AI chips business (because the competition might be stiffer than it appears on the surface) –maybe the company’s major game plan is to lure more businesses into using its cloud service. That would mean that AMD, Intel, Nvidia and all other chip vendors will have to get platforms to sell their chips and the only best platform would be through Google’s cloud.

Comments

comments