After releasing so many products overnight, how does Google explain that it suddenly became interested in hardware?

October 4th, Google hardware conference came as expected, Google really released a large and small two Pixel phones, and then really pulled out a VR helmet, and then took out the Google Home in May for a while music. In terms of confidentiality, the Google conference in October was as bad as the iPhone in September.

So what's the expectation of an early release conference? As a technology company whose market value fluctuates between the world's first and second year year-round, the answer is of course yes. Consumers wait to see prices and features decide whether or not to buy or buy. Manufacturers and developers stare at “What Google wants to do” and “How it will affect my business”. The former is handcuffed, and the latter may sometimes be fatal.

Let's take a look at what can be bought and bought after tonight (if we ignore the physical in heaven):

One large and one small Pixel phone, starting from 649 US dollars;

The first Daydream mobile VR helmet, $ 79;

Following the OnHub second router Google WiFi, 129 US dollars (single) - 299 US dollars (package);

Supports 4K Chromecast Ultra, $69;

Speakers version of Google Assistant, Google Home, 129 US dollars.

Google’s AI strategy

Sundar Pichai replaced Larry Page in the two issues of Google in May and October.

Pichai reviewed the development of the Internet and was at the stage of Google's experience. He thought that this evolution was PC-> Web-> Mobile-> AI. Pichai said that we are now experiencing an evolution from mobile to AI.

How to reflect this evolution? Google's knowledge map has accumulated more than 70 billion data, and machine learning is to put these super data on the "engine"; DeepMind has just defeated the strongest human player; in 2014 Google's image recognition system accuracy rate in 89.6 %, and now it has reached 93.9%.

Google will deliver these capabilities to users in the form of products. So this is Google Assistant, Google's definition is "to give everyone a personalized service Google." Because of the data you accumulate year after year when you use Google, Google has these data, computing power and algorithms. It can meet your needs better and better.

At the same time, Sundar Pichai emphasized that it is now very early days.

What Pichai expressed is that for a long time to come, for the user, Google AI is Google Assistant (additional sentence: DeepMind's positioning is closer to the enterprise AI solution).

Why did Google suddenly do its own hardware?

Former Motorola President Rick Osterloh left the company six months ago and returned to Google. The company authorized him to form a new hardware business unit. After Osterloh came to power, he immediately throw this question to himself: Why did Google start to make its own mobile phone/hardware?

First, he cited examples of cell phone-driven e-photo and music outbreaks, indicating that the combination of software and hardware is a general trend.


Second, "(self-hardware) allows us to better meet the needs of users."


Third, "We have done hardware before, and (being our own brand of hardware) is a natural step."

Osterloh's answer was careful. After all, Google was the most important part of the Android ecosystem. Downstream partners were the deciding factor for Android to follow iOS. At the same time, Osterloh does not have a clear statement about “border”; when it comes to “doing hardware is based on the needs of users”, it will no doubt become more and more like Apple.

For Google itself, the new hardware business unit finally pulled together the scattered hills, while the outside unified the product line and brand. Google’s current hardware is half related to content (Daydream, Chrome Cast...) and half is related to AI.

Google Assistant's open

Strategically moving from Mobile First to AI First, then to the product platform is from Android to Google Assistant. The release of the mention of Android on this conference is awkward. Google hopes that developers will focus on Google Assistant.

Scott Huffman previously led the research and development of advanced projects within Google, and is now the Google Assistant project leader.

Google mentioned in May that Google Assistant is open to third-party services. For example, in question and answer, it will call Wikipedia's data in addition to the knowledge map; for example, the user calls the car, it can call Uber. Huffman also confirmed that in the future it will be open to third-party hardware.

Like Amazon Echo, Google Assistant has a set of interfaces to interact with third-party services, officially known as "direct actions and conversation actions." That is to explain what the user needs in different contexts to evoke the corresponding application.

Huffman said, "Google Assistant is the next open ecosystem they are trying to build. "But there is not much news about Google Assistant. The official expects to announce more news in December.


Attached: Google Assistant Developer Website https://developers.google.com/actions/

Recommended: Google's first generation AI mobile phone like this, 649 dollars to send virtual assistant

Google's first Daydream VR helmet officially debuts for only $79

Huawei's new machine, Google is suffering

Posted on