Leverage AI on the Cloud to Transform Your Business (Cloud Next '18)
Good. Afternoon welcome, to next I'm, gonna be talking about how to leverage AI on the cloud so that you can transform, your business, I'm Valley, Apple Lakshmanan everyone calls me Lac and this, talk comes, about after. Having talked to dozens of customers over the years in, terms of basically. Answering. A very simple question all right I'm hearing all of this hype about AI but. Where do I actually use, it right, what kinds, of use cases, come. To mind and typically. The answer to this is you show them a laundry, list of here's, a whole bunch of people, who abused it and what, I've tried to do here is to kind of step back from that little bit and come, up with principles. Right the kinds, of things that you see the. Very broad, areas, and we, can basically look, at it from a couple of perspectives, one. Way is I'll talk about how. You see these in Google products, and I'll also talk about how. You how, we see, these in, terms, of our customers, use cases. So. When I talk about AI and when you hear our AI you. Probably. Think, of things. Like Google photos you, think of things like smart reply these, are. Incredibly. Buzz worthy, you hear about image models, and sequence, models if you're, a technical person you think about convolutional. Neural networks, and recurrent. Neural networks, and LS TMS etc, but. When, you talk about how, do I use AI. I, would. Encourage you to forget about all of that buzz. Because. Ultimately. What. Drives value in. Most. Companies, is machine. Learning on structured. Data data. That lives, in. A database data. That lives in a data warehouse. Even. At Google right, and, this is something that we did because it's it's, possible to do because you can go through the entire Google codebase and look. At all of the models, and see what kind of models, are people using, and, MLT. Is a multi-layer. Perceptron. It's, essentially, your standard. Neural network, with. Nothing fancy, in it just. Having a few layers that. Is 60%. Of all the models at Google and this, is the kind of model that you use if, you. Have a if, you have data that is just numeric, and categorical. Right, just structure data, LSD, ms are the, kind of model that you use if you have time, series, data or the. Kinds of models that you use if you have text, data. Translation. Text. Summarization smart. Replies, an example, for lsdm model that. Is another 30% of all the models at Google, so. All of the buzz worthy, image. Model said that, we hear about that every tutorial talks, about that, everybody puts on their slide to say this is what you can do with AI that's. Less, than 5%, of the models. Okay. So what do you focus on when, you say I want to use.
AI In my business. Structured. Data right, so that's the first thing to remember it's. That the value, actually comes from the. Data that you have in your data warehouse so. How. Do we use, structured. Data for. AI and then, like you know when we talk about buzz again what, you know we, talked about ai and then we talked about machine learning and machine learning is the part of AI that works today certainly, again when we talk about like what do we do machine. Learning right, so how, do we use machine learning it's, essentially. A way to use standard, algorithms, to. Get predictive. Insights from. Data to, make repeated decisions so, let's get a break, that down a little bit the, first thing of course is, that machine learning only works when you have data and, preferably. Lots and lots of it so, you have the data but. Unlike. The, traditional, typical. Ways. That you use your data the, difference, here is that the algorithms. That you apply to your data are standard. So, whether you're a retail, company or, you're an oil and gas company. Or you're a media. Company right, the, way, that you take your data the algorithms, that you apply are relatively. Standard, and that's basically. What has driven this. Big use, this big growth, in, the applications, of machine learning it's a fact that the, same kinds. Of models, work. Across, industries. Work across verticals, so, you applied these standard. Algorithms. To. The data but that's toward those standard algorithms, themselves. Aren't new either right when we talk about decision, trees or you talk about no random, forests they've been around for a while and those are what I mean by a standard algorithm, a decision, tree is a standard algorithm that you can apply regardless. Of which industry, you're in but. The key thing is that you apply these algorithms, to data to, create, predictive. Insights. Normally.
When We look at data what, are we doing we are creating backward, looking insights, we're looking at your data and you're saying what, happened, you're trying to understand. What, happened, machine. Learning is about doing. A predictive, thing right, predicting, something that happened but that's not it, until. This point I could be explaining. Everything. That a business, analyst, does pretty. Much at every company the. Big difference, between, what, a human. Analyst. Does and what, a machine, does is this, last bit it's, a repeated, decision, but the, idea, is that you don't apply, machine learning if, you need to make a decision, once. A year if you need to make a decision once, a month that. Is not an m/l use case an. M/l. Use case is a decision, that you make over and over again many. Many, many times a day it's, a kind of thing that you do every. Time a customer visits, you if you want to determine is, the, shopping cart going to be abandoned, now that is a decision that you have to make for every, customer. Who, visits, your webpage if, you're trying to make a decision of should. I place my new. Store, in this, location, that, is a decision that you may make 50, times a year, the. Fifty times a year decision, is not a good candidate for machine learning the. Thousands. And millions of, decisions, a day that, is a great candidate, for machine learning and somewhere. In between you, have to make a judgment, call of whether this is a data analysis, task or is it a machine learning task. So. To. Give you an example of where. This. Idea comes, about let's let, me start with a good Google so, our flagship. Applications, of course search we're a search company and the, way search used to work a few years ago was. That you would go into the search, bar and you would type in the keyword, Giants, and we. Had to show you either. The San Francisco, Giants which, is a baseball team or the. New York Giants which is a football team which. One did we show you first. Well. The way this used to work was. That there was a rule, base deep. In the bowels of the search code that, said if the query is joint and if, the user is located, in the Bay Area show. Them result, of a San Francisco Giants if the, user is in New York show, them results of a New York Giants, and if the user is somewhere else show. Them results about tall people and. That. Is basically the rule base for, that one, world. Giants. Imagine. How complex, that. Code base gets for, every. Single thing that. Could. Have multiple, meanings. So. That's essentially, the problem that you failed facing, because, we are writing hand coded rules like this for, each of these query, phrases, and. Machine. Learning comes, about because we say this, is a decision, that we have to make over, and over again and we, need to make it lots and lots of times so, how do we scale, this so, that we don't have to have someone.
Basically. Create. This rule for. Every, single possible. Query term that, could exist and. That's. Basically, what prompted, rankbrain. Which is essentially, a machine learning algorithm. That. Started from this premise that hey, when someone, comes to our webpage and they do, a query, we, show them a list of possible. Results, we, know which of them dig actually, clicked on so, now we can train a machine learning model. To. Basically, predict. What. The best possible, result, ought to be for. A specific query, term and we. Have enough examples of, this we, don't have to go around create, separate, rules for, every one of the query terms we. Basically have a machine learning model do. This for us so it's search. Is not purely. Machine learning there, are many many many signals, that we use but. The. Machine learning signal. Turned, out to basically call create. The, kind. Of improvement, that we see over. Two years of work right, that one signal, did and that was amazing right and that basically, was, what prompted Google, to set up and say man this machine learning think it has legs. So. What is that essentially, sick right so here's another example this is of our customers, so, rolls-royce basically. No item is a rolls-royce anything, cars but rolls-royce also, does shipping. And they, needed to basically interpret. Marine. Data sets and again, the way they would do this was basically it was a humongous data said they needed lots, and lots of rules and those rules had to be crafted by hand and, they. Were basically able to create. Safety. Measures. Simply. By basically, instead of doing this rule creation by hand by, basically taking, those marine datasets and inferring. What what. Kind of actions, ought to be taken, right through. Machine learning so. The, basic, idea the first premise here we're, how, do you use AI how. Do you use machine learning to leverage, your business number. One, look. Back, in you into your business think. About all the problems for. Which you are creating. Rules today, right. Any problem. For which you're creating, rules, today. And you. Have been, doing the good thing which is every. Time you've made a rule-based decision, you, basically have a data set that says this, is the information, I used to basically. The. Information that went into your decision tree to do the rule and the, outcome, right, was it actually correct, or not if you identified, fraud. Using. A bunch of rules was, this transaction, actually fraudulent. Or was it not, so that's your label and if.
You've Been doing these rules based. Data-driven. Decisions. For, a for, a period of time that becomes. A machine learning data set and you, can basically get, out of this business. Of hand, crafting, rules and basically. Turn it over to a machine learning model which. Can basically. Infer. What, the rules ought to be and to do this in a more holistic way so, number, one how do you basically. Leverage AI in your cloud look, at all your rule-based systems, look, at all your rule-based systems, look at rule-based systems especially for. Which you have been collecting outcomes, if you've not been collecting. The outcomes, for the decisions that you're making please, start right, start start saving whether, those decisions, actually worked out or didn't, and use. Those as basically the input to your machine learning models so, that's number one so. Let's take a second, use case in this case I'll be using Google, Maps to illustrate, the, the machine learning journey, right, the journey of a company, that's basically, going from, going. Through the machine learning transformation. So I use Google Maps all the time and I, live somewhere in the south of that diagram and I work in Google Kirkland, in. The north part of the diagram and every day I look at maps and maps, gives me a route, to get, from home to, work. Right. So great, and I. Can, look at that but is that. Machine. Learning and. You could say well I told. Google where I live I told, Google, where, I work and this, is basically, the route and. Going. From A to B this. Is Dijkstra, algorithm. This is the a-star, algorithm we, teach it in undergraduate, computer science classes, it's, a deterministic, algorithm so, it's not machine learning right. Going from A to B is not machine learning sure. It is essentially, a, deterministic. Rule it's, a rule that you can write down on paper and do it. How. About this so, I was in Japan and I. Was in a subway. Station called, Roppongi. And Google. Maps essentially. Told me that, you're on floor number two of Roppongi, station, and to. Basically get to Google Japan, this is a route that you have to take, how. Does Google Maps know that I'm on floor number two of the, subway station. GPS. Me this is underground, I'm. Not gonna get altitude out, of it, so. At that point the. Only way to know where, I am is to. Basically use, a whole, bunch, of other data sources. But. The point is to, get to that stage, where, you're now thinking about how do I figure out where the user is from. A variety of different data sources you, have to have solved, problem, number one which, is how to get from A to B right, that is the core of your business right, basically, provide navigation, information from A to B and then. You're, basically then saying okay now how do I use data, how, do I use ML, models, basically improve. That experience and. That's, what that second, bit is it's, about improving, the core business. Experience, that, you do with number one and, then. You get the icing on the cake which. Is of looking at at Google now card I said this is where I am what, can I do in between meetings and, I. Was able to suggest to me that I should go look go look at the Centauri Museum of Art which, has the kind of paintings, that I like now if you're a sports fan it might have probably, suggested, to you a sumo wrestling match close, by right, but the point being that this is now completely. Personalized. And it's. Still in, that, same realm, of, location. And location-based, read based services, and giving, you that in that additional. Information right, so this is base cailli the, journey, that. Enterprises. Go through with, machine learning you start with a core of your business that you typically solve, and you have already solved, and then, you look at how does how to basically address. Individual. Use cases, of that. Of that error. Experience. Of that user experience in, a better way and then. Finally. You, look over how do I basically, reach that very long tail, right, of that very rare.
Situations. That, we need to do very, very very well. Okay. So let, me take in another example of this this isn't the case of again, talking about this journey, to give you an idea. Of how this works let's say, that. We. Want to basically wear media company, and we, want to find the best time slot, for a TV show, right. So, how do we do this well. We might basically. Have data, that says, okay. There's here's, the here's, the age and the, number of viewers of that particular, age group who. Have who, basically have viewed shows. Like this show before, and I, have location. Information, of people, who are viewed this, show in the past and then, based on the location, information based. On the demographic. Based on the desirability, of that demographic, I would, basically say okay we're gonna take this new show and it, looks like 7:30, p.m. on Wednesday, is a best timeslot, for this show, right. This. Is essentially, your traditional. Data. Analysis, there's no machine learning here yet but, in order, to do this what, data have you collected. You. Basically, collected, a bunch of data around the. Demographics. Of people watching this show you, basically, found, the John draw of each of your different shows you. Know where the location, is of all of these people who have been watching these shows so. Now given all of this information can, we do the next thing the, second, step of that journey what's. The second step of the journey let's. Build a movie recommendation, system we, might say like okay let's. Go ahead and say to build a movie recommendation, system and your, first approach. Right. The simple approach would be something like this you, would say I'm gonna recommend popular. Movies, in. The genre. That, this user likes so, if I have a user who's between 30 and 39 years old his, male makes. Between 50,000. And $100,000. And lives, in South Korea and over. Time I know that this, person has watched a romantic. Car like. A romance, -, comedies, followed by romance, forever a comedy, then, my prediction, algorithm, might be let. Me go ahead and find a John Drewe that the user has watched most often in, this case it's comedy and then. Let me go find, where they live which is South Korea and find. The five most. Popular, comedy. Movies in South Korea and that, is my movie recommendation, for that user. It's. A pretty straightforward. Rule. Based algorithm. That, basically takes advantage, of the data right, but. The thing is it relies, on the data that you already have, this structured. Data that you have of the, history, of things that a person has watched and something. That you know about this particular user how. Does this become machine, learning. You. Flip this around, notice. That we have a model, I have data and a half prediction. Instead. You start from the data you don't start from this idea that I know what the model ought to be and therefore. I'm gonna go to the data I'm gonna make my prediction, instead. You start from the data you save. What's my data I have, all the, movies and I, have all of the ratings that every, user has done of all, of the movies and.
Then. Your, model, is if, I want to go ahead and do a prediction I'm gonna, find all the similar, movies across. All the similar, users. Using. Criteria like. The users current preference, and their age and a variety, of things but, notice here now all, those, very hard, boundaries. Of age. Income. Genre. Right. None. Those exist anymore right, this is much more amorphous it. Can basically take, into account that people are, not you can't bend them instead. It's a very continuous, range and you're finding similarity. Measures and that's basically what a machine, learning helps you do it, helps you basically take into account, multiple. Factors. And weight, them appropriately. So. This is essentially, that journey that you go through from. Right. From first, basically. Taking. A core problem, solving, it with data, thinking. About how to take that data and solve it with analytics, and then, using, that data and rethinking. It in, such a way that you don't have to you can basically account. For different factors, so. In other words then the, second. Thing that you can do with machine learning the. Second, way that you leverage, your business, with machine learning is that this is how you personalize. Your applications. And this, is how you basically reach, the. People, who, are unlike, every, other thirty, to thirty nine-year-old who lives in South Korea right, you're able to find, smaller. Groups, of people that they're alike and give, them much more targeted, recommendations. So. Machine. Learning is about basically, rigged mostly, what we do and we do businesses, is that we basically say we're gonna take the the middle part of our of our distribution. And we're gonna target everything, towards, the 80% of our users right, and we're gonna forget about the other 20% who, do crazy things but instead. What machine learning lets you do is that it lets you basically go, walk down into that tail and basically. For every user find, similar users, and be, able to do recommendations, for them. So. On to get to the third thing I'll start with one of my favorite course into what Andrew Aang and Andrew. Inc is of course a famous machine learning researcher, and you would think that he would be the one would be all gung-ho about. The latest advances. In machine. Learning theory, but, instead Andrew Andrew says it's, not about the best algorithm, it's, about who has the most data right. Because that's the order to that we have learned over and over again is that, the best machine, learning algorithms, are the, ones that have access to more data than any other ones, so. When you look when you compare, two algorithms. Forget. About what everybody tells you if you have two products, that essentially. Do the same thing, ask, them what data was this trained on if you have two image models, ask. Them what data was a trained on and if one, model. Was trained on more, data and better data than the other one that, the ball it doesn't matter what the quality of the algorithm, is that, data is basically going to control. The, quality of the result that you get so, inevitably, it's about the data the quality of the data and the quantity, of the data so. What. Does that have to do with, the way we approach.
Our. Data. Stewardship. Because, everything, starts, with data management, if, you, think back what how you collect. Your data and this is something that I've now noticing. Now is, lots. Of times what, we. Do is that, we take our data and we aggregate. It and we. Store, it and we try to build algorithms, machine. Learning algorithms, and data analysis, tasks, on the, aggregated. Data what. Do you mean by that, instead. Of doing a machine learning on every. Individual. Transaction. You. Try to forecast, sales. By. Aggregating. All the transactions. Over days and take. The daily data and use it to predict sales. That. Doesn't work. Because. What have you just done you've. Taken this very, rich, data source that you have of, every. Individual, transaction. That has ever happened and, you've. Thrown, it all away and, you've, kind of combined, everything, and, now, you basically have a data set of 365. Points, and you're. Trying to use that to predict, sales, stop. Don't. Train your model and 365, points, go. Back to that original data, where you had eighteen million transactions. You. Need to train your models. Not, on the filter data not on the aggregated, data but. On the raw data on the, original. Data. So. That's machine. Learning is about doing, things on as much data as you can right. Don't aggregate, things too. Early don't filter things too early don't throw away data too early that. Doesn't mean that you shouldn't clean things up you shouldn't make, sure they're good quality right, we're saying that try to get up the point where you don't need to throw away data that you don't need to aggregate this data to, reduce a noise in it that each individual, transaction data, has, the right information, that you need to, build a better machine learning model. But. It's not just about the data that you have also. Think. About all the ancillary, things that, affect your business so. It's not just about the data that you have in your data warehouse think. About whether data think about traffic, data think about political events. All these, things affect your business so, think about how to basically, join, your. Data sets with all these other diverse, factors, and that. Is the data set that you, should be using to build your machine learning models to build your AI systems. And. Then. Having. Now taken your data warehouse. Considered. All the third-party and partner, information, that you can join with, ask. Yourself, a third question, what. Data could I be collecting. That I'm, not collecting today. Sensors. Are getting really cheap, there. Is something like 8 billion, devices, that. Came online last, year. It's. Just an astounding, number and, this, is just connected, devices I'm not talking about about, devices that are not permanently, connected, these are just connected, devices and there's.
It's Getting, to the point where you. Data, is ubiquitous, and you can get it the, question is are, you, collecting, the data that actually impacts, your business so make sure that you have a data strategy that. Involves collecting, this data from, as many sources as you can whether, their phones or their IOT, devices or, whatever and then. Think. About how, to basically take that big data that you now have and, what I mean by big data is, now, the quantity, of the data uh Nagre gated unfiltered. The raw data, joined. With everything, else that affects you with, streaming. Data that's coming in in in, the old days like the three V's in other words volume. Variety, and, velocity, right. So that's basically what I mean here and that's, actually when you think about it that way and you thoughtfully. Consider. All of these things that affected, it it's changing. Businesses. Let's just take the first one like, games, ten. Years ago games, or board games, okay. Now, games, are, essentially. Completely. Customized. To the person playing it, the. Characters, that you encounter, when you play a game are very, different and have different skills, than the character that somebody else encounters, and the only reason that that is possible is, because, of the amount of data that gaming, companies are able to collect and how they're able to customize, the game as, you play the game and, you see this over and over again as you walk down that list of companies. Are getting completely transforming. By, rethinking the. Data that they're collecting and, what they can do with the data. So. As an example of another one slumber. Game who's, one of GCPs, customers, talking, about how cloud IOT, core allowed, them to focus their engineering, efforts, to. Basically, build, reliable. Economic. Things right they're talking about 30, terabytes, of, petrochemical. Data that, they can now use to build their models, so. That's basically, a scale that you. Should be thinking at. So. The. Third aspect. I, would leave, you with is this idea that, when you design your systems. Design. Them with the expectation. That next. Year you will have more data right, don't start with this defeatist idea, that you. That, the data that you have is all the data that you have today right, you, be able, to get more data next year and you want to basically work towards getting, that more data more diverse, data and build, your models, to account for that so. That's the third third aspect of this this design, for. More data for unn for unfiltered. On aggregate, data. Together. And another idea of this like talking about the kinds of devices and, being able to connect them and basically, create. New businesses. That just didn't exist a few years ago right, many, of us use things. Like Filip you write basically. Allows us to change, the. Way lighting, works and now. Right, philip's basically, processes, 25. Million remote lighting commands a day, something.
That Thing, that did not exist, three years ago. And. It's, just possible. Because of all of these trends, that, are connecting, together data. Connected. Devices the, ability to basically train, your models, and basically. Deal with unstructured, data like, lighting, comments. But. It's. All great but. What happens when you collect petabytes, and exabytes, of data. If. You're going to be spending all your time doing. Provisioning. Worrying. About reliability, worrying. About handling, the growing scale, if, you're going to be worrying about utilization. You're. Not going to be deriving, value. From any of this data you're gonna be spending all of your, head. Count ok, all of your human. Resources, all. Of your engineers, are, going to be basically, just. Working. On keeping, this data up and that's. Not what you want to be doing you, want to basically be deriving data, driving. Value, from it you want to understand, the data and this, is basically, something that we. At Google have, a lot of experience, with and this is kind of why we've, talked we've been basically building, serverless. Data analysis. Serverless. ETL, tools serverless. Machine, learning tools right, the reason that pretty, much when you look at Data Platform. Everything. Is serverless, everything. Is a fully managed, service, is simply. Because there. Is no way to do both things, there's, no way to both, have, an engineer, both. Be worried, about is, the, data gonna be there when I need it and, being. Able to collect and basically. Be able to create models, to derive value from that data so, you know even though I think we started, out talking about MapReduce. And, like. The equivalent, of these in 2004. We quickly realized, that that cluster. Centric. Way, of thinking, wasn't, gonna scale, and we, basically moved on to things like data, flow which is open sources apache beam write to basically deal with this idea that we wanted a server less ETL, server less extract, transform load build. Pipelines. So not don't think of serverless as purely, being around low-level. Functions. Server. Less should be about your entire workflow. Entire. Workflow, needs to be server less and that. That's something that that is the way that you basically. Manage. Data, bytes and exabytes, of data if. You've used bigquery. You know exactly how we think about it right you think of you write your sequel code and you don't worry about the thousands, of slots that come up to basically. Execute, those queries for you that, is how you should be thinking should be thinking in terms of code not, infrastructure. So. I mean and this is something that no bigquery, is one of those things that people immediately get when they see it and it, drives, a lot of transformation. And a lot of our customers, airasia. For, example, said you know hey we just need a platform. That, can scale, to. Basically, have. Our appetite, for just, this amazing. Growth, in data that they're seeing and, bigquery. Was ideal for that task right, so when you have this increasing. Volumes, of data and you need to continue, deriving value, from it. Think. About fully, managed serverless. Solutions, don't, settle for anything that, involves you. Having to spend up a cluster. So. The idea is spend. Time on, what. You need to do don't, spend time on how to do it okay, so you focus on insight. Not, infrastructure, so, another example of this is blue apron, who talks about after we move to bigquery our, query, time was reduced exponentially. Right so they were able to do a lot more queries, and. Importantly. Accelerate. The decision making an. Extra. Day is, very. Very very valuable. Or. Ever and then of course the other question, that we get is but, I have lots of data, can I really. Stop. Managing all of this myself and if, my favorite, and favorite anecdote. Here is Evernote. Because this is done by our professional services team, migrated. Three and a half petabytes, of content, in just 70 days, so.
And All of these are just very small. Documents. Owned by, millions, of users and we're, able to do that and so it's definitely. Possible you, can move fast and you can basically get, out of this managing. Your infrastructure, business so. For. A write, use a platform that, lets. You forget about. Infrastructure. If, there's a for a there's got to be a for B and what, is that so this is about the data part but. There's another bit, that any cloud platform, has to give you if you want to be successful, in AI and it. Is that you, need flexibility. You need to have the, ability to basically, do. Your machine learning runtimes, for different scenarios for. Example on Google cloud you can prototype with. Cloud data lab or deep learning image again you can do this prototype, locally. On your little laptop and then. Run. It on Fram with cube flow and, migrated. Over time, to ml engine so it's giving you this four, different scenarios for example, if you have a device, out in the field and you, want to be able to run a model on that device. You, want to make sure that your models are portable, such, that you can train on the cloud but, do the predict on, the, device out in the field so, you want to make sure that, any. System. That you build that you basically get and you run your em, machine learning runtimes on they support, all these different scenarios training. On the cloud predicting. On the device training. On the device, etc. Okay. The, other thing to realize is, that deep, learning only. Works because our, data sets are large so. The. Graph on the Left basically. Talks about the training data size this, is for a specific problem, but different problems we see the exact same thing so if. You for those of you in the back you can't quite see it the x-axis. Is two part twenty two part twenty one to four twenty two etc so, each grid, is a doubling. Of the data size and, each. Time, your data size doubles, your, error rate drops. Linearly. Okay. So remember, that to drop your error rate by. By X right. You're basically, a data set basically, has to go to, a power of X I did. Your data set doubles, to get a linear improvement, in error rate so we it. Only works right this is drop in error it. Happens, with more data but when we talk about more data we're, not talking, ten percent more data we're not talking twenty percent more data we're, talking, thousand. Times more data a million, times more data right, we're talking powers. Here lots. And lots more data so, that's the one thing that we want to remember on the left hand side the, right hand side, is the, other scary, part of the curve which, is these are the machine learning models, the state-of-the-art, machine, learning models and. The. X-axis there is the year and the. Y-axis is. A compute. Capability. That you need petaflop. Spur day of training. So. Alex. Net which is on the way left hand side that is the thing that essentially started, the deep learning revolution is, way. Down and in the corner there and again. The y-axis, there is logarithmic. So. Every time a model, goes up in that grid the, compute. Is, going, 10 times 100 times thousand, times so. Now look at the compute, power change. The last six years. From. 10,000, on the top, XY. Axis, the point 0 0 0 1 on the, bottom, in other words write, your compute, power that's needed to. Do state-of-the-art, machine, learning, went. Up 10 million, times. In. The last six years. Ok. Just, let that settle in for a minute. The. Amount, of data that you need. Doubles. Four. Times quadruples. Right. It goes, and powers their data requirement, goes and powers the, model, requirement, goes in powers so, you basically have two power laws. Going. On and. This. Is kind of why your, traditional, chips. Architecture. Moore's. Law they, don't really work anymore that, is this, is the thing that our leadership. Saw, three. Or four years ago that prompted, us to start developing the. TP use right. Basically. This need to completely, change, the game in. Terms of what kind of data, you need to handle, and what, kind of compute, you need to, be able to do this so. If. You want to do machine learning you. Need efficient. Cost-effective, places. To do machine learning right. So when we offer that we offer compute, engine to which you can basically attach CPUs, GPUs or.
TP, Use, cube. Flow which gives you this hybrid, environment, and cuber. Netizen time which gives you a great place to run it on Google cloud and cloud, ml engine which, gives you a fully managed, serverless, way, to, use all, of this hardware, and to, basically do this in a distributed, way so, the way you handle. The, increasing, amounts of data is distribution. And the. Way you handle, the increasing amounts of compute, is, better. And better Hardware chips so. You need distributed. Hardware. Chips and that's, basic. What something, like ml engine gives you it gives you distribution. And it gives you the best of the breed chips you need both of those in, order to do machine learning successfully. So. And the other thing to realize is that machine learning is moving towards increasing. Levels of ml abstraction, sorry, you did not see the scene it I guess it's seen it so you already seen it okay, so let. Me just start with like three years ago what happened right, awk net is is a Japanese, auction, company and they, wanted to basically, the way they wanted to rethink, the, way car auctions, work the, way it used to work was that you wanted to sell a car and you, would basically go fill out a form that said what what car you had how, old was it etc etc, etc, filling. Out a form boring. So, what ahkna, did was I said oh just go around your car take a bunch of photographs of your car, upload. Those photographs to our club and we'll, tell you how much your car is worth. Imagine. The friction, between. Filling. Out a form. Take. Photos of your car and. Completely. Change the business but, in order to do that they, had to build a custom, image model, so, they could price the cars they, had to do it they had to write the tensorflow code themselves. We. Help them do it they, wrote it it worked they changed their business great, but. I'm talking about increasing, levels of abstraction, so arc net wrote, tensor, flow code. Next. Level up Oh Cod oh okay. Dog is as UK grosser they, wanted to basically handle. Customer, service emails right, the way it used to work was a customer, sends an email somebody. At OCAD who reads the email says oh this comes tumor is basically, saying. Something. About, our. Produce, and then they forward to the produce department and, the person the produce reports and reads that and decides what to do about it it's. A pretty wasteful, process so. What did Okada do they, use the natural language API. Google. API, built. On Google Data, works. Out of the box but. It's not going to basically identify very, specific. Things. That are Accardo specific, but, it's going to basically say, the customers, happy customers, not happy they're talking about produce. Etc. Right, and then, acardo's basically, use that NLP. API and then, built a second, model that, basically said if the NLP API gives you these tags, which. Department, should you send it to, so. In other words they did not have to go to the raw text. And build, the complete, model themselves. They. Could basically build, off of the NLP API, and build, a smaller easier, model. Okay. That's. That is their abstraction. Levels that I'm talking about you don't ask. Technology. Is improved, you, increasingly. Don't have to go down to this level of dealing with raw. Images, raw text you. Can basically build, on top of existing. Api's. Meanwhile. Seen it. Wanted. To basically have, no it allows people to basically upload, content, and, sometimes. People upload inappropriate. Content that's. For whatever reason, and so, seen it wants to reject it and they could reject it using a vision API. Right. Straightforward, just use it as this and, then. The. Next level up. Uniqlo. A Japanese, retailer. Basically. They'd wanted to basically create a chatbot and, again. They don't need to go down to programming. Each of the individual, words and what they mean and conversations. They, could basically write, intense, this, is what a conversation, is when, somebody, wants to find a matching blouse, for.
Their Shirt and. The. Chart part is it's basically able to take that and create a realistic. Experience. For, the user right, so in you you can think in terms of not very, low level LS TM models, but. At very high level, what is the customers, intent, what, is the typical transaction. That's going that they're going to walk through and this is how you do it. So. If. You are doing ml what does this mean to you it, means that you want to basically pick. A framework. That. Lets you, do at. Every. Level of that abstraction, hierarchy you. Want to build a all level, tensorflow. Model, completely. Custom, no. Problem. Use ml engine. If. You, want to basically use, out-of-the-box. Models. That they're, basically trained on massive, amounts of data no. Problem, use a vision API the translate, API the speech API you. Have something, somewhere, in the middle you, want to basically take advantage of, everything, the vision API gives, you but customize. It do your own things use. Auto ml you heard in the keynote today that, auto ml is no longer just about vision, we also have text a classification, in auto ml the idea being we can basically build on top of Google here. So. That, is four B when, you can think about how do I do machine, learning on the cloud. Realize. That machine learning in, spite of all the hype is ultimately. Is still software, and you. Want to make a buy versus build decision, and you want to make it based on quality, based. On the kinds of things that you want to be able to do so. With. That it's, gonna summarize. Okay. So number one machine. Learning can be used to solve many problems for. Which you're writing rules today, number. Two, machine, learning is how you personalize. All your applications this, is how you reach the, long tail. Number. Three, design. Your systems, with this expectation that. You'll have lot more data next year and number. Four of combine the a and B here use, a platform. That, lets you forget about infrastructure, that, gives you a lot of great, pre-built, models, thank. You.