Accelerating business transformation with smart data analytics innovations
Good. Afternoon singapore i'm, saab i had our product, and solutions, marketing, for smart analytics, and today. We are going to talk about how, google cloud is innovating, with, smart analytics, and AI it's, going to be a power-packed 35, minutes I'll be joined by my colleague Reza and he's gonna do a live demo to make things real and then tika chief investment officer, of grasshopper, talking, about how they're using smart. HANA lytx to make their business, move forward, so, without further ado let's get started so I. Go around the globe and talk about smart analytics, and every. Time I talk to a customer and a partner, they ask what's new why. Google, Cloud is investing, so deeply what's fundamentally. Different. If. You think about modern statistics. It goes back to 1749. Bayes. Theorem, Laplace. Algorithms. So. What has changed, the. Fundamental premise. Of, data analytics is, going, through a transformation. We. Cannot, think about applying. Analytics, the same way used to apply possibly. 100 years back and why. This, is the change. If. You think about the digital era which started, back in 2002. Almost, 17 years back, for. The first time the. Total digital data, storage capacity, surpassed, the total analog. Data storage capacity, and from. Then, every. Year on. Our planet the amount of data that we are generating, is growing. With a compound, annual growth rate, of 60 percent plus every, year so. What does this all mean to, you as you. Are running your businesses, and you are trying to be data-driven as companies. The. Traditional. Analytics, systems, will not be, able to live up to this astronomic. Data growth these, were not designed to handle data at a large, scale data. Analytics. Has, fundamentally, become, big. Data analytics. So. This is from IDC I'm sure many of you have gone. Through this and you know that the amount of data that we are going to generate, by, 2025. Is going to be 175. It's, a lot of data let's keep it simple, what. Does this mean for you, as I, was saying the. If you think about the. MPP. Systems, of data warehousing like a lot of traditional, data warehouse vendors, they were building systems, because. The SMP. Based data, warehouse systems were not just good enough you needed to have distributed. Computing, capacity but. When you are exposed, to this amount of data the. Entre, mais traditional. Architectures. Are not, going to be sufficient. So. What's the answer. Now. Google has been, in the space for big data for, more than couple of decades, it's. The only company, that has nine products, with each, having a billion, plus users so. Our engineers, and managing, big data problem, every, day Google. Is a big data company and. With. Google Cloud we are bringing the best innovation. On distributed. Data analytics, and data processing and, making. It accessible, to our customers, so that you, can build, smart. Intelligent. Data-driven. Applications. Let's. Quickly go through what we are doing in terms of smart analytics, stack smart. Analytics, is not a piecemeal, problem, anymore it's not just about BI it's not about Hadoop and spark based, data processing, you, need to think about the data pipeline, end to end when, you are exposed, to this large amount of data, we. Are working with customers, who are ingesting, data from, offshore oil rigs to traditional, CRM and ERP systems. Ingesting. Data at massive. Scale and we are going to give you some of those examples today. Processing. That data in real time. Storing. That data in the data warehouse system, and then, visualizing. Analyzing. The data using, traditional dashboards. Reports or, building. Machine learning models using our AI technology. What's. Important. In this is all, the services, starting, from ingestion, to, processing. To storing, and analysis. Are all integrated. With each other so. That you don't spend a lot of time stitching, together the services, the.
Api's, Can talk to each other you can have seamless, handoff, between data ingestion, and data, processing services, cloud. Pub/sub cloud. Dataflow. Razor. Is going to talk about how you can build planet, skill streaming, ingestion service using. Our technology. So. Number. One it's comprehensive, we are investing, in a platform that, lets you build analytic. Solution, end-to-end. Starting, from ingestion from. On-premise, or cloud-based systems, to, actually, visualizing, and building, ml models. If. You don't take anything out of my talk this is the most important, slide and I talk to CEOs, across the globe about the, notion of serverless. Analytics. What, does this mean I, started. My carrier 15, years back as an analytics, engineer, and, you talk to anybody, who is in the business of building analytics, systems. 80%. Of the time is spent on system. Engineering, related tasks, building. Indexes, doing, disaster, recovery backup. Multi-tenancy. Capacity. Planning for peak. Hour times. What. We are doing is we are building analytics, systems so that you don't do anything around those gray boxes, we're. Gonna talk about bigquery, our, silver, less data warehouse you, move the data in no. Need for you to build indexes no, need for you to do disk vacuuming, performance, tuning, any kind of patching, updates, everything. Is taken care of by Google you move data in and you fire queries, and you get your insights. This. Is fundamentally. Different than. The traditional on-premise. MPP. Based architectures. Of analytics, so moving to the cloud is not, good enough you, need to reimagine, how to architect, a solution and we are making it easy and simple for you with serverless, analytics. Customers. Across the globe from. Retail, financial, services, logistics. Healthcare are embracing, Google Cloud analytics, to build solutions, we are going to hear from TK how they're building financial. Services application, using smart analytics, platform. Think. About HSBC. Close, to our house here AirAsia. Go-jek. Great. Customers, in Southeast Asia they are building analytics. And smart, applications, on Google cloud platform and we, are going to go to some specific, examples in the subsequent sections.
So. We talked, about the smart analytics, platform, the comprehensiveness. Of it it's server less that means you don't do any system engineering work you just move your data in and start, doing the analysis, immediately. So. What you can do with the platform what type of solutions, you can build of course you can build vertical. Solutions business applications. But I'm gonna talk about a few solutions which are horizontal could. Be applicable to everyone, in this room you, can build a modern, data warehouse, on Google Cloud smart analytics, platform, that can go from a few gigabytes to, petabytes. Of scale you can, build streaming, analytics solution, for real-time analytics. You. Can modernize, your Hadoop and spark based, on premised systems, by moving them to Google Cloud and we're going to talk about how customers are getting value out of it and, of course when you do this we. Don't want you to create another set of data silos, because some of your data systems are running in Google Cloud some, of them are on from ice so you're going to bridge that gap between on, premised system Google Cloud and maybe, you have data with other SAS, applications. And then. While you do all of this your, data needs to be secure, governed. So that you have complete control and peace of mind so. Let's jump right in and talk about modernizing. Data warehouse, with, Google Cloud. How. Many of you here have a data warehouse, in your company let me ask this question. Ok. It's. It's kind, of evident, right every, company, has some sort of a centralized, repository of, data and all. These data warehouse systems, are, right now massively, challenged, to handle, this astronomic, data growth we. Are talking to customers day, in and day out their. Data warehouse, systems, are melting. What. Does Google bigquery do Google bigquery is a, fully managed, server, less enterprise. Data warehouse system. We, have decouples storage, from, cute infrastructure, so, you can essentially independently.
Grow Your data storage, in the bigquery system you, can run queries, on petabytes, of data using, simple sequel. But. Bitcoin is not another, data warehouse which is very similar to what exists in the market its magical. It. Has in in, built-in, memory bi engine, for, fast sub second low latency, query performance it. Has built in real-time analytics capability. You can stream in up. To a million, row, per second, into the data warehouse without, impacting. The query performance on the data it. Has inbuilt, machine, learning capability, with bigquery ml, we introduced, earlier this year new, algorithms. Where data scientists, are able to build ml, models, using simple, sequel then, in no need to know Python or R it's. Game-changing. So. What does it all mean. When, you go to bigquery you're, able to analyze, data at scale you're, not doing system engineering, you're able to do new types of analytics. It. Saves a lot of money for customers, enterprise. Strategy group, ran a study to. Compare bigquery. Total cost of ownership with. Data warehouse systems, that are on-premise, like, traditional, MPP systems. Bigquery. Is at least 52 percent lower than on-premise, system and we. Asked and interviewed customers, who have moved those legacy, data warehouse systems to AWS. Because. They wanted to clarify, they. Are still paying 41, percent more than what they would have ended up ended up spending with Vickery so. Just, moving lifting, and shifting the data warehouse to the cloud is not the most efficient, option. We. Keep hearing are all cloud data warehouses, the same if we. Just move to a new cloud data warehouse is it going to give me the cost, advantage cost. Is top of mind for all of you we, talk to CIOs, data. Warehouse, cost is something they want to cut down, not. All data warehouses, are Sam, bigquery. Is architected. Fundamentally. In a different, way and that has, caused huge cost implication, I'm going, to show you a slide I think this is the first time showing this in any summit across the club we, recently published, a report with. Enterprise strategy group where we compared, the TCO of bigquery with. AWS, redshift, snowflake. And Azure sequel data warehouse and bigquery. Is 26. To 34, percent less, expensive, than any other cloud data warehouses, that are out there just. Take a pause and think, it's. Game-changing. It's. Completely, new type of architecture that's, giving, customers, this magnificent. Cost efficiency. And. Customers. Are using bigquery if, you look at the spectrum, of some of the larger customers, you're using like, think of Spotify, or snapchat like really. Large digital native, companies, to, traditional, companies, like HSBC, BNP, Paribas July. 20th. Century-fox, like, big. Brands, across the globe small. Digital native startups. Large. Digital native planet-scale companies, across the globe are using bigquery and getting value. Earlier. This year we, announced an offer for our customers, the, customers, who want to move from on-premise, data warehouse systems to bigquery Google. Is offering free training. Deployment. Planning services, and POC, credit, for customers, to help expedite, this migration journey if you have questions, see, us at the booth we'll be helping you to answer these questions how, to apply and and get started. So. We talked about modernizing. The data warehouse and moving, to bigquery but. We don't want, you to create data silos, because you will not have all your data in the warehouse if, you're, using Salesforce, Marketo, Zendesk, you have data in those SAS applications, if you're using SSP, you have data in SOP HANA so. How do you make sure that we bridge the gap between all, those data systems. Earlier. This year we, announced a new service, called cloud data fusion, it's. It abstracts. Data. Proc underneath, so it's basically an abstraction layer on top of data, proc it's a visual, UI based ETL. Service, that lets you create data pipeline, bring. Data from. Hundreds, of data sources including, sa P Hana and other applications. Seamlessly. Into bigquery or Google Cloud storage. This. Service is available for, you to go ahead and try and play with we. Also announce, hundred-plus, bigquery, data transfer, services, with, key SAS applications. Such as Marketo, stripe. Salesforce.com. So, if you're using those. Applications. In a few clicks without, writing, a single line of code you can start moving data from those, SAS applications, in bigquery and then start analyzing, the data. And. Some. Of our customers, they said hey we have already have, data we have data with Tera data or, we, have already started using AWS, redshift.
What Do we do we, announce data transfer service to move data from thira data to bigquery these are one-time loads customers, who are trying to migrate off those systems. There. Are customers, who are moving off AWS, redshift and coming to bigquery because of scale and efficiency, issues and we are making this simple with, the data transfer service, between redshift, and bigquery. Let's. Go and talk about Hadoop and smart don't. We love them like they they were the poster boys of the, big data processing paradigm, a, lot. Of our customers have invested, in Hadoop and spark systems, and they're, finding it really difficult to manage them in the long term. Google. Cloud offers, a few options, for customers, so if you're running an on-premise system, to. Do distributed, computing, using. Hadoop and spark you. Do pretty much everything, on your own from managing the infrastructure. Skilling. Dev, integration. Monitoring, writing, the core you. Can take those jobs and run it in Google, Cloud in Google compute engine and then. Google can take care of the actual, hardware infrastructure, complexity. So that you can focus on writing code doing, integrations, killing on your own it gives you some flexibility, you can bring the code as is, and get started with moving, to cloud. But. A lot of our customers are saying can. We modernize, our Hadoop and spark deployments. To and. Use a completely silver, less topology, so that I am not worried about scaling, issues I'm not worried about integration, I can just focus on writing, MapReduce. Or writing spark, jobs. This. Is what data proc does and I. Keep chilling what what does it mean for businesses, when you move to data proc what, do you end up saving, you. End up saving a lot of money. This. Is the ESG study that compared. Data. Processing, jobs running, in data proc compared. To on promisor loop and spark best systems it's. 57, percent less expensive, and then. When you are running the same jobs in AWS, EMR, compared. To data proc it's. Almost, 32 percent less expensive, to run those jobs in cloud data proc so. Again. The, meta point is running. Hadoop and spark in the cloud is not just good enough we. Need to think about the technology stack, more, holistically, look. At the architectural, choices because, it has a cost, implication that's. Going to hit your IT budget directly. Any. Pandora users here listen, to podcast, or music from Pandora so. Pandora is one of the world's leading music, streaming and throat, cast companies, and they. Had an on-premise, Hadoop. And spark based, systems with seven plus petabytes, of data in their own premised daedalic, but. They have 60. Plus million, users, who are using, Pandora. App from 200, plus different devices and they wanted to understand, the real-time behavior.
Of Those users which, song they are listening to so, that they can serve them personalized. Content, they. Can serve them personalized, broadcast, or music recommendation. So, they came to Google cloud they, move their data to Google Cloud Storage the on-premise data leak data and they're, bringing in real-time streaming, data from. Those 60 plus million users, and moving into bigquery and doing real-time analytics. Using. Simple sequel, at. A fraction of a cost and, effort just, imagine. The complexity. Of this time of a system and they're, running it in a server less environment, and there they, are up and running within, like a record. They're, able to do a/b testing, much faster, what type of content is resonating with customers. Streaming. Analytics is is becoming, real. You. Know IDC, predicts by 2022. Almost a quarter, of world's, data is going to become real-time, in nature. I'm. Sure you have heard about this term becoming, data-driven. Becoming. Data-driven, is a strategic, choice for a company, which. Market should I invest in what. Type of products, should I launch these, are all data-driven decisions. Strategic. Decisions, have long-term business implications, but. Businesses, are trying to be more even, driven they. Want to change the price of a product dynamically. Depending, on how the market is receiving, that product or how competition, is pricing. This. Type of analytics, systems are fundamentally. Different from. Traditional batch, oriented. Data systems. Companies. Are increasingly, investing in, real-time, event-driven. Architecture so. That they can get data from real time like, in the example, of Pandora they are getting real-time data from devices trying. To understand, the behavior of the users how, they're interacting we. Have many customers who are doing real-time website, personalization. Based on clickstream, data. Google. Cloud offers, a serverless, streaming. And batch processing. Unified, streaming, and batch processing, service. So. Cloud pub/sub, allows, you to ingest, data from. Any device and system Cloud. Dataflow, you, can use cloud dataflow or you can use smart if you're comfortable using spark to, do real-time batch, real-time, or batch processing, of the data and then you can store the data in bigquery in real time and start, querying the data in sequel and then raisa is going to make that real for us and show the magic how real-time. Data can come into a data warehouse and you can write queries and get that data that's. Magical, and this is something that we are seeing and the customers, who are comfortable using open source Kafka spark. They have an option to use those services too so, with that I want, to make it real and call razor show, the magic of streaming. Except. Good. Afternoon everybody. So. For, this demo we're going to make use of a food retailer, which, has stores open 24 hours 7. Days a week, the. Head of sales has asked us to do more, targeted, couponing, based time real-time. Inventory, data which, is going to require us to have access to real-time sales, data now. For the head of IT this is a bit of a problem because right now they have two, issues first of all all, of the points of sale data, is processed, in batch at the end of the day secondly. He has a small, team so any new system that we build has, to be operationally. Very efficient, so, we're going to rely on some of the technology, that SAP was talking about around that serverless. Analytics. And build an architecture that meets those requirements so. First of all we're going to take that point-of-sale, data and rather, than process it in a batch at the end of the day we're, going to send it as it happens to, Google, cloud pub/sub we. Are then going to process that data using. Google Cloud dataflow we're. Going to do two things with every piece of information we get first we're going to send it to our data warehouse using, its streaming capabilities. Second. As we're, now doing stream, analytics we're going to connect our data processing. Systems with, our operational. Systems by sending our inventory, folks a real-time. Data. Okay. So, if. We can move over to the laptop please, sorry. Bear with me. There. We go. Thank. You very much so, because. Pubsub. Is fully serviced I can't show you a deployment, or an install because, all you need to do is create the topic and we send our point-of-sale system to it what I can show you is data flowing, into the, topic so, these graphs are from stackdriver which, monitors, our systems. If, I just highlight this one so. Right now our point-of-sale, systems is sending around 200. Messages a second. Into, Google cloud pub/sub, obviously.
It's A very successful, food store that's a lot of sales over a day. The. Good news for our sales for, our IT team. Is that, if this 200, suddenly becomes two thousand, twenty, thousand, or hundreds, of thousands there's nothing extra, they need to do this, is a fully managed service that's going to automatically. Deal with that load on, the. Right hand side we have a different graph which is actually showing the, time. Every. Element stays, within pub/sub, so how quickly are things being processed as we're seeing them and on. Average it's being processed within one or two seconds now, what is the thing that's doing the processing Google, Cloud dataflow now. To show you about their to flow I'm just going to switch tabs. Here. We, can see the monitoring, interface for Google Cloud dataflow the. Graph you see is a representation. Of the code that I've written to, do the processing, and what, it looks like in a visual form at. The top we have pub/sub, so, we are reading information, from, our topic on the, right hand side I'm just going to expand, this since. This pipeline was started there was 70, million elements. Have been processed in terms. Of what the dataflow, is actually doing so, in this, middle branch we're actually sending information to, bigquery, using. Bigquery streaming. API capabilities. So that data is staying, nice and fresh and we'll come back to that and show that in a moment but. Now that we've got a streaming system how do we connect this directly to our other parts of our business well. We can actually do some stream analytics what. I've done here is I'm, doing some, processing of the data that's coming in it's coming around 200, messages per second as you can see I. Am. Doing some aggregations, and counting the amount of sales per product per store now. If you'll notice at the bottom I, am. Actually sending, this back into, pub/sub, so, pub/sub is no longer just being used as a way to ingest, information. Is being, used as the glue to connect, data, flow to other parts of our systems, in this case we connect it to the inventory department, so now our data department, is sending up-to-date live information to inventory so they know when they need to send things out to different stores so, here we have the processing, being done for data flow with, data flow let's. Look at the data warehouse so I'm, now on two big queries user interface. You know our SAP mission this is fully managed, there no create, database, button there's no create cluster, there's, no maintenance button, so again we're hitting our targets, in terms of low operational, cost for my sale and my IT team in.
Terms, Of what. We're going to run the query on on the left hand side for just expand, this we. Have the datasets that are available to me so in. Particular the, order with lines table, is the. Table that's being having, all that information from the point-of-sale systems, sent, to it it's. Got our schema as you would expect under details. If. I just go, through here, since. This table was created, we've. Put in about 1.7, billion rows, of information. Scrolling. Further down. We. Can see that the streaming buffer is active in, other words while, I'm speaking this table is being updated with. That point-of-sale. Information. Okay, so let's run a little query on this. Table it's just a zoom out a little bit for. This query I'm going to do a couple of things I'm going to join that table. With. Location. Table that contains information, about the, position. Of city of the, stores, on. The, left hand side that table, actually belongs to a different data set so, again sap, talked about this that we can just load data in and breaking, down silos the facilities. Department all they had to do to, give me access to that data to join in my sequel statement, was, to, give go through the proper security and access controls, and grant, me permission, there was no import/export. Moving, of the data around between, the departments, and that, is helping, breaking down the silos the, query, I'm gonna run join those two piece information and I have a simple predicate here that is essentially, going to look. For sales in the last ten minutes so, we're gonna run that. Zoom. Out a little bit, well. That's running so. This is completed running against that table the. The one that had the billion, rows we have in. The information, being provided I have for convenience, place the column here which is the current time 12, 24, 41, and you'll. Notice that the last sale in the New, York saw, was at 12:24. 38. So, within. Three seconds, of a sale happening, in the store our data, warehouse has, the, right information and, so, if you recall in our stream analytics does, the inventory Department so, with, this we've actually been able to show that we can build. The streaming internet pipeline, I, will go back to the slides now please. Thank. You so, here we built, a system that takes the point-of-sale information, process. It through pub/sub works, with dataflow to connect to our inventory department, and lands, in our data warehouse this. Demo there is an action extended, version of this at the demo booth outside, where. We connect to, our CRM, system we also do some, machine. Learning on the data to, do the couponing, that we had talked about at the start okay. So with, that this was a retail. Demo. Using, this architecture, for a different, type, of industry. I'm gonna invite, on to our stage TK, who is the chief investment officer. For, grasshopper. Based. Out of Singapore yeah so. If, you could walk through how you use this architecture. For your, industry and also three. Of the key learnings that you had along. That journey, sure do, you mind if I crept yes sorry. Thanks so. Hello. Everyone my name is TK I am the chief investment officer, for, for. Grasshopper. Grasshopper. Is a proprietary trading, company, that has been providing, liquidity to, major exchanges around the world including the one locally, here Singapore, Stock Exchange. Japan. Exchanges, like Tokyo Stock Exchange OSC and in, the u.s., CME. We. Consider, ourselves to be an innovation driven company we. Have about the team of 70 people and, we've. Been working with Google cloud since. 2017. At. The end presentation I, will share with you the three guys three, biggest takeaways. For. From our journey itself. So. Here's. Our problem statement. In.
Trading. Is a very high. Highly. Competitive. Feel. Right, high-frequency trading. Requires, us to constantly. Evolve. And innovate and, we. Need to innovate fast, so, since 2006. Grasshopper. Has been a trading, company that uses technology, and pivoted. To a company, a technology. Company that trades. And in, 2017. We, pivoted again and we evolved into a data-driven company, by using GCP. GCP. Allowed us to innovate on our quantitative, research and trading. Without boundaries. As. You. Can see on the slides, changes. In technology, constantly, affects, the way we trade and affects the wool of trading itself, CPU. Is becoming, faster latency. Decreasing, and creating. More data, storage. Costs, has, also decreased hard. Drives becoming, larger, and allowing. Us to store more data I think, both. SAP and Razer has sort of talked about how, a lot of more, data has actually affected how. We have to spend a lot more time storing. This data figuring, out how to query this data and, we've. Kind of passed this responsibility, on to Google cloud so. What does this mean for, grasshopper. It. Means that we are able to apply more. A I spend. Time more on research and, also, allow our quantitative studies to study a lot, more data. Which. Means we we, use bigquery. So. Just. To just, to wrap up my three, big points and the big three takeaways that I had with, the journey, starting. For the left side enabling. Scale, just. Imagine somebody in, my research team having to come to talk to me about. Predicting. The infrastructure. Requirements that they have they, need to do the research or the storage space they they, need but. To, be all to be to be truthful to you all a, lot, of the researchers, job. Is to actually explore, the, unknowns, so. If you exploring the unknowns you actually do not know how, much infrastructure. You, you. Actually need before you start your. So if you request for too much it becomes wasteful. Now, if you request for too little which is most, most. Of the situations, we, end up with is that we limit ourselves on, the, second point which is cycles, of innovation, if.
You. Are limited, by infrastructure. So, is your rate of innovation, so. Imagine, your current rate of innovation is seven days and every, seven days you come up with a new trading model what. If you could actually you. Know 10x, that or a hundred acts that or even better still you could a thousand, acts that and reduce, that rate of innovation that. Period that you take to calculate, and do your research to just ten minutes, how. What would that do to your rate of innovation for your company itself and, for. The last part is. Something, we actually discovered. Along the journey itself which is designed, with abundance. What. We focused on was. How. We could use the cloud to, actually design in a different way for. Most of us who kind of developed technology over the last twenty years we'd normally, develop. Technology, with limited. Infrastructure, sort, of a test server or, you. Know it's sort of testing. Or or building on your laptop itself. Basically. Get scarcity, but. Also with cost constraints. What. If you could solve these problems, with abundance, meaning. You, could actually design. Assuming. A lot more hardware than you actually need, start, with something with, a lot more, compute. Power, store. Everything. You, know some of the questions we asked was what data do you want to store you. Know back in the twenty. Years ago we had to decide how, much data we wanted to store because the dis sizes, were very very tiny but, right now we could store everything. That's coming in from the exchange and then figure out from. That what. We want it to keep I, think. With, that I appreciate, all your time for letting, me share my journey with you all and looking. Forward to hearing, a discussion from this conference leader thank you very much DK really appreciate, it and we that will bring SAP back on stage. Thanks. Raisa awesome, demo what. A great story big. Data analytics is, helping. Customers to innovate, faster, operate. In an unconstrained. Environment. What. About data protection, security, and governance this is becoming, top of mind in all, the data leaders, or CIOs, in companies, many, of you here. Google. Cloud offers, an end-to-end, security, and governance across, the entire data lifecycle you. Can encrypt, data at, rest. Or in motion you. Can mask data on the fly using cloud DLP, data, loss prevention service, managed access control, exfiltration. Risk, we. Have access transparency. Policies, and Google. Cloud is always subjected, to higher standards of third party audits for regulatory certification. And compliance, purposes and that's, why a lot of good leading banks, healthcare companies, are coming and using our service we introduce data catalog. Earlier. This year if you haven't played with the service it's a metadata, management and, data discovery, service, you, can tag data both business, and technical metadata do. Data discovery, and analysis it's, fundamental, to our portfolio. But. We are not in this journey alone many. Of you have already invested. In many other tools. ETL. Tools such as informatica, BI tools such, as tableau, lucre and others Google. Cloud is deeply, integrated across. Data ingestion, services, data processing, services bi, reporting, so you can bring best of those tools from our partners, to Google cloud and get complete, value of your data you're, not showing our anything.
Forrester. And Gartner, both have recognized Google, as a leader in the data analytics, space and if. You are interested, to learn more those, reports are available for downloads from our website so. With that I can't. Be more excited to share the stories with here today we, talked about customers. We talked about examples, of innovation, and looking. Forward to see what you do with data and how, you use Google cloud to take your business forward thank, you very much.