AWS re:Invent 2020: Streamlining manufacturing and supply chain at Novartis

AWS re:Invent 2020: Streamlining manufacturing and supply chain at Novartis

Show Video

hi everyone i'm very excited to talk to you about how aws and novartis are streamlining manufacturing and supply chain at novartis my name is amit gnostic i've been with novartis for nearly 17 years and in the last two years i've been working in our unit which is called novartis technical operations that's manufacturing supply chain where we produce nearly 70 billion units a year and supply our products to more than 800 million patients around the globe i will be co-presenting together with ian myers your colleague from aws who will present and introduce himself in a bit about a year ago we announced a collaboration between aws and novartis this collaboration is really about looking into the value chain of the whole company and identify opportunities where we can use technology and the best use of data to drive our productivity and efficiency as part of this collaboration we have been looking deeply into operations manufacturing and supply chain but also in other areas where we engage with our patients but before i talk about two cases that we have been looking into in detail i would like to zoom a little bit out and talk about novartis and the journey we are on as a company we want to be the leading medicines company powered by data science and digital and as i said we are on a journey a journey that started some time back if you go 100 years back we realized that novartis was actually an industrial conglomerate we had a dye business we had a chemical business we even had food business and a part in healthcare it was then about 10 years back that novartis became a 100 healthcare company but it was still very diversified and then over the last 10 years we made some changes to our our portfolio through divestments spin-off for example alcon our surgical business was spun off two years ago or gsk our consumer business was divested some years ago and at the same time we also bought new assets advanced technology platforms to become a 100 focused medicines company now it's not only that we are changing we also see that the industry is changing and especially three areas are of importance first we have a much better understanding of human biology today we see an explosion in data science we see an explosion in general when it comes to data it's all driven by digital health so we are not only producing much more data with the healthcare professionals but also with the patients and we also see that we have a continuous pressure on our prices a lot driven by aging population the increased demand for chronical treatments a lot of costs that is going up and driving the pressure in our industry and we know that the company that gets it right that can leverage the better understanding of human biology the company that can leverage the data and data science the best and is able to counter the pricing pressure will win in this industry in all of this manufacturing and supply chain plays a key role because as i said we only not only produce 70 billion units and reach more than 800 million patients but we also have to make sure that we can cover a complex portfolio in our portfolio we have very traditional medication which is best known as pills and tablets but we also have personalized medicines where we basically re-engineer the genes in the patient's body we also have to counter demand volatility something we saw this year driven by covid where a lot of stalking was taking place of pain killers paracetamol and some other medication so we have to counter that but we also have to make sure that we maintain a global footprint around the globe well as being local as well making sure we are not too dependent on single regions so that's why we need to be global but we also have to shorten our journey product journey from the factory to the patients so we have to be local as well and in everything we do our costs cox of goods sold is key because the more cost effective we can produce the more patients we can reach but it's also known that the pharma industry in general is very slow when it comes to adapting to new technologies and to a much extent this is driven by our past we have been for many years and still are a highly regulated industry that means every change we make we have to re-register we have to re-document which causes a natural hesitation to make any changes historically we also have been a very strong and high margin business so it was all about innovation new products quality and supply but not so much about optimizing our processes and our productivity and a lot of focus on top-line growth driven also by the interest of our investors and all of this has led to a manufacturing and supply chain in the industry which is behind if we compare it to other industry we are still with operating with batch processes whereas many other industries have continuous processes a lot of manual process and manual work not a high degree of automation we don't have a high supplier integration so that's something we also saw dramatically during the covet period and when it comes to quality a lot of our steps are post process quality controls and not in process quality controls and all of this all the four factors i just mentioned to some extent drive the high prices of our products now we know leveraging data and digital can change it inventory management is one example if we have end-to-end transparency we can shorten our buffer times and with that also the time of production complex supply chain very similar and to end allows us to have the right steps at the right time new ways of working moving away from paper really establishing and embracing modern ways of working and especially in the areas of control if you can move away from paper from batch documentation on paper but to digital there will be a tremendous new way of working in our industry now with all of this in mind we have worked with aws on two specific cases as i said one key element of everything we do is end to end transparency and one example one product we have been working on is called spot on spot on is about an inside center that looks at all our steps along our whole value chain and is about overcoming the limited transparency that we have today given the many systems we operate in it is about allowing tactical decision making by having all the information at our fingertips and it is about reducing manual transactional efforts so much more automation in everything we do the other example and the other product that we have is our buying engine and this is all about leveraging our scale we are a big company so how can we make sure we engage with our suppliers the best possible way and can also leverage the size that we have how can we ensure we have the best pricing among all our suppliers that we have in our ecosystem how can we make sure that we order the right time so we can also lower our inventory and also the transportation cost so we order them at the right place and with all that we do how can we improve our productivity so spot-on and buying engine is the core of this presentation and obviously we chose aws to work with us because aws for us is one of the key players in the industry when it comes to supply chain and also to manufacturing we truly believe that working with aws we can deploy the best technologies the most modern technology state of the art and make it part of our system we truly believe that we can challenge this status core and really think about what is possible what does innovation mean it is also about learning from other industries what you have done maybe in the automotive industry and in many other industries where you are present and one thing that always really impresses me most is the approach of working backwards when it comes to innovation at some point having a press release looking into the future what does it mean if you launch this product in one year or two years and then working backwards to really see what are the steps that we need to take to get there where we want to be so this is what i'm really excited about and with that i hand it over to ian who will take you through some of the technical details of the products that we have developed thanks very much amy hi there i'm ian director of technology with aws's global and strategic accounts business i'm going to talk with you about the two projects that i mentioned very exciting projects that we're proud to have partnered with novartis on we're going to talk about the buying engine first and a really exciting project where we have the opportunity and the objective to reduce novartis's overall procurement costs by 5 by increasing the volumes for the types of lab equipment ppe that they're purchasing and make sure that buyers were producing the right products for the task that they were performing and particularly try and reduce the amount of over specification for the goods and services that they might be buying we also wanted to help them provide a much more amazon like shopping experience for uh all of the products that they purchase across a large number of suppliers uh and really streamline that process to make it easier to buy the right product at the right time on the spot on side we built an application called onsite and its objective is to do manufacturing analytics to really improve the visibility of the batches that were being produced during the production cycle and in fact literally during the production cycle to facilitate handovers between shift groups where there may have been milestones or tasks that were exceptionally taken during the production of a therapy we wanted to track the production any disruptions that might occur help to facilitate root cause analysis and ultimately to be able to predict and project completion dates yield rates and improve the amount of data that we were able to yield for the purposes of batch planning so first let's talk about this intelligent procurement platform in the buying engine so the first thing we needed to do was to bring the supplier catalogs from the myriad of suppliers that novartis uses for for lab supplies bring those in-house and build that front-end experience of an internal product registry and the ability to create that shopping cart experience and this would include things like recommendations it would include collaborative filtering and show price comparisons between the different products that might be be chosen and ultimately the goal is to integrate directly with the procurement systems to really simplify the ordering process reduce the amount of human tasks and move in the end to completely automated purchasing of the right product at the right time so the architecture of this is what you see here we used the suppliers catalogs in the form of their websites and we built an application on amazon's elastic container service that pulls that data down and then processes it with a variety of models in order to turn raw html into product information we use that product information to create what we call a knowledge graph or multiple ways of describing the same products in terms of what tasks that they fulfill how are they used together be able to track robust amount of metadata and also to track volume that is purchased of those products over time against each of the products in the knowledge graph and in the end we needed to create an internet and intranet application that would allow their users to search that knowledge graph receive product recommendations and then make the right choice one of the areas we want to zoom in on was a really tricky problem that the team had to solve around applying machine learning models to the downloaded supplier catalogs in order to build this knowledge graph so an example of the type of products that might be purchased was these three examples of a 200 milliliter beaker they're very similar in terms of what they can do and in fact what they would be used for and as you can see they have very different prices the product on the right is up to 10 times more expensive than the most basic product so how do you choose which one is the right one well we applied a machine learning model to the product descriptions that you can see below the low form beaker the simplest beaker has basic properties of a beaker nothing particularly exceptional about it but as we get into more complex of the items that were purchased we can see there's a lot more information to grab our machine learning model then uses a variety of techniques to extract these property features out of the text that was on the suppliers website and make it contextual and important for the purchasing and lab technicians who are going to be responsible for for buying these products so in the example of the the middle item the low form beaker you can see that descriptions like type 1 class a or meets astm are features that we can extract from the description and classify those as specification in nature whereas things like mechanically stronger or a heavier base and thicker walls on the heavy-duty low-form beaker are things that we then can classify as physical properties and we use this information and as well the amount of features that we're able to extract from the description plus the price to try and create a concept of a hierarchy to understand that some products have small number of features some have a large number of features and therefore that is used with the pricing information to create that hierarchy of understanding of what product is good for what task so when we drill even further into this architecture we want to give you a really clear view of of how this works so the first step is to get the product supplier detail pages down and into a format that they're percival processable and we use an application called nutch a page crawling application on amazon emr in order to process all of the target pages and domains and crawl down through the description detail pages and we store those directly into amazon s3 as raw html we then use an ecs cluster to cleanse standardize and label that data to consolidate duplicate pages together and to create versioned set of pages across time we then apply a machine learning model from amazon sage maker hosted in sagemaker where we introduced the raw html and we get back a set of feature vectors that allow us to create the knowledge graph the knowledge graph itself lives in amazon neptune a fully managed graph database and we also take the main features as well as some simplified containers for the product descriptions and store those into amazon's elastic search service a fully managed elastic search cluster on the back end then we have to create an application that can work with this data and again we leveraged uh the elastic container service to create microservices that included things like search the description service that accesses metadata comparison service and that does some similarity matching we also have a faceting service that allows you to divide up the different products that might be returned into their various facets and then recent purchase purchases and an insight service and those are all exposed then through a flask application running on ecs into the intranet application browsers and then the ecs services on the background also integrate with the procurement api in this case sap ariba in order to actually dispatch the purchases that end users select so the next phase where we want to go with the buying engine is to integrate historical purchasing and volume information that is sourced from novartis's data lake the f1 data lake we'll then be able to use amazon forecast it's a fully managed forecasting model to be able to push the historical buying information through and understand where seasonal peaks and troughs may occur as well as to use amazon personalize a personalization and recommendation engine to improve the value that the end user gets from the insight service and that will just inform the purchasing decisions better by the right product at the lowest cost at the right time but we can go even further and in the long term we want to roll out what we call auto replenishment where we create a brand new ability subject to human oversight where we can automatically purchase the lab supplies and services that we know we always need to purchase and so we can aggregate those volumes up across multiple lab groups multiple business units and encrypt much larger volume without having to involve a human buyer in that purchasing activity and that should should significantly streamline things while also giving us the needed human approval workflow so a very exciting uh product today and a really exciting path forward i think for what we can achieve with automated replenishment now let's talk about manufacturing analytics the other side of the projects that we've been working on and we're going to drill into the details of on-site so we had some pretty important requirements to achieve very different from the buying engine and in a completely different way of working with a completely different set of of end users we needed to ultimately facilitate the ability to improve the planning process capture more data about manufacturing to supply retrospective analytics but also to facilitate real-time learning about how is a batch going where were their batch milestones we captured loads of data from legacy plant level software and even new sensors and we have to support both streaming and batch oriented data feeds all of that data streams into the f1 data lake that i mentioned previously and that sits there then categorized and secured for long-term analysis of nto's operations overall we then implement a series of forecasting and predicting models on top of this data and then leverage those models both for batch inference so that we can forecast forward in time but also to control real-time decisions that are being made whether that's during a shift hand or handover or for what-if analysis as we're looking into the details of how a particular batch is is being built and not unlike the buying engine we did need to build a intranet web-based application to surface this data but we also had to service the needs of our site users who are walking in a uh complex manufacturing environment and where there are team handover stations including the need to create tv walls six very large screen televisions that could surface all of the relevant data that's needed to have at a glance for the purposes of real-time control and monitoring as well as facilitating shift handovers as new folks come in to work on the manufacturing line so we'll build up the picture of what we actually built piece by piece uh because as you can imagine this is a fairly complex manufacturing process some of the most sophisticated and sensitive uh products being built and so we need to to really tick all the boxes for all of the different aspects uh that we have to service so within uh the existing uh sites where onsite is rolled out we have lots of industrial equipment this is legacy equipment that has been on place in place for for many years and can't be interrupted as we improve our ability to extract the data where we needed to improve the amount of data that we could capture for instance by using computer vision techniques we added on things like new camera infrastructure as required and integration had to be achieved with existing plc and dcs systems and we had a view to the need to support even connected workers the ability to directly interface on a per manufacturing staff basis in order to support the processes the data within the factory side is captured on local storage on the manufacturing device itself it may be captured inside of a manufacturing execution system or mas and then there's extensive infrastructure of historians or time series data applications that capture all the sensor information coming from these legacy devices collate those and then forward those on for subsequent analysis or operational control so the first thing that we added in order to be able to work with this data was the need to interface directly with the equipment and for that we used the aws iot green grass module allows you to run both real-time compute as well as machine learning models directly on existing hardware sitting within the factory and integrate with the various different protocols that those pieces of hardware may be using whether it's opc ua or mqtt and then to be able to interact in real time with those sensors so that could be for instance filtering and saying well here is a a particular sensor that is producing results that need subsequent handling or where we need to exert operational control and actually change the information that was happening and a variety of components including aws iot sitewise allow us to capture all of those different data feeds and then react to them the next thing we need to do is take that local storage of historians and messes and be able to enable those in the cloud and so we took those enterprise business applications and we moved those into the aws environment through a component called a global historian so lots of sites all have their own local historians and then they push all their data up into a global historian and then we push the mes data into the manufacturing data hub infrastructure in the aws cloud in order to react to the real-time processing that was happening within the factory we leverage aws iot core and iot events and we can then use events for instance to say here's a situation that's occurred on a particular device or a pattern that we've observed and we can also forward the data to kinesis data streams which allows for virtually unlimited number of consumers then tune into that data and do something interesting with it we also leverage amazon sns the simple notification service in order to send notifications that are happening on the site where perhaps a green grass lambda function will say look i've seen an anomaly in a sensor's data feed and i'm going to alert somebody to the fact that that's happening and we can then react to that now the data streams that we're capturing through iot core and kinesis data streams as i said are archived down into the f1 data lake but we're also taking all of the data that we need from our enterprise business applications including the historians and the mdhs and lifting those up into the data lake and those sit on amazon s3 as raw data and then novartis processes that data with the databricks cloud service which does cleansing and cataloging and advanced transformations that might be needed with this data now we get into the actual on-site service where we add the back end for on-site which is shown at the bottom which is comprised of dynamodb a fully managed nosql database as well as amazon aurora which is a highly available and highly performant relational database service with postgres and mysql compatibility amazon redshift is used on the backend for the more sophisticated multi-dimensional analysis that sits within the on-site back-end platform on the front end we built a series of asset or batch based microservices that link back to the asset that's being produced which might be in the form of a line or a specific unit made up of multiple sensors and we use lambda to control how we work with the back end data store lambda is a fully managed serverless compute engine and that's where our business logic sits about what onsite does what views of the data it can surface through amazon api gateway but we also host central uh models with amazon sagemaker that integrate through iot core and iot events so we have a highly available fully trained model that's able to do real-time inference for iot core and iot events to integrate with so our asset and batch microservices really are a combination of compute and the associated api as well as the machine learning models that facilitate that compute lastly we have a separate on-site front-end and that front end again leverages lambda for serverless compute but in this case we're combining all of the back-end asset and batch-based microservices into front-end appropriate services that then allow us to expose our web ui or our mobile ui using aws amplify and create notifications that go out and we did do a combination of web and mobile uis because that was the fastest way to create both the television view for site handover as well as internet facing applications so there's quite a few moving pieces to this application but they're all very modular and able to be decoupled from each other so that they can be operated independently scaled independently and made highly available so what did we build what is the end result that we got to well we created the on-site front end as we mentioned and that includes a variety of real-time profiling capabilities associated with the manufacturing line this uh we call it the the the sunburst diagram but this is overall plant health and this tells us how the site is doing from an overall area perspective and whether or not we have seen particular anomalies or events associated with the different dimensions of how the site is operated and in general this follows an isa 95 hierarchy associated with how we report the data up and kpis for example we created a batch listing these are all the batches that are in flight within this particular site and we can see that there are various life cycle states start and end dates progress states and so on for each of those batches we can zoom in and look at the overall genealogy of the batch how did it come to be in the status that it is and look at where we have future tasks to be done and each of these blocks that you see provide us with an additional layer of detail that we can zoom into in the form of a batch detail view where we can see how and when the batch was progressing and its specific dates of milestone events any delays that were associated with the the batch being created the orange and yellow lines and then the gray lines are the predicted completion dates based upon a forecasting model that's run within the on-site application overall so i hope that gives you an overview of how we managed to achieve these two really different really exciting and we think groundbreaking platforms and i'll hand it back now to amit to wrap up and give you a summary of where we got to thanks very much thank you so much ian for explaining this so well always impressive for me as a non-technology person to learn about all the work that goes into the products that we have so let me wrap up our session here so we started with talking about the announcement that we did about a year ago then about the ambition that we have at novartis to become the leading medicines company in the industry the role that data and digital plays and also the role that manufacturing and supply chain plays on our journey to extend people's and patients lives now we talked about buying engine and spot on two of the use cases that we have worked together with aws both buying engine and spot on are implemented in some of our sites today it is too early to talk about dollars and how much savings or productivity we have from these two products but what we know is that the potential is big buying engine now the data is coming together we have transparency transparency about the different prices of the suppliers but also we know when we should order so the potential is huge and very similar when it comes to spot on it is there the information is coming together we learn how to see by having end-to-end transparency how to act using the information that is available but also learn based on the insights that we have but we also know to be successful to have these products in full motion we need to drive change management our associates we as teams need to be able to navigate through these systems we need to get used to that our decisions are driven by data that are driven by insights when it comes to buying engine but also when it comes to spot on so i'm very very excited that these two products will unleash a lot of potential and i'm really looking forward to see the impact that it will have across our entire network so with that i hope that you enjoyed our session and thank you very much for watching and listening ian it has been great to partner with you and i'm really looking forward to many more of the work that we have been doing together so thanks a lot everyone and have a wonderful day

2021-02-12 14:49

Show Video

Other news