In this workshop, you will learn to design a data pipeline solution that leverages Cosmos DB for both the scalable ingest of streaming data, and the globally distributed serving of both pre-scored data and machine learning models. The solution leverages the Cosmos DB change data feed in concert with the Azure Databricks Delta to enable a modern data warehouse solution that can be used to create risk reduction solutions for scoring transactions for fraud in an offline, batch approach and in a near real-time, request/response approach.
Who should attend
This workshop is intended for Cloud Architects and IT professionals who have architectural expertise of infrastructure and solutions design in cloud technologies and want to learn more about Azure and Azure services as described in the ‘About this Course’ and ‘At Course Completion’ areas. Those attending this workshop should also be experienced in other non-Microsoft cloud technologies, meet the course prerequisites, and want to cross-train on Azure.
After completing this module, students will be able to:
- Implement solutions that leverage the strengths of Cosmos DB in support of advanced analytics solutions that require high throughput ingest, low latency serving and global scale in combination with scalable machine learning, big data and real-time processing capabilities.
Module 1: Whiteboard Design Session - Cosmos DB real-time advanced analytics
- Review the customer case study
- Design a proof of concept solution
- Present the solution
Module 2: Hands-on Lab - Cosmos DB real-time advanced analytics
- Collecting streaming transaction data
- Understanding and preparing the transaction data at scale
- Creating and evaluating fraud models
- Scaling globally