This document was created to show the integrations between Qlik, Databricks and SAP, with a focus of support Databrick’s new Delta Lake offering. The ...
This document was created to show the integrations between Qlik, Databricks and SAP, with a focus of support Databrick’s new Delta Lake offering. The use case documented below is based on SAP IDES Sales and Distribution data stored in a R/4 Azure system. The raw transaction ECC data is moved from SAP to Databricks using Attunity Replicate. A quick description:
Attunity Replicate empowers organizations to accelerate data replication, ingest and streaming across a wide range of heterogeneous databases, data warehouses and Big Data platforms. Used by hundreds of enterprises worldwide, Attunity Replicate moves your data easily, securely and efficiently with minimal operational impact. Find out more at https://www.qlik.com/us/products/attunity-replicate
The base tables were then transformed into a data mart schema by leveraging Attunity Compose for Data Lakes to prepare the data for processing by the Databricks ML engine. This transformed data is landed post processing into Delta Lake format which is a new feature of Compose 6.5.
Attunity Compose for Data Lakes automates the data pipeline to create analytics-ready data. By automating data ingest, Hive schema creation, and continuous updates, organizations realize faster value from their data lakes. Find out more at https://www.qlik.com/us/products/attunity-compose-data-lakes
Delta Lake, is an open-source storage layer that brings ACID transactions to Apache Spark™ and big data workloads. What make this unique is that unlike most Hadoop file formats, Delta Lake supports inserts, updates and deletes which are core to working with SAP ECC transactional systems. Find out more about Delta Lake at https://delta.io/ . Databricks, in our use case, has run a series of machine learning algorithms to predict delivery status based on multiple factors in the data mart. More about Databricks:
Databricks’ mission is to accelerate innovation for its customers by unifying Data Science, Engineering and Business. Databricks provides a Unified Analytics Platform powered by Apache Spark for data science teams to collaborate with data engineering and lines of business to build data products.
Qlik Sense is used as the data integration engine to combine the data from the raw ECC tables and the machine learning output from Delta Lake, correlate with the Qlik Indexing Engine, and then visualize the combined data set.
In this demo, Qlik visualizes SAP Sales and Distribution data that has been loaded into Databricks Delta Lake. Machine Learning jobs have been run aga...
In this demo, Qlik visualizes SAP Sales and Distribution data that has been loaded into Databricks Delta Lake. Machine Learning jobs have been run against the data to predict On-Time Deliveries and compare against how the Deliveries actually performed.
Watch how easy it easy to offload #SAP into #Azure SQL Data Warehouse with LIVE change data capture using #Attunity Replicate and #Qlik Sense. Tak...
Watch how easy it easy to offload #SAP into #Azure SQL Data Warehouse with LIVE change data capture using #Attunity Replicate and #Qlik Sense. Take control of your SAP data with the power of Azure!
This video shows how easy it is to extract data from a SAP R/4 system into an Azure SQL DW database and keep it synchronized in real time with changes made in SAP. Qlik is using a prebuilt SAP Sales and Distribution app to visualize the data from the Azure SQL DW database.
In this video, we will show how easy it is to offload data from SAP ERP using Attunity Replicate into Snowflake DB, and then visualize that data with ...
In this video, we will show how easy it is to offload data from SAP ERP using Attunity Replicate into Snowflake DB, and then visualize that data with Qlik Sense. The key advantage of Replicate is that the data in synchronized in real time with Snowflake from the SAP system as new transactions trickle into the system. The demo is a multicloud solution as SAP is on AWS/Oracle, Attunity is on GCP, and Snowflake is back on AWS. Qlik Sense is running on GCP as well.
The data used in this demo is SAP IDES data from the SAP ECC module Sales and Distribution.
Attunity Replicate for SAP is a high-performance, automated and easy to use data replication solution that is optimized to deliver SAP application da...
Attunity Replicate for SAP is a high-performance, automated and easy to use data replication solution that is optimized to deliver SAP application data in real-time for Big Data analytics. it moves the right SAP application data easily, securely and at scale to any major database, data warehouse or Hadoop, on premises or in the cloud. This solution builds on decades of leadership in enterprise data replication and SAP integration.
This application demonstrates a direct load from SAP ECC into Cloudera. The data is loaded directly from SAP into HDFS and then turned into Impala tables that Qlik connects to and applies complex transforms to in adding business friendly terms and time series analytics capabilities.