What SAP’s updated API policy means for customers 

Earlier this week, SAP announced updates to its policy governing the use of SAP APIs. The announcement quickly prompted discussion across the SAP Community and LinkedIn, which is understandable given that it signals a clear change in direction for how SAP data is expected to be integrated going forward. SAP’s strategic intent now leaves little room for speculation: the integration of SAP data with non‑SAP data is expected to go through SAP Business Data Cloud (SAP BDC), and customers will need to start working towards alignment with this direction. 

While adapting existing architectures may be inconvenient in the short term, this clarity also provides certainty. All roads now lead to SAP BDC, and while the transition may require effort, SAP BDC offers strong capabilities around governance, compliant data sharing, rich semantics and data products, as well as enabling AI across SAP‑centric processes. 

Below is my current understanding of the updated policy.

Disclaimer: I’m not a lawyer, and this isn’t legal advice. This article reflects my understanding and interpretation only—you should always validate with SAP and/or your legal advisers before acting on it. 

What policy has changed? 

The title of the policy is simply ‘SAP API Policy’. You can find it here on SAP’s website. It’s not a long article and uses plain language. It boils down to this:  

You may only use SAP Published APIs and SAPendorsed data access mechanisms, and only for their documented purposes. 

This potentially means that many customers will need to rethink how they ingest SAP data into their data platforms. Let’s look at the most common ways customers ingest SAP data today, and what the new policy means for each. 

  • Third-party products (Fivetran, Qlik, Theobald, SNP Glue, etc.)    

It depends. Some connectors may need to adapt. Customers should look to their vendors for clarification; vendors are likely to publish an updated position soon. 

  • CDS views 

This is the tricky part. My understanding is that you are still compliant if you use SAP-delivered CDS views, specifically enabled for extraction. 

  • ODP Framework 

ODP was designed for large-scale data extraction and, over OData, is specifically endorsed in the infamous Note 3255746. The ODP mechanism for OData is effectively creating a custom interface used for large-scale replication, which is not compliant with the new API policy. At the time of writing, Note 3255746 was not available on the SAP website. I expect the wording will be updated in the next few days. 

  • SAP SLT and SAP Data Services 

Both are SAP products, and the use case of extracting data at scale is documented by SAP. Use of these products remains compliant. However, both Data Services and SLT are legacy products and have little value in a modern data architecture. 

  • SAP Business Data Cloud (SAP BDC) 

Welcome to the future. Although the API Policy doesn’t specifically refer to SAP BDC, it is clear that SAP regards SAP BDC as the only compliant way to expose data to third-party data platforms. SAP has invested a lot of time to create better integration with Snowflake, Databricks, Microsoft Fabric and the Google Cloud Platform. How seamless the experience is depends on the platform, although over time I expect that every platform (including AWS) will have some form of seamless integration through zero-copy cloning. That is great, but there are a few reasons why customers are not jumping with joy just yet: implementing SAP BDC is a significant change to customers’ data architecture and requires considerable effort; it might make the landscape more complex instead of simplifying it; and SAP BDC—specifically the integration with third-party cloud platforms—is still new or a work in progress for some platforms, which comes with its own risks. There is also, of course, the question of cost. SAP BDC represents a broader proposition than simply extracting data from SAP, encompassing governance, security, shared semantics and data products, and deeper integration across the data landscape—all of which need to be considered alongside pricing. 

Action is needed 

Customers using third‑party ingestion tools should look out for updates from their vendors, as some connectors may need to change. 

If you are using OData interfaces, now is the time to talk to SAP. Rather than pulling the plug immediately, SAP’s likely approach is to agree a transition roadmap toward compliance—most likely centred on SAP BDC. 

And if your landscape still relies heavily on legacy tools such as Data Services or SLT, this may be the catalyst to modernise. These platforms often stand in the way of faster, more flexible access to SAP data. 

We’ll keep tracking updates from SAP and the ecosystem and will continue to publish follow-up posts as new information emerges. Follow the blog if you’d like to stay informed. 

Making Sense of SAP: 16 Questions on Data, BDC, and Modern Analytics

We’ve been hearing the same thing more and more in conversations with customers and organisations navigating SAP: “SAP data is messy”, “What’s going on with SAP Business Data Cloud?”, “It feels complicated”.

At the same time, SAP has been introducing major changes to how SAP data is accessed, governed, and analysed, particularly with SAP Business Data Cloud (BDC), Datasphere, and tighter integration with platforms like Snowflake and Databricks. For many organisations, it’s not always clear what problem SAP is actually trying to solve, or how these pieces fit together.

To unpack this, I sat down with Jorel Digman, our Head of New Business, to answer the 16 questions we’re most often asked about SAP.

1. What do most people misunderstand about SAP data? 

Most people assume SAP data is messy. In reality, it is highly structured. Its complexity simply reflects the complex business processes it supports. 

2. Is that a technology problem or more of a legacy problem? 

It’s primarily a problem of limited understanding. SAP contains tens of thousands of tables because it models diverse business processes. Within their own domain, people usually understand the data well, but cross‑domain understanding is often limited. 

3. When clients say their SAP data is a mess, what are they actually dealing with? 

They are usually referring to how SAP data appears in analytics or management reports. By the time data reaches those reports, it has been extracted, transformed, and often mixed with other sources, sometimes even manually manipulated in Excel. The mess is typically in the extraction and modelling, not the SAP source data itself. 

4. How would you explain SAP Business Data Cloud to a finance or operations leader? 

SAP Business Data Cloud provides SAP‑authored reporting models built on deep understanding of SAP processes. These models get organisations most of the way to meaningful reporting out of the box. BDC also introduces AI‑enabled analytics, allowing faster insights with prebuilt models and AI features. 

5. What problem is SAP trying to solve with BDC? 

Traditionally, analytics systems were separate from SAP and required overnight data extractions. BDC makes SAP data instantly available for analytics, dramatically reducing time and effort to move data and enabling near real‑time insights. 

6. What doesn’t BDC try to do? 

BDC is not designed to be a repository for all enterprise data. It focuses on SAP sources, while enabling easy integration with external big‑data platforms such as Databricks and Snowflake. 

7. BDC isn’t built for a single destination, right? 

Correct. BDC integrates seamlessly with popular cloud data platforms like Snowflake and Databricks, giving customers flexibility to use BDC alongside their preferred data cloud environments. 

8. How should customers think about Snowflake vs. Databricks in relation to BDC? 

The choice typically comes down to cost and in‑house skill sets. Teams strong in Python and code‑first engineering may prefer Databricks; teams focused on SQL‑based warehousing may prefer Snowflake. From a BDC standpoint, both are supported equally well. 

9. Where does the SAP Datasphere fit into all of this? 

Datasphere focuses on preparing SAP data for analytics, machine learning, and AI. Workloads may run in Snowflake or Databricks, but Datasphere governs and models the SAP data and makes it available to those workloads. 

10. Does Datasphere matter more once AI or advanced analytics enter the picture? 

Yes. AI requires high‑quality, trusted data. Because SAP data represents critical business processes, it is essential for AI initiatives. Datasphere provides well-modelled, accessible SAP data to support AI workloads effectively. 

11. What does zero‑copy data change in practice? 

Zero‑copy data allows workloads to access data without moving or duplicating it. It enables rapid provisioning of non‑production environments for development or testing, often in minutes, without the overhead of traditional data cloning. 

12. If an organisation is still on SAP ECC, do they need to wait to use BDC? 

No. BDC works with ECC as well as S/4HANA. While S/4HANA provides additional features, ECC customers can already benefit from BDC’s predefined models and data‑management capabilities. 

13. If an organisation has shifted to S/4HANA, what changes for their data architecture? 

S/4HANA combined with BDC creates a fully integrated, real‑time data platform. This enables near real‑time analytics, embedded AI capabilities, and tighter coupling between operational systems and analytics. 

14. What’s the biggest myth to clear up? 

The myth that SAP data is difficult. With the right expertise and accelerators, both those built into BDC and those provided by partners, working with SAP data can be efficient and highly effective. 

15. What is one thing companies walk away with from Snap Analytics’ SAP BDC Readiness Workshop? 

A clear roadmap showing where BDC will deliver business value. The workshop highlights opportunities to move beyond traditional BI reporting and begin leveraging AI‑enabled analytics supported by BDC’s accelerators and governance features. 

16. Is the retirement of BW 7.5 a good opportunity to evaluate future direction? 

Yes. BDC’s bridge functionality allows organisations to maintain their existing BW backend while modernising the front end through BDC. This enables a controlled, phased migration from BW to BDC without a disruptive big‑bang transition.