SAP licence constraints – explainer

Ever wondered how you can get data out of SAP without violating the license agreement? You’re not alone. Most organisations planning to move SAP data up to a Cloud Data Platform are struggling with that very question. Here is a little explainer which hopefully helps you understand what you can or cannot do. But first:

Disclaimer
The terms and conditions of your contract with SAP are agreed between your company and SAP. I don’t know the specifics of your contract, nor am I able to provide legal advice. This article is based on my observation and interpretation. When you are planning to take data out of SAP, I recommend you consult with your SAP account manager and your legal team to ensure you comply with the terms and conditions of the contract.
Enterprise license vs Runtime license

You might have heard that for certain extraction methods an ‘enterprise license’ is required. This is to do with how the database on which the SAP system runs is licensed. When you run an SAP system, you have to install a database system first. SAP ERP can run on a variety of database systems (Oracle, IBM, MS SQL Server, and so on, as well as SAP HANA (the database). The SAP S/4HANA ERP system only runs on SAP HANA. The license restriction applies regardless of what database system you run the SAP ERP application on.

Runtime license

You can purchase a runtime license for the database with your SAP ERP license. The runtime license allows you to run SAP ERP, but nothing else. The SAP application is the only direct user of the underlying database. All other users and usage is managed through the SAP application. Having a runtime license means you cannot create tables directly in the database, but you can create tables in the SAP application (which in turn results in a table creation in the database, with the SAP system as owner). The license agreement does not let you create stored procedures or extraction programs in the database directly. You are also not allowed to read the database directly, or extract data from the database directly or extract data from the database log tables, without going through the SAP application.

Enterprise license

An enterprise license gives you unlimited rights on the database. You can create your own tables and applications on the database as well as running the SAP application. In this case, you are allowed to extract data from tables directly, either by using a 3rd party application which connects to the database or by creating your own extraction processes. An enterprise license will be significantly more expensive than a runtime license. If your company does not have an enterprise license and you want to take data out of SAP, you need to find a way to go through the application layer, instead of the database layer.

Using standard SAP interfaces

SAP has APIs and OData services for getting data out of SAP. Most of these are designed for operational purposes. For example: Give me the address of customer abc. Or: update the price of material abc. These are not really suitable for data extraction. The exception to this are the function modules related to the ODP framework: They can be consumed through OData, and this is still allowed by SAP. You can find more information on using OData for data extraction through ODP here.

Note that it is not permitted to use the ODP function modules through an RFC connection.
Please refer to this blog for more info on that .

There is a standard function module which can be used through RFC, which is the RFC_READ_TABLE or even better, one of the successors of that function module. (RFC_READ_TABLE can’t handle wide tables). Which versions are available depend on your system version, so best to search for it on the SAP system itself. I have the fewest problems with /BODS/RFC_READ_TABLE2. I wouldn’t recommend anyone to build a data warehouse solution based on this extraction method, not least because I am pretty sure SAP have specified somewhere that these FMs are meant for internal use, and might be changed at any time. I wouldn’t be surprised if SAP announces it will forbid the use of these function module in a similar fashion as the ODP function modules.

Third party applications

Third party applications can either use the APIs (Function Modules) mentioned above or create their own application logic to get data out of SAP. If they are using the standard function modules then the same restrictions apply. This means ODP extraction through RFC is not allowed – even if this process is managed by a 3rd party application.

Applications which implement their own interfaces on the SAP system are ‘safe’ – at least for the time being. The small downside of this approach is that you need to implement a piece of code (delivered by the application vendor) in each SAP system you want to connect to. The upside is that the end-to-end process is more robust, better performing and easier to maintain than solutions built on the SAP standard APIs.

Be mindful of third party applications which read the database log or otherwise connect directly to the database layer: You will need an Enterprise license for this, using a 3rd party application does not make a difference from a licensing perspective.

SAP Datasphere

And then there is SAP. SAP Datasphere is perfectly capable of getting data out of SAP, and onto the cloud data platform of your choice. If this would be the only use case you have for SAP Datasphere, then I would imagine this is a very pricy solution. Still, I wanted to make sure I cover all the options.

SAP’s cynical move to keep control of your enterprise data (aka note 3255746)

SAP has rocked the boat. They have issued an SAP note (3255746), declaring a popular method for moving data from SAP into analytics platforms out of bounds for customers. Customers and software vendors are concerned. They have to ensure they operate within the terms & conditions of the license agreement with SAP. It seems unfair that SAP unilaterally changes these Ts and Cs after organisations have purchased their product. I will refrain from giving legal advice but my understanding is that SAP notes are not legally binding. I imagine the legal teams will have a field day trying to work this all out. In this article I will explain the context and consequences of this SAP note. I will also get my crystal ball out and try and predict SAPs next move, as well as giving you some further considerations which perhaps help you decide how to move forward.

What exactly have SAP done this time?

SAP first published note 3255746 in 2022. In the note, SAP explained that 3rd parties (customers, product vendors) could use SAP APIs for the Operational Data Provisioning (ODP) framework but these APIs were not supported. The APIs were meant for internal use. As such, SAP reserved the right to change the behaviour and/or remove these APIs altogether. Recently, SAP have updated the note (version 4). Out of the blue, SAP declared it is no longer permitted to use the APIs for ODP. For good measure, SAP threatens to restrict and audit the unpermitted use of this feature. With a history of court cases decided in SAPs favour over license breaches, it is no wonder that customers and software vendors get a bit nervous. So, let’s look at the wider context. What is this ODP framework and what does it actually mean for customers and product vendors?

SAP ODP – making the job of getting data out of SAP somewhat less painful

Getting data out of SAP is never easy, but ODP offered very useful features to take away some of the burden. It enabled external data consumers to subscribe to datasets. Instead of receiving difficult to decipher raw data, these data sets would contain data which was already modelled for analytical consumption. Moreover, the ODP framework supports ‘delta enabled’ data sets, which significantly reduces the volumes of data to refresh on a day-to-day basis. When the ODP framework was released (around 2011(1)), 3rd party data integration platforms were quick to provide a designated SAP ODP connector. Vendors like Informatica, Talend, Theobald and Qlik have had an ODP connector for many years. Recently Azure Data Factory and Matillion released their connector as well. SAP also offered a connection to the ODP framework through the open data protocol OData. This means you can easily build your own interface if the platform of your choice does not have an ODP plug-in.

One can imagine that software vendors are not best pleased with SAP’s decision to no longer permit the use of the ODP framework by 3rd parties. Although all platforms mentioned above have other types of SAP connectors(2), the ODP connector has been the go-to solution for many years. The fact that this solution was not officially supported by SAP has never really scared the software vendors. ODP was and remains to be deeply integrated in SAP’s own technology stack and the chances that SAP will change the architecture in current product versions are next to zero.

Predicting SAP’s next move

You might wonder why SAP is doing this? Well, in recent years, customers have voted with their feet and moved SAP data to more modern, flexible and open data & analytics platforms. There is no lack of competition. AWS, Google, Microsoft, Snowflake and a handful of other contenders all offer cost effective data platforms, with limitless scalability. On these data platforms, you are free to use the data and analytics tools of your choice, or take the data out to wherever you please without additional costs. SAP also has a data & analytics platform but this is well behind the curve. There are two SAP products to consider, SAP Analytics Cloud (SAC) and SAP DataSphere.
The first is a planning and analytics toolset for business users and was introduced in 2015. For a long time, it was largely ignored. In recent years, it has come to maturity and should now be considered a serious contender to PowerBI, Tableau, Qlik and so on. I’m not going to do a full-blown comparison here but the fact that SAC has integrated planning capabilities is a killer feature.
SAP DataSphere is a different story. It is relatively new (introduced as SAP Data Warehouse Cloud in 2020) – and seasoned SAP professionals know what to do with new products: If you’re curious you can do a PoC or innovation project. If not, or you don’t have the time or means for this kind of experimenting, you just sit and wait until the problems are flushed out. SAP DataSphere is likely to suffer from teething problems for a bit longer, and it will take time before it is as feature-rich as the main competitor data platforms. One of the critical features which was missing until very recently was the ability to offload data to cloud storage (S3/Blob/buckets, depending on your cloud provider). That feature was added in Feb 2024. Around the same time as when SAP decided that 3rd parties could no longer use the ODP interface to achieve exactly the same. Coincidence?

So where is SAP going with this? Clearly they want all their customers to embrace SAP DataSphere. SAP charges for storage and compute so of course they try and contain as many workloads and as much data as they can on their platform. This is not different from the other platform providers. What is different is that SAP deliberately puts up barriers to take the data out, where other providers let you take your data wherever you want. SAP’s competitors know they offer a great service at a very competitive price. It seems SAP doesn’t want to compete on price or service, but chooses to put up a legal barrier to keep the customer’s data on their platform.

SAP Certification for 3rd party ETL tools no longer available

Blocking the use of ODP by 3rd party applications is only the beginning. SAP has already announced it will no longer certify 3rd party ETL tools for the SAP platform(3). The out-and-out SAP specialists have invested heavily in creating bolt-on features on the SAP platform to replicate large SAP data sets efficiently, often in near real-time. The likes of Fivetran, SNP Glue and Theobald have all introduced their own innovative (proprietary) code purely for this function. SAP used to certify this code, but has now stopped doing so. Again, the legal position is unclear and perhaps SAP will do a complete u-turn on this, but for now it leaves these vendors wondering what the future will be for their SAP data integration products.

What do you need to do if you use ODP now through a 3rd party application?

My advice is to start with involving your legal team. In my opinion an SAP note is not legally binding like terms & conditions are, but I appreciate my opinion in legal matters doesn’t count for much.
If you are planning to stay on your current product version for the foreseeable future and you have no contract negotiations with SAP coming up then you can carry on as normal. When you are planning to move to a new product version though, or if your contract with SAP is up for renewal, it would be good to familiarise yourself with alternatives.

As I mentioned before, most 3rd party products have multiple ways of connecting to SAP, so it would be good to understand what the impact is if you had to start using a different method.
It also makes sense to stay up-to-date with the SAP DataSphere roadmap. When I put my rose-tinted glasses on, I can see a future where SAP provides an easy way to replicate SAP data to the cloud storage of your choice, in near-real time, in a cost effective way. Most customers wouldn’t mind paying a reasonable price for this. I expect SAP and its customers might have a very different expectation of what that reasonable price is but until the solution is there, there is no point speculating. If you are looking for some inspiration to find the best way forward for you, come talk to Snap Analytics. Getting data out of SAP is our core business and I am sure we can help you find a futureproof, cost effective way for you.


Footnotes and references

(1) – The ODP framework version 1.0 was released around 2011, specifically with SAP NetWeaver 7.0 SPS24,  7.01 SPS 09, 7.02 SPS 08. The current version of ODP is 2.0, which was released in 2014 with SAP Netweaver 7.3 SPS 08, 7.31 SPS 05, 7.4 SPS 02. See notes 1521883 and 1931427 respectively.

(2) – Other types of SAP connections: One of my previous blog posts discusses the various ways of getting data out of SAP in some detail: Need to get data out of SAP and into your cloud data platform? Here are your options

(3) – Further restrictions for partners on providing solutions to get data out of SAP, see this article: Guidance for Partners on certifying their data integration offerings with SAP Solutions

Better, cheaper, faster – you can have all three!

You may have heard the saying “Better, cheaper, faster – pick two”. The idea isn’t new. If you want something really good and you want it quickly, you’re going to have to pay. If you want to save money and still keep the quality, you’ll need to wait. And so on.

But in big data that mantra is being subverted. Thanks to the cloud, you can now deliver data solutions that are more flexible, scalable and, crucially, cheaper and faster. Best of all it doesn’t mean abandoning quality or reliability – if well designed you can achieve better quality and consistency.

Not so long ago, if you were planning a data project you might have to set aside a big chunk of your budget for licences, servers and a data warehouse in which to store it. On top of this, you’d need a specialised (and potentially expensive) team of people to set up the infrastructure and operate the hardware. This effectively put data analysis out of the reach of many smaller businesses.

The cloud has changed all that – revolutionising the delivery of data analytics. So, what exactly can it offer you?

BETTER

Today’s cloud based technology is so simple even the least tech savvy people are able to reap the rewards. You no longer need to know how much storage you require up front as companies like Snowflake simply offer small, medium or large solutions plus the option of almost infinite scalability. For many new and smaller businesses the entry package will be enough, allowing you to upload millions of rows of data. And as you expand you can simply scale up.

Conventional wisdom once said that there was no more secure way of storing data than keeping it all on your premises where it was maintained by and managed by a member of staff. In 2019 that is no longer true. Even the most conscientious IT person will be constrained by your budget and facilities. By handing this responsibility over to the likes of Microsoft with their near infinite resources, there is arguably no safer way of storing your valuable data.

CHEAPER

The maths is simple: with modern data platforms like Snowflake, you just pay for what you use. Whereas previously you would have had to try and work out up front how much space you needed and hope you hadn’t overestimated or underestimated (with the associated painful time and cost implications), now you can simply scale up or down as necessary as and when your business requires. If for example your business acquires a new company, it’s easy, simply instantly increase the size of your data warehouse. At the time of writing, a terabyte of storage with Snowflake is an astonishing $23 per month.

This flexibility also means reduced waste. Once you had to pay for a solution that might only be used on one day every month when you had to run 10,000 reports. The other 30 days it sat idle costing you money. Now you can pay for the smallest package for the majority of the month and set it to automatically scale up when you really need the resources.

FASTER

Remember the sound of whirring fans and the wall of heat that would hit you when you went anywhere near the server room? Thanks to the cloud you can do away with the racks upon racks of energy guzzling storage and move it all off site, possibly thousands of miles away. This doesn’t make it slower; thanks to modern petabyte networks, you can access your data in a fraction of the time, generating reports in 10 seconds rather than 20 minutes.

Several years ago Snap Analytics was hired by a large automotive manufacturer for a major project based on their premises. At the time cloud storage didn’t have quite the same functionality and wasn’t trusted to do the job. As a result we had to work on site with their people, working within their existing systems just to set up the architecture. It added nearly 6 months to the project – and quite a few zeros to the final invoice. Thankfully, with modern data platforms, these overheads are completely eliminated, the scalability is infinite and the speed is truly phenomenal. And all delivered at a fraction of the price!