SAP’s cynical move to keep control of your enterprise data (aka note 3255746)

SAP has rocked the boat. They have issued an SAP note (3255746), declaring a popular method for moving data from SAP into analytics platforms out of bounds for customers. Customers and software vendors are concerned. They have to ensure they operate within the terms & conditions of the license agreement with SAP. It seems unfair that SAP unilaterally changes these Ts and Cs after organisations have purchased their product. I will refrain from giving legal advice but my understanding is that SAP notes are not legally binding. I imagine the legal teams will have a field day trying to work this all out. In this article I will explain the context and consequences of this SAP note. I will also get my crystal ball out and try and predict SAPs next move, as well as giving you some further considerations which perhaps help you decide how to move forward.

What exactly have SAP done this time?

SAP first published note 3255746 in 2022. In the note, SAP explained that 3rd parties (customers, product vendors) could use SAP APIs for the Operational Data Provisioning (ODP) framework but these APIs were not supported. The APIs were meant for internal use. As such, SAP reserved the right to change the behaviour and/or remove these APIs altogether. Recently, SAP have updated the note (version 4). Out of the blue, SAP declared it is no longer permitted to use the APIs for ODP. For good measure, SAP threatens to restrict and audit the unpermitted use of this feature. With a history of court cases decided in SAPs favour over license breaches, it is no wonder that customers and software vendors get a bit nervous. So, let’s look at the wider context. What is this ODP framework and what does it actually mean for customers and product vendors?

SAP ODP – making the job of getting data out of SAP somewhat less painful

Getting data out of SAP is never easy, but ODP offered very useful features to take away some of the burden. It enabled external data consumers to subscribe to datasets. Instead of receiving difficult to decipher raw data, these data sets would contain data which was already modelled for analytical consumption. Moreover, the ODP framework supports ‘delta enabled’ data sets, which significantly reduces the volumes of data to refresh on a day-to-day basis. When the ODP framework was released (around 2011(1)), 3rd party data integration platforms were quick to provide a designated SAP ODP connector. Vendors like Informatica, Talend, Theobald and Qlik have had an ODP connector for many years. Recently Azure Data Factory and Matillion released their connector as well. SAP also offered a connection to the ODP framework through the open data protocol OData. This means you can easily build your own interface if the platform of your choice does not have an ODP plug-in.

One can imagine that software vendors are not best pleased with SAP’s decision to no longer permit the use of the ODP framework by 3rd parties. Although all platforms mentioned above have other types of SAP connectors(2), the ODP connector has been the go-to solution for many years. The fact that this solution was not officially supported by SAP has never really scared the software vendors. ODP was and remains to be deeply integrated in SAP’s own technology stack and the chances that SAP will change the architecture in current product versions are next to zero.

Predicting SAP’s next move

You might wonder why SAP is doing this? Well, in recent years, customers have voted with their feet and moved SAP data to more modern, flexible and open data & analytics platforms. There is no lack of competition. AWS, Google, Microsoft, Snowflake and a handful of other contenders all offer cost effective data platforms, with limitless scalability. On these data platforms, you are free to use the data and analytics tools of your choice, or take the data out to wherever you please without additional costs. SAP also has a data & analytics platform but this is well behind the curve. There are two SAP products to consider, SAP Analytics Cloud (SAC) and SAP DataSphere.
The first is a planning and analytics toolset for business users and was introduced in 2015. For a long time, it was largely ignored. In recent years, it has come to maturity and should now be considered a serious contender to PowerBI, Tableau, Qlik and so on. I’m not going to do a full-blown comparison here but the fact that SAC has integrated planning capabilities is a killer feature.
SAP DataSphere is a different story. It is relatively new (introduced as SAP Data Warehouse Cloud in 2020) – and seasoned SAP professionals know what to do with new products: If you’re curious you can do a PoC or innovation project. If not, or you don’t have the time or means for this kind of experimenting, you just sit and wait until the problems are flushed out. SAP DataSphere is likely to suffer from teething problems for a bit longer, and it will take time before it is as feature-rich as the main competitor data platforms. One of the critical features which was missing until very recently was the ability to offload data to cloud storage (S3/Blob/buckets, depending on your cloud provider). That feature was added in Feb 2024. Around the same time as when SAP decided that 3rd parties could no longer use the ODP interface to achieve exactly the same. Coincidence?

So where is SAP going with this? Clearly they want all their customers to embrace SAP DataSphere. SAP charges for storage and compute so of course they try and contain as many workloads and as much data as they can on their platform. This is not different from the other platform providers. What is different is that SAP deliberately puts up barriers to take the data out, where other providers let you take your data wherever you want. SAP’s competitors know they offer a great service at a very competitive price. It seems SAP doesn’t want to compete on price or service, but chooses to put up a legal barrier to keep the customer’s data on their platform.

SAP Certification for 3rd party ETL tools no longer available

Blocking the use of ODP by 3rd party applications is only the beginning. SAP has already announced it will no longer certify 3rd party ETL tools for the SAP platform(3). The out-and-out SAP specialists have invested heavily in creating bolt-on features on the SAP platform to replicate large SAP data sets efficiently, often in near real-time. The likes of Fivetran, SNP Glue and Theobald have all introduced their own innovative (proprietary) code purely for this function. SAP used to certify this code, but has now stopped doing so. Again, the legal position is unclear and perhaps SAP will do a complete u-turn on this, but for now it leaves these vendors wondering what the future will be for their SAP data integration products.

What do you need to do if you use ODP now through a 3rd party application?

My advice is to start with involving your legal team. In my opinion an SAP note is not legally binding like terms & conditions are, but I appreciate my opinion in legal matters doesn’t count for much.
If you are planning to stay on your current product version for the foreseeable future and you have no contract negotiations with SAP coming up then you can carry on as normal. When you are planning to move to a new product version though, or if your contract with SAP is up for renewal, it would be good to familiarise yourself with alternatives.

As I mentioned before, most 3rd party products have multiple ways of connecting to SAP, so it would be good to understand what the impact is if you had to start using a different method.
It also makes sense to stay up-to-date with the SAP DataSphere roadmap. When I put my rose-tinted glasses on, I can see a future where SAP provides an easy way to replicate SAP data to the cloud storage of your choice, in near-real time, in a cost effective way. Most customers wouldn’t mind paying a reasonable price for this. I expect SAP and its customers might have a very different expectation of what that reasonable price is but until the solution is there, there is no point speculating. If you are looking for some inspiration to find the best way forward for you, come talk to Snap Analytics. Getting data out of SAP is our core business and I am sure we can help you find a futureproof, cost effective way for you.


Footnotes and references

(1) – The ODP framework version 1.0 was released around 2011, specifically with SAP NetWeaver 7.0 SPS24,  7.01 SPS 09, 7.02 SPS 08. The current version of ODP is 2.0, which was released in 2014 with SAP Netweaver 7.3 SPS 08, 7.31 SPS 05, 7.4 SPS 02. See notes 1521883 and 1931427 respectively.

(2) – Other types of SAP connections: One of my previous blog posts discusses the various ways of getting data out of SAP in some detail: Need to get data out of SAP and into your cloud data platform? Here are your options

(3) – Further restrictions for partners on providing solutions to get data out of SAP, see this article: Guidance for Partners on certifying their data integration offerings with SAP Solutions

AI Implementation Race

The First AI winter took place between the 1970s–1980s. This describes a time where funding, particularly for emerging technologies, was often shifting between active and inactive cycles. Lack of progress and unrealistic expectations were the main drivers behind the cut back in funding. The situation deteriorated further from the 1980s to the 1990s, leading to the onset of the Second AI winter, primarily due to widespread doubt about the capabilities of AI technologies. However, by 2018 the tide had shifted. Compared to other industries, healthcare received the most substantial support of their projects from the government and investors. It was stated that in the United States from 2017 – 2019, over $100 million was invested in American start-ups specializing in pathology AI, with a focus on developing practical AI solutions for diagnostic purposes. As part of the UK government’s second Life Sciences Sector Deal, a £1.3 billion investment was announced to aid experts in the detection of diseases earlier with the use of AI (Serag et al., 2019). Furthermore, 38 projects received investment from the UK government of £36 million in 2021, in the hopes of driving innovation and improving the quality of care in the UK healthcare industry (GOV.UK, 2021). Consequently, opportunities for partnerships between academic institutions, industry organisations, and the NHS have opened with the aim of collaboration and innovation towards future technological development. Therefore, an inevitable “AI implementation race” ensued in the manufacturing, financial and banking, retail, transportation, and hospitality industry.

It is advisable, although tempting, not to enter guns blazing into this race. Businesses must carefully consider several factors to assess their readiness to implement AI. Some factors to consider include investment, strategic alignment, regulatory, and ethical considerations. The fundamental functions of management, planning, leading, organising, and controlling are geared to steer management efforts towards organizational objectives. When coupled with AI, these functions become formidable allies for decision-making, therefore advancing businesses towards their primary objectives of profitability, market leadership and growth, and stakeholder value creation.

Business Goals

Profitability

Enhanced predictive analytics powered by AI can play a pivotal role in guiding strategic decision-making, enhancing the precision of financial forecasts, and improving risk management. This is closely tied to the functions of planning, organising, and controlling within management. AI-infused project management solutions leverage predictive analytics to foresee project risks, allocate resources effectively, and streamline project timelines and finances. There are various project management solutions that incorporate AI. Microsoft Project includes AI features such as automated scheduling, risk assessment, and resource optimization to help project managers plan and execute projects more effectively. Asana is another widely-used project management tool that employs AI to assist with task prioritization, resource allocation, and scheduling, helping teams streamline their workflows. Companies like Accenture may leverage AI to analyse project data and identify trends that can inform project management decisions. Consulting firms like McKinsey may use AI-infused project management tools to improve project planning and optimize client engagements.

This translates into:

  • Optimizing resource utilization
  • Driving cost efficiencies
  • Optimizing revenue streams
Market Leadership and Growth

AI-driven insights can identify market trends, consumer preferences, and competitive dynamics, facilitating the development of innovative products and services. Companies like Amazon may leverage AI to optimize warehouse operations, forecast demand, and coordinate product deliveries. Similarly, AI-fortified risk management systems evaluate risks across the enterprise, simulate various scenarios, and propose strategies for risk mitigation, ensuring adherence to regulations and protection against unforeseen circumstances. AI algorithms analyse data from multiple sources, including supplier performance metrics, geopolitical events, and weather forecasts, to assess supply chain risks proactively. Such information may affect where and when organisations form partnerships and expand.
This results in:

  • Entering markets with a competitive edge
  • Maintaining a strong position by outperforming competitors
  • Identifying opportunities for strategic partnerships
  • Exploring expansion possibilities in new markets
Stakeholder Value Creation

AI has the potential to elevate customer experiences through personalized recommendations, efficient customer service, and tailored product offerings. E-commerce platforms like Amazon and Netflix use AI-powered recommendation systems to personalize product recommendations based on user behaviour, preferences, and purchase history. Sentiment analysis tools, powered by AI, monitor feedback from employees, reviews from customers, and interactions on social media. This is then used to assess sentiment and pinpoint areas for enhancing leadership strategies and boosting employee engagement. Amazon Comprehend is a natural language processing service from Amazon Web Services (AWS) that offers sentiment analysis features. Amazon Comprehend offers multi-language support and integrates with other AWS services for seamless integration into existing workflows. Additionally, leadership development programs augmented by AI utilize machine learning algorithms to scrutinize leadership attributes and conduct, pinpointing areas of proficiency and areas needing improvement, and offering tailored coaching and training suggestions. Platforms such as Peakon use AI to analyse employee feedback data and provide actionable insights to leaders and HR professionals for improving leadership effectiveness and employee engagement.
Thus, leading to:

  • Enhanced brand reputation
  • Customer loyalty
  • Employee retention

Stakeholder Engagement and the Role of Transparency

The power-interest matrix provides decision makers insight on key stakeholders. As this can vary across organisations, identifying key stakeholders when discussing implementation of AI systems is pivotal. It is vital for organisations to have a clear view on who has the highest level of influence or interest that may be affected or negatively influence their decisions.
Skepticism from stakeholders, particularly pertaining to the healthcare industry, of the implementation of AI often comes from its lack of explainability. Accountability and trustworthiness are significant driving forces for successful AI implementation as these directly address the issues encountered when discussing explainable AI. Regardless of the industry, trust needs to be built and maintained with stakeholders. Stakeholders such as the general public and employees can have a negative impact on implementation, particularly if they display resistance to change. If resistance primarily stems from key stakeholders, the implementation may face delays or failure.  It is important to assess the origins of this resistance in order to address it. Cases of resistance to change are often related to misinformation accumulating over time. Therefore, to address resistance to such a case, informing potential users of its benefits and openly and honestly addressing their fears may potentially result in less push back. An undercurrent of resistance from healthcare employees resulted in Welsh Health Boards experiencing difficulties when attempting to implement technological solutions to increase operational efficiency. Bridges states, the first task of transition management involves convincing people that there is more to life than where they are right now (2003). However, if they are consumed by fear and are kept in the dark, they will never be open to the idea of change.

Resource Allocation for AI Implementation

Financial Resources

While the introduction of AI systems may strain business finances, it’s essential to consider the Return on Investment (ROI). Conducting a thorough ROI analysis provides a clearer picture of the associated benefits, whether short or long term. Factors such as efficiency gains, cost savings, and potential revenue growth should be considered.

People

During the talent acquisition phase, valuing previous experience working with AI is essential. Alongside recruiting new talent, businesses should consider existing employees and explore strategies for training and upskilling. Temporary hiring of experts to train existing employees can empower and instill confidence in working alongside AI.

Technological Resources

For businesses unaccustomed to technology, investing in a robust and scalable technological infrastructure is crucial for effective AI implementation. Considerations around cloud computing, data storage, and computational power are necessary components in this technological transformation.

More and more organisations are joining in on the “AI Implementation race”. However, the last thing you want is to join a race you are not ready for.
Recommended considerations before entering the race:

  • SWOT Analysis (Strengths, Weaknesses, Opportunities and Threats)
  • Capacity Analysis
  • Feasibility Analysis
  • Technology Acceptance Model (Davis, 1989)

Planning, leading, organisaing, and controlling are pivotal when driving primary objectives of profitability, market leadership and growth, and stakeholder value creation. There are a range of benefits of AI implementation including increased efficiency, accuracy, insight, innovation, automation, and risk mitigation. As a result, adding in AI tools into the daily operations of organisations leads to businesses are accelerated towards achieving their objectives.


Sources used for this article

Bridges, W. (2003). Managing transitions: Making the most of change (2nd ed.). Nicholas Brearly
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information Technology. Management Information Systems Quarterly13(3), 319–340. https://doi.org/10.2307/249008
GOV.UK. (2021, June 16). £36 million boosts for AI technologies to revolutionise NHS care. https://www.gov.uk/government/news/36-million-boost-for-ai-technologies-to-revolutionise-nhs-care
Serag, A., Ion-Margineanu, A., Qureshi, H., McMillan, R., Saint Martin, M. J., Diamond, J., O’Reilly, P., & Hamilton, P. (2019). Translational AI and deep learning in diagnostic pathology. Frontiers in medicine6, 185.  https://doi.org/10.3389/fmed.2019.00185

Sigma Computing – the newest data visualisation tool?

Sigma Computing, a transformative force in the U.S., has now crossed the pond to make its mark in the UK, emerging as a leading-edge cloud-based data visualization tool. This comprehensive exploration delves into the myriad benefits of Sigma and why it stands poised as a formidable contender in the ever-evolving realm of data visualization tools. 

Works well with cloud data platforms 

At the core of Sigma’s appeal lies its seamless integration with major cloud data platforms, boasting compatibility with industry giants such as Snowflake, Azure, Redshift, and Databricks. This strategic alignment positions Sigma as a versatile solution that optimizes performance and cost-effectiveness by exclusively relying on cloud data warehouses. The tool distinguishes itself by pushing calculations down to the data warehouse level, eliminating the need for on-premise servers. Notably, Sigma refrains from storing data within the software, opting instead to intelligently cache results. This ingenuity allows users to augment their Snowflake usage by 20-30% without incurring additional credits—a testament to Sigma’s savvy approach to resource utilization. 

Interesting Feature: Input tables for ad-hoc scenario modelling 

A standout feature in Sigma’s arsenal is the introduction of Input tables, a tool for users engaged in ad-hoc scenario modelling. This functionality empowers users to modify specific cells with different values, providing a dynamic view of how visualizations might evolve. Crucially, this feature ensures that alterations within Sigma do not impact the underlying data within the cloud data warehouse, preserving data integrity. This means you don’t have to download the data into excel to be able to play with the data and analyse. 

Easy to use 

Ease of use is a cornerstone of Sigma’s design, positioning it as a low-code, user-friendly software solution that intuitively caters to diverse users. The interface mirrors the simplicity of spreadsheets, with everything stemming from a central table within the Sigma workbook. This simplicity extends to the creation of child tables, filters, and various visualizations. Unlike other tools that necessitate intricate coding for calculations, Sigma employs a straightforward notation approach, utilizing expressions like [column 1]*[column 2] for row-level calculations. The platform also offers automatic calculations for time dimensions, covering periods, dynamic rolling 4-week analyses, and relative dates compared to the current day. 

Speed and collaboration

Speed is a critical factor in data visualization, and Sigma excels in this domain by virtue of not storing data directly within the software. Instead, as a web browser-based application, it provides swift access to reports from anywhere with an internet connection. This real-time accessibility sets Sigma apart from competitors like PowerBI, allowing users to collaborate seamlessly on the same workbook—reminiscent of the collaborative features seen in Google Sheets. This real-time collaboration eliminates the need for cumbersome email exchanges, streamlining workflow and enhancing overall productivity.

Costing

Cost considerations play a pivotal role in the adoption of any software, and Sigma is strategically positioned for large organizations. Viewer access is entirely free, with costs limited to a platform fee and charges for content creators. This pricing model caters to organizations where sharing data with clients or partners is integral to day-to-day operations. A case in point is DoorDash, the American equivalent of Deliveroo, which leverages Sigma as its data visualization tool. DoorDash utilizes Sigma to share reports with all restaurants on its app, incurring no additional expenses for viewer licensing—a testament to Sigma’s cost-effective approach for data-sharing businesses.

In conclusion, Sigma Computing emerges not only as the newest data visualization tool but its strategic alignment with major cloud data warehouses, coupled with innovative features, user-friendly design, and cost-effective pricing, positions Sigma as a key player in the dynamic and ever-expanding field of data visualization. As organizations continue to navigate the complex landscape of data management and analysis, Sigma Computing stands as a beacon of efficiency, collaboration, and forward-thinking solutions. As Sigma gains prominence in the U.S., its entry into the UK market signifies a potential shift in the landscape of data visualization tools. The tool’s adaptability to diverse cloud data warehouses, coupled with its innovative features, positions it as a robust solution for organizations seeking an efficient, user-friendly, and cost-effective data visualization platform.