Your Enterprise Data Isn’t Aligned and It Shows 

Luka Jasionyte, from the marketing team at Snap Analytics, catches up with Raj Shah, Director of Strategic Accounts, to dive into his journey in data and analytics and explore the key challenges facing enterprise businesses today. Raj shares his insights on making sense of enterprise data, tackling data silos, the importance of strong governance, and how AI and high-quality data engineering are shaping the future. 

What inspired you to get into data and analytics, Raj? 

I was inspired by the opportunity to work across a variety of clients and industries, as data exists in every organisation. The business value that proper analytics can deliver feels almost magical, and although I didn’t enjoy programming, I found SQL surprisingly easy to write and understand.

What is the most exciting trend right now? 

AI, without a doubt. It has really focused everyone’s attention on getting their data right. It’s no longer something organisations can put off while continuing with manual or Excel-based analytics. AI depends on high-quality data, and ‘human-in-the-loop’ AI is the stepping stone for most organisations. Those that fail to embrace it risk being left behind.

What are some common data-related challenges that large enterprises typically face? 

Large enterprises often face several recurring data challenges that impact their ability to deliver reliable analytics. One major issue is the lack of a common data definition or language across the organisation. For example, a term like ‘margin’ can mean different things to different departments, leading to inconsistent reporting and decision-making. Another challenge is the reliance on flat files and manual data processing methods. These approaches are time-consuming, error-prone and make it difficult to scale analytics effectively. Excessive data transformation and manipulation often happens within front-end tools rather than being pushed down to the data warehouse. This creates inefficiencies, performance bottlenecks and governance risks.

What are the top client priorities for those looking to drive successful outcomes in data and analytics? 

For clients aiming to achieve successful outcomes in data and analytics, several priorities stand out. First, making data a true business asset is essential. This involves consolidating information from multiple systems into a clean, unified data warehouse that provides a single source of truth. Second, building data literacy across the organisation is critical. When teams understand and trust the data, they can make informed decisions and fully leverage analytics capabilities. Finally, reducing reliance on manual processes and Excel-based reporting is a key step towards scalability and efficiency. Moving to automated, integrated solutions not only saves time but also improves accuracy and enables advanced analytics. Together, these priorities create the foundation for delivering actionable insights and driving measurable business value.

Raj, why Snap Analytics? 

Clients choose Snap because we combine deep industry expertise with technical excellence. Our consultants have broad experience across the modern data stack, enabling us to design and deliver solutions that meet diverse business needs. We also bring specialised knowledge of complex systems such as SAP, supported by proven frameworks that reliably extract data and integrate it into modern cloud platforms. This ensures accuracy, scalability and speed. Snap focuses on delivering real business value. Every project is driven by outcomes that matter, whether that’s improving decision-making, reducing costs or accelerating innovation. Our approach uses lean teams, automation and reusable frameworks to achieve efficiency, standardisation and strong governance, giving clients confidence that their investment translates into measurable results.

Real-Time Data Is the Revolution We’ve Been Waiting For

Luka Jasionyte, from the marketing team at Snap Analytics, catches up with Jan Van Ansem, Co-founder and Head of SAP at Snap Analytics, to get the lowdown on his journey through data and analytics, and to share insights on the transformative power of Generative AI, the shift towards real-time data warehousing, and strategies for overcoming enterprise data integration and governance challenges.

What inspired you to get into data and analytics, Jan? 

My passion for coding started early – I was just 12 when I bought my first home computer, eager to dive into programming with Basic. That initial curiosity soon turned into a career in software development, where I honed my skills in building applications and systems. At the time, data & analytics wasn’t recognised as a distinct field; rather, it was an underlying component of software engineering and database management. 

The landscape shifted dramatically when Ralph Kimball published The Data Warehouse Toolkit, introducing methodologies that sparked widespread discussions about data modelling and architecture. This led me to explore the contrasting philosophies of Kimball vs. Inmon, each offering unique approaches to structuring enterprise data. That debate ignited a deep interest in understanding how businesses could best harness their data and analytics to drive decision-making. 

Since then, I’ve remained captivated by the challenge of designing optimal enterprise data models, ones that not only store information efficiently but also empower users with actionable insights. Whether it’s crafting intuitive data warehouses, refining business intelligence strategies, or integrating modern analytics tools, I’ve always been driven by the pursuit of solutions that transform raw data into meaningful knowledge. 

What is the most exciting trend right now?

AI is the obvious one. It’s completely reshaping how we work and live. From automation to predictive analytics, personalised experiences, and smarter decision-making, it’s changing the game across industries. And the exciting part? It’s not some distant futuristic concept anymore. AI is already making businesses more efficient and innovative every day. 

But for me, the real breakthrough is something that has been talked about forever but never fully realised at an enterprise data scale. Real-time data warehouses. For years, this has been the holy grail of data warehousing. Something that’s always been promised but never quite delivered in a way that works seamlessly for large businesses. The problem has always been the reliance on batch processing, which means companies are stuck making decisions based on outdated data instead of seeing what’s happening right now. 

Now though, we’re finally seeing real-time analytics become a reality. Thanks to advancements in cloud computing, AI-powered data processing, and cutting-edge architectures, enterprises can move beyond traditional reporting and actually act on insights as they happen. Whether it’s fraud detection in finance, predicting stock levels in retail, or optimising supply chains in real time, these innovations are making data warehouses more powerful than ever. 

We’re not quite at full-scale adoption yet, but we’re closer than ever. And soon, real-time enterprise data won’t just be a competitive edge. It’ll be the standard for businesses that want to stay ahead. 

What are some common data-related challenges that large enterprises typically face?

One of the biggest challenges is integrating missing pieces of data quickly. Despite our best efforts, business users almost always require more detail than what is readily available in a data warehouse. Closing the gap from 90 per cent complete to 100 per cent often demands disproportionate effort, consuming significant time and resources. 

A major reason for this difficulty is that enterprise data is typically spread across multiple platforms, legacy systems, and departmental silos, making it difficult to locate and consolidate. Additionally, data governance policies and security restrictions can further delay the process, adding layers of complexity when accessing specific datasets. 

More agile tools and processes, along with improved visibility of where data resides, can help enterprises tackle this issue more effectively. Solutions such as real-time data integration, AI-driven analytics, and self-service data platforms are making it easier for business users to retrieve insights on demand without needing to rely entirely on IT teams. As technology progresses, organisations that prioritise agility and seamless data access will gain a competitive advantage in leveraging their information for strategic decision-making. 

What are the top client priorities for those looking to drive successful outcomes in data and analytics? 

One of the biggest priorities for clients is improving customer engagement, understanding what truly drives customer behaviour through data. Businesses don’t just want surface-level insights; they need a deep understanding of how customers interact with their products, services, and digital channels. Customers now expect personalised experiences, but delivering them at scale is only possible with smart enterprise data systems that can identify patterns and preferences in real time. Whether it’s AI-driven recommendations, predictive analytics, or automated insights, businesses are looking for ways to refine their offerings and strengthen customer relationships. 

A major challenge is connecting scattered data sources. Many organisations struggle with fragmented data spread across multiple platforms, making it difficult to get a complete picture of customer behaviour. By integrating data from different systems and applying machine learning models, companies can go from reactive decision-making to proactive, tailored engagement. 

Jan, why Snap Analytics? 

At Snap, delivering great solutions isn’t just about technical expertise, it’s about teamwork, learning, and making a real impact. Our consultants work closely with customers’ teams, sharing knowledge and refining strategies to create the best possible outcomes. 

We also have strong relationships with vendors, helping shape product roadmaps and drive innovation that benefits our clients. But beyond the work itself, what really sets us apart is the people. We have a talented, forward-thinking team that’s not only skilled but also genuinely passionate about problem-solving. And most importantly, we create an environment where people actually enjoy working together, making Snap a great place to collaborate and grow. 

SAP and Snowflake Unite: Zero-Copy Data Sharing Transforms the Business Data Fabric 

The enterprise data landscape has reached an inflection point. For decades, organisations have grappled with a fundamental tension: SAP systems hold their most critical business data, yet extracting that data for advanced analytics has meant choosing between preserving semantic richness and achieving cloud-scale performance. SAP and Snowflake’s newly announced partnership fundamentally changes this equation by enabling organisations to seamlessly leverage Snowflake’s AI Data Cloud and SAP Business Data Cloud with semantically rich data through zero-copy sharing. 

The Hidden Tax on SAP Data 

Every organisation running SAP knows the story. Your most valuable business data (financials, supply chain operations, customer relationships) lives in SAP. But when you want to combine that data with external sources, build advanced analytics, or train machine learning models, you face a painful reality: traditional integration approaches strip away the business context that makes SAP data valuable in the first place. 

For many customers, integrating data across multi-cloud and hybrid environments adds complexity, especially when bringing transactional and analytical workloads together, and that process comes with a hidden data tax: it strips away the business context and semantics that give data its meaning. 

This isn’t just a technical inconvenience. When you lose SAP’s semantic layer (the carefully defined business logic, KPIs, and relationships that took years to build), your analytics teams spend months reconstructing it. Different business units develop their own versions of truth. Data governance becomes a nightmare. And the promise of AI-driven insights remains perpetually out of reach. 

Two Products, One Unified Vision 

The partnership introduces two complementary offerings: SAP Snowflake, a solution extension for SAP Business Data Cloud that brings Snowflake’s data and AI platform directly to SAP BDC customers, and SAP Business Data Cloud Connect for Snowflake, which enables bidirectional, zero-copy data sharing with existing Snowflake instances. 

SAP Snowflake: Extending SAP BDC with Cloud-Scale Analytics 

SAP Snowflake unites SAP’s deep expertise in mission-critical business processes and semantically rich data with Snowflake’s unified platform capabilities for building AI and machine learning solutions. For organisations already invested in SAP BDC, this means gaining access to Snowflake’s full ecosystem (advanced analytics, AI capabilities through Cortex, the Snowflake Marketplace, and data collaboration features) without sacrificing the semantic richness of their SAP data products. 

The architecture enables something previously difficult to achieve: customers can harmonise SAP and non-SAP data while optimising total cost of ownership across workloads and build agents and AI applications fueled by trusted SAP data products. 

SAP BDC Connect: Meeting Customers Where They Are 

Many enterprises have already made significant investments in Snowflake. For these organisations, SAP BDC Connect for Snowflake enables integration of existing Snowflake instances with SAP Business Data Cloud for more seamless, zero-copy access, providing Snowflake users with real-time access to semantically rich SAP data products without duplication. 

This is where the partnership shows its strategic sophistication. Rather than forcing a rip-and-replace approach, SAP recognises that customers have diverse technology stacks and provides a path forward that respects existing investments. 

Why Zero-Copy Matters 

The term “zero-copy” might sound like technical jargon, but it represents a fundamental shift in how enterprise data integration works. Traditional approaches require copying data from SAP into Snowflake, creating multiple versions of truth, introducing latency, and multiplying storage costs. Worse, they break the connection to SAP’s semantic layer. 

The integration happens in near real time through a zero-copy connection, allowing organisations to build AI applications that combine SAP and non-SAP data while maintaining unified governance and performance. This means your data teams can work with live SAP data products in Snowflake, complete with all the business context and definitions SAP has built up over decades. 

What This Means for Your Data Architecture 

If you’re running SAP and contemplating your analytics strategy, this partnership addresses several critical pain points: 

Semantic Preservation: Your SAP data products arrive in Snowflake with their business meaning intact. The KPIs, relationships, and definitions your organisation has invested years developing don’t need to be rebuilt. 

Real-Time Access: Near real-time data availability means your analytics, reports, and AI models work with current business data, not yesterday’s snapshot. 

Unified Governance: A single governance framework spans both platforms. You define access controls, data quality rules, and compliance policies once, and they apply across your integrated environment. 

Flexibility and Choice: Whether you’re starting fresh with SAP BDC or have years of investment in Snowflake, there’s a path forward that respects your current architecture. 

Cost Optimisation: Zero-copy sharing eliminates redundant storage costs and the compute overhead of traditional ETL processes. You can right-size your analytics infrastructure based on actual usage patterns. 

The Competitive Context 

This partnership follows integrations with Databricks in February 2025 and Google Cloud Platform BigQuery, making it the third such partnership SAP has announced in recent months. The pattern is clear: SAP is building an open data ecosystem where customers can choose the best tools for their specific use cases while maintaining the integrity of their SAP data. 

Industry analysts have taken note. Scott Bickley, an advisory fellow at Info-Tech Research Group, observed that Snowflake was the missing link SAP needed to enable bi-directional, zero-copy data sharing with non-SAP data sources. Sanchit Vir Gogia, chief analyst and CEO of Greyhound Research, noted this partnership feels less like a technical upgrade and more like SAP finally recognising how its customers actually work. 

Looking Ahead 

SAP Snowflake is planned to be generally available in Q1 this year, with SAP BDC Connect for Snowflake expected in H1 2026. These timelines give organisations the runway to assess how this partnership fits their data strategy and begin planning for integration. 

The implications extend beyond immediate technical capabilities. This partnership signals a broader shift in enterprise data architecture, away from monolithic, vendor-locked systems and toward flexible, semantically rich data fabrics that can adapt to changing business needs while preserving the institutional knowledge embedded in business data definitions. 

How Snap Analytics Can Help 

Snap Analytics was born from the SAP data and analytics space. Our founders all came from SAP-specialist Bluefin, and we’ve spent years helping organisations navigate the complex intersection of SAP data, ecosystem data, next generation data teams, and cloud analytics platforms. The SAP-Snowflake partnership creates new opportunities to unlock value from your SAP investments but realising that value requires careful planning and execution.

Complimentary 1-Hour SAP BDC Roadmap Session 

Snap Analytics is offering a complimentary 1-hour roadmap workshop to help you explore your SAP BDC options and chart your path forward in 2026. During this session, our experts will work with your team to: 

  • Evaluate your current SAP and analytics landscape 
  • Review SAP BDC integration options (SAP Snowflake vs. SAP BDC Connect) 
  • Identify quick wins and strategic priorities 
  • Develop a phased implementation roadmap aligned with the current 2026 availability timelines 
  • Assess resource requirements and budget considerations 

This is an ideal opportunity to understand how the SAP-Snowflake partnership fits your organisation’s specific needs now that general availability is approaching.  

Our Full Range of Services 

Beyond the roadmap session, we can help you: 

  • Assess Your Current State: Evaluate your existing SAP and analytics architecture to identify opportunities for optimisation through the SAP-Snowflake integration. 
  • Design Your Target Architecture: Develop a blueprint that leverages SAP BDC and Snowflake while preserving your semantic models and governance frameworks. 
  • Execute Strategic Migrations: Implement the integration with minimal disruption to ongoing operations, ensuring data quality and business continuity throughout the process. 
  • Optimise for Performance and Cost: Right-size your infrastructure, configure zero-copy sharing for optimal performance, and implement cost management strategies. 
  • Enable Advanced Analytics and AI: Help your teams leverage the combined capabilities of SAP BDC and Snowflake for sophisticated analytics, machine learning, and AI-driven insights. 

The convergence of SAP’s mission-critical business applications with Snowflake’s cloud-native data platform represents more than a technical integration. It’s an opportunity to fundamentally rethink how your organisation leverages data for competitive advantage. 

Conclusion 

The SAP and Snowflake partnership addresses a challenge that has frustrated enterprise data architects for years: how to preserve the semantic richness of SAP data while achieving cloud-scale analytics performance. Through zero-copy data sharing and bidirectional integration, organisations can finally have both. 

Whether you’re already running SAP BDC or have significant investments in Snowflake, this partnership offers a pragmatic path forward that respects your existing architecture while opening new possibilities for advanced analytics and AI. 

The question isn’t whether your SAP data should power your next generation of analytics and AI initiatives. The question is how quickly you can architect an integration that preserves your business context while delivering cloud-scale capabilities. With the SAP-Snowflake partnership, that timeline is now dramatically shorter. 

Ready to explore how the SAP-Snowflake partnership can transform your data architecture?

Contact Snap Analytics today to schedule your complimentary 1-hour SAP BDC roadmap session and develop a strategy for success in 2026.

Book your discovery call now.


Primary Sources

Primary Sources on the Partnership: 

  1. SAP News Center – Official announcement of the SAP and Snowflake partnership, detailing the two product offerings (SAP Snowflake and SAP BDC Connect for Snowflake), zero-copy data sharing capabilities, and planned availability timelines. 
  1. Snowflake Blog – Partnership announcement covering the integration details, technical architecture, and the vision for combining SAP’s semantic data models with Snowflake’s AI Data Cloud capabilities. 
  1. CIO Magazine – “SAP and Snowflake add zero-copy sharing between their systems” – Comprehensive coverage of the partnership announcement at SAP TechEd Berlin, including executive quotes from Irfan Khan (SAP’s data and analytics president and COO) on preserving semantics and high-fidelity data exchange, Christian Kleinerman (EVP of Product at Snowflake), and technical details on the differences between SAP Snowflake and SAP Databricks CIO
  1. AstraZeneca Customer Quote – Russell Smith, Vice President of ERP Transformation Technology at AstraZeneca, provided commentary on the value of real-time data access and AI capabilities enabled by the partnership CIO

Industry Analyst Perspectives: 

  1. Info-Tech Research Group – Scott Bickley, Advisory Fellow, commented on Snowflake being the missing link for SAP’s bi-directional, zero-copy data sharing strategy CIO
  1. Greyhound Research – Sanchit Vir Gogia, Chief Analyst and CEO, provided analysis noting the partnership “feels less like a technical upgrade and more like SAP finally recognizing how its customers actually work” CIO
  1. Moor Insights & Strategy – Robert Kramer, VP and Principal Analyst, discussed how the joint solution preserves contextual meaning and maintains governance controls while shifting the relationship from informal integration to formal alignment CIO

Context on Related Partnerships: 

  1. Previous SAP Integrations – References to SAP’s earlier partnerships with Databricks (February 2024) and Google Cloud Platform BigQuery, establishing the pattern of SAP building an open data ecosystem CIO

SAP, Sapphire and… Snowflake? 

As SAP Sapphire 2025 kicks off tomorrow, the data and analytics community is buzzing with anticipation. Last year, I wrote about the “great expectations” surrounding SAP Datasphere platform (which has now been re-branded to SAP Business Data Cloud (SAP BDC). This year, those expectations have only grown, especially with the promise of more concrete innovations and, potentially, a major new partnership. 

The rumour mill: A Snowflake partnership for SAP BDC? 

There’s been speculation that SAP might announce a partnership with Snowflake, mirroring the strategic alliance it already has with Databricks. I have no idea if this is true, and it contradicts another rumour that Databricks explicitly negotiated to have sole partnership with SAP for four years. I really hope to see at least a roadmap for Snowflake, as Snowflake customers working on the roadmap for SAP integration are now in limbo. Enabling seamless integration between SAP and non-SAP platforms is still the holy grail for many enterprises, and the technology is there. Let’s hope SAP can come up with a model that works for them, customers and technology partners.  

I would be very surprised if there is an update about Snowflake integration at this year’s Sapphire event, but who knows? I am just adding to the rumour mill here, and perhaps when enough people do that, it will become reality one day! 

What I’m hoping to see: Data Products and Intelligent Apps 

One of my biggest wishes for this year’s Sapphire is to see more tangible progress on data products and intelligent apps within the BDC. These pre-delivered, semantically rich content packages could be game-changers for organisations looking to build analytics and AI solutions faster and with more confidence. 

SAP has generated a lot of hype around these concepts, but so far, the actual content has been limited. I’m hoping this year’s event will bring clarity, especially around timelines, availability, and real-world use cases.

Reflections on last year’s predictions:

In my 2024 ‘Sapphire’ blog post, I highlighted three standout features that set SAP’s data & analytics solution. So what has become of them, now we are one year further?  

Integrated Planning – This remains a unique strength of SAP Analytics Cloud (SAC). The ability to combine planning and analytics in a single platform is something few others can match. 

Change Data Capture (CDC) and Near Real-Time Integration – Over the past year, this SAP BDC capability has matured. While it may not rival third-party replication tools in raw power, its seamless integration into BDC makes it incredibly user-friendly and effective for most enterprise needs. 

AI Copilot Evolution – Last year, SAP introduced both Just Ask and Joule. I found that confusing. Fast forward one year and looking at the sessions at Sapphire, “Just Ask” is mentioned in zero sessions and “Joule” in 58. I guess we have a winner.  

Sessions I’m attending:

From the comfort of my home office (I didn’t bag a ticket to Orlando or Barcelona this year), I will be attending the following sessions:  

Unleash the Power of Your Data with SAP Business Data Cloud 
? May 20, 19:30 BST 
? Session ID: BDC5092v 

Explore the Road Map for SAP Business Technology Platform 
? May 21, 16:30–16:50 BST 
? Session ID: BTP2122v 

SAP Business Data Cloud: Best Practices and Upcoming Innovations 
 ?On-demand 
? Session ID: BDC2772v 

For those lucky enough to be in Orlando, I would recommend the following session:  

SAP and Databricks open a bold new era of data and AI 
✈️In Person 
? Session ID: BDC2767 

I hope you all enjoy Sapphire this year! 

The SAP Databricks partnership: Combining expert knowledge of business critical processes with world-class data engineering capabilities

Today, SAP and Databricks announced SAP Business Data Cloud (BDC) with SAP Databricks. Product capabilities include integration of Databricks capabilities in the SAP data platform, delta sharing for SAP data and Insight Applications. These new capabilities put SAP firmly on the map as a leading global enterprise data platform provider. Here is what customers need to know.

SAP Business Data Cloud? What is that?

The short and incomplete answer is that BDC is SAP DataSphere, with Databricks integration, some new product features and a different pricing model. For the long answer, you can visit the product tour or register for the webcast.

What problem is the SAP Databricks partnership going to solve?

I see two main problems with the current SAP platforms (SAP DataSphere and SAP BW/4HANA):

  1. Poor integration of data engineering and data scientist toolsets in the SAP ecosystem
    Data engineers and data scientist love to work with language like Python, Scala and R, using Notebooks and controlling their source code with git-based solutions. Databricks is built on this premise. On SAP data platforms, it is all but impossible to use this. Bringing Databricks to SAP bridges a very significant shortcoming of the current SAP platforms.   
  2. Unable to provide scalable solutions at competitive prices
    SAP HANA infrastructure is expensive compared to the infrastructure Databricks is using. Databricks runs on cheap file storage with Spark engines, which is offered at competitive prices by all Hyperscalers. SAP HANA runs on high spec database servers. HANA compute is expensive by nature and that is not a problem SAP can easily resolve. SAP Business Data Cloud will continue to run on SAP HANA, but it will support Databricks ‘Delta Sharing’ capabilities. This means datasets residing outside SAP HANA can seamlessly be integrated in analytical models and reported on. In theory, this would open the door to also offloading SAP data to the Delta Lake, reducing the HANA footprint and cost of the SAP Business Cloud license. Whether SAP will allow this remains to be seen. Probably not without paying a fee for this.

What’s in it for Databricks and for Databricks customers?

That is two questions with very different answers. For Databricks, the partnership means that SAP will sell Databricks as a white label product with their SAP Business Data Cloud. This will result in increased revenue for Databricks, with very little effort required in the sales process.

For Databricks customers, it looks like initially not much will change. Their biggest challenge will still be to get data out of SAP onto the Databricks platform. If customers have a Datasphere license and they would move over to SAP BDC then in theory they could benefit from delta sharing, integrating HANA delta files directly into Databricks. I doubt SAP will allow this. Databricks have been working on an SAP connector for a while. Unconfirmed rumours are that this connector is to be released in Q1 2026. Whether Databricks will still go ahead with this is still unclear, as is the mechanism used for getting data from SAP into Databricks. For now, if you’re not on SAP BDC and you want SAP data in Databricks, you will still need to rely on 3rd party solution or use a sub-optimal solution based on OData.

New feature: Insight Applications & data packages

Insight Applications & data packages are end-to-end solutions, offering data transformation and modelling, reporting and AI driven insights for a specific business function. For those familiar with ‘Business Content’ think of this as business content on steroids. Or, think of it as Google Cortex, which does exactly the same.

SAP has promised that the Insight Applications & data packages will provide integrated insights across all SAP products, not just S/4HANA. For now though, the scope is limited to just ERP. If SAP lives up to its promise, then I can see this will be a great way to deliver value to the business very early on in a migration/implementation process. It is not yet clear to me how you can extend the content packages with customised tables or how you integrate non-SAP data in standard models. I guess we’ll have to wait a bit longer, perhaps until we can actually get hands-on with this.

Final thoughts

I believe SAP and Databricks are onto something great here. SAP DataSphere took a while to mature but in the last year or so the platform has become quite good. There were two major problems left: the poor integration with popular programming languages and the ability to scale at competitive prices. With the Databricks partnership, these problems should now be resolved, and SAP can once again become a leader in the data platform space.

Customer who are still on SAP BW have no reason to drag their feet any longer. SAP BDC is the future and you’re missing out on all innovation if you stick to SAP BW.

It will be interesting to see Snowflake’s response to today’s announcement. For a long time, Snowflake was almost the de-facto choice for customers moving away from SAP data platforms. Now, the ability to integrate data from SAP within Snowflake is lagging compared to Google (Cortex) and Databricks. Snowflake have been promoting SNP Glue as a preferred tool for SAP data integration. Perhaps Snowflake responds with a strategic acquisition as well?

I do hope SAP will take a sensible approach to customers who have good reasons to not follow with SAP BDC. There are other great data platforms out there – Snowflake, Databricks, Google Big Query – and customers have their own reasons why they might prefer one of those. Delta Sharing for SAP data could be incredibly beneficial for all these platforms. Customers will be happy to pay a fee for this service. Surely SAP could come up with a commercial model for this to work, and truly live up to their statement they put the customer first?

SAP licence constraints – explainer

Ever wondered how you can get data out of SAP without violating the license agreement? You’re not alone. Most organisations planning to move SAP data up to a Cloud Data Platform are struggling with that very question. Here is a little explainer which hopefully helps you understand what you can or cannot do. But first:

Disclaimer
The terms and conditions of your contract with SAP are agreed between your company and SAP. I don’t know the specifics of your contract, nor am I able to provide legal advice. This article is based on my observation and interpretation. When you are planning to take data out of SAP, I recommend you consult with your SAP account manager and your legal team to ensure you comply with the terms and conditions of the contract.
Enterprise license vs Runtime license

You might have heard that for certain extraction methods an ‘enterprise license’ is required. This is to do with how the database on which the SAP system runs is licensed. When you run an SAP system, you have to install a database system first. SAP ERP can run on a variety of database systems (Oracle, IBM, MS SQL Server, and so on, as well as SAP HANA (the database). The SAP S/4HANA ERP system only runs on SAP HANA. The license restriction applies regardless of what database system you run the SAP ERP application on.

Runtime license

You can purchase a runtime license for the database with your SAP ERP license. The runtime license allows you to run SAP ERP, but nothing else. The SAP application is the only direct user of the underlying database. All other users and usage is managed through the SAP application. Having a runtime license means you cannot create tables directly in the database, but you can create tables in the SAP application (which in turn results in a table creation in the database, with the SAP system as owner). The license agreement does not let you create stored procedures or extraction programs in the database directly. You are also not allowed to read the database directly, or extract data from the database directly or extract data from the database log tables, without going through the SAP application.

Enterprise license

An enterprise license gives you unlimited rights on the database. You can create your own tables and applications on the database as well as running the SAP application. In this case, you are allowed to extract data from tables directly, either by using a 3rd party application which connects to the database or by creating your own extraction processes. An enterprise license will be significantly more expensive than a runtime license. If your company does not have an enterprise license and you want to take data out of SAP, you need to find a way to go through the application layer, instead of the database layer.

Using standard SAP interfaces

SAP has APIs and OData services for getting data out of SAP. Most of these are designed for operational purposes. For example: Give me the address of customer abc. Or: update the price of material abc. These are not really suitable for data extraction. The exception to this are the function modules related to the ODP framework: They can be consumed through OData, and this is still allowed by SAP. You can find more information on using OData for data extraction through ODP here.

Note that it is not permitted to use the ODP function modules through an RFC connection.
Please refer to this blog for more info on that .

There is a standard function module which can be used through RFC, which is the RFC_READ_TABLE or even better, one of the successors of that function module. (RFC_READ_TABLE can’t handle wide tables). Which versions are available depend on your system version, so best to search for it on the SAP system itself. I have the fewest problems with /BODS/RFC_READ_TABLE2. I wouldn’t recommend anyone to build a data warehouse solution based on this extraction method, not least because I am pretty sure SAP have specified somewhere that these FMs are meant for internal use, and might be changed at any time. I wouldn’t be surprised if SAP announces it will forbid the use of these function module in a similar fashion as the ODP function modules.

Third party applications

Third party applications can either use the APIs (Function Modules) mentioned above or create their own application logic to get data out of SAP. If they are using the standard function modules then the same restrictions apply. This means ODP extraction through RFC is not allowed – even if this process is managed by a 3rd party application.

Applications which implement their own interfaces on the SAP system are ‘safe’ – at least for the time being. The small downside of this approach is that you need to implement a piece of code (delivered by the application vendor) in each SAP system you want to connect to. The upside is that the end-to-end process is more robust, better performing and easier to maintain than solutions built on the SAP standard APIs.

Be mindful of third party applications which read the database log or otherwise connect directly to the database layer: You will need an Enterprise license for this, using a 3rd party application does not make a difference from a licensing perspective.

SAP Datasphere

And then there is SAP. SAP Datasphere is perfectly capable of getting data out of SAP, and onto the cloud data platform of your choice. If this would be the only use case you have for SAP Datasphere, then I would imagine this is a very pricy solution. Still, I wanted to make sure I cover all the options.

SAP’s cynical move to keep control of your enterprise data (aka note 3255746)

SAP has rocked the boat. They have issued an SAP note (3255746), declaring a popular method for moving data from SAP into analytics platforms out of bounds for customers. Customers and software vendors are concerned. They have to ensure they operate within the terms & conditions of the license agreement with SAP. It seems unfair that SAP unilaterally changes these Ts and Cs after organisations have purchased their product. I will refrain from giving legal advice but my understanding is that SAP notes are not legally binding. I imagine the legal teams will have a field day trying to work this all out. In this article I will explain the context and consequences of this SAP note. I will also get my crystal ball out and try and predict SAPs next move, as well as giving you some further considerations which perhaps help you decide how to move forward.

What exactly have SAP done this time?

SAP first published note 3255746 in 2022. In the note, SAP explained that 3rd parties (customers, product vendors) could use SAP APIs for the Operational Data Provisioning (ODP) framework but these APIs were not supported. The APIs were meant for internal use. As such, SAP reserved the right to change the behaviour and/or remove these APIs altogether. Recently, SAP have updated the note (version 4). Out of the blue, SAP declared it is no longer permitted to use the APIs for ODP. For good measure, SAP threatens to restrict and audit the unpermitted use of this feature. With a history of court cases decided in SAPs favour over license breaches, it is no wonder that customers and software vendors get a bit nervous. So, let’s look at the wider context. What is this ODP framework and what does it actually mean for customers and product vendors?

SAP ODP – making the job of getting data out of SAP somewhat less painful

Getting data out of SAP is never easy, but ODP offered very useful features to take away some of the burden. It enabled external data consumers to subscribe to datasets. Instead of receiving difficult to decipher raw data, these data sets would contain data which was already modelled for analytical consumption. Moreover, the ODP framework supports ‘delta enabled’ data sets, which significantly reduces the volumes of data to refresh on a day-to-day basis. When the ODP framework was released (around 2011(1)), 3rd party data integration platforms were quick to provide a designated SAP ODP connector. Vendors like Informatica, Talend, Theobald and Qlik have had an ODP connector for many years. Recently Azure Data Factory and Matillion released their connector as well. SAP also offered a connection to the ODP framework through the open data protocol OData. This means you can easily build your own interface if the platform of your choice does not have an ODP plug-in.

One can imagine that software vendors are not best pleased with SAP’s decision to no longer permit the use of the ODP framework by 3rd parties. Although all platforms mentioned above have other types of SAP connectors(2), the ODP connector has been the go-to solution for many years. The fact that this solution was not officially supported by SAP has never really scared the software vendors. ODP was and remains to be deeply integrated in SAP’s own technology stack and the chances that SAP will change the architecture in current product versions are next to zero.

Predicting SAP’s next move

You might wonder why SAP is doing this? Well, in recent years, customers have voted with their feet and moved SAP data to more modern, flexible and open data & analytics platforms. There is no lack of competition. AWS, Google, Microsoft, Snowflake and a handful of other contenders all offer cost effective data platforms, with limitless scalability. On these data platforms, you are free to use the data and analytics tools of your choice, or take the data out to wherever you please without additional costs. SAP also has a data & analytics platform but this is well behind the curve. There are two SAP products to consider, SAP Analytics Cloud (SAC) and SAP DataSphere.
The first is a planning and analytics toolset for business users and was introduced in 2015. For a long time, it was largely ignored. In recent years, it has come to maturity and should now be considered a serious contender to PowerBI, Tableau, Qlik and so on. I’m not going to do a full-blown comparison here but the fact that SAC has integrated planning capabilities is a killer feature.
SAP DataSphere is a different story. It is relatively new (introduced as SAP Data Warehouse Cloud in 2020) – and seasoned SAP professionals know what to do with new products: If you’re curious you can do a PoC or innovation project. If not, or you don’t have the time or means for this kind of experimenting, you just sit and wait until the problems are flushed out. SAP DataSphere is likely to suffer from teething problems for a bit longer, and it will take time before it is as feature-rich as the main competitor data platforms. One of the critical features which was missing until very recently was the ability to offload data to cloud storage (S3/Blob/buckets, depending on your cloud provider). That feature was added in Feb 2024. Around the same time as when SAP decided that 3rd parties could no longer use the ODP interface to achieve exactly the same. Coincidence?

So where is SAP going with this? Clearly they want all their customers to embrace SAP DataSphere. SAP charges for storage and compute so of course they try and contain as many workloads and as much data as they can on their platform. This is not different from the other platform providers. What is different is that SAP deliberately puts up barriers to take the data out, where other providers let you take your data wherever you want. SAP’s competitors know they offer a great service at a very competitive price. It seems SAP doesn’t want to compete on price or service, but chooses to put up a legal barrier to keep the customer’s data on their platform.

SAP Certification for 3rd party ETL tools no longer available

Blocking the use of ODP by 3rd party applications is only the beginning. SAP has already announced it will no longer certify 3rd party ETL tools for the SAP platform(3). The out-and-out SAP specialists have invested heavily in creating bolt-on features on the SAP platform to replicate large SAP data sets efficiently, often in near real-time. The likes of Fivetran, SNP Glue and Theobald have all introduced their own innovative (proprietary) code purely for this function. SAP used to certify this code, but has now stopped doing so. Again, the legal position is unclear and perhaps SAP will do a complete u-turn on this, but for now it leaves these vendors wondering what the future will be for their SAP data integration products.

What do you need to do if you use ODP now through a 3rd party application?

My advice is to start with involving your legal team. In my opinion an SAP note is not legally binding like terms & conditions are, but I appreciate my opinion in legal matters doesn’t count for much.
If you are planning to stay on your current product version for the foreseeable future and you have no contract negotiations with SAP coming up then you can carry on as normal. When you are planning to move to a new product version though, or if your contract with SAP is up for renewal, it would be good to familiarise yourself with alternatives.

As I mentioned before, most 3rd party products have multiple ways of connecting to SAP, so it would be good to understand what the impact is if you had to start using a different method.
It also makes sense to stay up-to-date with the SAP DataSphere roadmap. When I put my rose-tinted glasses on, I can see a future where SAP provides an easy way to replicate SAP data to the cloud storage of your choice, in near-real time, in a cost effective way. Most customers wouldn’t mind paying a reasonable price for this. I expect SAP and its customers might have a very different expectation of what that reasonable price is but until the solution is there, there is no point speculating. If you are looking for some inspiration to find the best way forward for you, come talk to Snap Analytics. Getting data out of SAP is our core business and I am sure we can help you find a futureproof, cost effective way for you.


Footnotes and references

(1) – The ODP framework version 1.0 was released around 2011, specifically with SAP NetWeaver 7.0 SPS24,  7.01 SPS 09, 7.02 SPS 08. The current version of ODP is 2.0, which was released in 2014 with SAP Netweaver 7.3 SPS 08, 7.31 SPS 05, 7.4 SPS 02. See notes 1521883 and 1931427 respectively.

(2) – Other types of SAP connections: One of my previous blog posts discusses the various ways of getting data out of SAP in some detail: Need to get data out of SAP and into your cloud data platform? Here are your options

(3) – Further restrictions for partners on providing solutions to get data out of SAP, see this article: Guidance for Partners on certifying their data integration offerings with SAP Solutions

Better, cheaper, faster – you can have all three!

You may have heard the saying “Better, cheaper, faster – pick two”. The idea isn’t new. If you want something really good and you want it quickly, you’re going to have to pay. If you want to save money and still keep the quality, you’ll need to wait. And so on.

But in big data that mantra is being subverted. Thanks to the cloud, you can now deliver data solutions that are more flexible, scalable and, crucially, cheaper and faster. Best of all it doesn’t mean abandoning quality or reliability – if well designed you can achieve better quality and consistency.

Not so long ago, if you were planning a data project you might have to set aside a big chunk of your budget for licences, servers and a data warehouse in which to store it. On top of this, you’d need a specialised (and potentially expensive) team of people to set up the infrastructure and operate the hardware. This effectively put data analysis out of the reach of many smaller businesses.

The cloud has changed all that – revolutionising the delivery of data analytics. So, what exactly can it offer you?

BETTER

Today’s cloud based technology is so simple even the least tech savvy people are able to reap the rewards. You no longer need to know how much storage you require up front as companies like Snowflake simply offer small, medium or large solutions plus the option of almost infinite scalability. For many new and smaller businesses the entry package will be enough, allowing you to upload millions of rows of data. And as you expand you can simply scale up.

Conventional wisdom once said that there was no more secure way of storing data than keeping it all on your premises where it was maintained by and managed by a member of staff. In 2019 that is no longer true. Even the most conscientious IT person will be constrained by your budget and facilities. By handing this responsibility over to the likes of Microsoft with their near infinite resources, there is arguably no safer way of storing your valuable data.

CHEAPER

The maths is simple: with modern data platforms like Snowflake, you just pay for what you use. Whereas previously you would have had to try and work out up front how much space you needed and hope you hadn’t overestimated or underestimated (with the associated painful time and cost implications), now you can simply scale up or down as necessary as and when your business requires. If for example your business acquires a new company, it’s easy, simply instantly increase the size of your data warehouse. At the time of writing, a terabyte of storage with Snowflake is an astonishing $23 per month.

This flexibility also means reduced waste. Once you had to pay for a solution that might only be used on one day every month when you had to run 10,000 reports. The other 30 days it sat idle costing you money. Now you can pay for the smallest package for the majority of the month and set it to automatically scale up when you really need the resources.

FASTER

Remember the sound of whirring fans and the wall of heat that would hit you when you went anywhere near the server room? Thanks to the cloud you can do away with the racks upon racks of energy guzzling storage and move it all off site, possibly thousands of miles away. This doesn’t make it slower; thanks to modern petabyte networks, you can access your data in a fraction of the time, generating reports in 10 seconds rather than 20 minutes.

Several years ago Snap Analytics was hired by a large automotive manufacturer for a major project based on their premises. At the time cloud storage didn’t have quite the same functionality and wasn’t trusted to do the job. As a result we had to work on site with their people, working within their existing systems just to set up the architecture. It added nearly 6 months to the project – and quite a few zeros to the final invoice. Thankfully, with modern data platforms, these overheads are completely eliminated, the scalability is infinite and the speed is truly phenomenal. And all delivered at a fraction of the price!