Leveraging Data Connection Gateways to Harness Full Value from Analytics Investments
By Kumar Singh, Research Director, Automation & Analytics, SAPinsider
The critical role of analytics in digital transformation
Organizations today are rapidly evolving into mathematical corporations, driven by analytics and technology. Fueled by an explosion in data, they are making strategic as well as day-to-day decisions based on insights generated by this data. Hence, it is not an intricate deduction that analytics today is the driving force behind most successful organizations, across industries. A carefully crafted combination of business strategy and analytics strategy can propel most organizations forward. All the push in the area of digital transformation, specifically during the pandemic has further cemented the extremely strategic role analytics will play in deciding the success or failure of organizations. Another key aspect that is adding fuel to the fire is the rapid acceleration of cloud technology. As cloud breaks the many barriers organizations ran into from data and analytics infrastructure perspective, it is providing a whole new playground for analytics tools. Many best-of-breed analytics platforms, like SAP Analytics Cloud (SAC) are cloud native.
However, an extremely important point that we need to keep into perspective is that what drives every analytics tool and technology is the underlying data. And as analytics tools, like SAC, become more advanced and sophisticated, in many instances, their potential is getting constrained by the critical input- data. In order to leverage these tools to their full potential, it is extremely critical to ensure that these tools have access to all the right and required data sets. And this is where many organizations are running into a bottleneck. SAPinsider recently had the opportunity to discuss this critical topic with Ofir Gil, Chief Technology Officer (CTO) at APOS Systems. You can watch the full video of the discussion here:
Data connectivity challenges of analytics
The fact is, it is not just about success of analytics initiatives. The whole imperative of digital transformation is heavily dependent upon having the capability to build seamless data connections. As Ofir highlighted in our conversation: “In our conversations with customers, it is clear that data connectivity is key to the success of their digital transformation process.” Investments in many key digital transformation tools, like advanced analytics platforms like SAC, rely on seamless data connections to a plethora of data sources. In the SAP ecosystem, this connectivity aspect transcends beyond SAP based data sources. Organizations have fragmented data sources across their data and analytics landscape and most analytics platforms need to pull data from all these sources in order to generate effective insights. As per Ofir : “Connectivity, especially live connectivity, to non-SAP data sources is a hurdle that customers are facing in order to get a complete analytical picture of their data landscape.”
In an increasingly uncertain world, organizations are feeling the imperative to be more resilient and agile. Ofir also highlighted this fact when he stated: “One of the biggest challenges is rapid response to change. New business questions and processes surface continuously which causes IT driven business intelligence to become a bottleneck for analytics driven business activity.” A key aspect of the agility that orgaizations seek comes from having the capability to get insights fast and then react to them in an agile way. And this is where the criticality of analytics, and specifically the ability to tap into all forms of data comes into play. You can not have an agile analytics process unless you have seamless data connections to a wide portfolio of data sources.
The fact is that data is generated in multiple silos, in a very fragmented way in organizations today, both on-premise as well as in the cloud. Add to that the complexity of tapping into the data that is being generated by edge devices. It is obvious that you need to have live data connections in order to be able to generate any type of real time data analytics capabilities but it is easier said than done. Also, as we transition into the age of self-service, augmented analytics tools, the ability to establish these live data connections and the capability to run analytics on this data seamlessly is very critical, to help build user trust in analytics tools and technologies. This was emphasized by Ofir as well in his quote: ” The model of IT building on premise data warehouses with complex ETL processes to facilitate the data needs for driving growth in the business cannot keep up with the pace of change. This has led to the creation of large cloud-based data sources that can respond to change much more quickly. Utilizing those sources by Business Intelligence solutions requires live connections and fast querying engines. This allows for ad-hoc self-service data analysis and modeling to drive business growth.”
Data gateways- solution to a critical challenge of data connectivity
To start with, let us visit a simple definition of a data gateway. A data gateway is a solution that allows your organization to connect to multiple data sources, and provides a single and central point of access to connect to these data sources. In many instances, a single gateway can be used with all supported services. These gateways are well-suited to complex scenarios in which multiple people access multiple data sources. And with this very definition, you can start seeing the benefits these gateways can generate. A best in class data gateway should be able to provide virtualized connections to a gamut of data sources, both on-premise as well as cloud. This essentially means that this gateway handles the complexity of being a single interface for all the disparate sources and as Ofir quoted: “provides a unified semantic layer to hide the complexities of the data sources.”
This allows different applications to consume data through the gateway from different sources without the need to configure each consumer for the different protocols needed to consume the data. Each consumer can have just one access point to gain access to all the data sources they need to consume, and the semantic layer makes the data consumed identical across all the consumers. A good peek into capability of these gateways was evident in this statement by Ofir: “So, whether you need to analyze 1.3 billion rows in Snowflake while rapidly changing measures and dimensions in your charts and graphs or looking at hundreds of millions of rows in Google Big Query, SAP Analytics Cloud can rapidly handle all that data live using our gateway.”
What does this mean for SAPinsider ?
While the definition of digital transformation may vary from one organization to another, there is no denying the fact that it all begins with data. With rapid acceleration in technology and computing power, we now have a wide variety of analytics tools available in all domains, which means organizations can choose what works best for them and for their end users. That creates a complex landscape of siloed and fragmented data. The good news is that regardless of this complexity in data landscape, there is a robust solution to centralize data access responsively and responsibly, and that way is a solution like the APOS Live Data Gateway. However, as SAPinsiders embark on their journey to build this capability, there are certain aspects they will need to keep in perspective.
Understand your data landscape. Amid all the focus on analytics tools and technologies, data sometimes may end up taking a backstage in many organizations. However, data is foundational to determining the success of your analytics initiatives. So if you are looking to upgrade your analytics capabilities, a critical first step is to understand if your data infrastructure will be in sync with your upgraded analytics capabilities.
Invest in data quality, integration and connection. While there are many cultural, organizational and numerous other aspects for analytics to take deep roots in any organization, the underlying data plays an extremely important role. One important aspect is obviously data quality and integrity. GIGO is a frequently used acronym in the analytics world. It stands for Garbage In-Garbage Out (GIGO). It does not matter if you designed the world’s best models, if your underlying data is bad. And that is what GIGO stands for. Data integration is another key foundational requirement and we have discussed that in several SAPinsider articles and webinars and shared insights on tools and technologies that can help eliminate data silos. However, the reality is that data sources are still fragmented in most organizations and hence when organizations invest in best of breed analytics tools like SAP Analytics Cloud, one of their pain points is data connectivity. Providing data from all sources to drive the analytics process becomes the key to success for organizations. And if this seamless connection to data sources is not in place, this one pain point can lead to multiple challenges.
Invest in data connection gateways that aligns best with your unique landscape. The value of a data gateway has already been emphasized in this organization so we can conclude with a quote from Ofir : “Organizations can rest assured that, regardless of which data sources are in their mix, there is a way to centralize data access responsively and responsibly, and that way is the APOS Live Data Gateway. The APOS Live Data Gateway brings better time to value.”