Fivetran’s Hybrid Deployment Solution offers a secure, flexible solution for seamless data movement

Reading time: 5 mins

Meet the Authors

  • Mark Vigoroso

    CEO, ERP Today & Chief Content Officer, Wellesley Information Services

Key Takeaways

⇨ Fivetran has introduced Hybrid Deployment, a solution that allows enterprises to manage data pipelines securely within their own environments, enhancing control over sensitive information while leveraging Fivetran's management tools.

⇨ The new deployment model addresses critical data challenges for businesses, enabling seamless access to high-quality data necessary for AI adoption, while adhering to stringent regulatory compliance and security standards.

⇨ Hybrid Deployment provides enterprises full visibility and control over data movement processes, compatibility across various cloud and on-premises environments, and simplifies management with a user-friendly interface for monitoring all data pipelines.

Fivetran, an SAP partner and provider of data movement solutions, recently announced Hybrid Deployment, a new solution that allows customers to securely run data pipelines within their own environment from the Fivetran managed platform, providing a single control plane to manage all data sources, whether they are cloud-based SaaS apps or legacy databases with data that needs to be tightly controlled and managed for regulatory or compliance purposes. With a comprehensive selection of connectors, destinations and features, Fivetran empowers enterprises to centralize data efficiently, reliably and securely, while supporting governance and compliance requirements and benefiting from 24/7 support.

Fivetran is addressing what recent SAPinsider research uncovered as the number one obstacle to AI adoption. In the recent report AI: State of Adoption 2024, 53% of organizations cited challenges with legacy data and applications as the leading obstacle to adopting AI. High-quality data ensures that AI models make accurate predictions and decisions. If the data is noisy, incomplete, or biased, the AI model’s performance will suffer, leading to poor outcomes. Also AI models generally improve with larger datasets. The more data the model has access to, the more patterns it can learn and the better it performs in real-world applications.

Enterprises with sensitive data or in regulated industries often struggle to build data-driven practices. Seamless data access is critical for boosting efficiency, enabling AI-powered analytics and producing reliable data products. Centralizing data can lead to complex, costly systems that burden engineering teams. Traditional SaaS tools often lack the needed control and security, forcing organizations into more cumbersome solutions. DIY or self-hosted solutions often fail to scale effectively, especially as data priorities evolve in the age of AI, resulting in inefficiencies and an urgent need for modernization. Fivetran’s Hybrid Deployment simplifies this by enabling businesses to securely centralize their data and prepare it for AI and machine learning, all while maintaining control over sensitive information.

“Introducing a hybrid cloud deployment model to the Fivetran platform opens up entirely new possibilities for businesses of all sizes,” said George Fraser, Fivetran CEO. “Businesses no longer have to choose between managed automation and data control. They can now securely move data from all their critical sources—like Salesforce, Workday, Oracle, SAP, other cloud and on-premises databases and ERPs—into a data warehouse or data lake, all while keeping that data under their own control. This is especially valuable for industries like healthcare and life sciences and financial services, where secure, compliant and reliable data is critical.”

Hybrid Deployment keeps sensitive data within the customer’s environment while taking advantage of Fivetran’s management and monitoring tools. A lightweight local connector securely moves the data, while Fivetran’s user-friendly interface makes configuration and monitoring simple, all from a single control plane.

“Fivetran’s new hybrid deployment option is a game-changer. Now customers can easily centralize all their data, regardless of security or compliance requirements, given the ability to separate the control and data plane,” said Vinay Kumar Katta, Managing Delivery Architect, Capgemini. “Customers will appreciate Fivetran’s best-in-class platform that offers the flexibility to choose how and where their pipelines run. We’re excited to advance our partnership with Fivetran and help more enterprises modernize their data infrastructure.”

The separate control and data plane architecture ensures data never leaves the customer’s secure environment, giving them ultimate control over how their data is moved while still benefiting from Fivetran’s managed platform. This solution automates and centralizes the data movement process, while also giving enterprises the benefits of:

  • Full visibility: Monitor all pipelines from a single interface.
  • Data security: Control access, mask sensitive data and track movement to ensure compliance.
  • Compatibility across environments: Works across AWS, Microsoft Azure, Google Cloud, and on-premises environments.
  • Simple setup: Quick installation with no complex maintenance required.
  • Flexibility: Scale and customize pipelines with integrations for APIs and tools like Terraform.
  • Cost management: Track usage and control budgets with detailed reporting tools.

As government regulations like GDPR, HIPAA and CCPA continue to tighten, and new ones emerge, enterprises are increasingly focused on ensuring their data practices meet strict security and compliance standards. Recent breaches in healthcare and financial services have highlighted the risks of exposing sensitive information and compromising critical systems. DIY or self-managed pipelines can introduce vulnerabilities, especially as teams must frequently make updates in order to meet rapidly changing data schema, which increases the risk of errors that can lead to data breaches. Fivetran’s Hybrid Deployment mitigates these risks by providing a managed solution that runs within the customer’s secure environment, ensuring compliance while eliminating common points of failure.

“The simplicity of management is the best value of Hybrid Deployment,” said Ajay Bidani, Data and Insights Manager at Powell Industries. “I know the status of our pipelines and can launch and manage cloud-based and on-premises pipelines directly from one platform. Given our history with on-premises applications, I can say that pipelines going down and requiring restart was quite cumbersome, but Hybrid Deployment has been considerably different. I can realize the ease of monitoring and maintaining pipelines with immediacy. The ability to quickly stand up Hybrid Deployment for on-premises data movement, while managing it from a straightforward and familiar cloud-based control plane, is a great value add.”

What this means for SAPinsiders

Share your data management strategies. The focus on enterprise data management is intensifying with the proliferation of 5G, IoT, AI/ML and other transformative technologies. SAP customers are increasingly looking for new data management models for the storage, migration, integration, governance, protection, transformation, and processing of all kinds of data ranging from transactional to analytical. Balancing the risks, compliance needs, and costs of data management in SAP HANA on-premise and on the cloud while also providing reliable, secure data to the organization is increasingly important to the business We will be releasing the 2025 Data Management Strategies research report in February 2025. Contribute to the research by completing this survey: https://www.research.net/r/DataMgt25.

Handle sensitive data properly when moving data across environments. Implement data masking techniques to protect sensitive information (e.g., personally identifiable information, payment data) when used in non-production environments or during pipeline processing. For datasets containing personal information, use anonymization techniques to remove or obscure identifiable data before processing in the pipeline. Ensure compliance with privacy laws like GDPR or CCPA.

Monitor and audit data pipelines. Implement real-time monitoring for your data pipelines to detect unusual activity or performance issues. Enable detailed logging to capture events such as data access, modification, or execution of pipeline tasks. Logs should include user activity, changes to pipeline configurations, and data movements. Use AI/ML-powered tools to detect anomalies in data flows or pipeline execution, such as unexpected spikes in data transfer or unauthorized access attempts.

More Resources

See All Related Content