Enabling Data Fabric Analytics Platform

Blog > Enabling Data Fabric Analytics Platform

What is Data Fabric?

Data Fabric is the Data Management Architecture that enables the Business with the necessary data immediately to make the decisions. Fabric is an integrated data which was woven from multiple sources and variety of data that provides insights. In this digital journey, Data grows exponentially on day-to-day basis in a variety of formats. There are numerous sources which captures and shares the data. For instance, a marketing leader can infer the market pulse of the new product launch using a data fabric.

This fabric is sewed from internal product master data, external social media and internal CRM data from Cloud. It takes several weeks or months in a traditional ETL process to build this solution due to the complexity in data structures and the availability of the source connectors. Data Fabric architecture simplifies and enables the ways for the swift availability of data. It also focuses in sharing the business data rapidly to others.

Gartner defines Data Fabrics as “a design concept that serves as an integrated layer (fabric) of data and connecting processes. A data fabric utilizes continuous analytics over existing, discoverable and inferenced metadata assets to support the design, deployment and utilization of integrated and reusable data across all environments, including hybrid and multi-cloud platforms. ”. In a nutshell, Data Fabric Analytics platform help Business to use any data from anywhere and share their data to anyone quickly.

Conceptual View

about the Data Fabric

Components of Data Fabric Architecture

The key tenets of Data Fabric architecture are:

Robust Data Integration Layer

This section is the backbone of Data Fabric architecture. Data integration layer covers the data ingestion and transformation framework to handle structured, semi-structured and un-structured data from different sources like On-Prem/Cloud Databases, Streaming devices, External Data providers, Cloud Storage area, Enterprise Products, Data Lake, etc. Data Fabric has all the native connectors or SDKs to connect to any sources to bring the data. It has the options to parse semi-structured data from JSON or XML or APIs. It handles the transformations to create the cleansed Data Mart or Lake which eventually acts as input to generate Data Fabric.

Micro Services Layer

It constitutes as building block of data fabric solution. It serves as an isolated entity to exchange data in real time. Data can be received or shared by enabling this layer. Exposing data to external world must have robust security methodologies and authorization techniques. It has the options of encrypting datasets during the data transfer. The metadata of this layer should be handy and helps the users of this layer to understand the attributes and object structures.

Intelligent Knowledge Graph Layer

It creates the semantic layer with enriched data and metadata making more valuable to Business. It creates the collection of interlinked concepts and entities by connecting the isolated datasets to meet the actual business needs. With the metadata combination, Knowledge Graph becomes much more powerful which help business to search and get insights on the data quickly.

Data Governance Layer

This governs the Data Fabric platform in defining the standards, the approaches for different ingestion methods, the security principles, managing the different data stores and authorizing the users to the relevant data.

Data Consumption Layer

This talks about the delivery of the data to the Business or external teams. The delivery can be through the business intelligence solutions, web solutions or APIs. It provides the different perspective and more insights of the data.

How to enable Data Analytics Platform as Data Fabric?

Data Fabric is a design concept which comprises of framework with different products and solutions to create a fabric. The existing Data Analytics platform could have Data Integration tools and custom built solutions. It could have challenges to consume IoT data in real time. The platform has to be supplemented with new tools or enhance custom built solutions to ingest IoT data in platform. Data Analytics Platform has to be newly built or enhanced to handle the below scenarios,

  • Data Ingestion methods for different variety of data and multiple sources
  • Meta data management
  • Data Parser and Transformation process
  • Build Semantic Knowledge Graphs
  • Data Sharing across the teams
  • Data Governance
Benefits of Data Fabric solution
  • Enables to deliver the data insights to Business rapidly
  • Enables Self-Service in data ingestion and data consumptions
  • Enables Data Access to any where and in any format
How Data Finz accelerates to build Data Fabric solution

Data Finz is a “No Code Data Integration Platform” designed to resolve technical challenges and handle data integration use cases with simple configurations. This can be a part of the Integration and Micro Services layer in the Data Fabric Analytics Platform. Being No Code, it brings agility and productivity for the development teams and accelerates to reach Data Fabric state quickly. It has built-in pipelines which are designed for specific use cases. We have to configure these pipelines with the required connection, that’s all. Data Finz takes care of the Data Integration use case automatically. It has the below features:

  • Numerous Connectors – REST API with different authentication methods, Salesforce, JDBC, NoSQL, Cloud Storage (S3, Blob, Dropbox, Sharepoint)
  • Consume any format of data
  • Parse API or JSON or XML structures to a structured data
  • Generate Entity Relationship Diagram (ERD) from JSON or XML or API metadata
  • Generate Payload from structured data
  • Publish any data (Table, View, Stored Proc, Flat File) as an API
  • Generate Swagger or OpenAPI documentation
  • Copy data across Cloud environments or between On-Prem and Cloud environments
  • Execute SQLs as an ELT approach in target Data Marts or Big Data or EDW
  • Build Operational Data Store for transactional systems
  • Modernization of EDW using Lift & Shift approach
  • Validation of Data in the modernization process
  • Profiling of datasets with visualizations