The Datawave Platform

Take charge of your data processes

HCL Datawave provides a data management platform that enables businesses to build, manage and control their data flows across multiple pipelines.

One of the key challenges that organisations face when trying to gain access to data is complexity: legacy applications, multiple platforms, multiple storage methods and multiple data integration jobs ranging from traditional programs and ETL platforms (Extract, Transform, Load) to the latest Big Data, Cloud and streaming technologies.

The Datawave Platform is designed to build, manage and control all of these processes on a single platform. Organisations can optimise and combine the flow of data across the environments, technologies and applications that they have today and plan to build in the future – giving 360° management of entire data flows and inter-dependencies. We call this process orchestration.

Evolving data landscape

Enterprises have traditionally been focussed on delivering single enterprise warehouse programmes. These were typically very large, technically difficult programmes with extended time-lines.

Customer focused data

Driven by a more customer centric approach, today's data landscape is vastly different from even a few years ago. A cloud first, big data solution is increasingly the new normal, with an emphasis on valuable real-time self service analytics. This approach allows the enterprise to better service their customers and enable the monetising of data that was previously locked away in disparate unconnected systems.

Platform modernisation

Enterprises rely on legacy applications and there is a need for them to remain and run alongside the latest technologies. As data landscapes increasingly become more complex, data integration is about more than just one proprietary platform. Enterprise apps have many platforms, running 100s of data pipelines. These pipelines are inter-dependent either in terms of data assets or resources, or both.

Advantages

At HCL Datawave, we approach any data movement task from its constituent parts and look to automate it where possible. The Datawave Platform enables rapid no-code data delivery, Orchestration of new processes and existing legacy applications and gives the organisation 360° management of entire data flows and inter-dependencies.

Benefits

  • Automation of data provisioning
    – Code generation
  • Enables Trusted data
    – Data Orchestration
  • Agile data delivery and continuous integration
  • Consistent – Cleaned, governed,  commoditised data provisioned
  • 50%+ smaller delivery teams
  • Less Rockstar developers needed
  • Automated testing

Evolving data landscape

Data Types

Transactional data

Reference data

Social feeds, email & webchat

Streaming data

Platforms

RDBMS

No SQL databases

Enterprise applications

Cloud

Hybrid

Mainframe

Processes

ETL platforms

Hand coding

Big Data streaming & analytics

Data wrangling

BI tools

Code Generation

Pattern Based
Code Automation

Web based UI
– on-site, cloud, hybrid

Out of the box patterns

Data Control
& Monitoring

Job Management Framework
for Dev Test and Production

Operational dashboard
monitoring

Data Architecture
& Delivery Approach

Development approach
for Data Delivery

CREST Data Architecture
(Controllable Reusable Enterprise for Staging and Transformation of data)

Orchestrate. Automate. Transform.

 
  • Automation of data provisioning – Code generation
  • Enables Trusted data – Data Orchestration
  • Agile data delivery and continuous integration
  • Consistent – Cleaned, governed,  commoditised data provisioned
  • 50%+ smaller delivery teams
  • Less Rockstar developers needed
  • Automated testing

Take charge of your data processes

HCL Datawave provides a data management platform that enables businesses to build, manage and control their data flows across multiple pipelines.

One of the key challenges that organisations face when trying to gain access to data is complexity: legacy applications, multiple platforms, multiple storage methods and multiple data integration jobs ranging from traditional programs and ETL platforms (Extract, Transform, Load) to the latest Big Data, Cloud and streaming technologies.

The Datawave Platform is designed to build, manage and control all of these processes on a single platform. Organisations can optimise and combine the flow of data across the environments, technologies and applications that they have today and plan to build in the future – giving 360° management of entire data flows and inter-dependencies. We call this process orchestration.

Evolving data landscape

Enterprises have traditionally been focussed on delivering single enterprise warehouse programmes. These were typically very large, technically difficult programmes with extended time-lines.

Customer focused data

Driven by a more customer centric approach, today's data landscape is vastly different from even a few years ago. A cloud first, big data solution is increasingly the new normal, with an emphasis on valuable real-time self service analytics. This approach allows the enterprise to better service their customers and enable the monetising of data that was previously locked away in disparate unconnected systems.

Platform modernisation

Enterprises rely on legacy applications and there is a need for them to remain and run alongside the latest technologies. As data landscapes increasingly become more complex, data integration is about more than just one proprietary platform. Enterprise apps have many platforms, running 100s of data pipelines. These pipelines are inter-dependent either in terms of data assets or resources, or both.

Advantages

At HCL Datawave, we approach any data movement task from its constituent parts and look to automate it where possible. The Datawave Platform enables rapid no-code data delivery, Orchestration of new processes and existing legacy applications and gives the organisation 360° management of entire data flows and inter-dependencies.

Benefits

  • Automation of data provisioning – Code generation
  • Enables Trusted data – Data Orchestration
  • Agile data delivery and continuous integration
  • Consistent – Cleaned, governed,  commoditised data provisioned
  • 50%+ smaller delivery teams
  • Less Rockstar developers needed
  • Automated testing

Evolving data landscape

Data Types

Transactional data

Reference data

Social feeds, email & webchat

Streaming data

Platforms

RDBMS

No SQL databases

Enterprise applications

Cloud

Hybrid

Mainframe

Processes

ETL platforms

Hand coding

Big Data streaming & analytics

Data wrangling

BI tools

Code Generation

Pattern Based
Code Automation

Web based UI
– on-site, cloud, hybrid

Out of the box patterns

Data Control
& Monitoring

Job Management Framework
for Dev Test and Production

Operational dashboard
monitoring

Data Architecture
& Delivery Approach

Development approach
for Data Delivery

CREST Data Architecture
(Controllable Reusable Enterprise for Staging and Transformation of data)