Skip to content
Related Articles

Related Articles

Data Mining: Data Warehouse Process

Improve Article
Save Article
  • Last Updated : 23 Jun, 2022
Improve Article
Save Article

Data Warehouses are information gathered from multiple sources and saved under a schema that is living on the identical site. It is made with the aid of diverse techniques, inclusive of the following processes:

1. Data Cleanup: Data cleaning is the way of preparing statistics for analysis with the help of getting rid of or enhancing incorrect, incomplete, irrelevant, duplicate, or irregularly formatted information. This fact is no longer necessary or beneficial if you want to research the statistics because it is able to interrupt the technique or supply false results.

2. Data Integration: Data integration is the process of integrating data from different assets into a unified view. The integration method starts with a startup and includes steps that include refinement, ETL mapping, and conversion. Data integration ultimately permits analytics tools to create powerful and cheap enterprise intelligence. In a typical data integration procedure, the client sends a request for information to the master server. The master server prepares the vital records for internal and external assets. Extracts facts from sources and then integrates them into a single information set. It is then returned to the client for use.

3. Data Transformation: The process of converting information from one layout or shape to another is referred to as data transformation. Data transformation is critical for features that include data integration and information management. Data transformation has several capabilities: you can change the record types based on the needs of your project; enrich or aggregate the records by removing invalid or duplicate data. Generally, the technique consists of two stages. 

In the first step, you should:

  • Perform an information search that identifies assets and data types.
  • Determine the structure and information changes that occur.
  • Mapping data to discover how character fields are mapped, edited, inserted, filtered, and stored.

In the second step, you must:

  • Extract data from the original source. The size of the supply can range from a connected tool to a dependable useful resource along with a database or streaming resources, including telemetry or logging files from clients who use your web application.
  • Send data to the target site.
  • The target may be a database or a data warehouse that manages structured and unstructured records.

4. Loading Data: Data loading is the process of copying and loading data from a report, folder, or application to a database or similar utility. This is usually done via copying digital data from the source and pasting or loading the records into a data warehouse or processing tool. Data-loading is used in data extraction and loading methods. Typically, such information is loaded in a different format than the original location of the source.

5. Data Refreshing: In this process, the data stored in the warehouse is periodically refreshed so that it maintains its integrity. A data warehouse is a model of multidimensional data structures that are known as “Data Cubes” in which every dimension represents an attribute or different set of attributes in the schema of the data and each cell is used to store the value. Data is gathered from various sources such as hospitals, banks, organizations, and many more and goes through a process called ETL (Extract, Transform, Load).

  • Extract: This process reads the data from the database of various sources.
  • Transform: It transforms the data stored inside the databases into data cubes so that it can be loaded into the warehouse.
  • Load: It is a process of writing the transformed data into the data warehouse.

This process can be seen in the illustration below:  

Features of Data Warehouse: Please refer – Features of Data Warehouse.

My Personal Notes arrow_drop_up
Related Articles

Start Your Coding Journey Now!