An error data,duplicate data and data inconsistency is a big problem In a Database system. Many researcher's are trying to implement and solving a specific model for data cleaning and that will help to keep quality data in the data warehouse. Data quality is a critical factor for Data Warehouse and Data Integration System. Improving the quality of data is important because it is used in the process of decision support, which requires accurate data. Data cleaning is the process of the identifying and removing or correcting error in the data. Data cleaning is a key precondition for analysis decision and data integration. Rule base design is the new face for data cleaning. So we decide that, we choose this new technique and design its model with algorithm and justify the existing model for the data cleaning. The main objective of the data cleaning is to reduce the time and complexity of process and increase the quality and error free data. This book, therefore,provides a new metric of success for a database administrator, professional programmer who is working on DBMS and data warehouse system to enhanced data quality in real life.
Md. Alomgir Hossain,PhD researcher at Dhaka University of Engineering and Technology, MSc in Computer Science and Engineering (Jahangirnagar University), BSc in CSE (IUBAT), Diploma in Computer Technology (KPI). Major interest areas are Digital image processing, DDBMS, Cloud computing. Now he is working as an Assistant Professor, Dept. of CSE.
Number of Pages:
LAP LAMBERT Academic Publishing
Data Cleaning, data quality, Data Integrity, Rule based design, Source data, Target Data, Modata, Contextual anomalies, Data error, Data duplication
COMPUTERS / General