In Incremental Load in QlikView – Part 1 we saw Incremental load with Insert Only option.
In this session, we will see how to do Incremental Load with
– > Insert and Update
– > Insert, Update and delete
Insert and Update
Incremental Load with Insert and Update is similar to “Insert Only” (see my earlier blog Incremental Load in QlikView – Part1)
In this type of Incremental Load you will take new rows from the database and add/append to the existing QVD.
In our previous excercise, we have loaded, the following rows/table in QlikView QVD
For this example, i will add 2 new rows into my existing database table
Now our objective is to load only 2 new rows and not to overwrite the entire table
1. We will start with the inital load
2. Incremental load i.e. load new rows based on the LastUpdate date
3. Concatenate. Here you will check for PrimaryKey and load only those rows where PrimaryKey is new.This will ensure that only new rows are added and existing rows are not overwritten
Insert, Update and delete
This option is similar to Insert and Update with only exception that you need to exclude/delete records that have been deleted from the source system.
To explain my point, i have deleted a row and added a new row in my database table. I have deleted the row where PrimeId = 6. I have added a row where PrimeId = 16. Now my database table looks like
Now using Insert,Update and delete , you should be able to load the new rows and exclude the ones which are deleted in the source system and store the result into qvd.
Follow the same steps as above with Initial load,Incremental load,concatenate and then use Inner join script to exclude records that are deleted from the source system.
After executing this load, you will see that new rows are added to your qvd and the row which was excluded from the db is deleted from your qvd.
If you select from the Qvd and create a Table box, it will look like the following, your qvd deleted the record for PrimeId = 6 and added a new record where PrimeId = 16
Incremental loads are very useful when you have to load large transactional tables frequently and maintain the data integrity.
Further readings: quickintelligence