Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi All,
we have an incremental load application which is updated daily. now the bad data has entered QVD. very strange thing was, though the backup job was disabled, it ran!!!!!!!!!! because of this bad data entered backup Qvds.
Please suggest your thoughts.Its very critical for us
Hey,
create sample application like this.
Load the data from qvd into a internal table first. Then again load the data from the internal table to a fresh table with resident load using where clause(to remove duplicates based on applicable conditions or using primay key or any loadtme column).
Drop table1;
Now store the second table into qvd with same name as ur source qvd.
tats it.
BR,
Chinna
Hi,
Try like this
Copy the qvd and place it in some other folder. Load this qvd in separate QVW file like below.
TableName:
LOAD
*
FROM QVDName.qvd (qvd)
WHERE Conditions; \\to remove bad data
STORE TableName INTO QVDName.qvd;
Now copy this qvd and deploy it in your actual folder
Regards,
Jagan.
Removal of Bad Data depends on it type:
1) Duplicate entries
2) Junk characters
3) Null Values
etc...
WHERE clause helps in resolving the same.
Hi Jagan,
thanks for the reply. i have one doubt. we have transformed Qvds as well. when i try ur suggestion on the transformed Qvds, does it helps? beacuse we have lot of transformations in it. plaese suggest.
if possible try to remove the bad data in source level@DB and drop the QVD and do a full load.
So that bad data will be removed from the QVD
Hi,
QVD is a final table after transformation, you can do filter or change the data on top of this QVD and again save the Transformed/filtered data into the QVD.
Hope this helps you.
Regards,
Jagan.