Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi everyone.
I am trying to extract a table from our data warehouse with approx. 10 million records. It keeps timing out, so the DB team have suggested pulling the data in chucks instead of doing 1 single load.
Can anyone help in what I need to do to write a loop around my table load please?
The loop will just need to be on the number of records not on a date/month/year or any other field.
so i would extract the first 100,000 records, then loop to the start and extract records 100,001 to 200,001 etc, until all records have been extracted.
Thanks
in data do u have any fields like country or region like geographical data, then extract data using these fields like
sel * from table where country='IND';
sel * from table where country='AUS'
Hi ,
Unfortunately not, its route data (long and lat) the only fields in this specific table is
RouteID
Long
Lat
Its then linked to other tables on the RouteID field.
So i need to extract small chunks of the data at a time, i guess based on the ROWNO()
So I would need the loop to start at row 0-1000, then the next loop would extract 1001-2001 etc until all records have been extracted.
I think I wouldn't loop on a record-level else just using a rounding on lat or long like round(lat, 9) to get theoretically 20 chunks for the possible -90° to 90°. Depending on the real existing lat/long in your data and/or the writing-syntax in which the data are stored you may adjust the rounding to get rather equally sized chunks.
- Marcus
the real existing lat/long in your data and/or the writing-syntax in which the data are stored you may adjust the rounding to get rather equally sized chunks. QuickPayPortal