Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
When I run my script I see the status in the Script Execution Process window and when it get to the last step it just sits there without making any progress. I assumed the last step was causing a failure/hang, but when I look at the log I see: "Execution finished."
To make matters worse, if I press the Abort button the entire app crashes so I am never actually able to use the data.
Any insight into what is happening?
Thank you all for your wonderful suggestions (I will have to incorporate some of them)...but it turns out there was an error in a OnPostReload Trigger Action that was causing the hang!!!
After removing all my OnPostReload actions the script executes/exits normally.
Hi Jessica,
What is the size of data?
Are you able to do limit load successfully or not?
Regards
Neetha
If you look at the Windows Task Manager is still consuming CPU ?
Even after I see the Execution finished in the log the CPU usage for QV.exe is about 24-26% and the Private Working Memory set is continuing to increase.
I don't have the exact size of the data (because I can't get it to successfully load to save it before application crashes), but the last successful load I did resulted in a file that was about 5000K and my current app is 600K without any data. So guessing it is approx 4400K.
I am unable to do a partial load. It fails saying there is a missing table, which is a table that I had dropped after joining with another.
You may have a Cartesian join somewhere in the GUI.or maybe in the data model.
Go into the script editor debug mode and do a limited load of only 10 rows. Hopefully this will finish the reload and let you get in to see what's what.
Hi,
Are they any complex calculations in the script?
it usually hangs application when they are complex aggregations on huge data,
even though execution is complete.
Regards
Neetha
The section of my script that appears to be causing problems is a calendar generator. My script get a series of logs (in Logs table) where the timestamp is rounded to nearest hour, then I take the min/max timestamp of each of those logs and generate all the missing hours (needed when I chart later so hours with no logs are not excluded from chart axis), then I create a LogCalendar table where I get each unit I need. If I comment out the entire MinMaxDate section and keep the LogCalendar section then my script exits properly. So, is there something wrong with my MinMaxDate section?
MinMaxDate:
Load Min( LogTime ) as MinDate,
Max( LogTime ) as MaxDate
Resident Logs;
Let vMinDate = Peek('MinDate',-1,'MinMaxDate') - 1;
Let vMaxDate = Peek('MaxDate',-1,'MinMaxDate') ;
FOR vHourStep = MakeTime(0) to MakeTime(23) STEP MakeTime(1)
Join ( LogCalendar )
Load timestamp( $(vMinDate) + IterNo() + $(vHourStep) ) as LogTime
AUTOGENERATE 1
WHILE $(vMinDate) + IterNo() <= $(vMaxDate);
NEXT vHourStep;
LogCalendar:
Load DISTINCT LogTime,
DayName( LogTime) as LogDate,
DayName( LogTime ) as Day,
Year( LogTime ) as Year,
MonthName( LogTime ) as Month,
Date( WeekStart( LogTime ), 'M/D/YYYY' ) as Week
Resident Logs;
Bill has a good point.
In my experience, another behavior that may cause this lock up is the creation of a substantial amount of Synthetic Keys. Try the Debug Reload with limited records and check for synthetic keys in the Table Viewer.
Best,
Peter
Yes, the JOIN is doing 24 times a JOIN of the complete log table to the entire log table. These JOINs are incredibly expensive in time and resources, even if the resulting table is only as big as intended.
You can obtain the same result by starting from vMinDate and vMaxDate, generate a calendar with all dates inbetween (use DayStart()) en then join 24 hours to each and every date value. Note that DayStart(vMinDate) + 1/24 gets you 1 o'clock so the while loop will only have to count from 0 to 23.
Peter