Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Learn how to migrate to Qlik Cloud Analytics™: On-Demand Briefing!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

problem updating rows on one table of my active mysql

Already im tried to extract the id from my table of client from mysql database, via tMap comparing the rows name of my table with a excel with the name of clients, i wrote a rountines with the name searchIdByName, for filter when the name are the same, return "ID" if not equal return "0", but when are about 55 millones of rows, talend just stuck, just frezzing, somebady can help me pls?

Labels (2)
4 Replies
Anonymous
Not applicable
Author

Try something like this....
1) Connect the Excel file to the tMap as the Main row.
2) Connect the MySQL db as a lookup
3) Set the lookup table to "Reload At Each Row"
4) You will see a section appear to set a globalMap variable. Set this to be the field in the Excel file you want to match.
5) In the SQL of the MySQL lookup query, use the value of the globalMap in the WHERE clause

This will mean that you will not have to search through ALL of your MySQL table to find matches with the Excel file. 
vapukov
Master II
Master II

rhall_2.0 wrote:
Try something like this....
1) Connect the Excel file to the tMap as the Main row.
2) Connect the MySQL db as a lookup
3) Set the lookup table to "Reload At Each Row"
4) You will see a section appear to set a globalMap variable. Set this to be the field in the Excel file you want to match.
5) In the SQL of the MySQL lookup query, use the value of the globalMap in the WHERE clause

This will mean that you will not have to search through ALL of your MySQL table to find matches with the Excel file. 

just add, 
it will work fine, but depending from size of excel file - total time could be slow. Some time fasted method:
- connect to excel
- load all data for match in temp table in db (with index on matched column) - drop table if exists, bulk load, commit every 10 000 rows
- create dbInput component - with SQL query which JOIN (LEFT, INNER and etc) from original and new temp table

Total time of operation will be less than read row-by-row and lookup for each
Anonymous
Not applicable
Author

Indeed vapukov, loading the data to the database and filtering within the db would be FAR quicker. My assumption was that the db should not be edited. In the majority of cases I have worked on,the source db has not been permitted to be modified. However, if it is permitted, loading to the source db would be massively faster than querying for every row in an Excel file.
Anonymous
Not applicable
Author

If excel data can be loaded in database then why use tMap, instead of it, do the processing at the db level with the help of database stored proc.