Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
_AnonymousUser
Specialist III
Specialist III

Dynamic schema for multiple positional file inputs

Hello!
I have multiple (13 to be exact) positional file inputs, and they all have different schema (variation in column numbers and field lengths (patterns are different).
Is it possible to create a loop in integration studio (Enterprise DQ professional) to process each of the files, dynamically/automatically change the schema to match the file and write the contents to a oracleDB table?
Maybe it is doable via the tSetDynamicSchema component? My goal is to create a job that doesn't have a tFileInputPositionalt/DB write component for each file.
So far i have dabbled with tFileList component that locates the files, iterates through them and outputs them to a temporary tables with filename (picked from globalMap) as name.
I also have the pattern's for each of the files in globalmap, but i haven't been able to create a check to match the positional file pattern to a filename (the tFileList doesn't connect to tJavaRow).
Basically what i need is a dynamic file input that iterates through every file (.txt) in a folder, creates a dynamic schema based on the file content and write the content to a Oracle database tables.
Labels (4)
3 Replies
Anonymous
Not applicable

i may be misunderstanding you.
but you seem to have 13 files with different Schemas
which implies that each file has a its own schema.
and that you do *not* one file containing 2 or more schemas
could you please clarify that?
thanks,
_AnonymousUser
Specialist III
Specialist III
Author

Hi,
Yes, i have 13 different positional files with 13 different schemas. Each file has it's own.
I am trying to find a reusable and more dynamic solution to this, compared to having 13 different input/mapping/output components with schema in metadata.
Br,
jm
Anonymous
Not applicable

you would need to load these files as tFileRow (load full rows)
if you have a header line on these files then use that to define which schema to use and then try parsing these in that way.
i am just thinking aloud - but it is probably doable.
in a similar circumstance we have used file headers to define their 'file type' and then call a 'child' job to process each.
the idea you had seems tempting at first but you probably find that each file will have a quirk - and these are best dealt in a separated process; aka job.
regards