Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I have a scenario with thousands of CSV files to ingest. It compounds 5 differents entities (vheicules, expenses, transportations, etc.). If I use Hive to ingest them into HDFS I can just put them into 5 diferent directories and do a command like bellow. Hive automatically recognize all the files and I can make any selections to them using HiveQL.
CREATE EXTERNAL TABLE entity1 (col1a string, col1b string) LOCATION '/path/to/dir1';
CREATE EXTERNAL TABLE entity2 (col2a string, col2b string)
LOCATION '/path/to/dir2';
It´s possibile to QDC do that, we need to create a Job and append it to the same entity, or have another way to simulate this Hive feature?
Tks
Pedro
Hello @pedrobergo , try tweaking your src.file.glob to vheicules*.csv for example
Please let me know any update
Hi master Clever !
It´s really works.
I put /vheicules/*frota.frota.csv in src.file.glob and load all the files.
Tks a lot !!
Pedro
Hello @pedrobergo , try tweaking your src.file.glob to vheicules*.csv for example
Please let me know any update
Hi master Clever !
It´s really works.
I put /vheicules/*frota.frota.csv in src.file.glob and load all the files.
Tks a lot !!
Pedro