Skip to main content
Announcements
NEW: Seamless Public Data Sharing with Qlik's New Anonymous Access Capability: TELL ME MORE!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Generate new table based on resident

Hi Guys,

I'm sure this is a simple question, but I am having a mind blank, hopefully someone can help me.

I have an existing table with a description (text) field.  I want to generate a new table that has the UID and a few specific words I find in the description field with duplication (i.e. a new row per word that I'm searching for).

So for example with 1 row:

ID, Text

AA, The quick brown fox jumps over the lazy dog

I would use the wildmatch to search for the words of interest, in this case I'm searching for brown, over and lazy.

So I want another table to generate that links on ID that would now have 3 rows because I found my 3 words:

ID, Wordmatch

AA, brown

AA, over

AA, lazy

Can anyone think of an ingenious way to do this efficiently?

Thanks

Gareth

1 Solution

Accepted Solutions
martinpohl
Partner - Master
Partner - Master

Hello Gareth,

the do a loop and define the words in line Keywords:

File:
LOAD * INLINE [
    ID, Text
    AA, the word is the best
];
for each Keyword in 'word','best'
Keywords:
load ID,
if(isnull(wildmatch(Text,'*$(Keyword)*')), null(), '$(Keyword)') as Wordmatch
resident File;

next;

Regards

View solution in original post

20 Replies
martinpohl
Partner - Master
Partner - Master

Hello,

try this:

File:
LOAD * INLINE [
    ID, Text
    AA, the word is the best
];

Keywords:
load ID,
if(isnull(wildmatch(Text,'*word*')), null(), 'word') as Wordmatch
resident File;

load ID,
if(isnull(wildmatch(Text,'*best*')), null(), 'best') as Wordmatch
resident File;

(with other keywords, but the result is the same)

Regarsds

Not applicable
Author

Hi Martin,

Thanks for your suggestion.  That's not a bad idea, so you just let them all auto concatenate into a new table Keywords. 

I think this is effective when searching for a few words, what if my list of words grew to a more larger size, is there a more efficient way of doing this?

This will definitely work though, so I appreciate the suggestion!

Thanks

Gareth

martinpohl
Partner - Master
Partner - Master

Hello Gareth,

the do a loop and define the words in line Keywords:

File:
LOAD * INLINE [
    ID, Text
    AA, the word is the best
];
for each Keyword in 'word','best'
Keywords:
load ID,
if(isnull(wildmatch(Text,'*$(Keyword)*')), null(), '$(Keyword)') as Wordmatch
resident File;

next;

Regards

Not applicable
Author

I've tried your suggestion and I stored my Keyword variable as:

SET Keyword='word;best';

Problem is I only get 1 row with Wormatch as 'word;best'.

I want to have an individual row for each word (like below):

ID

AA, word

AA, best

Perhaps it is how I have defined my variable for the 'loop'?

martinpohl
Partner - Master
Partner - Master

it is not a good idea with the SET-command.

List your Keywords like

for each Keyword in 'word','best', 'word3', 'word4' ..... and so on.

Using a variable you need to configure the '

Not applicable
Author

Hmm, I tried that just now and still only get one row:

AA 'word','best'

?

martinpohl
Partner - Master
Partner - Master

it issem to that there is another error in the script.

I get the right result:

Unbenannt.JPG

Not applicable
Author

Hello,

This could help you?


File:
LOAD * INLINE [
    ID, Text
    AA, the word is the best
    BB, the best word is
];


join


// Table containing all your searched words
LOAD * INLINE [
SearchedWord
word
best
];

Keywords:
load ID,
if(isnull(wildmatch(Text,'*'&SearchedWord&'*')), null(), SearchedWord) as Wordmatch
resident File;

regards,

Beyrem

Not applicable
Author

Hi Beyrem,

This works well, although I get the null rows where there is no match.  I ideally want to create a lookup table that only has IDs where there are matching words and a single word per row (as in your example).

If I had a long list of words and a really large table of the descriptions, this could get rather unfortunately large.

Thanks

Gareth