Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Talend migration issue Java heap space

Hi,
We are migrating and upgrading from Talend 6.2 to 7.1.
For this, we have installed TAC 7.1+JOBSERVER+CommandLine+NEXUS + MySQL 5.7 in a new VM test server.
Next we have imported the old database to the new MySQL server.
we have increased the memory in setenv.sh

export JAVA_OPTS="$JAVA_OPTS -Xmx10240m -Dfile.encoding=UTF-8"

NB: we are using java 8
Then we proceed the migration process in the db config page.
We get the error message

Start Export Db.load
database schema migration failed.
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3332)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:596)
at java.lang.StringBuffer.append(StringBuffer.java:367)
at org.hibernate.type.TextType.get(TextType.java:41)
at org.hibernate.type.NullableType.nullSafeGet(NullableType.java:113)
at org.hibernate.type.NullableType.nullSafeGet(NullableType.java:102)
at org.hibernate.type.AbstractType.hydrate(AbstractType.java:81)
at org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:2031)
at org.hibernate.loader.Loader.loadFromResultSet(Loader.java:1371)
at org.hibernate.loader.Loader.instanceNotYetLoaded(Loader.java:1299)
at org.hibernate.loader.Loader.getRow(Loader.java:1197)
at org.hibernate.loader.Loader.getRowFromResultSet(Loader.java:568)
at org.hibernate.loader.Loader.doQuery(Loader.java:689)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:224)
at org.hibernate.loader.Loader.doList(Loader.java:2144)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2028)
at org.hibernate.loader.Loader.list(Loader.java:2023)
at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:393)
at org.hibernate.hql.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:338)
at org.hibernate.engine.query.HQLQueryPlan.performList(HQLQueryPlan.java:172)
at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1121)
at org.hibernate.impl.QueryImpl.list(QueryImpl.java:79)
at org.eclipse.emf.teneo.hibernate.resource.HibernateResource.loadUsingTopClasses(HibernateResource.java:334)
at org.eclipse.emf.teneo.hibernate.resource.HibernateResource.loadFromStore(HibernateResource.java:322)
at org.eclipse.emf.teneo.hibernate.resource.HibernateResource.loadResource(HibernateResource.java:272)
at org.eclipse.emf.teneo.resource.StoreResource.load(StoreResource.java:277)
at org.talend.teneo.model.TalendDatastore.exportDataStore(TalendDatastore.java:357)
at org.talend.migration.TalendMigrationApplication.call(TalendMigrationApplication.java:260)
at org.talend.migration.TalendMigrationApplication.call(TalendMigrationApplication.java:56)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:748)

We have tried to adjust CATALINA_OPTS with these parameters in the setenv.sh file.

export CATALINA_OPTS="$CATALINA_OPTS -Xms512m"
export CATALINA_OPTS="$CATALINA_OPTS -Xmx8192m"
export CATALINA_OPTS="$CATALINA_OPTS -XX:MaxPermSize=256m"

but the same error persists
Any idea to get around the out memory issue
Thanks for your help

Labels (2)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

This is a bug in the migration routine.

The actual bug is the weak method to read the MySQL table data without using the streaming mode. Without streaming the driver caches all the records from the query and accumulate this way a lot of memory.

You can only try to reduce the data. The table taskexecutionhistory is mostly the largest table in the TAC database and you could consider to delete entries older than a particular date.

The other way is to increase the memory as much as possible.

View solution in original post

5 Replies
fdenis
Master
Master

you have to focus on database schema migration failed.

waht did you try to migrate? witch db witch schema?

tarray is to small for your data so can you split your migration by project or ....

good luck
Anonymous
Not applicable
Author


@fdenis wrote:
you have to focus on database schema migration failed.

waht did you try to migrate? witch db witch schema?

tarray is to small for your data so can you split your migration by project or ....

good luck

Hi Denis,

Thanks for your assistance,

We aim to migrate from Talend version 6.2 to talend version 7.1.

We use the migration tool of Talend TAC to migrate the MySQL TAC db (metadata).

I think that I can't split the migration.

I tried to increase the -Xmx memeory parameter in setenv.sh (apache tomcat parameter file) but in vain.

Any suggestions ?

Thanks

 

Anonymous
Not applicable
Author

This is a bug in the migration routine.

The actual bug is the weak method to read the MySQL table data without using the streaming mode. Without streaming the driver caches all the records from the query and accumulate this way a lot of memory.

You can only try to reduce the data. The table taskexecutionhistory is mostly the largest table in the TAC database and you could consider to delete entries older than a particular date.

The other way is to increase the memory as much as possible.

Anonymous
Not applicable
Author


@lli wrote:

This is a bug in the migration routine.

The actual bug is the weak method to read the MySQL table data without using the streaming mode. Without streaming the driver caches all the records from the query and accumulate this way a lot of memory.

You can only try to reduce the data. The table taskexecutionhistory is mostly the largest table in the TAC database and you could consider to delete entries older than a particular date.

The other way is to increase the memory as much as possible.


Hi Jlolling,

Please correct me. to increase memory we should update the -Xmx value in setenv.sh of tomcat parameter file.

export JAVA_OPTS="$JAVA_OPTS -Xmx20g -Dfile.encoding=UTF-8"
CATALINA_OPTS="-Dbtm.root=$CATALINA_HOME -Dbitronix.tm.configuration=$CATALINA_HOME/conf/btm-config.properties -Djbpm.tsr.jndi.lookup=java:comp/env/TransactionSynchronizationRegistry -Djava.security.auth.login.config=$CATALINA_HOME/webapps/kie-drools-wb/WEB-INF/classes/login.config -Dorg.kie.demo=false -Dorg.uberfire.nio.git.daemon.port=9419 -Dorg.uberfire.nio.git.ssh.enabled=false"

Thanks for your support

Anonymous
Not applicable
Author

Thanks a lot for your help.

I increased memory and deleted entries from taskexecutionhistory table and it works.

All my respect JLolling