Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Qlik is providing these mitigation steps as a temporary measure. A patch will be provided and linked here; customers are advised to move to the patch as soon as it is available.
REM Attunity Compose Java Server configuration/run script
REM e.g. AT_PROD = C:\Program Files\Attunity\Compose\java_server
for %%A in ("%~dp0..") do set AT_PROD=%%~fA
REM list plugins here
SET AT_PLUGIN_LIST=-plugins compose_ctl
REM set data directory based on the name of this script
set AT_DATA_SUFFIX=
for /F "tokens=2 delims=_" %%A in ("%~n0") do set AT_DATA_SUFFIX=%%A
if "%AT_DATA_SUFFIX%" == "" (
set AT_DATA=
) else (
set AT_DATA=-d data_%AT_DATA_SUFFIX%
)
if "%COMPOSE_JAVA_SERVER_DEBUG%" == "true" (
set JVM_REMOTE_DEBUG_ARGUMENTS=-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=127.0.0.1:5005
)else (
set JVM_REMOTE_DEBUG_ARGUMENTS=
)
SET AT_JAVA=%AT_PROD%\lib\jre\bin\java.exe
SET AT_EXTERNAL=%AT_PROD%\external
SET AT_LIB=%AT_PROD%\lib
SET AT_PLUGINS=%AT_PROD%\plugins
SET AT_MAIN=com.attunity.infrastructure.server.PluginServer
SET AT_EXTERNAL_JDBC_PATH=%AT_PROD%\jdbc
SET AT_APP_NAME=-DQlikApp=ComposeJavaServer
<--------------------- Fix Here--------------------->
SET LOG4J_FORMAT_MSG_NO_LOOKUPS=TRUE
"%AT_JAVA%" %AT_APP_NAME% %JVM_REMOTE_DEBUG_ARGUMENTS% -cp "%AT_EXTERNAL_JDBC_PATH%"/*;"%AT_PLUGINS%"/*;"%AT_EXTERNAL%"/*;"%AT_LIB%"/* %AT_MAIN% %AT_DATA% %AT_PLUGIN_LIST% %*
$ cd <installation-root>\Compose\java\external
$ ren log4j-core-<version#>.jar ..\log4j-core-<version#>.jar-vulnerable
log4j-core-nolookup-<version#>.jar
from this page and place it in the same location as the vulnerable jar.$ sc stop AttunityComposeForDataLakes
$ sc start AttunityComposeForDataLakes
Note that if you have a customized Compose for Data Lakes start script, you should perform the equivalent edit on your modified start script.
Customers using Compose for Data Lakes using Spark projects with a remote Compose agent running on their Hadoop cluster should follow the mitigation for the Windows service, as detailed above. Additionally, they will need to apply another mitigation for the remote Compose agent running on Hadoop. The certification and the details for this mitigation are not yet ready and will be published in the coming days. Monitor the support blog or knowledge base in the Qlik community for an updated version of this document.
For more information on the Log4j vulnerability, please visit the Support Updates Blog post.