Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Search our knowledge base, curated by global Support, for answers ranging from account questions to troubleshooting error messages.
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
We're happy to help! Here's a breakdown of resources for each type of need.
Support | Professional Services (*) | |
Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. | Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. | |
|
|
(*) reach out to your Account Manager or Customer Success Manager
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)
The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)
The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.
Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.
Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.
Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation Guidelines
Get the full value of the community.
Register a Qlik ID:
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
Log in to manage and track your active cases in the Case Portal. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
If you require a support case escalation, you have two options:
When other Support Channels are down for maintenance, please contact us via phone for high severity production-down concerns.
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
When deploying a Data Service Job containing a tRest component to Talend Runtime from TMC, osgi service gives error:
LinkageError : loader constraint violation for class when selecting overriding method javax.ws.rs.core.Response per tRest.the log displays the following error messages.
The Job design appears as follows:
tRestRequest --> xxx --> tRest --> xxxx --> tRestResponse
The detailed error messages in the log:
2024-11-07T14:09:40,762 | ERROR | features-2-thread-1 | container.BlueprintContainerImpl 460 | 85 - org.apache.aries.blueprint.core - 1.10.3 | Unable to start container for blueprint bundle talenddev.Job_P2PO_ESB_SuperviserDetailsAdhoc_v5/0.1.10
org.osgi.service.blueprint.container.ComponentDefinitionException: Unable to instantiate components
at org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:741) ~[?:?]
Caused by: java.lang.LinkageError: loader constraint violation for class talenddev.job_p2po_esb_superviserdetailsadhoc_v5_0_1.Job_P2PO_ESB_SuperviserDetailsAdhoc_v5$ExceptionMapper4TalendJobRestService: when selecting overriding method 'javax.ws.rs.core.Response talenddev.job_p2po_esb_superviserdetailsadhoc_v5_0_1.Job_P2PO_ESB_SuperviserDetailsAdhoc_v5$ExceptionMapper4TalendJobRestService.toResponse(javax.ws.rs.WebApplicationException)' the class loader org.eclipse.osgi.internal.loader.EquinoxClassLoader @284cb81 of the selected method's type
According to the documentation of the tRest component, Talend Runtime does not support this component from a design perspective.
To build Jobs that need to be deployed into the Talend Runtime, it is recommended to use the tRESTClient component which is best suited for the Talend Runtime, even though the tRest component is working fine prior to v801-R2024-05.
For more information about the tRESTClient component, see tRESTClient.
With the introduction of third-party ODBC connectors for Direct Access Gateway, the stability of the Data Gateway can be impacted if no Data Preview limit is added during the configuration step.
The section to verify under each connection would be "SELECT statement template for Data Preview".
Each driver has a specific syntax. Qlik provides some examples of connection strings in Sample connection strings and syntax.
Verify that all the data connections have a limit configured in the data preview for ODBC Generic driver connections under "SELECT statement template for Data Preview" as documented in Database specific properties.
A PowerShell sample is attached to this article to extract these details. Running the script will generate a CSV file with a column named rowsLimitKeyword_Custom. Review every syntax to contain a type of limit (this can change for each data source, some common words are LIMIT, TOP, etc.).
API endpoint https://qlik.dev/apis/rest/data-connections/
https://<tenant>/api/v1/data-connections?noDatafiles=true&filter=datasourceID eq "DG_GenericDriver"
We suggest using a separate server for development activities in general, highly recommended for customers using third-party drivers as those can cause abnormal/unexpected behaviors outside of our control.
Long-running commands can cause a bottleneck in the Data Gateway, an improvement for this is planned on Direct Access Gateway 1.7.0, but we recommend to verify the configuration of data connections at all times regardless.
DirectAccess log will show the duration of the commands, for example:
"CommandDurationMs":340734.958,"Method":"post","Url":"http://localhost:5050/metadata/proxyCommand"
QB-28905
The Qlik Sense Engine allows for a Hard Max Limit to be set on memory consumption. This setting requires that the Operating System is configured to support this, as described in the SetProcessWorkingSetSizeEx documentation (QUOTA_LIMITS_HARDWS_MAX_ENABLE parameter).
Before using the Hard Max Limit, familiarize yourself with Microsoft's memory management:
Source: learn.microsoft.com
By default, using the SetProcessWorkingSetSize function to set an application's minimum and maximum working set sizes does not guarantee that the requested memory will be reserved, or that it will remain resident at all times. When an application is idle, or a low-memory situation causes a demand for memory, the operating system can reduce the application's working set below its minimum working set limit. If memory is abundant, the system might allow an application to exceed its maximum working set limit.
The QUOTA_LIMITS_HARDWS_MIN_ENABLE and QUOTA_LIMITS_HARDWS_MAX_ENABLE flags enable you to ensure that limits are enforced.
When you increase the working set size of an application, you are taking away physical memory from the rest of the system. This can degrade the performance of other applications and the system as a whole. It can also lead to failures of operations that require physical memory to be present (for example, creating processes, threads, and kernel pool). Thus, you must use the SetProcessWorkingSetSize function carefully. You must always consider the performance of the whole system when you are designing an application.
After enabling QUOTA_LIMITS_HARDWS_MAX_ENABLE as per Microsoft's guidelines:
See Editing an engine - Qlik Sense for administrators for details.
To note:
Even with the hard limit set, it may still be possible for the host operating system to report memory spikes above the Max memory usage (%).
This is down to how the Qlik Sense Engine memory limit will be defined based on the total memory available.
Example:
The memory working setting limit is not a hard limit to set on the engine. This is a setting which determines how much we allocate and how far we are allowed to go before we start alarming on the working set beyond parameters.
QLIK-96872
A Talend Rest Job is sending a request to a secure REST web service through Jersey client and it was running fine in an older version. When running the same job in a newer talend version, it is getting error Exception in component
tREST_1 javax.ws.rs.ProcessingException: Already connected at org.glassfish.jersey.client.ClientRuntime.invoke(ClientRuntime.java:312)
It is recommended you should use talend tHTTPClient component as a replacement over tRest/tRESTClient component in job design.
tHttpClient component is available only when you have installed the 8.0.1-R2023-05 Talend Studio monthly update or a later one delivered by Talend. For more information, check with your administrator.
The root cause comes from a known bug in Jersey client and it is list below without robust fix:
https://github.com/eclipse-ee4j/jersey/issues/3000
https://github.com/eclipse-ee4j/jersey/issues/3001
javax-ws-rs-processingexception-while-sending-request-through-jersey
When changing the name of a dataset, the source name still stays the same. This can be seen by uploading a “firstname.qvd" and renaming it to “secondname.qvd”.
The dataset's detail will show "firstname.qvd" as the source.
As a consequence:
Trying to load “FROM [lib://DataFiles/secondname.qvd]” will produce a "(Connector error: File not found)" error.
This is not a defect, it's how the product is designed.
A dataset represents a data resource with its properties such as name. The value of that is that you are able to use more user-friendly names of datasets without having to change the source names, which can be useful when the dataset is pointing to a database table for instance.
In the future, there is a plan to add the possibility of calling the name of the datasets in the script.
To help Qlik customers manage costs more effectively, Qlik has developed the Qlik Snowflake Monitoring application, designed to provide invaluable insights about your Snowflake costs, usage, inventory, security, performance and contract utilization. This app utilizes Qlik's Associative Engine to connect directly to your Snowflake instance and reveal insights from Snowflake's detailed metadata, offering valuable information that traditional query-based tools and Snowflake's own reports are unable to provide.
Leveraging Qlik Application Automation, and Data Alerts, you can:
*Minor configuration is required on first run to create the required data connections.
Content:
This automation template is a fully guided installer/updater for the Qlik Snowflake Monitor. Leverage this automation template to easily install and update this application. The application itself is community-supported; and it is provided through Qlik’s Open-Source Software GitHub and thus is subject to Qlik’s open-source guidelines & policies.
For more information, refer to the GitHub Repository.
If the monitoring app was installed manually (i.e. not through the application automation installer), then the app will not be detected as existing. The automation will install new copies side-by-side. Any subsequent executions of the automation will detect the newly installed monitoring application and check their versions. This is because the application is tagged with ‘QCS - QSM - {App Name}’ and ‘QCS – QSM - {Version}’ during the installation process through the automation. Manually installed applications will not have these tags and therefore will not be detected.
The Qlik Snowflake Monitor requires two connections, one to your Snowflake instance to feed the data for your analytics, and one REST connection to the qlik-oss repository to run a version check on the monitor.
You will need to create a custom User, Role and Warehouse on your snowflake tenant. This is to ensure this user and role can see the monitoring details and can be monitored.
For Authentication, this setup is defaulted to username & password.
Finally, you need to name the connection as follows:
If you wish to use an alternative authentication method, please follow the documentation accordingly on both Snowflake & Qlik.
The REST connection is used to fetch version details from the GitHub repository. On reload it will look for the the latest released version in github and check this against the version you have installed. You can later use this in ‘Part Three’ to create an alert when updates to the application are available. To create a REST connection the following information is required:
Once these two connections have been set up, you can reload the application. The application has been created to accommodate Snowflake tenants of all sizes. If you have a small tenant, you will find the initial run of the load script can take around 30 minutes, and for larger tenants this can be over an hour or two. Subsequent runs will utilize cached QVDs that update daily to reduce reload times each subsequent day.
If a new release of the application is made, occasionally a full reload of data is required, but generally, if the data schema is unchanged the existing QVDs will be maintained. This is through the use of versions in the names of the QVDs used to store the data.
The application has the following two variables:
To create a new Data Alert for updates to the monitoring app, follow these steps:
The Qlik Snowflake Monitor can be easily installed by following these steps above. If you wish to find out more, check out this Ometis blog post and this Ometis Webinar to get a run through of the analytics this application can offer.
If you face any issues, please use the GitHub and raise an issue through the repository.
The AS/400 supports a file concept known as multiple-member files, in which one file (or table) can possess several different members. Each member is a part of the same file or table and shares the same schema, but the members are uniquely named and have unique data. Just like a partition tables in other regular RDBMS eg in Oracle, MySQL etc.
ODBC and OLE DB have no built-in mechanism for accessing multiple members. By default, ODBC always accesses the first member in a multimember file. This is the known limitation in Replicate User Guide: • Multi-member (Partitioned) tables are not supported.
To enable ODBC-based applications such as Data Transformation Services (DTS) to access multiple-member files, we need to use the AS/400's SQL ALIAS statement. The ALIAS statement lets you create an alias for each member we need to access. We use ALIAS to access all members in Qlik Replicate full load stage. More samples Creating and using ALIAS names.
However, all member data changes will be recorded into DB400 Journal file, Qlik Replicate can capture all members changes. No special settings needed in Qlik Replicate CDC stage.
This article describe how to overcome the limitation and setup Replicate task to replicate Multi-Member Tables.
Support Case: 00049024
This article is specific to Qlik Sense Enterprise on Windows.
Enable Audit Logging:
For more information on audit logging itself, see: How to enable Audit Logging in Qlik Sense Enterprise on Windows.
To read who has deleted an app:
Qlik Sense Enterprise on Windows
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Beginning November 1, 2022, Qlik is enforcing rate limits to API requests on Qlik Cloud REST APIs. This means any API request originating to a REST API endpoint on Qlik Cloud is subject to rejection if the number of requests to an endpoint exceeds the allowed amount in a specified duration on that endpoint.
API rate limiting is a mechanism for ensuring API and platform performance and scalability. It protects the platform from being overwhelmed by requests made to API endpoints by throttling the number of requests an endpoint will accept before blocking or rejecting more requests from a client.
All REST endpoints in Qlik Cloud have a rate limit tier assignment. Any requests made from Qlik Sense REST connector, requests made from Qlik Application Automation, qlik-cli, any REST client like Postman, or custom application you create is subject to rate limiting.
Limits are enforced per tier, per user, per tenant. When a rate limit is reached, all endpoints in the same tier are blocked until the retry-after time expires. When you exceed a rate limit, your application receives an HTTP 429 status code response such as below:
Depending on the language, client, and code you’ve written to interact with Qlik’s APIs, you need to accommodate it to handle rate limits based upon the APIs you’re using. One way to do this is to add code that handles the 429 response by reading the `retry-after` response header and adding a function that throttles your application to wait until the retry period has elapsed.
Initially, rate limits are going to be enforced on Qlik Cloud REST API endpoints only. It is our intention to add external rate limits for other types of traffic including but not limited to websocket connections in a future release.
Here are the enforcement tiers for the number of requests allowed on an endpoint based on its assigned tiers.
Tier |
Limit |
Description |
Tier 1 |
600+ per minute |
Supports majority of requests for data consumption with generous bursting. |
Tier 2 |
60+ per minute |
Create, update, and delete resource endpoints with occasional bursts of more requests. |
Special |
Varies |
Rate limiting conditions are unique for methods with this tier. Consult the method's documentation to better understand its rate limiting conditions. |
We identified these tiers after observing API requests and rate limiting decisions from the beginning of 2022. Additional consideration has been paid to endpoints with heavy usage to make sure the services the APIs call scale to support the anticipated request volume.
API rate limiting is a mechanism for protecting your experience using the Qlik Cloud platform. Here are some reasons why we’re beginning to enforce rate limits on tenants:
Information about Qlik Cloud API rate limits is visible on qlik.dev beginning today, October 11, 2022. In the API reference section for Manage APIs, you can identify the rate limit tier for endpoints you use. Any special tier endpoints will indicate the specific API rate limit on the APIs reference page.
We released a new API Policy for working with Qlik Cloud’s APIs. Please review this page so you can ensure the end users of your solutions receive a pleasant experience interacting with Qlik Cloud.
Qlik Cloud platform features and APIs rely on rate limits to help provide a predictably pleasant experience for users.
The details of how and when rate limiting works differs between features and not based on customer subscriptions. All customers meet the same API rate limit tiers depending on the API endpoint. See Rate limiting for details.
Examples:
Fails to deploy a Job in TAC, an error occurs and the detailed error messages are shown as below:
The following configuration has been added to impose constraints on Job in order to prevent Denial of Service attacks. High default values have been established, and it is recommended to adjust them according to your specific environment.
Add the following parameters in setenv.sh file([TAC]/apache-tomcat/bin/setenv.sh):
set "JAVA_OPTS=%JAVA_OPTS% -Xmx4096m -Dfile.encoding=UTF-8 -Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_ZIPPED_ENTRIES=200000 -Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_UNZIPPED_SIZE=194679051264 -Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_ZIP_NAME_LENGTH=3096 -Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_UNZIPPED_FOLDER_NAME_LENGTH=1024 -Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_UNZIPPED_FILE_NAME_LENGTH=1024 -Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_ZIP_DEPTH=2048"
Add the following parameters to the Windows Service under "Java Options", please follow these steps:
-Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_ZIPPED_ENTRIES=200000
-Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_UNZIPPED_SIZE=194679051264
-Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_ZIP_NAME_LENGTH=3096
-Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_UNZIPPED_FOLDER_NAME_LENGTH=1024
-Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_UNZIPPED_FILE_NAME_LENGTH=1024
-Dorg.talend.remote.jobserver.commons.config.JobServerConfiguration.MAX_ZIP_DEPTH=2048"
Internal Comments
Tags
Zip content exceeds,MAX_ZIPPED_ENTRIES
By embedding a chart object to your web application, you will use in the body
of the HTML file, the following example sets the ui
attribute to object, indicating to qlik-embed that you want to embed a specific visualization from a Qlik Sense application.
[ Fig.1 ]
<qlik-embed
ui="analytics/chart"
app-id="<APP_ID>"
object-id="<CHART_ID>"
></qlik-embed>
Using " ui= analytics/chart"
it should correctly render, for instance, a single KPI object, as per the documentation below : https://qlik.dev/extend/extend-qlik-visualizations/supported-charts/
The following error can occurs when trying to render the KPI in your web application:
Failed to load visualization: 'kpi'
This is because the character set is not supported and the only supported one is "utf-8".
Please apply the following instruction in your HTML file to the head-element :
<meta charset="utf-8">
For usage where the encoding cannot be controlled by the user (such as embedding in a tool, e.g. Microsoft Excel ) and this error arises, the customer will have to embed with the "classic/chart" type and the option "iframe=true".
Qlik Cloud
When you use the tHttpClient/tRest/tRestClient component to pull Bloomberg numbers (URL ended with .csv extension) over API, the data is giving garbled characters.
Using the proper operation tFileFetch component to download online application/csv file into a local temp folder firstly, and then combining with tFileInputDelimited to read data.
The job design should be: tFileFetch-->onSubjobok-->tFileInputDelimited-->Further processing
As the target source URL ends with .csv, it should be considered as a download operation for application/CSV type
For more information about how to retrieve a file from an HTTP website and read data from the fetched file, please refer to this documentation
Relevant resources:
How to Access My Qlik Portal
Managing Your Subscription in My Qlik
Please do the following before requesting the change:
Areas that are particularly sensitive when you remove a user:
When you did update the context parameters and republish the Studio artifact to TMC for the requirement of updating your job context parameters , however, the context parameters are in fact not updated after re-running the task. What is the problem with?
To have a TMC task following the context changes of a Studio job, the parameter "Override parameter values with artifact defaults" must be enabled in the Task Configuration.
The reason why this is not enabled by default is because task context parameters can be updated directly in TMC without interacting with Studio by editing the task as shown below:
When a task is created for the first time, it will copy all the values from the artifact. These task values will always keep the same value and will not be overwritten/replaced by new values in the artifact.
For more information about how to update the new promoted version, please refer to this documentation
updating-job-tasks-with-latest-artifact-version
Qlik Replicate tasks using Oracle as a Source Endpoint fail after installing the Oracle July 2024 patch.
All Qlik Replicate versions older than the 2024.5 SP03 release are affected.
Upgrade to Qlik Replicate 2023.11 SP05, or 2024.5 SP03 or later.
Download the formal builds for 2023.11 and 2024.5 here:
Qlik Replicate 2023.11 SP05: https://files.qlik.com/url/qr2023110860sp05 (expires 1/31/2025)
Qlik Replicate 2024.5 SP03 link: https://files.qlik.com/url/qr2024050563sp03 (expires 1/31/2025)
If you have Qlik Enterprise Manager deployed, upgrade this as well. See Qlik Enterprise Manager fails adding a table to a task with SYS-E-HTTPFAIL, no rest handler for url for download links.
The Oracle July 2024 patch introduced a change to redo events. Qlik has since provided a fix for Qlik Replicate which parses the redo log correctly.
RECOB-8698
Oracle Database 19c Release Update July 2024 Known Issues
As a general reminder, all changes to the environment such as operating system patches, endpoint and driver patches, etc. should be tested in lower environments before promoting to production.
You can manage your Qlik Sense Business subscription in My Qlik, including renewing your subscription, reducing the number of paid seats in your tenant, and cancelling your subscription.
If you are looking to manage your Enterprise SaaS subscription, please contact your account manager or your local Qlik Sales team.
Table of Content
You can manage your Payment and Billing information for your subscription.
You can increase and reduce the number of users in your Qlik Sense Business subscription using My Qlik. You can also choose to undo the reduction in users any time before the new term begins.
NOTE: The subscription is not reduced to the new number of seats until the beginning of the
next subscription term. You will continue to have the same number of seats available until the end of your current subscription term.
Follow the onscreen instructions.
If you decide that you do not want to reduce the number of seats, you can undo the request any time before the new term.
Qlik Sense Business subscription at any time using My Qlik. You can also choose to undo the cancellation any time before the end of the term.
NOTE: The cancellation becomes effective at the end of your subscription term. You will
continue to have access to your subscription until the end of the term.
Follow the on-screen instructions.
The subscription tile updates to reflect the cancellation request and displays the date on which the cancellation becomes effective.
If you decide that you do not want to cancel the subscription, you can undo the request any time before the end of the term.
This article describes an example of how to disable custom connectors for a particular user group for Qlik Sense.
The example is provided as is. Further customisation or adaptation to a specific use case scenario can be obtained by engaging Qlik's Professional Services.
Name: DisableCustomConnectors Description: This custom rule disables Custom Connectors for user group “Sydney” Resource filter: DataConnection_* Action: Create Condition: ((resource.type!="folder" and resource.type!="Custom") and (user.group="Sydney")) Context: Only in hub
Name: DataConnectionForNormalUsers Description: This custom rule is for Data Connections except User Group “Sydney”. Custom Connection creation enabled. Resource filter: DataConnection_* Action: Create Condition: ((resource.type!="folder" and user.group!="Sydney")) Context: Only in hub
Upgrading Qlik NPrinting fails with:
Conflicting -boot options
An identical error is thrown when uninstalling Qlik NPrinting, even when following the How to uninstall Qlik NPrinting instructions.
The failure occurs during the Messaging service stop and start process.
To resolve the issue with the Messaging service close and start process, create a variable which sets the correct configuration for the Qlik NPrinting server and allows us to bypass the problematic initialization code blocking the upgrade: