Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
This space offers a variety of blogs, all written by Qlik employees. Product and non product related.
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio.
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics.
This blog was created for professors and students using Qlik within academia.
Hear it from your Community Managers! The Community News blog provides updates about the Qlik Community Platform and other news and important announcements.
The Qlik Digest is your essential monthly low-down of the need-to-know product updates, events, and resources from Qlik.
The Qlik Education blog provides information about the latest updates of our courses and programs with the Qlik Education team.
By analyzing data through the application, it is possible to identify consumption patterns that can help optimize resource allocation and reduce waste. The application can assist in evaluating the energy efficiency of business operations. Metrics and reports generated allow for identifying areas where improvements can be implemented to reduce costs and environmental impact.
By identifying and optimizing areas of energy and water consumption, the application can contribute to more efficient resource management, reducing operational costs associated with supplies. The application provides detailed data and reports that support informed decision-making regarding resource management.
Executives and operations managers can use the application to optimize resource usage, reduce operational costs, and enhance overall operational efficiency.
This application is crucial for a company's business as it enables daily monitoring and optimization of resource consumption, leading to cost reduction and improved sustainability practices.
I feel pleased to introduce our new Educator Ambassador for 2024, Sumitra Pundlik from Marathwada Mitra Mandal College of Engineering, in the western Indian city of Pune. Sumitra has been an Educator since the last 15 years and is currently working as an Assistant Professor at Marathwada Mitra Mandal College of Engineering.
Sumitra teaches various subjects related to Data Analytics such as DBMS, Advanced DBMS, Machine Learning, Big Data Analytics. She has also taught R and Python Programming including different Business Intelligence and Data Analytics technologies including Qlik Sense.
Sumitra has been a member of the Qlik Academic Program since the last three years and has also successfully earned the Qlik Sense Business Analyst Qualification. After earning this qualification, she supported the learning journey of more than 400 students of different academic institutes like NMIMS,Hyderabad, VJIT Hyderabad,MIT ADT University and encouraged them to pursue Qlik Sense Business Analyst Qualifications.
According to her, “The Qlik Academic Program provides a great opportunity for students to access many online resources and different qualifications and certifications including Data Literacy, Qlik Sense Business Analyst and Qlik Sense Data Architect. The interactive material of the academic program helps students to excel in the field of data analytics”.
Further she adds, “I have been spreading awareness about the Qlik Academic Program and its benefits to other engineering colleges by conducting expert sessions via social media platforms”
Sumitra says, “Moving ahead, we are keen for internship and job opportunities for our students who have a good understanding of Qlik Sense and have earned the academic program qualifications.”
In Sumitra’s words, “Data literacy is my interest and I want to educate every data science enthusiast about the importance of data literacy.”
She feels it’s her mission to create more awareness about data analytics and encourage use of Qlik Sense. Data is the new fuel, and it is driving the current industry, she further adds.
When Sumitra is not teaching, she likes spending time with her son, Viraj. She is focusing on upskilling herself in recent trends, also taking care of her health and following a workout routine.
We are looking forward to working closely with Sumitra during her tenure as an ambassador and creating more links with Universities in Pune and rest of India.
We all know and love using Insight Advisor right within the Qlik Sense hub or inside Analytics apps, helping us analyze data, create visualizations or build data models.
In this post, we will tap into the Insight Advisor API to leverage its power within a separate web application.
We will create a simple web app that allows to ask natural language questions against our Qlik Sense app and get a recommended visualization as a response that we will then render using nebula.js
Pre-requisites:
You will need to grab the following before starting:
Installation
Run npm install to install the content of package.json
Folder structure:
The following sections discuss the main parts of building the web app and calling the API, they are not in any particular order. I will provide the complete code for the project at the end of the post so you can see where everything fits.
1. Connecting to Qlik Cloud
First things first, we need to handle the authentication to Qlik Cloud.
Interactive login process:
async function getQCSHeaders() {
await qlikLogin(); // enforce tenant login
const response = await fetch(`${tenantUrl}/api/v1/csrf-token`, {
mode: 'cors',
credentials: 'include',
headers: {
'qlik-web-integration-id': webIntegrationId,
},
});
const csrfToken = new Map(response.headers).get('qlik-csrf-token');
return {
'qlik-web-integration-id': webIntegrationId,
'qlik-csrf-token': csrfToken,
};
}
async function qlikLogin() {
const loggedIn = await fetch(`${tenantUrl}/api/v1/users/me`, {
mode: 'cors',
credentials: 'include',
headers: {
'qlik-web-integration-id': webIntegrationId,
},
});
if (loggedIn.status !== 200) {
if (sessionStorage.getItem('tryQlikAuth') === null) {
sessionStorage.setItem('tryQlikAuth', 1);
window.location = `${tenantUrl}/login?qlik-web-integration-id=${webIntegrationId}&returnto=${location.href}`;
return await new Promise((resolve) => setTimeout(resolve, 10000)); // prevents further code execution
} else {
sessionStorage.removeItem('tryQlikAuth');
const message = 'Third-party cookies are not enabled in your browser settings and/or browser mode.';
alert(message);
throw new Error(message);
}
}
sessionStorage.removeItem('tryQlikAuth');
console.log('Logged in!');
return true;
}
2. Communicating with the Qlik Cloud Engine
(content of the cloud.engine.js file)
We need to open a session using enigma.js to communicate with the Qlik QIX engine.
import enigma from "enigma.js";
const schema = require("enigma.js/schemas/12.1306.0.json");
export default class EngineService {
constructor(engineUri) {
this.engineUri = engineUri;
}
openEngineSession(headers) {
const params = Object.keys(headers)
.map((key) => `${key}=${headers[key]}`)
.join("&");
const session = enigma.create({
schema,
url: `${this.engineUri}?${params}`,
});
session.on("traffic:sent", (data) => console.log("sent:", data));
session.on("traffic:received", (data) => console.log("received:", data));
return session;
}
async closeEngineSession(session) {
if (session) {
await session.close();
console.log("session closed");
}
}
async getOpenDoc(appId, headers) {
let session = this.openEngineSession(headers);
let global = await session.open();
let doc = await global.openDoc(appId);
return doc;
}
}
3. Including the Nebula Charts needed and rendering the recommended viz from Insight Advisor
When we eventually get back a recommendation from Insight Advisor, we will use a nebula object to embed it in our web app.
For a full list of available Nebula objects, visit: https://qlik.dev/embed/foundational-knowledge/visualizations/
We need to install “stardust” that contains the main embed function and all the nebula objects we need:
npm install @nebula.js/stardust
then install all objects needed
npm install @nebula/sn-scatter-plot
npm install @nebula/sn-bar-chart
etc...
import { embed } from '@nebula.js/stardust';
import scatterplot from '@nebula/sn-scatter-plot';
etc...
Inside the rendering function, we will use stardust’s embed method to render the recommended chart type we get from Insight Advisor.
async function fetchRecommendationAndRenderChart(requestPayload) {
// fetch recommendations for text or metadata
const recommendations = await getRecommendation(requestPayload);
const engineUrl = `${tenantUrl.replace('https', 'wss')}/app/${appId}`;
// fetch rec options which has hypercubeDef
const recommendation = recommendations.data.recAnalyses[0];
// get csrf token
const qcsHeaders = await getQCSHeaders();
const engineService = new EngineService(engineUrl);
// get openDoc handle
const app = await engineService.getOpenDoc(appId, qcsHeaders);
await renderHypercubeDef(app, recommendation);
}
async function renderHypercubeDef(app, recommendation) {
const type = recommendation.chartType;
const nebbie = embed(app, {
types: [
{
name: type,
load: async () => charts[type],
},
],
});
document.querySelector('.curr-selections').innerHTML = '';
(await nebbie.selections()).mount(document.querySelector('.curr-selections'));
await nebbie.render({
type: type,
element: document.getElementById('chart'),
properties: { ...recommendation.options }
});
4. Calling the Insight Advisor API for recommendations
You can either call the API with a natural language question or a set of fields and master items with an optional target analysis.
Insight Advisor API endpoints that can be called:
api/v1/apps/{appId}/insight-analyses
api/v1/apps/{appId}/insight-analyses/model
api/v1/apps/{appId}/insight-analyses/actions/recommend
// Getting the recommendation
async function getRecommendation(requestPayload) {
await qlikLogin(); // make sure you are logged in to your tenant
// build url to execute recommendation call
const endpointUrl = `${tenantUrl}/api/v1/apps/${appId}/insight-analyses/actions/recommend`;
let data = {};
// generate request payload
if (requestPayload.text) {
data = JSON.stringify({
text: requestPayload.text,
});
} else if (requestPayload.fields || requestPayload.libItems) {
data = JSON.stringify({
fields: requestPayload.fields,
libItems: requestPayload.libItems,
targetAnalysis: { id: requestPayload.id },
});
}
const response = await fetch(endpointUrl, {
credentials: "include",
mode: "cors",
method: 'POST',
headers,
body: data,
});
const recommendationResponse = await response.json();
return recommendationResponse;
}
Results:
For the complete example that includes calling the API with fields, master items, and a target analysis type, visit qlik.dev post: https://qlik.dev/embed/gen-ai/build-insight-advisor-web-app/
The full code for this post can be found here:
https://github.com/ouadie-limouni/insight-advisor-api
Make sure to change the variables in index.js.
I hope you find this post helpful, please let me know if you have any question in the comment section below!
Ouadie
Weighted data makes data more accurate. NPS Classification is needed when looking at review data. Sentiment Analysis would be the next addition to make the data even more accurate.
Review data is tricky as the data from 1 to another differs in so many ways. Careful consideration is needed when making decisions. Weighted data makes data more accurate. NPS Classification is needed when looking at review data. Sentiment Analysis would be the next addition to make the data even more accurate.
Marketing and BI Developers
Weighted data makes data more accurate. NPS Classification is needed when looking at review data. Sentiment Analysis would be the next addition to make the data even more accurate.
Hi everyone,
Want to stay a step ahead of important Qlik support issues? Then sign up for our monthly webinar series where you can get first-hand insights from Qlik experts.
The Techspert Talks session from February looked at Upgrading Qlik Replicate Best Practices.
But wait, what is it exactly?
Techspert Talks is a free webinar held on a monthly basis, where you can hear directly from Qlik Techsperts on topics that are relevant to Customers and Partners today.
In this session, we will cover:
Click on this link to see the presentation
Analyze personal usage in LinkedIn network
Get insights about your LinkedIn profile
LinkedIn users who want to explore their activity on LinkedIn
It encourages other LinkedIn users to request their own data and create a personal dashboard
There have been many new capabilities that give developers ways to customize and style an app. In this blog, I will review how the sheet header and toolbar can be toggled on and off and the benefits of each, as well as things to consider. The sheet header and the toolbar both appear at the top of an app. The sheet header, outlined below in yellow, includes the name of the sheet, an optional logo or image, and previous and next sheet navigation arrows.
The toolbar is the row above the sheet header. It includes buttons and links to Notes, Insight Advisor, selections tool, bookmarks, sheets and edit sheet.
The toggle for the sheet header and toolbar can be found in the app options section of an app. Open app options by clicking on the arrow next to the app name at the top center of the app. From there, click on the App options icon on the right.
Once the app options are open, you will find the toggles for Show toolbar and Show sheet header.
One of the main benefits of removing the sheet header and toolbar is to gain more space on the sheet. The space that is used by the sheet header and toolbar become area that developers can use for additional filter panes and/or visualizations. Another benefit is developers can add custom capabilities to replace the Qlik Sense defaults. For example, a developer may want to create their own navigation buttons and have more control over the options that are available to the user. If the sheet(s) are being used to create a PowerPoint presentation, removing the sheet header and toolbar makes the presentation look more polished.
Now let’s discuss some things to consider when removing the sheet header. If the sheet header is removed, alternative sheet navigation should be provided for the user. It is possible to use your keyboard to navigate the sheets, but many people do not know that so custom navigation should be created by the developer using buttons or links. In the image below, buttons are used.
In the image below, buttons are used again but the highlighted button indicates the sheet the user is on. So, in this example, the developer has replaced the sheet navigation and the sheet title that was included in the removed sheet header.
A sheet title can also be added to a sheet using a Text & image object. The custom navigation can be designed to match a theme or company brand which gives the developer a lot of flexibility and can give a company’s apps a consistence look and feel.
When the toolbar is toggled off, features are hidden but they are not removed from the app entirely. This is great but not all users may be aware of alternative ways to access the features on the toolbar, so it is important to keep this in mind. For example, users can still create notes for a visualization or view notes for a visualization by right-clicking on a chart, selecting the eclipse (…) and then selecting Notes. Another example is users can still access bookmarks or the sheets in an app via the App Overview. Users can still ask questions via Insight Advisor, so not functionality is loss with the removal of the toolbar. Other things to consider is that while selections can still be made via filter panes and visualizations, without the selection bar, users may not be aware that selections have been made. This is why the developer needs to make sure there are filter panes or some way for users to know what has been selected. When it comes to selections, buttons can also be used to perform actions such as clearing selections and making selections in a field.
The overall goal is not to make things harder for the user so knowing possible issues and designing for them is smart. While there are benefits in toggling off the sheet header and/or toolbar, developers must consider how this may impact their users and how their users will use the app. The user experience can be just as good with the sheet header and toolbar toggled off if the developer plans well for an intuitive user experience.
Thanks,
Jennell
In previous posts on the Design blog, we've explored various ways for embedding Qlik Sense analytics. These have ranged from straightforward iFrames to more complex approaches like the Capabilities API, as well as more recent tools such as Nebula.js and Enigma.js.
Today, we’re going to be taking a quick look at a new library from Qlik called qlik-embed, but before diving into it, I would like to clarify that this library is currently in public preview and at the time of writing this blog, frequent updates as well as breaking changes are prone to happen (you can read more about that on qlik.dev or follow the Changelog for updated https://qlik.dev/changelog)
So what exactly is qlik-embed?
qlik-embed is a library for easily embedding data and analytics interfaces into your web apps while overcoming some of the concerns that usually arise when embedding content from one software application to another such as third-party cookies, cross-site request forgery, content security policy etc..
The library is designed to work with web apps from simple plain HTML ones to more modern frameworks like React etc.. That is in fact made easier since whichever qlik-embed flavor you use, the configuration options, the methods, and the properties will be similar.
If you are already embedding Qlik Sense content into your web applications, you can learn about the various reasons why qlik-embed would be a better solution on qlik.dev (https://qlik.dev/embed/qlik-embed/qlik-embed-introduction#why-qlik-embed-over-capability-api-or-nebulajs)
Web Components:
qlik-embed makes use of web components which are basically custom HTML elements in the form of <qlik-embed> </qlik-embed> HTML tags that allow you to configure properties of the content you’re embedding
You can find all supported web-components here:
How to quickly get started?
Before getting started, it’s worth noting that there are several ways to connect qlik-embed web components to Qlik.
More information about Auth can be found here:
- Connect qlik-embed: https://qlik.dev/embed/qlik-embed/connect-qlik-embed
- Best Practices: https://qlik.dev/embed/qlik-embed/qlik-embed-auth-best-practice
You can connect to qlik-embed in these ways:
In this post, we’re going to use OAuth2 Single-page-app from the Qlik Cloud tenant Management Console under oAuth:
Example using HTML Web Components:
Reference page: https://qlik.dev/embed/qlik-embed/qlik-embed-webcomponent-quickstart
First thing we need to do is add a <script> element in the <head> tag to configure the call to the qlik-embed library and set up the attributes relevant to the connection we choose.
<script
crossorigin="anonymous"
type="application/javascript"
src="https://cdn.jsdelivr.net/npm/@qlik/embed-web-components"
data-host="<QLIK_TENANT_URL>"
data-client-id="<QLIK_OAUTH2_CLIENT_ID>"
data-redirect-uri="<WEB_APP_CALLBACK_URI>"
data-access-token-storage="session"
>
</script>
web-component:
<qlik-embed ui="classic/app" app-id="<APP_ID>"></qlik-embed>
oauth-callback.html:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<script
crossorigin="anonymous"
type="application/javascript"
data-host="<QLIK_TENANT_URL>"
src="https://cdn.jsdelivr.net/npm/@qlik/embed-web-components/dist/oauth-callback.js"
></script>
</head>
</html>
You can fork the full example here and change the “Tenant URL” and the rest of the attributes to your own tenant after creating the OAuth SPA config: https://replit.com/@ouadielim/qlik-embed-HTML-Web-Components#index.html
result:
Example using React:
In React, you can use qlik’s embed-react library package: npm install @qlik/embed-react (https://www.npmjs.com/package/@qlik/embed-react)
Then, you can import QlikEmbed and QlikEmbedConfig from @qlik/embed-react. React’s context is used to pass in the hostConfig that you configure to point to your Qlik Cloud Tenant (host) and use the OAuth 2 config (clientId). The redirect URI needs to point to a page which is similar to what we did above in HTML web components.
import { QlikEmbed, QlikEmbedConfig } from "@qlik/embed-react";
const hostConfig = {
host: "<QLIK_CLOUD_TENANT>",
clientId: "<CLIENT_ID>",
redirectUri: "https://localhost:5173/oauth-callback.html",
accessTokenStorage: "session",
authType: "Oauth2",
};
const appId = "<APP_ID>";
const sheetId = ""; // sheet id or empty string
export default () => (
<QlikEmbedConfig.Provider value={hostConfig}>
<div className="container">
<h1>Qlik Embed with React</h1>
<div className="selections-bar">
<QlikEmbed ui="analytics/selections" appId={appId} />
</div>
<div className="viz">
<QlikEmbed ui="classic/app" app={appId} sheet={sheetId} />
</div>
<div className="viz">
<QlikEmbed ui="analytics/chart" appId={appId} objectId="hRZaKk" />
</div>
</div>
</QlikEmbedConfig.Provider>
);
You can clone the full React example here: https://github.com/ouadie-limouni/qlik-embed-react
result:
Limitations ?
There are a few limitations to qlik-embed as it continues to develop into a more stable and robust library - you can read more about those on qlik.dev: https://qlik.dev/embed/qlik-embed/qlik-embed-limitations
Like I mentioned at the very beginning, qlik-embed is new and evolving quickly, I invite you to test it to get familiar with it early and stay tuned for more updates and bug fixes as they come out using the Changelog page.
I hope you found this post helpful, please let me know in the comments below if you have any questions!
Thanks
- Ouadie
If team plays game at home, they score more, and win the games more often
It didn't, I did it just for myself 🙂
I showed it just to my Professor at my Business Informatics studies and my colleagues from the group. They liked it.
I used alternate states for it, to enable on one side of the dashboard to select games played at home, and on the second side games played away.
In my last post, I discussed the robust capabilities of Qlik Sense(QS) APIs to build out-of-the-box visual metaphors and ways to integrate them within Qlik’s ecosystem. A natural choice for developers while building QS extensions throughout the years has been the Extension API primarily using vanilla JavaScript, jQuery and AngularJS.
The Extension API consists of methods and properties used to create custom visualization extensions.
Enter… Qlik Sense’s Open Source Solution — Nebula.js!
Nebula.js is a collection of product and framework agnostic JavaScript libraries and APIs that helps developers integrate visualizations and mashups on top of the Qlik Associative Engine in QS Desktop, QS Enterprise on Windows, and SaaS editions of Qlik Sense. This tutorial specifically applies to the QS SaaS edition. Nebula.js offers developers an alternative to the 'Capability APIs' that have historically been used to create mashups. The tutorial will focus on developing a new visualization based on a user scenario using Nebula.js and the 3rd-party visualization library D3.js. Our target is to understand how we can leverage Nebula.js to build a QS extension object and bring in out-of-the-box visualization capabilities within the SaaS platform. This tutorial does not emphasize the D3.js programming part, but the motivation behind the visualization is discussed.
User scenario: An organization using Qlik Sense has a new requirement to develop a visual representation to understand high-dimensional mutlivariate dataset for their organization. Their dataset consists of numerical values, and they want to compare multiple features together to analyze the relationships between them. Based on these requirements, their Data Visualization Engineer presents to them the ‘Parallel Coordinate plot’.
Parallel coordinate plots (PCP) have proved to be efficient in effectively visualizing high-dimensional multivariate datasets. In a parallel coordinate, each feature is represented as vertical bars and the values are plotted as a series of lines connected across each axis. Their advantage is that the vertical bars(features) can have their own scale, as each feature works off a different unit of measurement. PCP provides insights into specific hidden patterns in data like similarities, clusters, etc., and allows for more straightforward comparative analysis.
Prerequisites:
Step1: Use nebula.js CLI to import the necessary packages. The command scaffolds a project into the /hello folder with the following structure:
Command:npx @nebula.js/cli create hello --picasso none
Step 2: Start the development server by running:
cd hello
npm run start
The command starts a local development server and opens up http://localhost:8080 in your browser. The benefit of having the dev server with Nebula.js is that it provides an interactive way to test and edit your extension without the need to iteratively deploy in QS every time a new change is made.
Step 3: Configure the data structure.
Visualizations in QS are based on a hypercube definition(qHyperCubeDef ). Therefore, any new visual object we want to bring into the QS ecosystem needs to have the data structure defined. With Nebula.js, we have the object-properties.js file that allows defining the structure of our object.
const properties = {
showTitles: true,
qHyperCubeDef: {
qInitialDataFetch: [{ qWidth: 30, qHeight: 200 }],
}
}
We also need to set a data target in the data.js file so we refer to the right hypercube definition(important to note in case you have multiple qHyperCubeDef objects).
export default {
targets: [
{
path:'/qHyperCubeDef',
}
],
};
Step 4: Developing the visualization extension using Nebula.js and D3.js.
QS Nebula.js specific code:
Now that we have everything ready, we start developing our extension with the custom visualization object using the index.js file from our project.
Note that Nebula.js and its primary package @nebula.js/stardust is built on the concept of custom hooks. This might sound familiar to people working with React.js. Hooks is a concept that emphasizes reusable, composable functions rather than classical object-oriented classes and inheritance. The primary hooks that we are dependent on for developing our extension object are described below:
The method that helps us in rendering our visualization object is the component()function. The component() function is executed every time something related to the object rendering changes, for example, theme, data model, data selections, component state, etc. This function can be compared to the paint() function in the Extension API.
To render our data, we first need to access the layout through the useLayout hook and then use it in combination with the useEffect hook. The hypercube’s qDataPages[0].qMatrix contains all the data(dimension and measures) used in the QS environment, and we will need to pass this data to our D3.js-based visualization.
component() {
const element = useElement();
const layout = useLayout();
useEffect(() => {
var qMatrix = layout.qHyperCube.qDataPages[0].qMatrix;
}
}
To see the data values and understand the structure of the qHyperCube, it is always a good idea to do a console.log(layout). A snippet shows values specific to our use case. Every time a new dimension or measure is added to our extension object, qDataPages[0].qMatrix is updated with those values.
The required dimension values for our chart are then extracted from the hypercube using the qText property from qDataPages[0].qMatrix like below.
var data = qMatrix.map(function (d) {
return {
PetalLength: d[0].qText,
PetalWidth: d[1].qText,
SepalLength: d[2].qText,
SepalWidth: d[3].qText,
Species: d[4].qText,
};
});
Our next step is to define the width and height of the visualization object, and capture its id. We will use this id to bind it to our element object from the useLayout hook as shown below:
var width = 1000;
var height = 400;
var id = "container_" + layout.qInfo.qId;
const elem_new = `<div id=${id}></div>`;
element.innerHTML = elem_new;
Finally, we make a call to the D3.js function from within the useEffect hook.
viz(data, width, height, id);
D3.js specific code:
The viz() function contains all of our D3.js code that allows us to draw a Parallel coordinate plot. First, we would need to append the SVG to the <div> that contains the id of our QS object, like below.
var svg = d3
.select("#" + id)
.append("svg")
.attr("width", width + margin.left + margin.right)
.attr("height", height + margin.top + margin.bottom)
.append("g")
.attr(
"transform",
"translate(" + margin.left + "," + margin.top + ")"
);
We then get all of the dimensions except Species to build our x and y axes.
var dimensions = Object.keys(data[0]).filter(function (d) {
return d != "Species";
});
var y = {};
for (var i in dimensions) {
var name_new = dimensions[i];
y[name_new] = d3.scaleLinear().domain([0, 8]).range([height, 0]);
}
var x = d3.scalePoint().range([0, width]).domain(dimensions);
To draw the lines for our Parallel coordinate plot, we will need to build the path function that would take a row from our qHyperCube and return the x and y coordinates of the line.
function path(d) {
return d3.line()(
dimensions.map(function (p) {
return [x(p), y[p](d[p])];
})
);
}
And finally, we bind everything with our SVG like below:
svg
.selectAll("myPath")
.data(data)
.enter()
.append("path")
.attr("class", function (d) {
return "line " + d.Species;
})
.attr("d", path)
.style("fill", "none")
.style("stroke", function (d) {
return color(d.Species);
})
.style("opacity", 0.5);
Step 5: Deploying the extension.
To build our project, we use the below command below to generates all QS readable files and puts them in a folder /hello-ext . This folder can then be compressed(.zip) and uploaded to the Extension section of SaaS console to be used within the QS environment.
npm run sense
If you are just getting started with Nebula.js, https://qlik.dev is a great place to review the basics and drill-down on related functions.
This project’s source code is made available at: https://github.com/dipankarqlik/Nebula