Video Tutorials
The Basics

Quick Start
Quick tutorial on how to setup a workspace, create first data pipeline for a sample use-case, and monitor the running application.
KNOW MORE
Data Pipeline
Data Pipeline allows you to easily create a data pipeline using built-in operators on a drag & drop editor.
KNOW MORE
Pipeline Management
StreamAnalytix provides tools, such as Import/Export, Pipeline Integration, Versioning, Data Lineage, Pipeline Inspect, and Error Mining, to manage an application pipelines at all the stages of the application lifecycle.
KNOW MORE
Landing Dashboards
Landing Dashboards provide details about Pipelines, Metrics, SAX Web Health, Connections, Alerts, and License Summary.
KNOW MORE
Emitters
Emitters defines the destination stage of a pipeline that could be a NoSql store, Indexer, Relational database, or third party BI tool.
KNOW MORE
Channels
Data access in StreamAnalytix is recognized by Channels that are built-in drag and drop operators to consume data from various data sources such as message queues, transactional databases, log files, and sensors for IOT data.
KNOW MORE
Processors
Processors are the built-in operators for processing the streaming data by performing various transformations and analytical operations.
KNOW MORE
Know the Super User
Super User can manage functions which are not permissible to Admin or Normal users. Only Super User can: manage Workspaces, start/stop default Sub-Systems, and upload/upgrade Licenses.
KNOW MORE
User Roles
User Roles determines the level of permissions that are assigned to a user to perform a group of tasks.
KNOW MOREData Pipeline Deep-Dive

Configure Alerts
Alerts will notify you as soon as an unwanted activity takes place within the system or every time the given criteria for an alert is satisfied. You can create rule based alerts as per your own criteria at run- time.
KNOW MORE
CEP
Complex Event Processing, CEP is used in Operational Intelligence (OI) solutions to provide insight into business operations by running query analysis against live feeds and event data.
KNOW MORE
Persistence & Indexing
Persistence allows you to persist the data in any NoSQL store like HBase or Cassandra, and Indexing allows you to index it in Solr or Elasticsearch.
KNOW MORE
Streaming Configuration and Filtering
StreamAnalytix provides you the functionality to process multiple streams of the data in parallel and filter the message data at the run-time.
KNOW MORE
Register Entities
Register Entities allows you to register custom components i.e. custom parsers, channels and processors to be used in the pipelines.
KNOW MORE
Pipeline Inspect
Pipeline inspection is a mechanism through which the processing components of a pipeline can be investigated.
KNOW MORE
Data Lineage
Data Lineage provides complete audit trail of the data from its source to its destination.
KNOW MORE
Scope Variables
Variables allows you to create and use the variables in data pipelines as per the scope. If you have a running pipeline and you want to update the variable value at runtime that is where you can use it to edit and continue.
KNOW MORE
Transformations
A transformation rule explains how the value of field in a message should be transformed to be used in predictive model. Transformation variable is associated with a message and can be defined on any field in the message. Once it is defined, it can be used as any other message field, while defining the model.
KNOW MORE
Pipeline Versioning
Versioning of Pipelines enables you to create different versions of the same pipeline and rollback to a previous version for testing purpose.
KNOW MOREAdvance Features

Monitoring
StreamAnalytix allows you to monitor each component’s performance graphically and define alerts on the basis of graph value. You can monitor both the Application and the System data.
KNOW MORE
RT Dashboards
Dashboards provide a powerful means to explore and analyze real-time data using charts and graphs. Dashboards displays the status of metrics and key performance indicators for a pipeline. You can integrate everything you must keep track of, however disparate, onto a single screen through real-time (RT) dashboard.
KNOW MORE
Dynamic CEP
DynamicCEP allows registration of queries with pre-defined actions (“PUBLISH_TO_RABBITMQ”, “INVOKE_WEBSERVICE_CALL” and “CUSTOM_ACTION”) applied on the running pipeline.
KNOW MORE
Read Data from File
Log Agent reads files data from remote data sources and ingests the data into Kafka or RabbitMQ channels.
KNOW MORE
Custom Processor
Custom processor allows you to implement your custom logic in a pipeline. You can write your own custom code to ingest data from any data source and build it as a custom channel. You can use in your pipelines or even share it with other workspace users.
KNOW MORE
More on Data Types
While configuring a new message, you need to define the Message Parser Type. Supported Parser Types are Delimited, JSON, AVRO, Regular Expression, and Custom.
KNOW MOREResources
Events
4th Marketing Analytics Summit 2020
September 15, 2020 - September 16, 2020 Virtual event
WEBINARS
Simplify Spark-based ETL workflows on the cloud
Dec 11, 2020 | 11:00 am IST