Topology streaming data flows

The requirements for a modern data platform within an organisation must accommodate both traditional use cases like micro-services, transactions, data integration, as well as modern use cases like high volume data sets (IoT, click streams etc), machine learning and stream processing. Building on top of Apache Kafka has both business and technical challenges.
Micro-services bring a whole new paradigm for event driven approaches, and Apache Kafka is ideal for highly scalable architectures.


Lenses supports you in this paradigm shift.
  • Provides the building blocks to build visual streaming data flows
  • Monitors data pipelines end to end with alerts to never miss and SLA
  • Supports multiple data types out of the box like AVRO, JSON, XML, CSV, Protobufs etc
  • You can provide custom serializers and deserializers to support custom data sets
  • Custom Applications can be exposed in Lenses Topology and leverage its monitoring capabilities
  • Lenses SQL Engine supports your data pipelines to build Processors and customise Connectors
  • Monitor Producers & Consumers rates as well as deployment metrics
  • Access data and debug your apps with an intuitive web interface where you can inspect & add data

Manage & Monitor
streaming data flows

Connectors

With 100+ Open Source Kafka connectors available - 30+ enterprise grade connectors out of the box - integrate seamlessly your data systems.

SQL Processors

Every action and capability is exposed via a respective REST or WebSocket API, so you can easily integrate with your deployment practices.

CLI

Enable CI/CD and automation, GitOps and promoting flows across environments with the open source CLI tool to deliver your apps to production quickly.

Native Clients

Use the open source Python, Go, JavaScript libraries, to accelerate your application development timelines.


Lenses Connectors

Stream Reactor is an Apache License, Version 2.0 open source collection of components built on top of Apache Kafka and provides Kafka Connect compatible connectors to move data between Kafka and popular data stores. Stream Reactor provides source connectors to publish data into Apache Kafka and sink connectors to bring data from Apache Kafka into other systems.

The connectors support KCQL (Kafka Connect Query Language), an open source component of Lenses SQL Engine that provides an elegant and simple SQL like syntax for selecting fields and routing from sources or topics to Kafka or the target system (topic to target entity mapping, field selection, auto creation, auto evolution, error policies).

Custom Streaming Applications

Viewing topology is a powerful feature of Lenses and automatically all Connectors and SQL based KStream applications are depicted. Starting with Lenses 2.1, you can now bring your own consumer and producer applications to the topologies. All you have to do is use the Apache 2.0 topology-client that provides a simple builder for topologies, which is then pushed to a Kafka topic to be picked up by Lenses at runtime. In addition, Lenses also supports metrics for Kafka Streams, Akka Streams and Spark Structured Streaming.

Kubernetes integration

Lenses integrates with Kubernetes to deploy your data flows. Lenses SQL Engine allows to register SQL processors which are automatically deployed and scaled leveraging your Kubernetes cluster.

Learn more about Scaling SQL Processors in Kubernetes Mode.

Data Types & Access Data

Lenses provides security and data governance features to allow access to the data for both development and business needs whilst supporting any data format

Our platform provides a rich ecosystem to deliver data to other systems and leverages your favourite tools. Lenses integrates with Kafka Connect, to move data around your systems, provides a JDBC driver to integrate with BI tools, Python for data scientists to integrate with favourite notebooks such as Jupyter, Reduxjs library to get real time data straight to your frontend application.

Lenses & Lenses SQL Engine supports multiple data types out of the box, while allows you to register custom serializers and deserializers. You can browse, query and register processors for AVRO, JSON, Protobuf, XML, CSV, Array payloads, query plain text with Regex. For AVRO payloads, Lenses integrates with Schema Registry but adds lineage and role based access on top, to meet the demands of a corporate environment.

Producers & Consumers

Monitor your apps like a pro! Lenses provides in-app features to monitor you applications and apply alerts to never miss an SLA. Moreover, Lenses ships with rich Kafka specific dashboards in order to monitor your low level metrics.


Promotion

When it comes to CI/CD and automation, GitOps or promoting flows across environments, use the open source Lenses CLI tool to deliver swiftly your apps.


Ready to build your data platform with Lenses?

Start your free trial today