Lenses SQL Streaming Engine

The easy way to create, deploy and scale Kafka-based streaming applications.
Query, filter, transform your data in motion, batch or during ingestion, using SQL syntax semantics!

drive10
Lenses SQL is a 3 tier engine:
  • view and query live and batch data
  • create real time streaming processors
  • connect data sources by including SQL within our connectors configuration

It’s the first and only SQL engine that provides automated Kafka Connect and Kubernetes out-of-the-box deployment scalability options for SQL processors and also managed and supported via Lenses web interface.

Lenses is the only product that provides the visual topology of the stream flow via interactive pipelines. Developers, Analysts, data scientists and business users are encouraged to work together on the same platform by complementing their existing skills and upgrading with streaming capabilities.

drive02
drive10

Connect

Kafka Connectors with SQL enbaled filtering at the ingestion time, with supported system features

The KCQL (Kafka Connect Query Language) is a SQL like syntax allowing streamlined configuration of Kafka Sink Connectors, and is available across many connectors.

You get a logical and concise syntax to define lightweight transformations, with additional DSL to integrate with target systems.

Features
  • Avro and JSon support
  • Topic to target mapping
  • Field selection, extraction and filtering
  • Access to Key, Value and metadata
  • Auto creation & auto evolution
  • Error policies

A Primary Key (PK) does not have a context in Kafka, but if we are using SQL to write data into a target RDBMS, using our KCQL extentions we can:
INSERT INTO targetDbTable SELECT field1, field2, field3 PK field5

INSERT (or UPSERT) supported extentions

INSERT INTO   $TARGET
SELECT *|columns           (i.e col1,col2 | col1 AS column1,col2)
FROM   $TOPIC_NAME
  [ PK columns ]
  [ AUTOCREATE ]
  [ AUTOEVOLVE ]
  [ BATCH = N ]
  [ CAPITALIZE ]
  [ INITIALIZE ]
  [ PARTITIONBY cola[,colb] ]
  [ DISTRIBUTEBY cola[,colb] ]
  [ TIMESTAMP cola|sys_current ]
  [ STOREAS   $YOUR_TYPE([key=value, .....]) ]
  [ WITHFORMAT TEXT|AVRO|JSON|BINARY|OBJECT|MAP ]
  [ WITHTYPE $YOUR_TYPE]
  [ WITHCOMPRESSION = SNAPPY|GZIP|LZ4|ZLIB|BZIP2|LZO ]
  [ WITHTAG = $YOUR_TAG]
  [ WITHJMS_SELECTOR = $YOUR_SELECTOR ]
  [ WITHTTL = $YOUR_TTL ]
  [ WITHPIPELINE = $YOUR_PIPELINE]
  [ WITHSUBSCRIPTION = $YOUR_SUBSCRIPTION ]
  [ WITHPARTITIONER = $YOUR_PARTITIONER ]
  [ WITHDOCTYPE = $YOUR_DOC_TYPE ]
  [ INCREMENTALMODE = $YOUR_MODE ]
  [ KEYDELIMITER = $YOUR_DELIMITER ]
  [ WITHINDEX_SUFFIX = $YOUR_SUFFIX ]
  [ WITHCONVERTER = `$YOUR_CONVERTER` ]

The SELECT mode is useful for target systems that do not support the concept of namespaces ( Key-Value stores such as HazelCast or Redis are flat )

SELECT *|columns
FROM   $TOPIC_NAME
  [ IGNORE columns ]
  [ WITHCONVERTER ]
  [ WITHFORMAT  JSON|AVRO|BINARY ]
  [ WITHGROUP $YOUR_CONSUMER_GROUP ]
  [ WITHPARTITION (partition),[(partition, offset) ]
  [ STOREAS $YOUR_TYPE([key=value, .....]) ]
  [ SAMPLE $RECORDS_NUMBER EVERY $SLIDE_WINDOW ]
An example

Here is a Redis Connector example:

INSERT INTO sensorTimeseries SELECT sensorID, temperature, humidity FROM sensorsTopic STOREAS SortedSet (score=ts)

The above KCQL instructs the Redis Kafka Sink Connector to select fields, and store the time-series data into a Sorted Set, using the field `ts` for scoring each message

Analyse

Data Browsing, Topics and Messages,Choose Decoders, AVRO supported
Lenses SQL knows how to handle the following payload structures (also known as decoder types): BINARY, JSON, AVRO, STRING, INT, LONG, PROTOBUF*(coming soon). AVRO Schema Registry is also fully supported.

SELECT field2 AS my_field
FROM magicTopic
WHERE _ktype=DOUBLE
      AND _vtype=AVRO
SELECT *
FROM magicTopic
WHERE _ktype=JSON
      AND _vtype=AVRO
Access Metadata
SELECT
    field1,
    _offset,
    _ts,
    _partition
FROM magicTopic
Work with Keys
SELECT min_temp AS `min`
FROM sensorsTopic
WHERE _key LIKE '%N01'
      AND _ktype=STRING
      AND _vtype=AVRO
      AND _partition = 0
Nested Fields
SELECT
    details.antenaclass AS antena,
    details.readingperiod AS `interval`
FROM sensorsTopic


Filtering
SELECT *
FROM topicA
WHERE  (a.d.e + b) / c = 100
    AND _offset > 2000
    AND partition in (2,6,9)
LIMIT 1000
Project Fields & Functions
SELECT field2 * field3 - abs(field4.field5) AS total,
       _offset AS `_key.offset`
FROM magicTopic
SELECT min_temp AS `min`,
       max_temp AS `max`,
       (min_temp + max_temp) / 2 AS average,
       details.antenaclass AS antena,
       details.readingperiod AS `interval`
FROM sensorsTopic
WHERE _key.id like '%N01'
      AND _ktype=JSON
      AND _vtype=JSON
Time Travelling
SELECT
    device_id,
    temperature
FROM sensorsTopic
WHERE _ts >= '2017-08-01 12:24:00' AND _ts <= '2017-08-01 14:30:00'

Process

Register processors in continuous streams, scale with Kafka Connect & Kubernetes
We instrument SQL into KStream applications, that can continuously execute either in-memory or scale-out with HA with a controllable parallelization.

INSERT INTO targetTopic
    SELECT device_id,
           longitude,
           latitude,
           details.altitude AS alt
FROM sourceTopic
WHERE alt > 100
      AND _partition < 7
      AND _vtype=AVRO
      AND _ktype=AVRO
Continuous Joins
Define tables and streams with kafka topics and apply rules to join them
WITH
tableX AS (SELECT x, y, z
   FROM topicA
   WHERE _ktype=AVRO
     AND _vtype=AVRO
   GROUP BY SLIDING (1,S)),
streamY AS
  (SELECT STREAM m, n
   FROM topicB
   WHERE x <> 'something'
     AND _ktype=AVRO
     AND _vtype=AVRO)
SELECT X.x,
       X.y,
       Y.m
FROM tableX X
INNER JOIN tableY Y ON X.z = Y.n
GROUP BY SLIDING(2,s)
Windowing
SLIDING, TUMBLING, SESSION, HOP windows supported
FROM topicA A
INNER JOIN topicB B ON A.a=B.b
WHERE A.i <> B.uu
  AND A.m = 10
  AND B.x LIKE '%a%'
  AND A._ktype=AVRO
  AND A._vtype=AVRO
  AND B._ktype=AVRO
  AND B._vtype=AVRO
GROUP BY SLIDING(1,S)

Ready to Get Started with Lenses?

CONTACT LANDOOP


2 Minute Overview


Discover awesome features


Community


Join us at Landoop Community


Resources


Repos, Docs, Trainings, Tutorials


Free Download


ALL-IN-ONE free for developers!