Blockchain Source

Download connector Blockchain Connector for Kafka 1.0 Blockchain Connector for Kafka 0.11

A Kafka Connector to hook into the live streaming providing a real-time feed for new bitcoin blocks and transactions provided by www.blockhain.info The connector subscribes to notification on blocks, transactions or an address and receives JSON objects describing a transaction or block when an event occurs. This Json is then pushed via Kafka connect to a Kafka topic and therefore can be consumed either by a Sink or a live stream processor.

Since is a direct WebSocket connection the source will only ever use one connector task at any point. There is no point spawning more and then have duplicate data.

One thing to remember is the subscription API from blockchain doesn’t offer an option to start from a given timestamp. This means if the connect worker is down then you will miss some data.

The Sink connects to unconfirmed transaction!! Read more about the live data here

Prerequisites

  • Apache Kafka 0.11.x or above
  • Kafka Connect 0.11.x or above

Connector QuickStart

The easiest way to quick-start with this connector is to try out the Lenses Development Environment as it’s a docker image ready for your laptop, runs Kafka and Kafka Connect, and has the connector already available in the classpath

Note

Kafka Connect might take up to 3-4 minutes to start up, as there are 30+ Kafka Connectors available, and loading each one takes 10-15 seconds

Select Connectors -> New Connector -> Blockchain and paste the following configuration

name=blockchain-source
connector.class=com.datamountaineer.streamreactor.connect.blockchain.source.BlockchainSourceConnector
connect.blockchain.source.kafka.topic=blockchains
tasks.max=1
../../_images/kafka-connect-blockchains.png

Within seconds you should be able to view blockchain message flowing through the Kafka Topic blockchains

In the configuration that we posted to the Kafka Connector, we specified:

  1. The name of the source.
  2. The Source class.
  3. The max number of tasks the connector is allowed to created (1 task only).
  4. The topics to write to.

The target topic should be pre-created with 1 partition, or if “topic auto creation” is enabled, it will be created for you. The messages in the target topic will be

Kafka Description
Key Empty. The connector does not write any bytes at the topic key
Value The blockchain information in Avro format. An Avro schema blockchains-value will be automatically registered in the Schema Registry

Configurations

The Kafka Connect framework requires the following in addition to any connectors specific configurations:

Config Description Type Value
name Name of the connector string  
tasks.max The number of tasks to scale output int 1
connector.class Name of the connector class string com.datamountaineer.streamreactor.connect.blockchain.source.BlockchainSourceConnector

Connector Configurations

Config Description Type Default Optional
connect.progress.enabled Enables logging to the output for how many records have been processed boolean false yes

Kubernetes

Helm Charts are provided at our repo, add the repo to your Helm instance and install. We recommend using the Landscaper to manage Helm Values since typically each Connector instance has it’s own deployment.

Add the Helm charts to your Helm instance:

helm repo add landoop https://landoop.github.io/kafka-helm-charts/

TroubleShooting

Please review the FAQs and join our slack channel