Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dalelane/ibm-ace-avrodeserialize
Sample IBM App Connect Enterprise Compute node for parsing Apache Avro data
https://github.com/dalelane/ibm-ace-avrodeserialize
Last synced: 24 days ago
JSON representation
Sample IBM App Connect Enterprise Compute node for parsing Apache Avro data
- Host: GitHub
- URL: https://github.com/dalelane/ibm-ace-avrodeserialize
- Owner: dalelane
- License: apache-2.0
- Created: 2021-10-25T19:36:59.000Z (about 3 years ago)
- Default Branch: master
- Last Pushed: 2024-05-13T22:09:03.000Z (6 months ago)
- Last Synced: 2024-05-16T02:05:44.680Z (6 months ago)
- Language: Java
- Homepage: https://dalelane.co.uk/blog/?p=5228
- Size: 6.84 MB
- Stars: 0
- Watchers: 2
- Forks: 2
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# IBM App Connect Enterprise Compute node for parsing Apache Avro data
- [Overview](#overview)
- [Contents](#contents)
- [Details](#details)
- [Dependencies](#dependencies)## Overview
![msgflow screenshot](./docs/msgflow-screenshot.png)
Sample compute node implementation for processing Kafka messages that have been serialized using Apache Avro schemas from a schema registry.
It turns json-encoded or binary-encoded Avro data into a JSON object that can be processed using standard native App Connect transformation nodes. This can be used with messages that contain schema IDs in message headers, or with schema IDs in the message payload.
## Contents
- [`AvroDeserialize.java`](./AvroDeserialize.java)
- implementation of the Java compute node
- [`sample-policy.policyxml`](./sample-policy.policyxml)
- example of a policy needed to configure the compute node with details of the schema registry to use## Details
![msgflow screenshot](./docs/annotated-msgflow-screenshot.png)
| **input terminal** | **format** | **details** |
| ------------------ | ---------- | ---------------------------------------------------------- |
| input | BLOB | serialized message data retrieved by a KafkaConsumer node || **output terminal** | **format** | **details** |
| ------------------- | ---------- | --------------------------------------------------------- |
| out | JSON | JSON object deserialized using an Apache Avro schema |
| alt | BLOB | messages that could not be deserialized* |### Possible reasons for messages not being de-serializable:
- Schema has been deleted from the Schema Registry since the message was produced to the Kafka topic
- Schema Registry is not currently available
- Invalid schema registry credentials provided in the [config policy](#configuration)## Dependencies
### Configuration
The compute node has a run-time dependency on a policy for providing the configuration information about the Avro schema registry to use.
A [sample policy is provided](./sample-policy.policyxml) and needs to be deployed with any message flows using this compute node.
### Jars
The compute node implementation has compile and run-time dependencies on Avro, slf4j, and Jackson JSON jars. A [helper script](./download-dependencies.sh) is provided to download these jars from Maven Central.
See the IBM App Connect Enterprise documentation on [Adding Java code dependencies](https://www.ibm.com/docs/en/app-connect/12.0?topic=java-adding-code-dependencies) for guidance on how to add these jars to your App Connect Enterprise server.
### Schema Registry
The compute node implementation is based on schemas from a schema registry, such as the registry that is included with [IBM Event Streams](https://ibm.github.io/event-automation/es/) or run as [a stand-alone open source registry](https://www.apicur.io/registry/).
## Background
https://dalelane.co.uk/blog/?p=5228