Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/hyperledger-labs/hlf-connector
Integrate with Hyperledger Fabric using REST and Kafka with Block and Chaincode Event emitter.
https://github.com/hyperledger-labs/hlf-connector
blockchain connector hacktoberfest hacktoberfest-accepted hyperledger hyperledger-fabric java kafka rest-api
Last synced: 3 months ago
JSON representation
Integrate with Hyperledger Fabric using REST and Kafka with Block and Chaincode Event emitter.
- Host: GitHub
- URL: https://github.com/hyperledger-labs/hlf-connector
- Owner: hyperledger-labs
- License: apache-2.0
- Created: 2022-02-11T16:19:25.000Z (almost 3 years ago)
- Default Branch: main
- Last Pushed: 2024-06-02T10:25:08.000Z (8 months ago)
- Last Synced: 2024-06-03T06:56:42.849Z (8 months ago)
- Topics: blockchain, connector, hacktoberfest, hacktoberfest-accepted, hyperledger, hyperledger-fabric, java, kafka, rest-api
- Language: Java
- Homepage:
- Size: 382 KB
- Stars: 12
- Watchers: 6
- Forks: 25
- Open Issues: 13
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Codeowners: CODEOWNERS
Awesome Lists containing this project
README
# Hyperledger Fabric REST Integration
## Description:-
This artifact provides a mechanism to invoke and query fabric chaincode using a REST-based API interface.
Additionally, it can also invoke chaincode using a asynchronous method and can publish chaincode events to Kafka/Event-Hub topics.## Key Feature:-
1. Invoke Chaincode with REST.
2. Query Chaincode with REST.
3. Invoke Chaincode with Kafka/Event-Hub.
4. Publish chaincode events from multiple channels to Kafka/Event-Hub.## Prerequisites:-
1. Fabric 2.x network.
2. Connection Profile YAML file.
3. Wallet (.id) file.
4. Java and Maven is installed.
5. (Optional) Kafka/Event-Hub configuration for invoking chaincode asynchronously.
6. (Optional) Kafka/Event-Hub configuration for publishing chaincode events.## Running Locally:-
1. Download/Clone the repository and build the project using ``mvn clean install``
2. Create a folder wallet in the root directory of the project.
3. If using [fabric-getting-started](https://github.com/anandbanik/fabric-getting-started) script, note the path to CA Pem file for Org1. Usually located in ``fabric-getting-started/test-network/organizations/peerOrganizations/org1.example.com/ca`` folder.
4. Open [EnrollAdmin.java](https://github.com/blockchain-wmt/hlf-connector/blob/main/src/test/java/hlf/java/rest/client/util/EnrollAdmin.java) and set the ``pemFilePath`` variable with value noted above and run. This will create ``admin.id`` in the wallet folder.
5. Open [RegisterUser](https://github.com/blockchain-wmt/hlf-connector/blob/main/src/test/java/hlf/java/rest/client/util/RegisterUser.java) and set the ``pemFilePath`` variable with value noted above and run. This will create ``clientUser.id`` in the wallet folder.
2. Add the ``connection-org1.yaml`` file, located at ``fabric-getting-started/test-network/organizations/peerOrganizations/org1.example.com`` to the wallet folder.
3. Make sure the Peer URL and CA URL in ``connection-org1.yaml`` are reachable. If using [fabric-getting-started](https://github.com/anandbanik/fabric-getting-started), change the peer URL in ``connection-org1.yaml`` to ``peer0.org1.example.com:7051`` and CA URL to ``ca-org1:7054``.
4. Run, hlf.java.rest.client.FabricClientBootstrap java file or jar file.
5. You can also run as container using ``docker-compose up``. If the fabric network is running local, make sure the docker-compose.yml file is configured to use the correct network.
```
networks:
default:
external:
name:
```## Event-driven Design
### Asynchronous Integration to invoke chaincode
This component supports event-based architecture by consuming transactions through Kafka & Azure EventHub.
To configure it, use the below configuration in the application.yml file.
```
kafka:
integration-points:
-
brokerHost:
groupId:
topic:
-
brokerHost:
groupId:
topic:
# For Azure EventHub
jaasConfig: org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb:///;SharedAccessKeyName=;SharedAccessKey=";
# For SOX compliant Kafka Clusters
ssl-enabled: true
security-protocol: SSL
ssl-keystore-location:
ssl-keystore-password:
ssl-truststore-location:
ssl-truststore-password:
ssl-key-password:
```
The component accepts JSON payload and 3 headers to invoke the chaincode.
Please find below the keys for the headers:-
```
1. channel_name
2. function_name
3. chaincode_name
```### Capture Chaincode events:-
This component supports capturing chaincode events and publish it to Kafka or Azure EventHub. This can be useful for integrating with offchain DB.
To configure it, use the below configuration in the application.yml file.
```
fabric:
events:
enabled: true
chaincode: mychannel1,mychannel2 #Comma-separated list for listening to events from multiple channels
kafka:
event-listener:
brokerHost:
topic:
# For Azure EventHub
jaasConfig: org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb:///;SharedAccessKeyName=;SharedAccessKey=";
# For SOX compliant Kafka Clusters
ssl-enabled: true
security-protocol: SSL
ssl-keystore-location:
ssl-keystore-password:
ssl-truststore-location:
ssl-truststore-password:
ssl-key-password:
```
The component will send the same JSON payload sent by the chaincode and add the following headers.```
1. fabric_tx_id
2. event_name
3. channel_name
4. event_type (value: chaincode_event)
```### Capture Block events:-
This component supports capturing block events and publish it to Kafka or Azure EventHub. This can be useful for integrating with offchain DB where adding events to chaincode is not possible (for ex - Food-Trust anchor channel).
To configure it, use the below configuration in the application.yml file.
```
fabric:
events:
enabled: true
block: mychannel1,mychannel2 #Comma-separated list for listening to events from multiple channels
kafka:
event-listener:
brokerHost:
topic:
# For Azure EventHub
jaasConfig: org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb:///;SharedAccessKeyName=;SharedAccessKey=";
# For SOX compliant Kafka Clusters
ssl-enabled: true
security-protocol: SSL
ssl-keystore-location:
ssl-keystore-password:
ssl-truststore-location:
ssl-truststore-password:
ssl-key-password:
```
The component will send the same JSON payload sent by the chaincode and add the following headers.```
1. fabric_tx_id
2. channel_name
3. chaincode name
4. function_name
5. event_type (value: block_event)
```