An open API service indexing awesome lists of open source software.

https://github.com/jbossdemocentral/edge-to-cloud-data-pipelines-demo

Solution Pattern: Edge to Core Data Pipelines for AI/ML
https://github.com/jbossdemocentral/edge-to-cloud-data-pipelines-demo

ai-ml data-acquisition data-pipelines data-science demo edge-computing soluton-pattern

Last synced: 7 months ago
JSON representation

Solution Pattern: Edge to Core Data Pipelines for AI/ML

Awesome Lists containing this project

README

          

# Edge-to-Core Data Pipelines for AI/ML Demo

The Edge to Core Data Pipelines for AI/ML solution pattern provides an architecture solution for scenarios in which edge devices generate image data, which must be collected, processed, and stored at the edge before being utilized to train AI/ML models at the core data center or cloud.

### Prerequisites

This Camel Quarkus component combines MQTT and HTTP clients (such as IoT devices, cellphones, and third-party clients) with an AI/ML engine to obtain image detection results.

You will require:

- An OpenShift Container Platform cluster running version 4.12 or above with Cluster Admin access.

You can obtain one by ordering from [Red Hat Demo Platform](https://demo.redhat.com/catalog?search=4.12) or deploying the trial version available at [Try Red Hat OpenShift](https://www.redhat.com/en/technologies/cloud-computing/openshift/try-it)

- Docker is installed and running.
The ansible playbook that configures the environment is run using Docker and Linux container images in this demonstration. You should use the most recent Docker version. See the [Docker Engine installation documentation](https://docs.docker.com/engine/installation/) for further information.

## Install the demo

1. Clone [this](https://github.com/RedHat-Middleware-Workshops/camel-edge-rhte) GitHub repository:

```sh
git clone https://github.com/hguerrero/edge-to-cloud-data-pipelines-demo.git
```

2. Change to the Ansible directory.

```sh
cd edge-to-cloud-data-pipelines-demo/ansible
```

3. Set the cluster admin variable.

```sh
OCP_USERNAME=
```

4. Configure the `KUBECONFIG` file to use.

```sh
export KUBECONFIG=./kube-demo
```

5. Login into your OpenShift cluster from the `oc` command line.

```sh
oc login --username="${OCP_USERNAME}" --server=https://(...):6443 --insecure-skip-tls-verify=true
```

Replace the `--server` url with your own cluster API endpoint.

6. Run the playbook

1. With Ansible (if you have it installed and configured):

```sh
ansible-playbook -i inventory/openshift.yaml -e="ocp_username=${OCP_USERNAME}" ./install.yaml
```

2. With Docker:

```sh
docker run -i -t --rm --entrypoint /usr/local/bin/ansible-playbook \
-v $PWD:/runner \
-v $PWD/kube-demo:/home/runner/.kube/config \
quay.io/agnosticd/ee-multicloud:v0.0.11 \
-e="ocp_username=${OCP_USERNAME}" \
./install.yaml
```

3. With Podman:

```sh
podman run -i -t --rm --entrypoint /usr/local/bin/ansible-playbook \
-v $PWD:/runner \
-v $PWD/kube-demo:/home/runner/.kube/config \
quay.io/agnosticd/ee-multicloud:v0.0.11 \
-e="ocp_username=${OCP_USERNAME}" \
./install.yaml
```

## Run the Demo

### Explore the S3 buckets

To check the content of the buckets we collected during the demo, you can use a simple file viewer like [filestash S3 Browser](https://www.filestash.app/s3-browser.html).

1. Expand the *Advanced* section.
2. Use the external endpoint for S3 and the credentials from the NoobaaAcount.
3. Click on *Connect*

You will be able to browse through the buckets and check the images sent by the edge applications.