https://github.com/hmcts/ccd-definition-store-api
https://github.com/hmcts/ccd-definition-store-api
hmcts-ccd jenkins-cft jenkins-cft-a-c
Last synced: about 1 month ago
JSON representation
- Host: GitHub
- URL: https://github.com/hmcts/ccd-definition-store-api
- Owner: hmcts
- License: mit
- Created: 2018-03-15T11:10:38.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2025-04-17T11:28:34.000Z (about 1 month ago)
- Last Synced: 2025-04-18T00:55:57.487Z (about 1 month ago)
- Topics: hmcts-ccd, jenkins-cft, jenkins-cft-a-c
- Language: Java
- Size: 22.3 MB
- Stars: 7
- Watchers: 82
- Forks: 7
- Open Issues: 27
-
Metadata Files:
- Readme: README.md
- Contributing: .github/CONTRIBUTING.md
- License: LICENSE.md
Awesome Lists containing this project
README
# ccd-case-definition-store-api
[](https://hmcts.github.io/cnp-api-docs/swagger.html?url=https://hmcts.github.io/cnp-api-docs/specs/ccd-definition-store-api.json)
[](https://opensource.org/licenses/MIT)
[](https://travis-ci.org/hmcts/ccd-definition-store-api)
[](https://hub.docker.com/r/hmcts/ccd-definition-store-api)
[](https://codecov.io/gh/hmcts/ccd-definition-store-api)
[](https://www.codacy.com/app/adr1ancho/ccd-definition-store-api?utm_source=github.com&utm_medium=referral&utm_content=hmcts/ccd-definition-store-api&utm_campaign=Badge_Grade)
[](https://www.codacy.com/app/adr1ancho/ccd-definition-store-api?utm_source=github.com&utm_medium=referral&utm_content=hmcts/ccd-definition-store-api&utm_campaign=Badge_Coverage)
[](https://snyk.io/test/github/hmcts/ccd-definition-store-api)
[](#ccd-definition-store-api)Validation and persistence of definitions for field types, jurisdictions, case types and associated display elements.
## Overview
Definitions are imported as an Excel spreadsheet which are parsed, persisted and then exposed as JSON through a REST API.
Spring Boot and Spring Data are used to persist the data in a PostgreSQL database. The database schema is created and maintained by Flyway change sets applied during application startup.
Moreover, if the feature is enabled, the ElasticSearch cluster is initialised when a definition file is imported. For each case type, an index, an alias,
and a mapping is created on ElasticSearch. If `failOnImport` is true, any ES initialisation error will prevent the import to succeed. If false, ES errors are
simply ignored## Getting started
### Prerequisites
- [Open JDK 21](https://openjdk.java.net/)
- [Docker](https://www.docker.com)#### Environment variables
The following environment variables are required:
| Name | Default | Description |
|------|-----------------------------------------------------------------------------|-------------|
| DEFINITION_STORE_DB_USERNAME | - | Username for database |
| DEFINITION_STORE_DB_PASSWORD | - | Password for database |
| DEFINITION_STORE_DB_USE_SSL | - | set to `true` if SSL is to be enabled. `false` recommended for local environments. |
| DEFINITION_STORE_IDAM_KEY | - | Definition store's IDAM S2S micro-service secret key. This must match the IDAM instance it's being run against. |
| DEFINITION_STORE_S2S_AUTHORISED_SERVICES | ccd_data,ccd_gw,ccd_admin,jui_webapp,pui_webapp,aac_manage_case_assignment,xui_webapp | Authorised micro-service names for S2S calls |
| IDAM_USER_URL | - | Base URL for IdAM's User API service (idam-app). `http://localhost:4501` for the dockerised local instance or tunneled `dev` instance. |
| IDAM_S2S_URL | - | Base URL for IdAM's S2S API service (service-auth-provider). `http://localhost:4502` for the dockerised local instance or tunneled `dev` instance. |
| USER_PROFILE_HOST | - | Base URL for the User Profile service. `http://localhost:4453` for the dockerised local instance. |
| AZURE_APPLICATIONINSIGHTS_INSTRUMENTATIONKEY | - | secrets for Microsoft Insights logging, can be a dummy string in local |### Building
The project uses [Gradle wrapper](https://docs.gradle.org/current/userguide/gradle_wrapper.html).
This project uses [TestContainers](https://www.testcontainers.org/usage/database_containers.html#jdbc-url) for the database testing support.
Docker must be installed on the machine you are running tests.To build project please execute the following command:
```bash
./gradlew clean build
```### Running
If you want your code to become available to other Docker projects (e.g. for local environment testing), you need to build the image:
```bash
docker-compose build
```The above will build both the application and database images.
If you want to build only one of them just specify the name assigned in docker compose file, e.g.:```bash
docker-compose build ccd-definition-store-api
```When the project has been packaged in `target/` directory,
you can run it by executing following command:```bash
docker-compose up
```As a result the following containers will get created and started:
- Database exposing port `5451`
- API exposing ports `4451`#### Handling database
Database will get initiated when you run `docker-compose up` for the first time by execute all scripts from `database` directory.
You don't need to migrate database manually since migrations are executed every time `docker-compose up` is executed.
You can connect to the database at `http://localhost:5451` with the username and password set in the environment variables.
## Modules
The application is structured as a multi-module project. The modules are:
### repository
Data access layer.
### domain
Domain logic.
### rest-api
Secured RESTful API giving access to part of the domain logic.
### excel-importer
Secured endpoint and specific logic for importing case definition as an Excel spreadsheet.
### application
Spring application entry point and configuration.
### Functional Tests
The functional tests are located in `aat` folder. The tests are written using
befta-fw library. To find out more about BEFTA Framework, see the repository and its README [here](https://github.com/hmcts/befta-fw).Will run all the FT's:
./gradlew functional
##### Some Functional Tests
Will run both F-105 and F-110:./gradlew functional -P tags="@F-105 or @F-110"
Will run only S-110.1:
./gradlew functional -P tags="@S-110.1"
## LICENSE
This project is licensed under the MIT License - see the [LICENSE](LICENSE.md) file for details.