Skip to main content

Integrations

This page details the pathways of integrations between components.

Diagrams

Container diagram

Container diagram

Domain services

Used to display or process someone’s information that does not belong in interventions.

API Responsibility
hmpps-probation-integration-api Replacement of Community API which is slow and bulky. This is used for reading personal details, someone’s responsible officers, officer teams.
Writing activities (NSIs), appointments, appointment outcomes, and progress notifications.
community-api Reading personal details, someone’s responsible officers, officer teams.
Writing activities (NSIs), appointments, appointment outcomes, and progress notifications. The plan is to remove usage of commiunity-api
hmpps-assess-risk-and-needs Reading risk information.
Writing supplementary (redacted) risk information.
hmpps-prisoner-search Reading prisoner information.
This includes prison location and the release dates.

Platform services

For features shared across many services.

API Responsibility
hmpps-auth Authenticating and authorising users given their roles and groups.
Issuing JWT tokens.
token-verification-api Validating issued tokens.
GOV.uk notify Sending emails.

Scheduled data transfers

We need to transfer our data for further processing. Transfer is done via CronJobs:

Component Responsibility
cronjob/data-extractor-analytics Daily snapshot of raw data, transferred to the Analytical Platform.
Used for data cleansing, exploratory analysis and further processing.
cronjob/generate-ndmis-performance-report Daily snapshot of transformed data, transferred to National Delius Management Information System (NDMIS).
Used for business reporting.

Static and mapping data

These data sources are either indirectly used or embedded in our applications.

Dataset Responsibility
NSI to contract type mapping Mapping between intervention contract type and nDelius NSI type identifiers.
Maintained in the linked code.
Probation office data Mapping to nDelius location identifiers.
List of probation offices.
UI application has a file copy.
Probation regions List of probation service regions.
API database has a copy.
Police and crime commissioner (PCC) regions List of the probation-specific Police and crime commissioner (PCC) regions, used for Commissioned Rehabilitative Services (CRS).
API database has a copy.
🔐 Interventions seed data Repository to populate contract, prime provider, subcontractor and intervention data.
Used as-is.

Data transformation pipeline

We need to transform our data for business intelligence dashboards.

Resource definitions:

Component Responsibility
Analytical Platform landing bucket policy Injects the Analytical Platform landing bucket secret to our environment to be used by jobs. Sets up permissions so our namespace can write to that bucket.
Analytical Platform landing bucket definition Creates the landing bucket for our data. The aws_arn_for_put_permission is populated from the user_arn value from the secret created above.
Modernisation Platform environment Defines our Modernisation Platform namespace and the GitHub team access privileges.
Modernisation Platform environment resources Defines any resources in the environment. In a sandbox environment, a scheduled job destroys all undefined resources. Those resources must be defined here to prevent deletion.
Modernisation Platform bucket and its policy S3 bucket on the Modernisation Platform side that stores the transformed data that powers the dashboards. The bucket policy allows the Analytical Platform lambda to write into it.

Dashboard data propagation happens through:

Step Component Responsibility
1 generic-data-analytics-extractor helm chart Defines a CronJob and adds it to the service deployment that creates daily snapshots from the entire database in the Analytical Platform S3 landing bucket.
2 fact, dimension, data mart definitions Defines the transformations into fact and dimension tables and data marts using dbt
3 dbt transformation pipeline Executes the defined transformations.
4 hook to copy to holding area Defines that after a successful transformation, the data must be copied into a “holding area”, awaiting transfer.
5 copy to Modernisation Platform Defines that the contents of the “holding area” (on change) must be moved from the Analytical Platform to the Modernisation Platform. It creates a lambda function that runs on an on-change event.
6 parse schemas with AWS Glue crawlers Creates and maintains the Modernisation Platform’s data sources, making transformed data usable by other tools in the Modernisation Platform.
7 mapped data source Defines the schemas of the transformed data (in the Modernisation Platform), directly exposing them to AWS Athena and QuickSight.

To access the data:

Step Component Action
1 map datasets from the data source Manual step in the Modernisation Platform. Each data source should be a “direct query” unless performance problems arise.
2 create analyses in QuickSight Manual step in the Modernisation Platform. Each analysis should be specific to the business problem.
This page was last reviewed on 6 September 2024. It needs to be reviewed again on 6 March 2025 by the page owner #interventions-dev .
This page was set to be reviewed before 6 March 2025 by the page owner #interventions-dev. This might mean the content is out of date.