Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/jimbrig/lossrunAnalyzer
R Package and Shiny App to Analyze Insurance Lossruns
https://github.com/jimbrig/lossrunAnalyzer
actuarial data-analysis data-mining data-science insurance r record-linkage risk-management shiny
Last synced: 9 days ago
JSON representation
R Package and Shiny App to Analyze Insurance Lossruns
- Host: GitHub
- URL: https://github.com/jimbrig/lossrunAnalyzer
- Owner: jimbrig
- Archived: true
- Created: 2020-01-10T20:01:18.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2020-01-10T20:52:20.000Z (almost 5 years ago)
- Last Synced: 2024-08-13T07:14:15.411Z (4 months ago)
- Topics: actuarial, data-analysis, data-mining, data-science, insurance, r, record-linkage, risk-management, shiny
- Language: R
- Size: 11.7 KB
- Stars: 4
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.Rmd
Awesome Lists containing this project
- jimsghstars - jimbrig/lossrunAnalyzer - R Package and Shiny App to Analyze Insurance Lossruns (R)
README
---
output: github_document
---```{r, include = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>",
fig.path = "man/figures/README-",
out.width = "100%"
)
```# lossrunAnalyzer - Analyzing Insurance Claims Data
The goal of **lossrunAnalyzer** is to assist actuaries to quickly analyze,
diagnose, and summarize lossruns containing individual claims data for
property casualty insurance.## Badges
[![Lifecycle: experimental](https://img.shields.io/badge/lifecycle-experimental-orange.svg)](https://www.tidyverse.org/lifecycle/#experimental)
[![Project Status: WIP](https://www.repostatus.org/badges/latest/wip.svg)](http://www.repostatus.org/#wip)## Installation
You can install lossrunAnalyzer from [GitHub](https://github.com/) with:
``` r
# install.packages("devtools")
devtools::install_github("jimbrig2011/lossrunAnalyzer")
```## Roadmap
The end-goal of **lossrunAnalyzer** is to provide support for the following:
- Initial Reasonability Checks:
+ Unique claim ID
+ Paid + Case = Incurred
+ Totals = Sum of Splits
+ Report Date >= Loss Date
+ Field Consistency (i.e. States, Status, etc.)
- Possible Duplicate Detection- Occurrence Grouping
- Adding "Working" Fields
+ Retentions / Limits / Deductibles
+ Various "Limited" Amounts
+ ALAE Treatments
+ Years (Policy, Accident, Report, Fiscal, Calendar)
+ Lags (Report, Close, Tenure)
+ Max IBNR's at various scenarios
+ Legal, Lost-Time, Indemnity Support- Utilizing Lookup / Support Tables
- Record Linkage to Reduce Fuzzyness
- Merging Lossruns Across Multiple Evaluations
- Comparing Lossruns to Prior's and Addressing Various KPI's / Diagnostic Checks
- Anomaly / Outlier Detection
- Automating Development Comments as to why things changed
- Summarizing Data into Triangles
- Performing an AvE (Actual vs. Expected) Analysis Summary
- Checking for Dropped / Missing Claims or New Claims with Old Dates
- Tie Out to Exposures