Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/awesome-mlops/awesome-ml-monitoring

A curated list of awesome open source tools and commercial products for monitoring data quality, monitoring model performance, and profiling data 🚀
https://github.com/awesome-mlops/awesome-ml-monitoring

List: awesome-ml-monitoring

aiops concept-drift data-cleaning data-drift data-management data-monitoring data-quality data-science dataops datascience deep-learning machine-learning machine-learning-platform mlops model-drift model-explainability model-management model-monitoring model-performance

Last synced: 15 days ago
JSON representation

A curated list of awesome open source tools and commercial products for monitoring data quality, monitoring model performance, and profiling data 🚀

Awesome Lists containing this project

README

        

# awesome-ml-monitoring

A curated list of awesome open source tools and commercial products for monitoring data quality, monitoring model performance, and profiling data 🚀

* [Aporia](https://www.aporia.com/): Observability with customized monitoring and explainability for ML models.
* [Arize](https://github.com/Arize-ai/client_python): An end-to-end ML observability and model monitoring platform.
* [Datatile](https://github.com/polyaxon/datatile): A library for managing, summarizing, and visualizing data.
* [DataProfiler](https://github.com/capitalone/DataProfiler): A Python library designed to make data analysis, monitoring and sensitive data detection easy.
* [Deepchecks](https://github.com/deepchecks/deepchecks): Test Suites for Validating ML Models & Data. Deepchecks is a Python package for comprehensively validating your machine learning models and data with minimal effort.
* [Evidently](https://github.com/evidentlyai/evidently): Interactive reports to analyze ML models during validation or production monitoring.
* [Fiddler](https://www.fiddler.ai/): Monitor, explain, and analyze your AI in production.
* [Great Expectations](https://github.com/great-expectations/great_expectations): Helps data teams eliminate pipeline debt, through data testing, documentation, and profiling.
* [Manifold](https://github.com/uber/manifold): A model-agnostic visual debugging tool for machine learning.
* [Netron](https://github.com/lutzroeder/netron): Visualizer for neural network, deep learning, and machine learning models.
* [Pandas Profiling](https://github.com/pandas-profiling/pandas-profiling): Extends the pandas DataFrame with df.profile_report() for quick data analysis.
* [Pandera](https://github.com/pandera-dev/pandera): A light-weight, flexible, and expressive data validation library for dataframes.
* [Superwise](https://www.superwise.ai): Fully automated, enterprise-grade model observability in a self-service SaaS platform.
* [Whylogs](https://github.com/whylabs/whylogs): The open source standard for data logging. Enables ML monitoring and observability.
* [ydata-quality](https://github.com/ydataai/ydata-quality): Data Quality assessment with one line of code.
* [Yellowbrick](https://github.com/DistrictDataLabs/yellowbrick): Visual analysis and diagnostic tools to facilitate machine learning model selection.
* [Soda Core](https://github.com/sodadata/soda-core): Data profiling, testing, and monitoring for SQL accessible data.