{"id":19746343,"url":"https://github.com/astrolabsoftware/fink-client","last_synced_at":"2025-04-30T08:30:57.070Z","repository":{"id":40394902,"uuid":"197523150","full_name":"astrolabsoftware/fink-client","owner":"astrolabsoftware","description":"Light-weight client to manipulate alerts from Fink","archived":false,"fork":false,"pushed_at":"2024-09-13T08:03:07.000Z","size":29520,"stargazers_count":9,"open_issues_count":16,"forks_count":8,"subscribers_count":4,"default_branch":"master","last_synced_at":"2024-10-13T17:16:59.040Z","etag":null,"topics":["apache-kafka","astronomy","client","dashboard"],"latest_commit_sha":null,"homepage":"https://fink-broker.readthedocs.io/en/latest/broker/services_summary/","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/astrolabsoftware.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-07-18T06:12:45.000Z","updated_at":"2024-09-20T18:54:58.000Z","dependencies_parsed_at":"2024-06-03T13:11:58.573Z","dependency_job_id":"c4048074-645b-4eb9-9a71-8baa06930c38","html_url":"https://github.com/astrolabsoftware/fink-client","commit_stats":{"total_commits":296,"total_committers":6,"mean_commits":"49.333333333333336","dds":"0.21959459459459463","last_synced_commit":"aa8d2818bafc800acd6e64718b9bff56c8cf7102"},"previous_names":[],"tags_count":35,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/astrolabsoftware%2Ffink-client","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/astrolabsoftware%2Ffink-client/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/astrolabsoftware%2Ffink-client/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/astrolabsoftware%2Ffink-client/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/astrolabsoftware","download_url":"https://codeload.github.com/astrolabsoftware/fink-client/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":224202819,"owners_count":17272807,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["apache-kafka","astronomy","client","dashboard"],"created_at":"2024-11-12T02:14:20.673Z","updated_at":"2024-11-12T02:14:21.167Z","avatar_url":"https://github.com/astrolabsoftware.png","language":"Jupyter Notebook","readme":"[![pypi](https://img.shields.io/pypi/v/fink-client.svg)](https://pypi.python.org/pypi/fink-client)\n[![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=fink-client\u0026metric=alert_status)](https://sonarcloud.io/dashboard?id=fink-client)\n[![Maintainability Rating](https://sonarcloud.io/api/project_badges/measure?project=fink-client\u0026metric=sqale_rating)](https://sonarcloud.io/dashboard?id=fink-client)\n[![Sentinel](https://github.com/astrolabsoftware/fink-client/workflows/Sentinel/badge.svg)](https://github.com/astrolabsoftware/fink-client/actions?query=workflow%3ASentinel)\n[![codecov](https://codecov.io/gh/astrolabsoftware/fink-client/branch/master/graph/badge.svg)](https://codecov.io/gh/astrolabsoftware/fink-client)\n[![Documentation Status](https://readthedocs.org/projects/fink-broker/badge/?version=latest)](https://fink-broker.readthedocs.io/en/latest/?badge=latest)\n\n# Fink client\n\n`fink-client` is a light package to manipulate catalogs and alerts issued from the [fink broker](https://github.com/astrolabsoftware/fink-broker) programmatically. It is used in the context of 2 major Fink services: Livestream and Data Transfer.\n\n## Installation\n\n`fink_client` requires a version of Python 3.9+.\n\n### Install with pip\n\n```bash\npip install fink-client --upgrade\n```\n\n### Use or develop in a controlled environment\n\nFor development, we recommend the use of a virtual environment:\n\n```bash\ngit clone https://github.com/astrolabsoftware/fink-client.git\ncd fink-client\npython -m venv .fc_env\nsource .fc_env/bin/activate\npip install -r requirements.txt\npip install .\n```\n\n## Registration\n\nIn order to connect and poll alerts from Fink, you need to get your credentials:\n\n1. Subscribe to one or more Fink streams by filling this [form](https://forms.gle/2td4jysT4e9pkf889).\n2. After filling the form, we will send your credentials. Register them on your laptop by simply running:\n  ```\n  fink_client_register -username \u003cUSERNAME\u003e -group_id \u003cGROUP_ID\u003e ...\n  ```\n\n## Livestream usage\n\nOnce you have your credentials, you are ready to poll streams! You can easily access the documentation using `-h` or `--help`:\n\n```bash\nfink_consumer -h\nusage: fink_consumer [-h] [--display] [--display_statistics] [-limit LIMIT]\n                     [--available_topics] [--save] [-outdir OUTDIR]\n                     [-schema SCHEMA] [--dump_schema] [-start_at START_AT]\n\nKafka consumer to listen and archive Fink streams from the Livestream service\n\noptional arguments:\n  -h, --help            show this help message and exit\n  --display             If specified, print on screen information about\n                        incoming alert.\n  --display_statistics  If specified, print on screen information about queues,\n                        and exit.\n  -limit LIMIT          If specified, download only `limit` alerts. Default is\n                        None.\n  --available_topics    If specified, print on screen information about\n                        available topics.\n  --save                If specified, save alert data on disk (Avro). See also\n                        -outdir.\n  -outdir OUTDIR        Folder to store incoming alerts if --save is set. It\n                        must exist.\n  -schema SCHEMA        Avro schema to decode the incoming alerts. Default is\n                        None (version taken from each alert)\n  --dump_schema         If specified, save the schema on disk (json file)\n  -start_at START_AT    If specified, reset offsets to 0 (`earliest`) or empty\n                        queue (`latest`).\n```\n\nYou can also look at an alert on the disk:\n\n```bash\nfink_alert_viewer -h\nusage: fink_alert_viewer [-h] [-filename FILENAME]\n\nDisplay cutouts and lightcurve from a ZTF alert\n\noptional arguments:\n  -h, --help          show this help message and exit\n  -filename FILENAME  Path to an alert data file (avro format)\n```\n\nMore information at [docs/livestream](https://fink-broker.readthedocs.io/en/latest/services/livestream).\n\n## Data Transfer usage\n\nIf you requested data using the [Data Transfer service](https://fink-portal.org/download), you can easily poll your stream using:\n\n```bash\nusage: fink_datatransfer.py [-h] [-topic TOPIC] [-limit LIMIT] [-outdir OUTDIR] [-partitionby PARTITIONBY] [-batchsize BATCHSIZE] [-nconsumers NCONSUMERS]\n                            [-maxtimeout MAXTIMEOUT] [-number_partitions NUMBER_PARTITIONS] [--restart_from_beginning] [--verbose]\n\nKafka consumer to listen and archive Fink streams from the data transfer service\n\noptional arguments:\n  -h, --help            show this help message and exit\n  -topic TOPIC          Topic name for the stream that contains the data.\n  -limit LIMIT          If specified, download only `limit` alerts from the stream. Default is None, that is download all alerts.\n  -outdir OUTDIR        Folder to store incoming alerts. It will be created if it does not exist.\n  -partitionby PARTITIONBY\n                        Partition data by `time` (year=YYYY/month=MM/day=DD), or `finkclass` (finkclass=CLASS), or `tnsclass` (tnsclass=CLASS). `classId` is\n                        also available for ELASTiCC data. Default is time.\n  -batchsize BATCHSIZE  Maximum number of alert within the `maxtimeout` (see conf). Default is 1000 alerts.\n  -nconsumers NCONSUMERS\n                        Number of parallel consumer to use. Default (-1) is the number of logical CPUs in the system.\n  -maxtimeout MAXTIMEOUT\n                        Overwrite the default timeout (in seconds) from user configuration. Default is None.\n  -number_partitions NUMBER_PARTITIONS\n                        Number of partitions for the topic in the distant Kafka cluster. Do not touch unless you know what your are doing. Default is 10\n                        (Fink Kafka cluster)\n  --restart_from_beginning\n                        If specified, restart downloading from the 1st alert in the stream. Default is False.\n  --verbose             If specified, print on screen information about the consuming.\n```\n\nMore information at [docs/datatransfer](https://fink-broker.readthedocs.io/en/latest/services/data_transfer/).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fastrolabsoftware%2Ffink-client","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fastrolabsoftware%2Ffink-client","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fastrolabsoftware%2Ffink-client/lists"}