{"id":26716409,"url":"https://github.com/processing/processing-contributions","last_synced_at":"2025-04-14T01:25:10.759Z","repository":{"id":271546775,"uuid":"880326144","full_name":"processing/processing-contributions","owner":"processing","description":"This repo holds the list of user contributed libraries, tools, modes, and examples and the scripts to convert this list to the appropriate format for the PDE Contribution Manager and the website.","archived":false,"fork":false,"pushed_at":"2025-04-12T04:49:42.000Z","size":578,"stargazers_count":3,"open_issues_count":5,"forks_count":5,"subscribers_count":6,"default_branch":"main","last_synced_at":"2025-04-12T05:30:22.586Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/processing.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-10-29T14:23:23.000Z","updated_at":"2025-04-12T04:49:45.000Z","dependencies_parsed_at":null,"dependency_job_id":"9f89b0e4-bb4d-49c9-bc22-a98f516296ec","html_url":"https://github.com/processing/processing-contributions","commit_stats":null,"previous_names":["processing/processing-contributions"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/processing%2Fprocessing-contributions","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/processing%2Fprocessing-contributions/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/processing%2Fprocessing-contributions/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/processing%2Fprocessing-contributions/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/processing","download_url":"https://codeload.github.com/processing/processing-contributions/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248805825,"owners_count":21164395,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-03-27T15:27:04.585Z","updated_at":"2025-04-14T01:25:10.717Z","avatar_url":"https://github.com/processing.png","language":"Python","readme":"# Processing Contributions\n\nThis repository contains the list of Processing libraries, tools, modes, and examples contributed by the community. Contributions added here will appear in the Contribution Manager and on the Processing.org website.\n\n\u003e [!TIP]\n\u003e Ready to publish your contribution? Submit it through the [GitHub issue forms for new contributions](https://github.com/processing/processing-contributions/issues/new/choose).\n\n## Technical information\n\nAll contributions are stored in a contributions database file in yaml format, `contributions.yaml`.\nConsumers of this data are the Processing website, and the Processing application.\n\n![Example contribution object showing the key-value pairs, as well as new fields like ‘status’ and ‘override’](https://github.com/user-attachments/assets/6005b31f-167a-435d-9087-d7f1b57a220e)\n\nWithin the `scripts` folder are scripts for parsing and validating the data from the \nproperties file. These are used by a Github action to processing new contributions and add them to the\ndatabase. The `issue_to_pr.yml` workflow is triggered by a new issue for registering a new contribution.\nIt will then retrieve the properties file provided in the issue, parse and validate, and then if valid,\nadd the new information to the `contributions.yaml` database file in a new pull request.\n\n### Data structure\nAll contributions are stored in a contributions database file in yaml format, `contributions.yaml`.\nEach entry contains the fields found in the properties file, such as\n* The fields from the `library.properties` file are: `name`, `version`, `prettyVersion`, \n`minRevision`, `maxRevision`, `authors`, `url`, `type`, `categories`, `sentence`, `paragraph`. These\nfields will also be in the database, and will be the value in the library property text file. If\nany of these values should be overridden, please read below about the `override` field.\n\nAdditional fields are\n* `id`: integer value id. \n* `source`: url of the properties file, from the published library\n* `download`: url of the zip file containing the library\n* `status` - Possible values are \n   * `DEPRECATED` - Libraries that seem to be permanently down, or have been deprecated. \n   These are libraries that are commented out of `source.conf`. This is manually set.\n   * `BROKEN` - libraries whose properties file cannot be retrieved, but we will still check. \n   These are libraries listed in `skipped.conf`\n   * `VALID` - libraries that are valid and available\n* `override` - This is an object, where any component field values will replace the existing field values. For example, libraries in the `broken.conf` file are outdated, and we want to cap the\n`maxRevision` to `228`. This cap can be applied by setting `override` to {`maxRevision`: `228`}\n* `log` - Any notes of explanation, such as why a library was labeled `BROKEN`\n* These fields are included, but the data is not comprehensive. It would require pulling data from the archives. \n   * `previousVersions` - a list of previous `prettyVersion` values. This will be added whenever a new \n   library is updated. To have complete data for this field will require some detective work into the archives.\n   * `dateAdded` - Date library was added to contributions. This will be added whenever a new library is\n   added. To have complete data for this field will require some detective work into the archives.\n   * `lastUpdated` - Date library was last updated in the repo. This will be added whenever a library is\n   updated. To have complete data for this field will require waiting for all libraries to be updated, or\n   will require some detective work into the archives.\n\n### Scripts\nThe scripts folder contains scripts in Python for parsing, validating, and processing the database \nfile and properties files.\n\n* `add_new_contribution_to_yaml.py`: script to be used from command line, that is called only by the \nissue_to_pr Github workflow. It takes two arguments:\n  * contribution type - such as `library`, `examples`, `tool` or `mode`.\n  * source url - the url for the properties file in the published library to be added as a new contribution.\n* `fetch_updates.py`: script that can be called from command line, that will update a specified contribution (via id)\nor it will update the full contributions database file. It will update by retrieving the content in the `source` url.\nIf the version has iterated, it will overwrite the previous entry in the database file.\n* `parse_and_validate_properties_txt.py`: tools for parsing and validating properties text files. the `issue_to_pr`\nGithub workflow will call this from command line. If the data is valid, this will set an environment variable in \nthe workflow environment to the contribution object.\n* `to_contribs_txt.py`: processes the `contributions.yaml` database file to `pde/contribs.txt` for consumption\nby the contribution manager in the Processing application.\n* `to_sources_jsons.py`: processes the `contributions.yaml` database file to individual json files in the `sources` \nfolder, for consumption by the Processing website.\n* `utils.py`: utility functions used by multiple script files.\n\n\n### Outputs\n\nAt this time, the website requires a folder of json files, where each json file is a separate \ncontribution. These files are created from the database using the script `scripts/to_source_jsons.py`.\n\nThe Processing application's contribution manager reads in a `contribs.txt` file.\nThis file is created from the database using the script `scripts/to_contribs_txt.py`.\n\n\n## Contributors\n\nThis repository was created by Claudine Chen ([@mingness](https://github.com/mingness)) as part of the \n2024 New Beginnings (pr05) Grant from the [Processing Foundation](https://github.com/processing), to simplify the\nworkflows for libraries, tools, and modes, mentored by Stef Tervelde ([@Stefterv](https://github.com/stefterv)).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fprocessing%2Fprocessing-contributions","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fprocessing%2Fprocessing-contributions","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fprocessing%2Fprocessing-contributions/lists"}