Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/luke100000/immersiveapi
A diverse set of API endpoints
https://github.com/luke100000/immersiveapi
Last synced: about 1 month ago
JSON representation
A diverse set of API endpoints
- Host: GitHub
- URL: https://github.com/luke100000/immersiveapi
- Owner: Luke100000
- Created: 2023-08-10T13:21:58.000Z (over 1 year ago)
- Default Branch: master
- Last Pushed: 2024-09-16T10:41:33.000Z (about 2 months ago)
- Last Synced: 2024-09-16T12:20:21.068Z (about 2 months ago)
- Language: Python
- Size: 3.47 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# ImmersiveAPI
A diverse set of API endpoints for various projects to avoid overhead introduced by creating new repos, deployments,
etc.## Structure
* `modules` - Each directory represents a module and contains at least a `.py` with the same name, containing the
initializer.
* `common` - Shared code between modules
* `cache` - Persistent cache for data, each module should have its own subdirectory
* `temp` - Temporary files, each module should clean up after the task is done
* `data` - Static data
* `config.toml` / `default_config.toml` - Used to toggle modules and pass additional config## Module
Each module is a group of **self-contained, async, thread safe** endpoints.
````py
from main import Configuratordef init(configurator: Configurator):
configurator.register("Name", "Description")@configurator.post("/v1/your_module/your_endpoint")
async def your_endpoint():
return {"hello": "world"}
````## Heavy lifting
For heavy background work, use the worker process pool.
Either use the primary executor, shared among all endpoints:
```py
from common.worker import get_primary_executordef generate_text():
return "Hello, world!"async def your_endpoint():
# noinspection PyUnusedLocal
result = await get_primary_executor().submit(
0, # priority
generate_text,
# args...
# kwargs...
)
```Or create a private executor for your module:
```py
from common.worker import Executorexecutor = Executor(1)
```## Not process-safe
Do not launch with multiple workers, not all operations are process-safe, and especially the ML endpoints would blow up in memory. Use (background) workers instead.