{"id":28304526,"url":"https://github.com/sun-lab-nbb/sl-experiment","last_synced_at":"2026-03-07T22:31:11.466Z","repository":{"id":286844353,"uuid":"911244021","full_name":"Sun-Lab-NBB/sl-experiment","owner":"Sun-Lab-NBB","description":"A Python library that provides tools to acquire, manage, and preprocess scientific data in the Sun (NeuroAI) lab.","archived":false,"fork":false,"pushed_at":"2026-01-24T02:43:02.000Z","size":2009,"stargazers_count":2,"open_issues_count":0,"forks_count":1,"subscribers_count":2,"default_branch":"main","last_synced_at":"2026-01-24T13:29:45.598Z","etag":null,"topics":["ataraxis","data","experiment","methods","neuroscience"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Sun-Lab-NBB.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2025-01-02T15:11:04.000Z","updated_at":"2026-01-24T02:43:05.000Z","dependencies_parsed_at":"2025-12-13T00:04:13.511Z","dependency_job_id":null,"html_url":"https://github.com/Sun-Lab-NBB/sl-experiment","commit_stats":null,"previous_names":["sun-lab-nbb/sl-experiment"],"tags_count":5,"template":false,"template_full_name":null,"purl":"pkg:github/Sun-Lab-NBB/sl-experiment","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Sun-Lab-NBB%2Fsl-experiment","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Sun-Lab-NBB%2Fsl-experiment/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Sun-Lab-NBB%2Fsl-experiment/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Sun-Lab-NBB%2Fsl-experiment/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Sun-Lab-NBB","download_url":"https://codeload.github.com/Sun-Lab-NBB/sl-experiment/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Sun-Lab-NBB%2Fsl-experiment/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":30234479,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-07T19:01:10.287Z","status":"ssl_error","status_checked_at":"2026-03-07T18:59:58.103Z","response_time":53,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ataraxis","data","experiment","methods","neuroscience"],"created_at":"2025-05-24T00:13:19.657Z","updated_at":"2026-03-07T22:31:11.422Z","avatar_url":"https://github.com/Sun-Lab-NBB.png","language":"Python","readme":"# sl-experiment\n\nA Python library that provides tools to acquire, manage, and preprocess scientific data in the Sun (NeuroAI) lab.\n\n![PyPI - Version](https://img.shields.io/pypi/v/sl-experiment)\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/sl-experiment)\n[![uv](https://tinyurl.com/uvbadge)](https://github.com/astral-sh/uv)\n[![Ruff](https://tinyurl.com/ruffbadge)](https://github.com/astral-sh/ruff)\n![type-checked: mypy](https://img.shields.io/badge/type--checked-mypy-blue?style=flat-square\u0026logo=python)\n![PyPI - License](https://img.shields.io/pypi/l/sl-experiment)\n![PyPI - Status](https://img.shields.io/pypi/status/sl-experiment)\n![PyPI - Wheel](https://img.shields.io/pypi/wheel/sl-experiment)\n___\n\n## Detailed Description\n\nThis library functions as the central hub for collecting and preprocessing the data shared by all individual Sun lab \nprojects. To do so, it exposes the API that allows interfacing with the hardware making up the overall Mesoscope-VR \n(Virtual Reality) system used in the lab and working with the data collected via this hardware. Primarily, this involves\nspecializing varius general-purpose libraries, released as part of the 'Ataraxis' science-automation project\nto work within the specific hardware implementations used in the lab.\n\nThis library is explicitly designed to work with the specific hardware and data handling strategies used in the Sun lab,\nand will likely not work in other contexts without extensive modification. It is made public to serve as the real-world \nexample of how to use 'Ataraxis' libraries to acquire and preprocess scientific data.\n\nCurrently, the Mesoscope-VR system consists of three major parts: \n1. The [2P-Random-Access-Mesoscope (2P-RAM)](https://elifesciences.org/articles/14472), assembled by \n   [Thor Labs](https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=10646) and controlled by \n   [ScanImage](https://www.mbfbioscience.com/products/scanimage/) software. The Mesoscope control and data acquisition \n   are performed by a dedicated computer referred to as the 'ScanImagePC' or 'Mesoscope PC.' \n2. The [Unity game engine](https://unity.com/products/unity-engine) running the Virtual Reality game world used in all \n   experiments to control the task environment and resolve the task logic. The virtual environment runs on the main data\n   acquisition computer referred to as the 'VRPC.'\n3. The [microcontroller-powered](https://github.com/Sun-Lab-NBB/sl-micro-controllers) hardware that allows \n   bidirectionally interfacing with the Virtual Reality world and collecting non-visual animal behavior data. This \n   hardware, as well as dedicated camera hardware used to record visual behavior data, is controlled through the 'VRPC.'\n___\n\n## Table of Contents\n\n- [Dependencies](#dependencies)\n- [Installation](#installation)\n- [System Assembly](#system-assembly)\n- [Usage](#usage)\n- [API Documentation](#api-documentation)\n- [Recovering from Interruptions](#recovering-from-interruptions)\n- [Versioning](#versioning)\n- [Authors](#authors)\n- [License](#license)\n- [Acknowledgements](#Acknowledgments)\n___\n\n## Dependencies\n\n### Main Dependency\n- ***Linux*** operating system. While the library may also work on Windows and macOS, it has been explicitly written for\n  and tested on mainline [6.11 kernel](https://kernelnewbies.org/Linux_6.11) and Ubuntu 24.10 distribution of the GNU \n  Linux operating system.\n\n### Software Dependencies\n***Note!*** This list only includes external dependencies that are required to run the library, in addition to all \ndependencies automatically installed from pip / conda as part of library installation. The dependencies below have to\nbe installed and configured on the **VRPC** before calling runtime commands via the command-line interface (CLI) exposed\nby this library.\n\n- [MQTT broker](https://mosquitto.org/). The broker should be running locally with the **default** IP (27.0.0.1) and \n  Port (1883) configuration.\n- [FFMPEG](https://www.ffmpeg.org/download.html). As a minimum, the version of FFMPEG should support H265 and H264 \n  codecs with hardware acceleration (Nvidia GPU). It is typically safe to use the latest available version.\n- [MvImpactAcquire](https://assets-2.balluff.com/mvIMPACT_Acquire/) GenTL producer. This library is used with version \n  **2.9.2**, which is freely distributed. Higher GenTL producer versions will likely work too, but they require \n  purchasing a license.\n- [Zaber Launcher](https://software.zaber.com/zaber-launcher/download). Use the latest available release.\n- [Unity Game Engine](https://unity.com/products/unity-engine). Use the latest available release.\n---\n\n### Hardware Dependencies\n\n**Note!** These dependencies only apply to the 'VRPC,' the main PC that runs the data acquisition and \npreprocessing pipelines. Hardware dependencies for the ScanImagePC are determined by ThorLabs.\n\n- [Nvidia GPU](https://www.nvidia.com/en-us/). This library uses GPU hardware acceleration to encode acquired video \n  data. Any Nvidia GPU with hardware encoding chip(s) should work as expected. The library was tested with RTX 4090.\n- A CPU with at least 12, preferably 16, physical cores. This library has been tested with \n  [AMD Ryzen 7950X CPU](https://www.amd.com/en/products/processors/desktops/ryzen/7000-series/amd-ryzen-9-7950x.html). \n  It is recommended to use CPUs with 'full' cores, instead of the modern Intel’s design of 'e' and 'p' cores \n  for predictable performance of all library components.\n- A 10-Gigabit capable motherboard or Ethernet adapter, such as [X550-T2](https://shorturl.at/fLLe9). Primarily, this is\n  required for the high-quality machine vision camera used to record videos of the animal’s face. We also use 10-Gigabit\n  lines for transferring the data between the PCs used in the data acquisition process and destinations used for \n  long-term data storage (see [data management section](#data-structure-and-management)).\n___\n\n## Installation\n\n### Source\n\nNote, installation from source is ***highly discouraged*** for everyone who is not an active project developer.\n\n1. Download this repository to your local machine using your preferred method, such as Git-cloning. Use one\n   of the stable releases from [GitHub](https://github.com/Sun-Lab-NBB/sl-experiment/releases).\n2. Unpack the downloaded zip and note the path to the binary wheel (`.whl`) file contained in the archive.\n3. Run ```python -m pip install WHEEL_PATH```, replacing 'WHEEL_PATH' with the path to the wheel file, to install the \n   wheel into the active python environment.\n\n### pip\nUse the following command to install the library using pip: ```pip install sl-experiment```.\n___\n\n## System Assembly\n\nThe Mesoscope-VR system consists of multiple interdependent components. We are constantly making minor changes to the \nsystem to optimize its performance and facilitate novel experiments and projects carried out in the lab. Treat this \nsection as a general system composition guide, but consult our publications over this section for instructions on \nbuilding specific system implementations used for various projects.\n\nPhysical assembly and mounting of ***all*** hardware components mentioned in the specific subsections below is discussed\nin the [main Mesoscope-VR assembly section](#mesoscope-vr-assembly).\n\n### Zaber Motors\nAll brain activity recordings with the mesoscope require the animal to be head-fixed. To orient head-fixed animals on \nthe Virtual Reality treadmill (running wheel) and promote task performance, we use two groups of motors controlled \nthough Zaber motor controllers. The first group, the **HeadBar**, is used to position the animal’s head in \nZ, Pitch, and Roll axes. Together with the movement axes of the Mesoscope, this allows for a wide range of \nmotions necessary to promote good animal running behavior and brain activity data collection. The second group of \nmotors, the **LickPort**, controls the position of the water delivery port (and sensor) in X, Y, and Z axes. This\nis used to ensure all animals have comfortable access to the water delivery tube, regardless of their head position.\n\nThe current snapshot of Zaber motor configurations used in the lab, alongside motor parts list and electrical wiring \ninstructions, is available \n[here](https://drive.google.com/drive/folders/1SL75KE3S2vuR9TTkxe6N4wvrYdK-Zmxn?usp=drive_link).\n\n**Warning!** Zaber motors have to be configured correctly to work with this library. To (re)configure the motors to work\nwith the library, apply the setting snapshots from the link above via the \n[Zaber Launcher](https://software.zaber.com/zaber-launcher/download) software. Make sure you read the instructions in \nthe 'Applying Zaber Configuration' document for the correct application procedure.\n\n**Although this is highly discouraged, you can also edit the motor settings manually**. To configure the motors\nto work with this library, you need to overwrite the non-volatile User Data of each motor device (controller) with\nthe data expected by this library. See the [API documentation](https://sl-experiment-api-docs.netlify.app/) for the \n**ZaberSettings** class to learn more about the settings used by this library. See the source code from the \n[zaber_bindings.py](/src/sl_experiment/zaber_bindings.py) module to learn how these settings are used during runtime.\n\n### Behavior Cameras\nTo record the animal’s behavior, we use a group of three cameras. The **face_camera** is a high-end machine-vision \ncamera used to record the animal’s face with approximately 3-MegaPixel resolution. The **left-camera** and \n**right_camera** are 1080P security cameras used to record the body of the animal. Only the data recorded by the \n**face_camera** is currently used during data processing and analysis. We use custom \n[ataraxis-video-system](https://github.com/Sun-Lab-NBB/ataraxis-video-system) bindings to interface with and record the \nframes acquired by all cameras.\n\nSpecific information about the components used by the camera systems, as well as the snapshot of the configuration \nparameters used by the **face_camera**, is available \n[here]https://drive.google.com/drive/folders/1l9dLT2s1ysdA3lLpYfLT1gQlTXotq79l?usp=sharing).\n\n### MicroControllers\nTo interface with all components of the Mesoscope-VR system **other** than cameras and Zaber motors, we use Teensy 4.1 \nmicrocontrollers with specialized [ataraxis-micro-controller](https://github.com/Sun-Lab-NBB/ataraxis-micro-controller) \ncode. Currently, we use three isolated microcontroller systems: **Actor**, **Sensor**, and **Encoder**.\n\nFor instructions on assembling and wiring the electronic components used in each microcontroller system, as well as the \ncode running on each microcontroller, see the \n[microcontroller repository](https://github.com/Sun-Lab-NBB/sl-micro-controllers).\n\n### Unity Game World\nThe task environment used in Sun lab experiments is rendered and controlled by the Unity game engine. To make Unity work\nwith this library, each project-specific Unity task must use the bindings and assets released as part of our \n[GIMBL-tasks repository](https://github.com/Sun-Lab-NBB/GIMBL-tasks). Follow the instructions from that repository to \nset up Unity Game engine to run Sun lab experiment tasks.\n\n**Note** This library does not contain tools to initialize Unity Game engine. The desired Virtual Reality task\nhas to be started ***manually*** before initializing the main experiment runtime through this library. The main Unity \nrepository contains more details about starting the virtual reality tasks when running experiments.\n\n### Google Sheets API Integration\n\nThis library is statically configured to interact with various Google Sheet files used in the Sun lab. Currently, this \nincludes two files: the **surgery log** and the **water restriction log**. Primarily, this part of the library is \ndesigned as a convenience feature for lab members and to back up and store all project-related data in the same place.\n\n#### Setting up Google Sheets API Access\n\n**If you already have a service Google Sheets API account, skip to the next section.** Typically, we use the same \nservice account for all projects and log files.\n\n1. Log into the [Google Cloud Console](https://shorturl.at/qiDYc). \n2. Create a new project.\n3. Navigate to APIs \u0026 Services \u003e Library and enable the Google Sheets API for the project. \n4. Under IAM \u0026 Admin \u003e Service Accounts, create a service account. This will generate a service account ID in the format\n   of `your-service-account@gserviceaccount.com`.\n5. Select Manage Keys from the Actions menu and, if a key does not already exist, create a new key and download the \n   private key in JSON format. This key is then used to access the Google Sheets.\n\n#### Adding Google Sheets Access to the Service Account\nTo access the **surgery log** and the **water restriction log** Google Sheets as part of this library runtime, create \nand share these log files with the email of the service account created above. The service account requires **Viewer** \naccess to the **surgery log** file and **Editor** access to the **water restriction log** file.\n\n**Note!** This feature expects that both log files are formatted according to the available Sun lab templates. \nOtherwise, the parsing algorithm will not behave as expected, leading to runtime failure.\n\n### Mesoscope-VR Assembly:\n***This section is currently a placeholder. Since we are actively working on the final Mesoscope-VR design, it will be \npopulated once we have a final design implementation.***\n\n___\n\n## Data Structure and Management\n\nThe library defines a fixed structure for storing all acquired data which uses a 4-level directory tree hierarchy: \n**root**, **project**, **animal**, and **session**.\n\nCurrently, our pipeline uses a total of four computers when working with data. **VRPC** and **ScanImagePC** are used to \nacquire and preprocess the data. After preprocessing, the data is moved to the **BioHPC server** and the \n**Synology NAS** for long-term storage and preprocessing. All data movement is performed over 10-Gigabit local networks \nwithin the lab and broader Cornell infrastructure.\n\n***Critical!*** Although this library primarily operates the VRPC, it expects the root data directories for all other \nPCs used for data acquisition or storage in the lab to be **mounted to the VRPC filesystem using the SMB3 protocol**. \nIn turn, this allows the library to maintain the same data hierarchies across all storage machines.\n\nGenerally, the library tries to maintain at least two copies of data for long-term storage: one on the NAS and the other\non the BioHPC server. Moreover, until `sl-purge` (see below) command is used to clear the VRPC storage, an additional \ncopy of the acquired data is also stored on the VRPC for each recorded session. While this design achieves high data \nintegrity (and redundancy), we **highly encourage** all lab members to manually back up critical data to external \nSSD / HDD drives.\n\n### Root Directory\nWhen a training, experiment, or maintenance runtime command from this library is called for the first time, the library \nasks the user to provide the path to the root project directory on the VRPC. The data for all projects after this point \nis stored in that directory. This directory is referred to as the local **root** directory. Moreover, each\nproject can be configured with paths to the root directories on all other computers used for data acquisition or \nstorage (see below). However, it is expected that all projects in the lab use the same root directories for all \ncomputers.\n\n### Project directory\nWhen a new --project (-p) argument value is provided to any runtime command, the library generates a new **project**\ndirectory under the static **root** directory. The project directory uses the project name provided via the command\nargument as its name. As part of this process, a **configuration** subdirectory is also created under the \n**project** directory.\n\n***Critical!*** Inside the **configuration** subdirectory, the library automatically creates a \n**project_configuration.yaml** file. Open that file with a text editor and edit the fields in the file to specify the \nproject configuration. Review the [API documentation](https://sl-experiment-api-docs.netlify.app/) for the \n**ProjectConfiguration** class to learn more about the purpose of each configuration file field.\n\nTogether with the **project_configuration.yaml**, the library also creates an example **default_experiment.yaml**\nfile. Each experiment that needs to be carried out as part of this project needs to have a dedicated .yaml file, named\nafter the experiment. For example, to run the 'default_experiment,' the library uses the configurations stored in \nthe 'default_experiment.yaml' file. You can use the default_experiment.yaml as an example for writing additional \nexperiment configurations. Review the [API documentation](https://sl-experiment-api-docs.netlify.app/) for the \n**ExperimentConfiguration** and **ExperimentState** classes to learn more about the purpose of each field inside the \nexperiment configuration .yaml file.\n\n### Animal directory\nWhen a new --animal (-a) argument value is provided to any runtime command, the library generates a new **animal**\ndirectory under the **root** and **project** directory combination. The directory uses the ID of the animal, \nprovided via the command argument as its name.\n\nUnder each animal directory, two additional directories are created. First, the **persistent_data** directory, which\nis used to store the information that has to stay on the VRPC when the acquired data is transferred from the VRPC to \nother destinations. Second, the **metadata** directory, which is used to store information that does not change between\nsessions, such as the information about the surgical procedures performed on the animal.\n\n### Session directory\nWhen any training or experiment runtime command is called, a new session directory is created under the **root**, \n**project** and **animal** directory combination. The session name is derived from the current UTC timestamp, accurate \nto microseconds. Together with other runtime controls, this makes it impossible to have sessions with duplicate names \nand ensures all sessions can always be sorted chronologically.\n\nSince the same directory tree is reused for data processing, all data acquired by this library is stored under the \n**raw_data** subdirectory, generated for each session. Overall, an example path to the acquired data can therefore \nlook like this: `/media/Data/Experiments/TestMice/666/2025-11-11-05-03-234123/raw_data`. Our data processing pipelines \ngenerate new files and subdirectories under the **processed_data** directory using the same **root**, **project**, \n**animal**, and **session** combination, e.g.`server/sun_data/TestMice/666/2025-11-11-05-03-234123/processed_data`.\n\n### Raw Data contents\nAfter acquisition and preprocessing, the **raw_data** folder will contain the following files and subdirectories:\n1. **zaber_positions.yaml**: Stores the snapshot of the HeadBar and LickPort motor positions taken at the end of the \n   runtime session.\n2. **hardware_configuration.yaml**: Stores the snapshot of some configuration parameters used by the hardware module \n   interfaces during runtime.\n3. **session_data.yaml**: Stores the paths and other data-management-related information used during runtime to save and\n   preprocess the acquired data.\n4. **session_descriptor.yaml**: Stores session-type-specific information, such as the training task parameters or \n   experimenter notes. For experiment runtimes, this file is co-opted to store the Mesoscope objective positions.\n5. **ax_checksum.txt**: Stores an xxHash-128 checksum used to verify data integrity when it is transferred to the \n   long-term storage destination.\n6. **behavior_data_log**: Stores compressed .npz log files. All non-video data acquired by the VRPC during runtime is \n   stored in these log files. This includes all messages sent or received by each microcontroller and the timestamps \n   for the frames acquired by each camera.\n7. **camera_frames** Stores the behavior videos recorded by each of the cameras.\n8. **mesoscope_frames**: Stores all Mesoscope-acquired data (frames, motion estimation files, etc.). This directory \n   will be empty for training sessions, as they do not acquire Mesoscope data.\n\n### ScanImagePC\n\nThe ScanImagePC uses a modified directory structure. First, under its **root** directory, there has to be a \n**mesoscope_frames** directory, where ***All*** ScanImage data is saved during each session runtime. During \npreprocessing, the library automatically empties this directory, allowing the same directory to be (re)used by all \nexperiment sessions.\n\nUnder the same **root** directory, the library also creates a **persistent_data** directory. That directory follows the \nsame hierarchy (**project** and **animal**) as the VRPC. Like the VRPC’s **persistent_data** directory, it is used to \nkeep the data that should not be removed from the ScanImagePC even after all data acquired for a particular session is \nmoved over for long-term storage.\n\n**Note!** For each runtime that uses the mesoscope, the library requires the user to generate a screenshot of the \ncranial window and the dot-alignment window. The easiest way to make this work is to reconfigure the default Windows \nscreenshot directory (ScanImagePC uses Windows OS) to be the root ScanImagePC data directory. This way, hitting \n'Windows+PrtSc' will automatically generate the .png screenshot under the ScanImagePc root directory, which is the \nexpected location used by this library.\n\n--- \n\n## Usage\n\nAll user-facing library functionality is realized through a set of Command-Line Interface (CLI) commands automatically \nexposed when the library is pip-installed into a python environment. Some of these commands take additional arguments \nthat allow further configuring their runtime. Use `--help` argument when calling any of the commands described below to\nsee the list of supported arguments together with their descriptions and default values.\n\nTo use any of the commands described below, activate the python environment where the libray is installed, e.g., with \n`conda activate myenv` and type one of the commands described below.\n\n***Warning!*** Most commands described below use the terminal to communicate important runtime information to the user \nor request user feedback. **Make sure you carefully read every message printed to the terminal during runtime**. \nFailure to do so may damage the equipment or harm the animal.\n\n### sl-crc\nThis command takes in a string-value and returns a CRC-32 XFER checksum of the input string. This is used to generate a \nnumeric checksum for each Zaber Device by check-summing its label (name). This checksum should be stored under user \nSetting 0. During runtime, it is used to ensure that each controller has been properly configured to work with this \nlibrary by comparing the checksum loaded from User Setting 0 to the checksum generated using the device’s label.\n\n### sl-devices\nThis command is used during initial system configuration to discover the USB ports assigned to all Zaber devices. This \nis used when updating the project_configuration.yaml files that, amongst other information, communicate the USB ports \nused by various Mesoscope-VR system components during runtime.\n\n### sl-replace-root\nThis command is used to replace the path to the **root** directory on the VRPC (where all projects are saved), which is \nstored in a user-specific default directory. When one of the main runtime commands from this library is used for the \n**first ever time**, the library asks the user to define a directory where to save all projects. All future\ncalls to this library use the same path and assume the projects are stored in that directory. Since the path is \nstored in a typically hidden service directory, this command simplifies finding and replacing the path if this need \never arises.\n\n### sl-maintain-vr\nThis command is typically used twice during each experiment or training day. First, it is used at the beginning of the \nday to prepare the Mesoscope-VR system for runtime by filling the water delivery system and, if necessary, replacing \nthe running-wheel surface wrap. Second, it is used at the end of each day to empty the water delivery system.\n\nThis runtime is also co-opted to check the cranial windows of newly implanted animals to determine whether they should\nbe included in a project. To do so, the command allows changing the position of the HeadBar and LickPort manipulators \nand generating a snapshot of Mesoscope and Zaber positions, as well as the screenshot of the cranial window.\n\n***Note!*** Since this runtime fulfills multiple functions, it uses an 'input'-based terminal interface to accept \nfurther commands during runtime. To prevent visual bugs, the input does not print anything to the terminal and appears \nas a blank new line. If you see a blank new line with no terminal activity, this indicates that the system is ready \nto accept one of the supported commands. All supported commands are printed to the terminal as part of the runtime \ninitialization.\n\n#### Supported vr-maintenance commands\n1.  `open`. Opens the water delivery valve.\n2.  `close`. Closes the water delivery valve.\n3.  `close_10`. Closes the water delivery valve after a 10-second delay.\n4.  `reference`. Triggers 200 valve pulses with each pulse calibrated to deliver 5 uL of water. This command is used to\n    check whether the valve calibration data stored in the project_configuration.yaml of the project specified when \n    calling the runtime command is accurate. This is done at the beginning of each training or experiment day. The \n    reference runtime should overall dispense ~ 1 ml of water.\n5.  `calibrate_15`. Runs 200 valve pulses, keeping the valve open for 15-milliseconds for each pulse. This is used to \n    generate valve calibration data.\n6.  `calibarte_30`. Same as above, but uses 30-millisecond pulses.\n7.  `calibrate_45`. Same as above, but uses 45-millisecond pulses.\n8.  `calibrate_60`. Same as above, but uses 60-millisecond pulses.\n9.  `lock`. Locks the running wheel (engages running-wheel break).\n10. `unlock`. Unlocks the running wheel (disengages running wheel break).\n11. `maintain`. Moves the HeadBar and LickPort to the predefined VR maintenance position stored inside non-volatile\n    Zaber device memory.\n12. `mount`. Moves the HeadBar and LickPort to the predefined animal mounting position stored inside non-volatile\n    Zaber device memory. This is used when checking the cranial windows of newly implanted animals.\n13. `image`. Moves the HeadBar and LickPort to the predefined brain imaging position stored inside non-volatile\n    Zaber device memory. This is used when checking the cranial windows of newly implanted animals.\n14. `snapshot`. Generates a snapshot of the Zaber motor positions, Mesoscope positions, and the screenshot of the \n    cranial window. This saves the system configuration for the checked animal, so that it can be reused during future \n    training and experiment runtimes\n\n### sl-lick-train\nRuns a single lick-training session. All animals in the Sun lab undergo a two-stage training protocol before they start \nparticipating in project-specific experiments. The first phase of the training protocol is lick training, where the \nanimals are trained to operate the lick-tube while being head-fixed. This training is carried out for 2 days.\n\n### sl-run-train\nRuns a single run-training session. The second phase of the Sun lab training protocol is run training, where the \nanimals run on the wheel treadmill while being head-fixed to get water rewards. This training is carried out for the \n5 days following the lick-training.\n\n### sl-experiment\nRuns a single experiment session. Each project has to define one or more experiment configurations that can be executed \nvia this command. Every experiment configuration may be associated with a unique Unity VR task, which has to be\nactivated independently of running this command. See the [project directory notes](#project-directory) to learn about \nexperiment configuration files which are used by this command.\n\n**Critical!** Since this library does not have a way of starting Unity game engine or ScanImage software, both have to \nbe initialized **manually** before running the sl-experiment command. See the main \n[Unity repository](https://github.com/Sun-Lab-NBB/GIMBL-tasks) for details on starting experiment task runtimes. To \nprepare the ScanImage software for runtime, enable 'External Triggers' and configure the system to take **start** and \n**stop** triggers from the ports wired to the Actor microcontroller as described in our \n[microcontroller repository](https://github.com/Sun-Lab-NBB/sl-micro-controllers). Then, hit 'Loop' to 'arm' the system\nto start frame acquisition when it receives the 'start' TTL trigger from this library.\n\n### sl-process\nThis command can be called to preprocess the target training or experiment session data folder. Typically, this library\ncalls the preprocessing pipeline as part of the runtime command, so there is no need to use this command separately. \nHowever, if the runtime or preprocessing is unexpectedly interrupted, call this command to ensure the target session is \npreprocessed and transferred to the long-term storage destinations.\n\n### sl-purge\nTo maximize data integrity, this library does not automatically delete redundant data from the ScanImagePC or the VRPC, \neven if the data has been safely backed up to long-term storage destinations. This command discovers all redundant data\nmarked for deletion by various Sun lab pipelines and deletes it from the ScanImagePC or the VRPC. \n\n***Critical!*** This command has to be called at least weekly to prevent running out of disk space on the ScanImagePC \nand VRPC.\n\n---\n\n## API Documentation\n\nSee the [API documentation](https://sl-experiment-api-docs.netlify.app/) for the\ndetailed description of the methods and classes exposed by components of this library.\n___\n\n## Recovering from Interruptions\nWhile it is not typical for the data acquisition or preprocessing pipelines to fail during runtime, it is not \nimpossible. The library can recover or gracefully terminate the runtime for most code-generated errors, so this is \nusually not a concern. However, if a major interruption (i.e., power outage) occurs or the ScanImagePC encounters an \ninterruption, manual intervention is typically required before the VRPC can run new data acquisition or preprocessing \nruntimes.\n\n### Data acquisition interruption\n\n***Critical!*** If you encounter an interruption during data acquisition (training or experiment runtime), it is \nimpossible to resume the interrupted session. Moreover, since this library acts independently of the ScanImage software\nmanaging the Mesoscope, you will need to manually shut down the other acquisition process. If VRPC is interrupted, \nterminate Mesoscope data acquisition via the ScanImage software. If the Mesoscope is interrupted, use 'ESC+Q' to \nterminate the VRPC data acquisition.\n\nIf VRPC is interrupted during data acquisition, follow this instruction:\n1. If the session involved mesoscope imaging, shut down the Mesoscope acquisition process and make sure all required \n   files (frame stacks, motion estimator data, cranial window screenshot) have been generated adn saved to the \n   **mesoscope_frames** folder.\n2. Remove the animal from the Mesoscope-VR system.\n3. Use Zaber Launcher to **manually move the HeadBarRoll axis to have a positive angle** (\u003e 0 degrees). This is \n   critical! If this is not done, the motor will not be able to home during the next session and will instead collide \n   with the movement guard, at best damaging the motor and, at worst, the mesoscope or the animal.\n4. Go into the 'Device Settings' tab of the Zaber Launcher, click on each Device tab (NOT motor!) and navigate to its \n   User Data section. Then **flip Setting 1 from 0 to 1**. Without this, the library will refuse to operate the Zaber \n   Motors.\n5. If the session involved mesoscope imaging, **rename the mesoscope_frames folder to prepend the session name, using an\n   underscore to separate the folder name from the session name**. For example, from mesoscope_frames → \n   2025-11-11-05-03-234123_mesoscope_frames. Critical! if this is not done, the library may **delete** any leftover \n   mesoscope files during the next runtime and will not be able to properly preprocess the frames for the interrupted\n   session during the next step.\n6. Call the `sl-process` command and provide it with the path to the session directory of the interrupted session. This\n   will preprocess and transfer all collected data to the long-term storage destinations. This way, you can preserve \n   any data acquired before the interruption and prepare the system for running the next session.\n\n***Note!*** If the interruption occurs on the ScanImagePC (Mesoscope) and you use the 'ESC+Q' combination, there is \nno need to do any of the steps above. Using ESC+Q executes a 'graceful' VRPC interruption process which automatically\nexecutes the correct shutdown sequence and data preprocessing.\n\n### Data preprocessing interruption\nTo recover from an error encountered during preprocessing, call the `sl-process` command and provide it with the path \nto the session directory of the interrupted session. The preprocessing pipeline should automatically resume an \ninterrupted runtime.\n\n---\n\n## Versioning\n\nWe use [semantic versioning](https://semver.org/) for this project. For the versions available, see the \n[tags on this repository](https://github.com/Sun-Lab-NBB/sl-experiment/tags).\n\n---\n\n## Authors\n\n- Ivan Kondratyev ([Inkaros](https://github.com/Inkaros))\n- Kushaan Gupta ([kushaangupta](https://github.com/kushaangupta))\n- Natalie Yeung\n- Katlynn Ryu ([katlynn-ryu](https://github.com/KatlynnRyu))\n- Jasmine Si\n\n___\n\n## License\n\nThis project is licensed under the GPL3 License: see the [LICENSE](LICENSE) file for details.\n___\n\n## Acknowledgments\n\n- All Sun lab [members](https://neuroai.github.io/sunlab/people) for providing the inspiration and comments during the\n  development of this library.\n- The creators of all other projects used in our development automation pipelines and source code \n  [see pyproject.toml](pyproject.toml).\n\n---\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsun-lab-nbb%2Fsl-experiment","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fsun-lab-nbb%2Fsl-experiment","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsun-lab-nbb%2Fsl-experiment/lists"}