{"id":13455906,"url":"https://github.com/danielmiessler/fabric","last_synced_at":"2026-04-15T18:01:29.014Z","repository":{"id":215378951,"uuid":"738733003","full_name":"danielmiessler/Fabric","owner":"danielmiessler","description":"Fabric is an open-source framework for augmenting humans using AI. It provides a modular system for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.","archived":false,"fork":false,"pushed_at":"2026-04-09T06:56:18.000Z","size":150927,"stargazers_count":40495,"open_issues_count":32,"forks_count":4035,"subscribers_count":393,"default_branch":"main","last_synced_at":"2026-04-09T08:27:03.430Z","etag":null,"topics":["ai","augmentation","flourishing","life","work"],"latest_commit_sha":null,"homepage":"https://danielmiessler.com/p/fabric-origin-story","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/danielmiessler.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":"docs/CONTRIBUTING.md","funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":"docs/CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"docs/SECURITY.md","support":"docs/SUPPORT.md","governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null},"funding":{"github":["danielmiessler","ksylvan"],"buy_me_a_coffee":"kayvansylvan"}},"created_at":"2024-01-03T23:18:31.000Z","updated_at":"2026-04-09T08:11:18.000Z","dependencies_parsed_at":"2025-12-18T12:08:54.274Z","dependency_job_id":null,"html_url":"https://github.com/danielmiessler/Fabric","commit_stats":null,"previous_names":["danielmiessler/fabric"],"tags_count":417,"template":false,"template_full_name":null,"purl":"pkg:github/danielmiessler/Fabric","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/danielmiessler%2FFabric","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/danielmiessler%2FFabric/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/danielmiessler%2FFabric/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/danielmiessler%2FFabric/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/danielmiessler","download_url":"https://codeload.github.com/danielmiessler/Fabric/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/danielmiessler%2FFabric/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31853279,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-15T15:24:51.572Z","status":"ssl_error","status_checked_at":"2026-04-15T15:24:39.138Z","response_time":63,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","augmentation","flourishing","life","work"],"created_at":"2024-07-31T08:01:13.294Z","updated_at":"2026-04-15T18:01:28.884Z","avatar_url":"https://github.com/danielmiessler.png","language":"Go","readme":"\u003cdiv align=\"center\"\u003e\n    \u003ca href=\"https://go.warp.dev/fabric\" target=\"_blank\"\u003e\n        \u003csup\u003eSpecial thanks to:\u003c/sup\u003e\n        \u003cbr\u003e\n        \u003cimg alt=\"Warp sponsorship\" width=\"400\" src=\"https://raw.githubusercontent.com/warpdotdev/brand-assets/refs/heads/main/Github/Sponsor/Warp-Github-LG-02.png\"\u003e\n        \u003cbr\u003e\n        \u003ch\u003eWarp, built for coding with multiple AI agents\u003c/b\u003e\n        \u003cbr\u003e\n        \u003csup\u003eAvailable for macOS, Linux and Windows\u003c/sup\u003e\n    \u003c/a\u003e\n\u003c/div\u003e\n\n\u003cbr\u003e\n\n\u003cdiv align=\"center\"\u003e\n\n\u003cimg src=\"./docs/images/fabric-logo-gif.gif\" alt=\"fabriclogo\" width=\"400\" height=\"400\"/\u003e\n\n# `fabric`\n\n[![Static Badge](https://img.shields.io/badge/mission-human_flourishing_via_AI_augmentation-purple)](https://github.com/danielmiessler/fabric)\n\u003cbr /\u003e\n[![GitHub top language](https://img.shields.io/github/languages/top/danielmiessler/fabric)](https://github.com/danielmiessler/fabric)\n[![GitHub last commit](https://img.shields.io/github/last-commit/danielmiessler/fabric)](https://github.com/danielmiessler/fabric/commits/main)\n[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)\n[![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/danielmiessler/fabric)\n\n\u003cdiv align=\"center\"\u003e\n\u003ch4\u003e\u003ccode\u003efabric\u003c/code\u003e is an open-source framework for augmenting humans using AI.\u003c/h4\u003e\n\u003c/div\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003cstrong\u003eEnglish\u003c/strong\u003e ·\n  \u003ca href=\"README.zh.md\"\u003e中文\u003c/a\u003e\n\u003c/p\u003e\n\n![Screenshot of fabric](./docs/images/fabric-summarize.png)\n\n\u003c/div\u003e\n\n[Updates](#updates) •\n[What and Why](#what-and-why) •\n[Philosophy](#philosophy) •\n[Installation](#installation) •\n[Usage](#usage) •\n[REST API](#rest-api-server) •\n[Examples](#examples) •\n[Just Use the Patterns](#just-use-the-patterns) •\n[Custom Patterns](#custom-patterns) •\n[Helper Apps](#helper-apps) •\n[Meta](#meta)\n\n\u003c/div\u003e\n\n## What and why\n\nSince the start of modern AI in late 2022 we've seen an **_extraordinary_** number of AI applications for accomplishing tasks. There are thousands of websites, chat-bots, mobile apps, and other interfaces for using all the different AI out there.\n\nIt's all really exciting and powerful, but _it's not easy to integrate this functionality into our lives._\n\n\u003cdiv class=\"align center\"\u003e\n\u003ch4\u003eIn other words, AI doesn't have a capabilities problem—it has an \u003cem\u003eintegration\u003c/em\u003e problem.\u003c/h4\u003e\n\u003c/div\u003e\n\n**Fabric was created to address this by creating and organizing the fundamental units of AI—the prompts themselves!**\n\nFabric organizes prompts by real-world task, allowing people to create, collect, and organize their most important AI solutions in a single place for use in their favorite tools. And if you're command-line focused, you can use Fabric itself as the interface!\n\n## Updates\n\nFor a deep dive into Fabric and its internals, read the documentation in the [docs folder](https://github.com/danielmiessler/Fabric/tree/main/docs). There is\nalso the extremely useful and regularly updated [DeepWiki](https://deepwiki.com/danielmiessler/Fabric) for Fabric.\n\n\u003cdetails\u003e\n\u003csummary\u003eClick to view recent updates\u003c/summary\u003e\n\nDear Users,\n\nWe've been doing so many exciting things here at Fabric, I wanted to give a quick summary here to give you a sense of our development velocity!\n\nBelow are the **new features and capabilities** we've added (newest first):\n\n### Recent Major Features\n\n- [v1.4.437](https://github.com/danielmiessler/fabric/releases/tag/v1.4.437) (March 16, 2026) — **OpenAI Codex PLugin**: Fabric now supports using OpenAI Codex (with your OpenAI subscription) as a backend!\n- [v1.4.417](https://github.com/danielmiessler/fabric/releases/tag/v1.4.417) (Feb 21, 2026) — **Azure AI Gateway Plugin**: Added Azure AI Gateway plugin supporting multiple backends (AWS Bedrock, Azure OpenAI, Google Vertex AI) through a unified Azure APIM Gateway with shared subscription key authentication.\n- [v1.4.416](https://github.com/danielmiessler/fabric/releases/tag/v1.4.416) (Feb 21, 2026) — **Azure Entra ID Authentication**: Added Azure Entra ID authentication plugin with shared Azure utilities, Entra ID/MSAL support, and extracted common Azure logic into a reusable `azurecommon` package.\n- [v1.4.380](https://github.com/danielmiessler/fabric/releases/tag/v1.4.380) (Jan 15, 2026) — **Microsoft 365 Copilot Integration**: Added support for corporate Microsoft 365 Copilot, enabling enterprise users to leverage AI grounded in their organization's Microsoft 365 data (emails, documents, meetings.\n- [v1.4.378](https://github.com/danielmiessler/fabric/releases/tag/v1.4.378) (Jan 14, 2026) — **Digital Ocean GenAI Support**: Added support for Digital Ocean GenAI, along with a [guide for how to use it](./docs/DigitalOcean-Agents-Setup.md).\n- [v1.4.356](https://github.com/danielmiessler/fabric/releases/tag/v1.4.356) (Dec 22, 2025) — **Complete Internationalization**: Full i18n support for setup prompts across all 10 languages with intelligent environment variable handling—making Fabric truly accessible worldwide while maintaining configuration consistency.\n- [v1.4.350](https://github.com/danielmiessler/fabric/releases/tag/v1.4.350) (Dec 18, 2025) — **Interactive API Documentation**: Adds Swagger/OpenAPI UI at `/swagger/index.html` with comprehensive REST API documentation, enhanced developer guides, and improved endpoint discoverability for easier integration.\n- [v1.4.338](https://github.com/danielmiessler/fabric/releases/tag/v1.4.338) (Dec 4, 2025) — Add Abacus vendor support for Chat-LLM\n  models (see [RouteLLM APIs](https://abacus.ai/app/route-llm-apis)).\n- [v1.4.337](https://github.com/danielmiessler/fabric/releases/tag/v1.4.337) (Dec 4, 2025) — Add \"Z AI\" vendor support. See the [Z AI overview](https://docs.z.ai/guides/overview/overview) page for more details.\n- [v1.4.334](https://github.com/danielmiessler/fabric/releases/tag/v1.4.334) (Nov 26, 2025) — **Claude Opus 4.5**: Updates the Anthropic SDK to the latest and adds the new [Claude Opus 4.5](https://www.anthropic.com/news/claude-opus-4-5) to the available models.\n- [v1.4.331](https://github.com/danielmiessler/fabric/releases/tag/v1.4.331) (Nov 23, 2025) — **Support for GitHub Models**: Adds support for using GitHub Models.\n- [v1.4.322](https://github.com/danielmiessler/fabric/releases/tag/v1.4.322) (Nov 5, 2025) — **Interactive HTML Concept Maps and Claude Sonnet 4.5**: Adds `create_conceptmap` pattern for visual knowledge representation using Vis.js, introduces WELLNESS category with psychological analysis patterns, and upgrades to Claude Sonnet 4.5\n- [v1.4.317](https://github.com/danielmiessler/fabric/releases/tag/v1.4.317) (Sep 21, 2025) — **Portuguese Language Variants**: Adds BCP 47 locale normalization with support for Brazilian Portuguese (pt-BR) and European Portuguese (pt-PT) with intelligent fallback chains\n- [v1.4.314](https://github.com/danielmiessler/fabric/releases/tag/v1.4.314) (Sep 17, 2025) — **Azure OpenAI Migration**: Migrates to official `openai-go/azure` SDK with improved authentication and default API version support\n- [v1.4.311](https://github.com/danielmiessler/fabric/releases/tag/v1.4.311) (Sep 13, 2025) — **More internationalization support**: Adds de (German), fa (Persian / Farsi), fr (French), it (Italian),\n  ja (Japanese), pt (Portuguese), zh (Chinese)\n- [v1.4.309](https://github.com/danielmiessler/fabric/releases/tag/v1.4.309) (Sep 9, 2025) — **Comprehensive internationalization support**: Includes English and Spanish locale files.\n- [v1.4.303](https://github.com/danielmiessler/fabric/releases/tag/v1.4.303) (Aug 29, 2025) — **New Binary Releases**: Linux ARM and Windows ARM targets. You can run Fabric on the Raspberry PI and on your Windows Surface!\n- [v1.4.294](https://github.com/danielmiessler/fabric/releases/tag/v1.4.294) (Aug 20, 2025) — **Venice AI Support**: Added the Venice AI provider. Venice is a Privacy-First, Open-Source AI provider. See their [\"About Venice\"](https://docs.venice.ai/overview/about-venice) page for details.\n- [v1.4.291](https://github.com/danielmiessler/fabric/releases/tag/v1.4.291) (Aug 18, 2025) — **Speech To Text**: Add OpenAI speech-to-text support with `--transcribe-file`, `--transcribe-model`, and `--split-media-file` flags.\n\nThese features represent our commitment to making Fabric the most powerful and flexible AI augmentation framework available!\n\n\u003c/details\u003e\n\n## Intro videos\n\nKeep in mind that many of these were recorded when Fabric was Python-based, so remember to use the current [install instructions](#installation) below.\n\n- [Network Chuck](https://www.youtube.com/watch?v=UbDyjIIGaxQ)\n- [David Bombal](https://www.youtube.com/watch?v=vF-MQmVxnCs)\n- [My Own Intro to the Tool](https://www.youtube.com/watch?v=wPEyyigh10g)\n- [More Fabric YouTube Videos](https://www.youtube.com/results?search_query=fabric+ai)\n\n## Navigation\n\n- [`fabric`](#fabric)\n  - [What and why](#what-and-why)\n  - [Updates](#updates)\n    - [Recent Major Features](#recent-major-features)\n  - [Intro videos](#intro-videos)\n  - [Navigation](#navigation)\n  - [Changelog](#changelog)\n  - [Philosophy](#philosophy)\n    - [Breaking problems into components](#breaking-problems-into-components)\n    - [Too many prompts](#too-many-prompts)\n  - [Installation](#installation)\n    - [One-Line Install (Recommended)](#one-line-install-recommended)\n    - [Manual Binary Downloads](#manual-binary-downloads)\n    - [Using package managers](#using-package-managers)\n      - [macOS (Homebrew)](#macos-homebrew)\n      - [Arch Linux (AUR)](#arch-linux-aur)\n      - [Windows](#windows)\n    - [From Source](#from-source)\n    - [Docker](#docker)\n    - [Environment Variables](#environment-variables)\n    - [Setup](#setup)\n    - [Supported AI Providers](#supported-ai-providers)\n    - [Per-Pattern Model Mapping](#per-pattern-model-mapping)\n    - [Add aliases for all patterns](#add-aliases-for-all-patterns)\n      - [Save your files in markdown using aliases](#save-your-files-in-markdown-using-aliases)\n    - [Migration](#migration)\n    - [Upgrading](#upgrading)\n    - [Shell Completions](#shell-completions)\n      - [Quick install (no clone required)](#quick-install-no-clone-required)\n      - [Zsh Completion](#zsh-completion)\n      - [Bash Completion](#bash-completion)\n      - [Fish Completion](#fish-completion)\n  - [Usage](#usage)\n    - [Debug Levels](#debug-levels)\n    - [Dry Run Mode](#dry-run-mode)\n    - [Extensions](#extensions)\n  - [REST API Server](#rest-api-server)\n    - [Ollama Compatibility Mode](#ollama-compatibility-mode)\n  - [Our approach to prompting](#our-approach-to-prompting)\n  - [Examples](#examples)\n  - [Just use the Patterns](#just-use-the-patterns)\n    - [Prompt Strategies](#prompt-strategies)\n      - [Available Strategies](#available-strategies)\n  - [Custom Patterns](#custom-patterns)\n    - [Setting Up Custom Patterns](#setting-up-custom-patterns)\n    - [Using Custom Patterns](#using-custom-patterns)\n    - [How It Works](#how-it-works)\n  - [Helper Apps](#helper-apps)\n    - [`to_pdf`](#to_pdf)\n    - [`to_pdf` Installation](#to_pdf-installation)\n    - [`code2context`](#code2context)\n    - [`generate_changelog`](#generate_changelog)\n  - [pbpaste](#pbpaste)\n  - [Web Interface (Fabric Web App)](#web-interface-fabric-web-app)\n  - [Meta](#meta)\n    - [Primary contributors](#primary-contributors)\n    - [Contributors](#contributors)\n  - [💜 Support This Project](#-support-this-project)\n\n\u003cbr /\u003e\n\n## Changelog\n\nFabric is evolving rapidly.\n\nStay current with the latest features by reviewing the [CHANGELOG](./CHANGELOG.md) for all recent changes.\n\n## Philosophy\n\n\u003e AI isn't a thing; it's a _magnifier_ of a thing. And that thing is **human creativity**.\n\nWe believe the purpose of technology is to help humans flourish, so when we talk about AI we start with the **human** problems we want to solve.\n\n### Breaking problems into components\n\nOur approach is to break problems into individual pieces (see below) and then apply AI to them one at a time. See below for some examples.\n\n\u003cimg width=\"2078\" alt=\"augmented_challenges\" src=\"https://github.com/danielmiessler/fabric/assets/50654/31997394-85a9-40c2-879b-b347e4701f06\"\u003e\n\n### Too many prompts\n\nPrompts are good for this, but the biggest challenge I faced in 2023——which still exists today—is **the sheer number of AI prompts out there**. We all have prompts that are useful, but it's hard to discover new ones, know if they are good or not, _and manage different versions of the ones we like_.\n\nOne of `fabric`'s primary features is helping people collect and integrate prompts, which we call _Patterns_, into various parts of their lives.\n\nFabric has Patterns for all sorts of life and work activities, including:\n\n- Extracting the most interesting parts of YouTube videos and podcasts\n- Writing an essay in your own voice with just an idea as an input\n- Summarizing opaque academic papers\n- Creating perfectly matched AI art prompts for a piece of writing\n- Rating the quality of content to see if you want to read/watch the whole thing\n- Getting summaries of long, boring content\n- Explaining code to you\n- Turning bad documentation into usable documentation\n- Creating social media posts from any content input\n- And a million more…\n\n## Installation\n\n### One-Line Install (Recommended)\n\n**Unix/Linux/macOS:**\n\n```bash\ncurl -fsSL https://raw.githubusercontent.com/danielmiessler/fabric/main/scripts/installer/install.sh | bash\n```\n\n**Windows PowerShell:**\n\n```powershell\niwr -useb https://raw.githubusercontent.com/danielmiessler/fabric/main/scripts/installer/install.ps1 | iex\n```\n\n\u003e See [scripts/installer/README.md](./scripts/installer/README.md) for custom installation options and troubleshooting.\n\n### Manual Binary Downloads\n\nThe latest release binary archives and their expected SHA256 hashes can be found at \u003chttps://github.com/danielmiessler/fabric/releases/latest\u003e\n\n### Using package managers\n\n**NOTE:** using Homebrew or the Arch Linux package managers makes `fabric` available as `fabric-ai`, so add\nthe following alias to your shell startup files to account for this:\n\n```bash\nalias fabric='fabric-ai'\n```\n\n#### macOS (Homebrew)\n\n`brew install fabric-ai`\n\n#### Arch Linux (AUR)\n\n`yay -S fabric-ai`\n\n#### Windows\n\nUse the official Microsoft supported `Winget` tool:\n\n`winget install danielmiessler.Fabric`\n\n### From Source\n\nTo install Fabric, [make sure Go is installed](https://go.dev/doc/install), and then run the following command.\n\n```bash\n# Install Fabric directly from the repo\ngo install github.com/danielmiessler/fabric/cmd/fabric@latest\n```\n\n### Docker\n\nRun Fabric using pre-built Docker images:\n\n```bash\n# Use latest image from Docker Hub\ndocker run --rm -it kayvan/fabric:latest --version\n\n# Use specific version from GHCR\ndocker run --rm -it ghcr.io/ksylvan/fabric:v1.4.305 --version\n\n# Run setup (first time)\nmkdir -p $HOME/.fabric-config\ndocker run --rm -it -v $HOME/.fabric-config:/home/appuser/.config/fabric kayvan/fabric:latest --setup\n\n# Use Fabric with your patterns\ndocker run --rm -it -v $HOME/.fabric-config:/home/appuser/.config/fabric kayvan/fabric:latest -p summarize\n\n# Run the REST API server (see REST API Server section)\ndocker run --rm -it -p 8080:8080 -v $HOME/.fabric-config:/home/appuser/.config/fabric kayvan/fabric:latest --serve\n```\n\n**Images available at:**\n\n- Docker Hub: [kayvan/fabric](https://hub.docker.com/repository/docker/kayvan/fabric/general)\n- GHCR: [ksylvan/fabric](https://github.com/ksylvan/fabric/pkgs/container/fabric)\n\nSee [scripts/docker/README.md](./scripts/docker/README.md) for building custom images and advanced configuration.\n\n### Environment Variables\n\nYou may need to set some environment variables in your `~/.bashrc` on linux or `~/.zshrc` file on mac to be able to run the `fabric` command. Here is an example of what you can add:\n\nFor Intel based macs or linux\n\n```bash\n# Golang environment variables\nexport GOROOT=/usr/local/go\nexport GOPATH=$HOME/go\n\n# Update PATH to include GOPATH and GOROOT binaries\nexport PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH\n```\n\nfor Apple Silicon based macs\n\n```bash\n# Golang environment variables\nexport GOROOT=$(brew --prefix go)/libexec\nexport GOPATH=$HOME/go\nexport PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH\n```\n\n### Setup\n\nNow run the following command\n\n```bash\n# Run the setup to set up your directories and keys\nfabric --setup\n```\n\nIf everything works you are good to go.\n\n### Supported AI Providers\n\nFabric supports a wide range of AI providers:\n\n**Native Integrations:**\n\n- OpenAI\n- OpenAI Codex (ChatGPT/Codex subscription OAuth via private backend)\n- Anthropic (Claude)\n- Google Gemini\n- Ollama (local models)\n- Azure OpenAI\n- Amazon Bedrock\n- Vertex AI\n- LM Studio\n- Perplexity\n\n**OpenAI-Compatible Providers:**\n\n- Abacus\n- AIML\n- Cerebras\n- DeepSeek\n- DigitalOcean\n- GitHub Models\n- GrokAI\n- Groq\n- Langdock\n- LiteLLM\n- MiniMax\n- Mistral\n- Novita AI\n- OpenRouter\n- SiliconCloud\n- Together\n- Venice AI\n- Z AI\n\nRun `fabric --setup` to configure your preferred provider(s), or use `fabric --listvendors` to see all available vendors.\n\n### Per-Pattern Model Mapping\n\n You can configure specific models for individual patterns using environment variables\n like `FABRIC_MODEL_PATTERN_NAME=vendor|model`\n\n This makes it easy to maintain these per-pattern model mappings in your shell startup files.\n\n### Add aliases for all patterns\n\nIn order to add aliases for all your patterns and use them directly as commands, for example, `summarize` instead of `fabric --pattern summarize`\nYou can add the following to your `.zshrc` or `.bashrc` file. You\ncan also optionally set the `FABRIC_ALIAS_PREFIX` environment variable\nbefore, if you'd prefer all the fabric aliases to start with the same prefix.\n\n```bash\n# Loop through all files in the ~/.config/fabric/patterns directory\nfor pattern_file in $HOME/.config/fabric/patterns/*; do\n    # Get the base name of the file (i.e., remove the directory path)\n    pattern_name=\"$(basename \"$pattern_file\")\"\n    alias_name=\"${FABRIC_ALIAS_PREFIX:-}${pattern_name}\"\n\n    # Create an alias in the form: alias pattern_name=\"fabric --pattern pattern_name\"\n    alias_command=\"alias $alias_name='fabric --pattern $pattern_name'\"\n\n    # Evaluate the alias command to add it to the current shell\n    eval \"$alias_command\"\ndone\n\nyt() {\n    if [ \"$#\" -eq 0 ] || [ \"$#\" -gt 2 ]; then\n        echo \"Usage: yt [-t | --timestamps] youtube-link\"\n        echo \"Use the '-t' flag to get the transcript with timestamps.\"\n        return 1\n    fi\n\n    transcript_flag=\"--transcript\"\n    if [ \"$1\" = \"-t\" ] || [ \"$1\" = \"--timestamps\" ]; then\n        transcript_flag=\"--transcript-with-timestamps\"\n        shift\n    fi\n    local video_link=\"$1\"\n    fabric -y \"$video_link\" $transcript_flag\n}\n```\n\nYou can add the below code for the equivalent aliases inside PowerShell by running `notepad $PROFILE` inside a PowerShell window:\n\n```powershell\n# Path to the patterns directory\n$patternsPath = Join-Path $HOME \".config/fabric/patterns\"\nforeach ($patternDir in Get-ChildItem -Path $patternsPath -Directory) {\n    # Prepend FABRIC_ALIAS_PREFIX if set; otherwise use empty string\n    $prefix = $env:FABRIC_ALIAS_PREFIX ?? ''\n    $patternName = \"$($patternDir.Name)\"\n    $aliasName = \"$prefix$patternName\"\n    # Dynamically define a function for each pattern\n    $functionDefinition = @\"\nfunction $aliasName {\n    [CmdletBinding()]\n    param(\n        [Parameter(ValueFromPipeline = `$true)]\n        [string] `$InputObject,\n\n        [Parameter(ValueFromRemainingArguments = `$true)]\n        [String[]] `$patternArgs\n    )\n\n    begin {\n        # Initialize an array to collect pipeline input\n        `$collector = @()\n    }\n\n    process {\n        # Collect pipeline input objects\n        if (`$InputObject) {\n            `$collector += `$InputObject\n        }\n    }\n\n    end {\n        # Join all pipeline input into a single string, separated by newlines\n        `$pipelineContent = `$collector -join \"`n\"\n\n        # If there's pipeline input, include it in the call to fabric\n        if (`$pipelineContent) {\n            `$pipelineContent | fabric --pattern $patternName `$patternArgs\n        } else {\n            # No pipeline input; just call fabric with the additional args\n            fabric --pattern $patternName `$patternArgs\n        }\n    }\n}\n\"@\n    # Add the function to the current session\n    Invoke-Expression $functionDefinition\n}\n\n# Define the 'yt' function as well\nfunction yt {\n    [CmdletBinding()]\n    param(\n        [Parameter()]\n        [Alias(\"timestamps\")]\n        [switch]$t,\n\n        [Parameter(Position = 0, ValueFromPipeline = $true)]\n        [string]$videoLink\n    )\n\n    begin {\n        $transcriptFlag = \"--transcript\"\n        if ($t) {\n            $transcriptFlag = \"--transcript-with-timestamps\"\n        }\n    }\n\n    process {\n        if (-not $videoLink) {\n            Write-Error \"Usage: yt [-t | --timestamps] youtube-link\"\n            return\n        }\n    }\n\n    end {\n        if ($videoLink) {\n            # Execute and allow output to flow through the pipeline\n            fabric -y $videoLink $transcriptFlag\n        }\n    }\n}\n```\n\nThis also creates a `yt` alias that allows you to use `yt https://www.youtube.com/watch?v=4b0iet22VIk` to get transcripts, comments, and metadata.\n\n#### Save your files in markdown using aliases\n\nIf in addition to the above aliases you would like to have the option to save the output to your favorite markdown note vault like Obsidian then instead of the above add the following to your `.zshrc` or `.bashrc` file:\n\n```bash\n# Define the base directory for Obsidian notes\nobsidian_base=\"/path/to/obsidian\"\n\n# Loop through all files in the ~/.config/fabric/patterns directory\nfor pattern_file in ~/.config/fabric/patterns/*; do\n    # Get the base name of the file (i.e., remove the directory path)\n    pattern_name=$(basename \"$pattern_file\")\n\n    # Remove any existing alias with the same name\n    unalias \"$pattern_name\" 2\u003e/dev/null\n\n    # Define a function dynamically for each pattern\n    eval \"\n    $pattern_name() {\n        local title=\\$1\n        local date_stamp=\\$(date +'%Y-%m-%d')\n        local output_path=\\\"\\$obsidian_base/\\${date_stamp}-\\${title}.md\\\"\n\n        # Check if a title was provided\n        if [ -n \\\"\\$title\\\" ]; then\n            # If a title is provided, use the output path\n            fabric --pattern \\\"$pattern_name\\\" -o \\\"\\$output_path\\\"\n        else\n            # If no title is provided, use --stream\n            fabric --pattern \\\"$pattern_name\\\" --stream\n        fi\n    }\n    \"\ndone\n```\n\nThis will allow you to use the patterns as aliases like in the above for example `summarize` instead of `fabric --pattern summarize --stream`, however if you pass in an extra argument like this `summarize \"my_article_title\"` your output will be saved in the destination that you set in `obsidian_base=\"/path/to/obsidian\"` in the following format `YYYY-MM-DD-my_article_title.md` where the date gets autogenerated for you.\nYou can tweak the date format by tweaking the `date_stamp` format.\n\n### Migration\n\nIf you have the Legacy (Python) version installed and want to migrate to the Go version, here's how you do it. It's basically two steps: 1) uninstall the Python version, and 2) install the Go version.\n\n```bash\n# Uninstall Legacy Fabric\npipx uninstall fabric\n\n# Clear any old Fabric aliases\n(check your .bashrc, .zshrc, etc.)\n# Install the Go version\ngo install github.com/danielmiessler/fabric/cmd/fabric@latest\n# Run setup for the new version. Important because things have changed\nfabric --setup\n```\n\nThen [set your environmental variables](#environment-variables) as shown above.\n\n### Upgrading\n\nThe great thing about Go is that it's super easy to upgrade. Just run the same command you used to install it in the first place and you'll always get the latest version.\n\n```bash\ngo install github.com/danielmiessler/fabric/cmd/fabric@latest\n```\n\n### Shell Completions\n\nFabric provides shell completion scripts for Zsh, Bash, and Fish\nshells, making it easier to use the CLI by providing tab completion\nfor commands and options.\n\n#### Quick install (no clone required)\n\nYou can install completions directly via a one-liner:\n\n```bash\ncurl -fsSL https://raw.githubusercontent.com/danielmiessler/Fabric/refs/heads/main/completions/setup-completions.sh | sh\n```\n\nOptional variants:\n\n```bash\n# Dry-run (see actions without changing your system)\ncurl -fsSL https://raw.githubusercontent.com/danielmiessler/Fabric/refs/heads/main/completions/setup-completions.sh | sh -s -- --dry-run\n\n# Override the download source (advanced)\nFABRIC_COMPLETIONS_BASE_URL=\"https://raw.githubusercontent.com/danielmiessler/Fabric/refs/heads/main/completions\" \\\n    sh -c \"$(curl -fsSL https://raw.githubusercontent.com/danielmiessler/Fabric/refs/heads/main/completions/setup-completions.sh)\"\n```\n\n#### Zsh Completion\n\nTo enable Zsh completion:\n\n```bash\n# Copy the completion file to a directory in your $fpath\nmkdir -p ~/.zsh/completions\ncp completions/_fabric ~/.zsh/completions/\n\n# Add the directory to fpath in your .zshrc before compinit\necho 'fpath=(~/.zsh/completions $fpath)' \u003e\u003e ~/.zshrc\necho 'autoload -Uz compinit \u0026\u0026 compinit' \u003e\u003e ~/.zshrc\n```\n\n#### Bash Completion\n\nTo enable Bash completion:\n\n```bash\n# Source the completion script in your .bashrc\necho 'source /path/to/fabric/completions/fabric.bash' \u003e\u003e ~/.bashrc\n\n# Or copy to the system-wide bash completion directory\nsudo cp completions/fabric.bash /etc/bash_completion.d/\n```\n\n#### Fish Completion\n\nTo enable Fish completion:\n\n```bash\n# Copy the completion file to the fish completions directory\nmkdir -p ~/.config/fish/completions\ncp completions/fabric.fish ~/.config/fish/completions/\n```\n\n## Usage\n\nOnce you have it all set up, here's how to use it.\n\n```bash\nfabric -h\n```\n\n```plaintext\nUsage:\n  fabric [OPTIONS]\n\nApplication Options:\n  -p, --pattern=                    Choose a pattern from the available patterns\n  -v, --variable=                   Values for pattern variables, e.g. -v=#role:expert -v=#points:30\n  -C, --context=                    Choose a context from the available contexts\n      --session=                    Choose a session from the available sessions\n  -a, --attachment=                 Attachment path or URL (e.g. for OpenAI image recognition messages)\n  -S, --setup                       Run setup for all reconfigurable parts of fabric\n  -t, --temperature=                Set temperature (default: 0.7)\n  -T, --topp=                       Set top P (default: 0.9)\n  -s, --stream                      Stream\n  -P, --presencepenalty=            Set presence penalty (default: 0.0)\n  -r, --raw                         Use the defaults of the model without sending chat options\n                                    (temperature, top_p, etc.). Only affects OpenAI-compatible providers.\n                                    Anthropic models always use smart parameter selection to comply with\n                                    model-specific requirements.\n  -F, --frequencypenalty=           Set frequency penalty (default: 0.0)\n  -l, --listpatterns                List all patterns\n  -L, --listmodels                  List all available models\n  -x, --listcontexts                List all contexts\n  -X, --listsessions                List all sessions\n  -U, --updatepatterns              Update patterns\n  -c, --copy                        Copy to clipboard\n  -m, --model=                      Choose model\n  -V, --vendor=                     Specify vendor for chosen model (e.g., -V \"LM Studio\" -m openai/gpt-oss-20b)\n      --modelContextLength=         Model context length (only affects ollama)\n  -o, --output=                     Output to file\n      --output-session              Output the entire session (also a temporary one) to the output file\n  -n, --latest=                     Number of latest patterns to list (default: 0)\n  -d, --changeDefaultModel          Change default model\n  -y, --youtube=                    YouTube video or play list \"URL\" to grab transcript, comments from it\n                                    and send to chat or print it put to the console and store it in the\n                                    output file\n      --playlist                    Prefer playlist over video if both ids are present in the URL\n      --transcript                  Grab transcript from YouTube video and send to chat (it is used per\n                                    default).\n      --transcript-with-timestamps  Grab transcript from YouTube video with timestamps and send to chat\n      --visual                      Extract visual data from video using OCR and FFmpeg\n      --visual-sensitivity          Tolerance for FFmpeg scene detection (0.0 - 1.0)\n      --visual-fps                  Extract a specific number of frames per second instead of using scene detection\n      --comments                    Grab comments from YouTube video and send to chat\n      --metadata                    Output video metadata\n  -g, --language=                   Specify the Language Code for the chat, e.g. -g=en -g=zh\n  -u, --scrape_url=                 Scrape website URL to markdown using Jina AI\n  -q, --scrape_question=            Search question using Jina AI\n  -e, --seed=                       Seed to be used for LMM generation\n  -w, --wipecontext=                Wipe context\n  -W, --wipesession=                Wipe session\n      --printcontext=               Print context\n      --printsession=               Print session\n      --readability                 Convert HTML input into a clean, readable view\n      --input-has-vars              Apply variables to user input\n      --no-variable-replacement     Disable pattern variable replacement\n      --dry-run                     Show what would be sent to the model without actually sending it\n      --serve                       Serve the Fabric Rest API\n      --serveOllama                 Serve the Fabric Rest API with ollama endpoints\n      --address=                    The address to bind the REST API (default: :8080)\n      --api-key=                    API key used to secure server routes\n      --config=                     Path to YAML config file\n      --version                     Print current version\n      --listextensions              List all registered extensions\n      --addextension=               Register a new extension from config file path\n      --rmextension=                Remove a registered extension by name\n      --strategy=                   Choose a strategy from the available strategies\n      --liststrategies              List all strategies\n      --listvendors                 List all vendors\n      --shell-complete-list         Output raw list without headers/formatting (for shell completion)\n      --search                      Enable web search tool for supported models (Anthropic, OpenAI, Gemini)\n      --search-location=            Set location for web search results (e.g., 'America/Los_Angeles')\n      --image-file=                 Save generated image to specified file path (e.g., 'output.png')\n      --image-size=                 Image dimensions: 1024x1024, 1536x1024, 1024x1536, auto (default: auto)\n      --image-quality=              Image quality: low, medium, high, auto (default: auto)\n      --image-compression=          Compression level 0-100 for JPEG/WebP formats (default: not set)\n      --image-background=           Background type: opaque, transparent (default: opaque, only for\n                                    PNG/WebP)\n      --suppress-think              Suppress text enclosed in thinking tags\n      --think-start-tag=            Start tag for thinking sections (default: \u003cthink\u003e)\n      --think-end-tag=              End tag for thinking sections (default: \u003c/think\u003e)\n      --disable-responses-api       Disable OpenAI Responses API (default: false)\n      --voice=                      TTS voice name for supported models (e.g., Kore, Charon, Puck)\n                                    (default: Kore)\n      --list-gemini-voices          List all available Gemini TTS voices\n      --notification                Send desktop notification when command completes\n      --notification-command=       Custom command to run for notifications (overrides built-in\n                                    notifications)\n      --yt-dlp-args=                Additional arguments to pass to yt-dlp (e.g. '--cookies-from-browser brave')\n      --thinking=                   Set reasoning/thinking level (e.g., off, low, medium, high, or\n                                    numeric tokens for Anthropic or Google Gemini)\n      --show-metadata               Print metadata (input/output tokens) to stderr\n      --debug=                     Set debug level (0: off, 1: basic, 2: detailed, 3: trace)\nHelp Options:\n  -h, --help                        Show this help message\n```\n\n### Debug Levels\n\nUse the `--debug` flag to control runtime logging:\n\n- `0`: off (default)\n- `1`: basic debug info\n- `2`: detailed debugging\n- `3`: trace level\n\n### Dry Run Mode\n\nUse `--dry-run` to preview what would be sent to the AI model without making an API call:\n\n```bash\necho \"test input\" | fabric --dry-run -p summarize\n```\n\nThis is useful for debugging patterns, checking prompt construction, and verifying input formatting before using API credits.\n\n### Extensions\n\nFabric supports extensions that can be called within patterns. See the [Extension Guide](internal/plugins/template/Examples/README.md) for complete documentation.\n\n**Important:** Extensions only work within pattern files, not via direct stdin. See the guide for details and examples.\n\n## REST API Server\n\nFabric includes a built-in REST API server that exposes all core functionality over HTTP. Start the server with:\n\n```bash\nfabric --serve\n```\n\nThe server provides endpoints for:\n\n- Chat completions with streaming responses\n- Pattern management (create, read, update, delete)\n- Context and session management\n- Model and vendor listing\n- YouTube transcript extraction\n- Configuration management\n\nFor complete endpoint documentation, authentication setup, and usage examples, see [REST API Documentation](docs/rest-api.md).\n\n### Ollama Compatibility Mode\n\nFabric can serve as a drop-in replacement for Ollama by exposing Ollama-compatible API endpoints. Start the server with:\n\n```bash\nfabric --serve --serveOllama\n```\n\nThis enables the following Ollama-compatible endpoints:\n\n- `GET /api/tags` - List available patterns as models\n- `POST /api/chat` - Chat completions\n- `GET /api/version` - Server version\n\nApplications configured to use the Ollama API can point to your Fabric server instead, allowing you to use any of Fabric's supported AI providers through the Ollama interface. Patterns appear as models (e.g., `summarize:latest`).\n\n## Our approach to prompting\n\nFabric _Patterns_ are different than most prompts you'll see.\n\n- **First, we use `Markdown` to help ensure maximum readability and editability**. This not only helps the creator make a good one, but also anyone who wants to deeply understand what it does. _Importantly, this also includes the AI you're sending it to!_\n\nHere's an example of a Fabric Pattern.\n\n```bash\nhttps://github.com/danielmiessler/Fabric/blob/main/data/patterns/extract_wisdom/system.md\n```\n\n\u003cimg width=\"1461\" alt=\"pattern-example\" src=\"https://github.com/danielmiessler/fabric/assets/50654/b910c551-9263-405f-9735-71ca69bbab6d\"\u003e\n\n- **Next, we are extremely clear in our instructions**, and we use the Markdown structure to emphasize what we want the AI to do, and in what order.\n\n- **And finally, we tend to use the System section of the prompt almost exclusively**. In over a year of being heads-down with this stuff, we've just seen more efficacy from doing that. If that changes, or we're shown data that says otherwise, we will adjust.\n\n## Examples\n\n\u003e The following examples use the macOS `pbpaste` to paste from the clipboard. See the [pbpaste](#pbpaste) section below for Windows and Linux alternatives.\n\nNow let's look at some things you can do with Fabric.\n\n1. Run the `summarize` Pattern based on input from `stdin`. In this case, the body of an article.\n\n    ```bash\n    pbpaste | fabric --pattern summarize\n    ```\n\n2. Run the `analyze_claims` Pattern with the `--stream` option to get immediate and streaming results.\n\n    ```bash\n    pbpaste | fabric --stream --pattern analyze_claims\n    ```\n\n3. Run the `extract_wisdom` Pattern with the `--stream` option to get immediate and streaming results from any      Youtube video (much like in the original introduction video).\n\n    ```bash\n    fabric -y \"https://youtube.com/watch?v=uXs-zPc63kM\" --stream --pattern extract_wisdom\n    ```\n\n4. Create patterns- you must create a .md file with the pattern and save it to `~/.config/fabric/patterns/[yourpatternname]`.\n\n5. Run a `analyze_claims` pattern on a website. Fabric uses Jina AI to scrape the URL into markdown format before sending it to the model.\n\n    ```bash\n    fabric -u https://github.com/danielmiessler/fabric/ -p analyze_claims\n    ```\n\n## Just use the Patterns\n\n\u003cimg width=\"1173\" alt=\"fabric-patterns-screenshot\" src=\"https://github.com/danielmiessler/fabric/assets/50654/9186a044-652b-4673-89f7-71cf066f32d8\"\u003e\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\nIf you're not looking to do anything fancy, and you just want a lot of great prompts, you can navigate to the [`/patterns`](https://github.com/danielmiessler/fabric/tree/main/data/patterns) directory and start exploring!\n\nWe hope that if you used nothing else from Fabric, the Patterns by themselves will make the project useful.\n\nYou can use any of the Patterns you see there in any AI application that you have, whether that's ChatGPT or some other app or website. Our plan and prediction is that people will soon be sharing many more than those we've published, and they will be way better than ours.\n\nThe wisdom of crowds for the win.\n\n### Prompt Strategies\n\nFabric also implements prompt strategies like \"Chain of Thought\" or \"Chain of Draft\" which can\nbe used in addition to the basic patterns.\n\nSee the [Thinking Faster by Writing Less](https://arxiv.org/pdf/2502.18600) paper and\nthe [Thought Generation section of Learn Prompting](https://learnprompting.org/docs/advanced/thought_generation/introduction) for examples of prompt strategies.\n\nEach strategy is available as a small `json` file in the [`/strategies`](https://github.com/danielmiessler/fabric/tree/main/data/strategies) directory.\n\nThe prompt modification of the strategy is applied to the system prompt and passed on to the\nLLM in the chat session.\n\nUse `fabric -S` and select the option to install the strategies in your `~/.config/fabric` directory.\n\n#### Available Strategies\n\nFabric includes several prompt strategies:\n\n- `cot` - Chain-of-Thought: Step-by-step reasoning\n- `cod` - Chain-of-Draft: Iterative drafting with minimal notes (5 words max per step)\n- `tot` - Tree-of-Thought: Generate multiple reasoning paths and select the best one\n- `aot` - Atom-of-Thought: Break problems into smallest independent atomic sub-problems\n- `ltm` - Least-to-Most: Solve problems from easiest to hardest sub-problems\n- `self-consistent` - Self-Consistency: Multiple reasoning paths with consensus\n- `self-refine` - Self-Refinement: Answer, critique, and refine\n- `reflexion` - Reflexion: Answer, critique briefly, and provide refined answer\n- `standard` - Standard: Direct answer without explanation\n\nUse the `--strategy` flag to apply a strategy:\n\n```bash\necho \"Analyze this code\" | fabric --strategy cot -p analyze_code\n```\n\nList all available strategies with:\n\n```bash\nfabric --liststrategies\n```\n\nStrategies are stored as JSON files in `~/.config/fabric/strategies/`. See the default strategies for the format specification.\n\n## Custom Patterns\n\nYou may want to use Fabric to create your own custom Patterns—but not share them with others. No problem!\n\nFabric now supports a dedicated custom patterns directory that keeps your personal patterns separate from the built-in ones. This means your custom patterns won't be overwritten when you update Fabric's built-in patterns.\n\n### Setting Up Custom Patterns\n\n1. Run the Fabric setup:\n\n   ```bash\n   fabric --setup\n   ```\n\n2. Select the \"Custom Patterns\" option from the Tools menu and enter your desired directory path (e.g., `~/my-custom-patterns`)\n\n3. Fabric will automatically create the directory if it does not exist.\n\n### Using Custom Patterns\n\n1. Create your custom pattern directory structure:\n\n   ```bash\n   mkdir -p ~/my-custom-patterns/my-analyzer\n   ```\n\n2. Create your pattern file\n\n   ```bash\n   echo \"You are an expert analyzer of ...\" \u003e ~/my-custom-patterns/my-analyzer/system.md\n   ```\n\n3. **Use your custom pattern:**\n\n   ```bash\n   fabric --pattern my-analyzer \"analyze this text\"\n   ```\n\n### How It Works\n\n- **Priority System**: Custom patterns take precedence over built-in patterns with the same name\n- **Seamless Integration**: Custom patterns appear in `fabric --listpatterns` alongside built-in ones\n- **Update Safe**: Your custom patterns are never affected by `fabric --updatepatterns`\n- **Private by Default**: Custom patterns remain private unless you explicitly share them\n\nYour custom patterns are completely private and won't be affected by Fabric updates!\n\n## Helper Apps\n\nFabric also makes use of some core helper apps (tools) to make it easier to integrate with your various workflows. Here are some examples:\n\n### `to_pdf`\n\n`to_pdf` is a helper command that converts LaTeX files to PDF format. You can use it like this:\n\n```bash\nto_pdf input.tex\n```\n\nThis will create a PDF file from the input LaTeX file in the same directory.\n\nYou can also use it with stdin which works perfectly with the `write_latex` pattern:\n\n```bash\necho \"ai security primer\" | fabric --pattern write_latex | to_pdf\n```\n\nThis will create a PDF file named `output.pdf` in the current directory.\n\n### `to_pdf` Installation\n\nTo install `to_pdf`, install it the same way as you install Fabric, just with a different repo name.\n\n```bash\ngo install github.com/danielmiessler/fabric/cmd/to_pdf@latest\n```\n\nMake sure you have a LaTeX distribution (like TeX Live or MiKTeX) installed on your system, as `to_pdf` requires `pdflatex` to be available in your system's PATH.\n\n### `code2context`\n\n`code2context` is used in conjunction with the `create_coding_feature` pattern.\nIt generates a `json` representation of a directory of code that can be fed into an AI model\nwith instructions to create a new feature or edit the code in a specified way.\n\nSee [the Create Coding Feature Pattern README](./data/patterns/create_coding_feature/README.md) for details.\n\nInstall it first using:\n\n```bash\ngo install github.com/danielmiessler/fabric/cmd/code2context@latest\n```\n\n### `generate_changelog`\n\n`generate_changelog` generates changelogs from git commit history and GitHub pull requests. It walks through your repository's git history, extracts PR information, and produces well-formatted markdown changelogs.\n\n```bash\ngenerate_changelog --help\n```\n\nFeatures include SQLite caching for fast incremental updates, GitHub GraphQL API integration for efficient PR fetching, and optional AI-enhanced summaries using Fabric.\n\nInstall it using:\n\n```bash\ngo install github.com/danielmiessler/fabric/cmd/generate_changelog@latest\n```\n\nSee the [generate_changelog README](./cmd/generate_changelog/README.md) for detailed usage and options.\n\n## pbpaste\n\nThe [examples](#examples) use the macOS program `pbpaste` to paste content from the clipboard to pipe into `fabric` as the input. `pbpaste` is not available on Windows or Linux, but there are alternatives.\n\nOn Windows, you can use the PowerShell command `Get-Clipboard` from a PowerShell command prompt. If you like, you can also alias it to `pbpaste`. If you are using classic PowerShell, edit the file `~\\Documents\\WindowsPowerShell\\.profile.ps1`, or if you are using PowerShell Core, edit `~\\Documents\\PowerShell\\.profile.ps1` and add the alias,\n\n```powershell\nSet-Alias pbpaste Get-Clipboard\n```\n\nOn Linux, you can use `xclip -selection clipboard -o` to paste from the clipboard. You will likely need to install `xclip` with your package manager. For Debian based systems including Ubuntu,\n\n```sh\nsudo apt update\nsudo apt install xclip -y\n```\n\nYou can also create an alias by editing `~/.bashrc` or `~/.zshrc` and adding the alias,\n\n```sh\nalias pbpaste='xclip -selection clipboard -o'\n```\n\n## Web Interface (Fabric Web App)\n\nFabric now includes a built-in web interface that provides a GUI alternative to the command-line interface. Refer to [Web App README](/web/README.md) for installation instructions and an overview of features.\n\n## Meta\n\n\u003e [!NOTE]\n\u003e Special thanks to the following people for their inspiration and contributions!\n\n- _Jonathan Dunn_ for being the absolute MVP dev on the project, including spearheading the new Go version, as well as the GUI! All this while also being a full-time medical doctor!\n- _Caleb Sima_ for pushing me over the edge of whether to make this a public project or not.\n- _Eugen Eisler_ and _Frederick Ros_ for their invaluable contributions to the Go version\n- _David Peters_ for his work on the web interface.\n- _Joel Parish_ for super useful input on the project's Github directory structure..\n- _Joseph Thacker_ for the idea of a `-c` context flag that adds pre-created context in the `./config/fabric/` directory to all Pattern queries.\n- _Jason Haddix_ for the idea of a stitch (chained Pattern) to filter content using a local model before sending on to a cloud model, i.e., cleaning customer data using `llama2` before sending on to `gpt-4` for analysis.\n- _Andre Guerra_ for assisting with numerous components to make things simpler and more maintainable.\n\n### Primary contributors\n\n\u003ca href=\"https://github.com/danielmiessler\"\u003e\u003cimg src=\"https://avatars.githubusercontent.com/u/50654?v=4\" title=\"Daniel Miessler\" width=\"50\" height=\"50\" alt=\"Daniel Miessler\"\u003e\u003c/a\u003e\n\u003ca href=\"https://github.com/xssdoctor\"\u003e\u003cimg src=\"https://avatars.githubusercontent.com/u/9218431?v=4\" title=\"Jonathan Dunn\" width=\"50\" height=\"50\" alt=\"Jonathan Dunn\"\u003e\u003c/a\u003e\n\u003ca href=\"https://github.com/sbehrens\"\u003e\u003cimg src=\"https://avatars.githubusercontent.com/u/688589?v=4\" title=\"Scott Behrens\" width=\"50\" height=\"50\" alt=\"Scott Behrens\"\u003e\u003c/a\u003e\n\u003ca href=\"https://github.com/agu3rra\"\u003e\u003cimg src=\"https://avatars.githubusercontent.com/u/10410523?v=4\" title=\"Andre Guerra\" width=\"50\" height=\"50\" alt=\"Andre Guerra\"\u003e\u003c/a\u003e\n\n### Contributors\n\n\u003ca href=\"https://github.com/danielmiessler/fabric/graphs/contributors\"\u003e\n  \u003cimg src=\"https://contrib.rocks/image?repo=danielmiessler/fabric\" alt=\"contrib.rocks\" /\u003e\n\u003c/a\u003e\n\nMade with [contrib.rocks](https://contrib.rocks).\n\n`fabric` was created by \u003ca href=\"https://danielmiessler.com/subscribe\" target=\"_blank\"\u003eDaniel Miessler\u003c/a\u003e in January of 2024.\n\u003cbr /\u003e\u003cbr /\u003e\n\u003ca href=\"https://twitter.com/intent/user?screen_name=danielmiessler\"\u003e![X (formerly Twitter) Follow](https://img.shields.io/twitter/follow/danielmiessler)\u003c/a\u003e\n\n## 💜 Support This Project\n\n\u003cdiv align=\"center\"\u003e\n\n\u003cimg src=\"https://img.shields.io/badge/Sponsor-❤️-EA4AAA?style=for-the-badge\u0026logo=github-sponsors\u0026logoColor=white\" alt=\"Sponsor\"\u003e\n\n**I spend hundreds of hours a year on open source. If you'd like to help support this project, you can [sponsor me here](https://github.com/sponsors/danielmiessler). 🙏🏼**\n\n\u003c/div\u003e\n","funding_links":["https://github.com/sponsors/danielmiessler","https://github.com/sponsors/ksylvan","https://buymeacoffee.com/kayvansylvan"],"categories":["**Section 5: Prompt Engineering, Finetuning, and Visual Prompts**","Python","ai","JavaScript","\u003cspan id=\"game\"\u003eGame (World Model \u0026 Agent)\u003c/span\u003e","\u003cimg src=\"./assets/satellite.svg\" width=\"16\" height=\"16\" style=\"vertical-align: middle;\"\u003e Satellites","Community Projects","Go","Research AI Tools","A01_文本生成_文本对话","\u003ca name=\"ai\"\u003e\u003c/a\u003eAI / ChatGPT","Large Language Model","Repos","HarmonyOS","AI Research","Documentation \u0026 Templates","Prompt Engineering"],"sub_categories":["**Prompt Guide \u0026 Leaked prompts**","\u003cspan id=\"tool\"\u003eLLM (LLM \u0026 Tool)\u003c/span\u003e","Specialized Applications","AI Tools for Research","大语言对话模型及数据","DevTools","Windows Manager","MCP Servers","Prompt Templates","Codex Resources"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdanielmiessler%2Ffabric","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdanielmiessler%2Ffabric","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdanielmiessler%2Ffabric/lists"}