{"id":24144512,"url":"https://github.com/rhul27/devopsassistant","last_synced_at":"2025-11-26T21:03:20.492Z","repository":{"id":271778213,"uuid":"914540917","full_name":"Rhul27/DevOpsAssistant","owner":"Rhul27","description":"The DevOps Assistant is an AI-powered tool for automating DevOps tasks. It generates and executes Bash commands on remote servers using Streamlit and Ollama. Perfect for simplifying server management and automating repetitive tasks. 🤖🚀","archived":false,"fork":false,"pushed_at":"2025-01-10T10:40:35.000Z","size":56,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-03-01T14:44:19.351Z","etag":null,"topics":["devops","llm","ollama","python","sqlite3","ssh","streamlit-application"],"latest_commit_sha":null,"homepage":"https://devopsassistant-1.streamlit.app/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Rhul27.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2025-01-09T19:52:26.000Z","updated_at":"2025-01-10T13:09:45.000Z","dependencies_parsed_at":"2025-01-09T20:42:18.497Z","dependency_job_id":null,"html_url":"https://github.com/Rhul27/DevOpsAssistant","commit_stats":null,"previous_names":["rhul27/devopsassistant"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Rhul27%2FDevOpsAssistant","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Rhul27%2FDevOpsAssistant/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Rhul27%2FDevOpsAssistant/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Rhul27%2FDevOpsAssistant/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Rhul27","download_url":"https://codeload.github.com/Rhul27/DevOpsAssistant/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":241381520,"owners_count":19953749,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["devops","llm","ollama","python","sqlite3","ssh","streamlit-application"],"created_at":"2025-01-12T06:11:41.022Z","updated_at":"2025-11-26T21:03:15.471Z","avatar_url":"https://github.com/Rhul27.png","language":"Python","readme":"# DevOps Assistant 🤖\n\nThe DevOps Assistant is an AI-powered tool designed to help DevOps engineers and system administrators automate tasks, execute commands on remote servers, and generate accurate Bash commands using a local LLM (Large Language Model). It integrates with Streamlit for a user-friendly interface and uses SQLite for command history and caching.\n\n---\n\n## Features ✨\n\n- **SSH Integration**: Connect to remote servers securely via SSH.\n- **AI-Powered Command Generation**: Use a local LLM (e.g., Ollama) to generate accurate Bash commands.\n- **Command Execution**: Execute commands on remote servers and view results in real-time.\n- **Command History**: Store and retrieve past commands and responses for future reference.\n- **Caching Mechanism**: Cache frequently used commands to improve response times.\n- **User Authentication**: Secure access with user authentication (optional).\n- **Streamlit UI**: Intuitive and interactive web-based interface.\n\n---\n\n## Prerequisites 📊\n\nBefore running the DevOps Assistant, ensure you have the following installed:\n\n- **Python 3.8+**: [Download Python](https://www.python.org/downloads/)\n- **Ollama**: A local LLM server. [Install Ollama](https://ollama.ai)\n- **Streamlit**: For the web interface.\n- **Paramiko**: For SSH connections.\n- **SQLite3**: For database storage (included with Python).\n\n---\n\n## Installation 🛠️\n\n1. **Clone the repository**:\n   ```bash\n   git clone https://github.com/Rhul27/DevOpsAssistant.git\n   cd DevOpsAssistant\n   ```\n\n2. **Install dependencies**:\n   ```bash\n   pip install -r requirements.txt\n   ```\n\n3. **Set up Ollama**:\n   \n   Install Ollama and start the server:\n   ```bash\n   ollama serve\n   ```\n\n   Download a model (e.g., llama3.2):\n   ```bash\n   ollama pull llama3.2\n   ```\n\n4. **Run the Streamlit app**:\n   ```bash\n   streamlit run main.py\n   ```\n\n5. **Access the app**:\n   Open your browser and navigate to [http://localhost:8501](http://localhost:8501).\n\n---\n\n## Usage 🚀\n\n### Connect to the Server:\n\n1. Enter the server's IP address, username, and password in the sidebar.\n2. Click **\"Connect to Server\"**.\n\n### Connect to the LLM Model:\n\n1. Select a model from the dropdown in the sidebar.\n2. Click **\"Connect to LLM Model\"**.\n\n### Ask a Question:\n\n1. Enter your question in the main input box (e.g., \"How do I check disk usage on Linux?\").\n2. Click **\"Submit\"** to get a response.\n\n### View Command History:\n\nAll executed commands and responses are stored in the database and displayed in the **Command History** section.\n\n---\n\n## Folder Structure 🗂️\n\n```\nDevOpsAssistant/\n├── Core/                     # Core functionality\n│   ├── func.py               # SSH, LLM, and command execution\n│   ├── database.py           # Database operations\n│   ├── auth.py               # User authentication\n│   └── utils.py              # Utility functions\n├── models/                   # Data models\n│   └── command_history.py    # SQLite model for command history\n├── main.py                   # Streamlit app entry point\n├── requirements.txt          # Python dependencies\n└── devops_assistant.db       # SQLite database file\n```\n\n---\n\n## Configuration ⚙️\n\n- **Ollama Server URL**: Default is `http://localhost:11434`. Update in the sidebar if needed.\n- **Default Model**: Set to `llama3.2`. Change in `func.py` if required.\n- **SSH Timeout**: Default is 10 seconds. Adjust in `func.py`.\n\n---\n\n## Contributing 🤝\n\nContributions are welcome! If you'd like to contribute, please follow these steps:\n\n1. Fork the repository.\n2. Create a new branch:\n   ```bash\n   git checkout -b feature/YourFeatureName\n   ```\n3. Commit your changes:\n   ```bash\n   git commit -m 'Add some feature'\n   ```\n4. Push to the branch:\n   ```bash\n   git push origin feature/YourFeatureName\n   ```\n5. Open a pull request.\n\n---\n\n## License 📝\n\nThis project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.\n\n---\n\n## Acknowledgments 🙏\n\n- **Ollama**: For providing the local LLM server.\n- **Streamlit**: For the easy-to-use web interface.\n- **Paramiko**: For SSH connectivity.\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frhul27%2Fdevopsassistant","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Frhul27%2Fdevopsassistant","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Frhul27%2Fdevopsassistant/lists"}