{"id":17445624,"url":"https://github.com/celvoxes/thinkr","last_synced_at":"2025-04-19T13:43:34.968Z","repository":{"id":257874492,"uuid":"861324924","full_name":"CelVoxes/thinkR","owner":"CelVoxes","description":"gpt-o1 like chain of thoughts with local LLMs in R","archived":false,"fork":false,"pushed_at":"2024-10-15T16:07:41.000Z","size":478,"stargazers_count":32,"open_issues_count":0,"forks_count":3,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-03-29T08:22:33.953Z","etag":null,"topics":["chainofthought","cot","llm","ollama","r"],"latest_commit_sha":null,"homepage":"","language":"R","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/CelVoxes.png","metadata":{"files":{"readme":"README.Rmd","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-09-22T15:44:32.000Z","updated_at":"2025-03-18T20:43:35.000Z","dependencies_parsed_at":"2024-10-19T20:59:07.163Z","dependency_job_id":null,"html_url":"https://github.com/CelVoxes/thinkR","commit_stats":null,"previous_names":["celvoxes/thinkr"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/CelVoxes%2FthinkR","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/CelVoxes%2FthinkR/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/CelVoxes%2FthinkR/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/CelVoxes%2FthinkR/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/CelVoxes","download_url":"https://codeload.github.com/CelVoxes/thinkR/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":249705381,"owners_count":21313351,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["chainofthought","cot","llm","ollama","r"],"created_at":"2024-10-17T18:06:22.602Z","updated_at":"2025-04-19T13:43:34.936Z","avatar_url":"https://github.com/CelVoxes.png","language":"R","readme":"---\ntitle: \"thinkR\"\noutput: github_document\n---\n\n\n![](logo.webp)\n\nthinkR is an R package that enables o-1 like chain of thoughts using ollama.\n\n## Installation\n\nTo install ceLLama, use the following command:\n```{r eval=FALSE}\ndevtools::install_github(\"eonurk/thinkR\")\n```\n\n## Usage\n\n#### Step 1: Install Ollama\n\nDownload [`Ollama`](https://ollama.com/).\n\n#### Step 2: Choose Your Model\n\nSelect your preferred model. For instance, to run the Llama3 model, use the following terminal command:\n\n```{bash eval=FALSE}\nollama run llama3.1\n```\n\nThis initiates a local server, which can be verified by visiting http://localhost:11434/. The page should display \"Ollama is running\".\n\n#### Step 3: Think!\n\n\u003e Q: How many 'R's are in strawberry?\n```{r cache=TRUE, eval=FALSE}\nlibrary(thinkR)\n\n## Usage example\nollama \u003c- OllamaHandler$new(model = \"llama3.1\")\nresult \u003c- generate_response(\"How many 'R's are in strawberry?\", ollama)\n```\n\n\u003cdetails\u003e\u003csummary\u003eThinking...\u003c/summary\u003e\n```{r echo=FALSE}\nresult \u003c- readRDS(\"llama3_1.results.rds\")\n\ncat(paste0(\n    sapply(result$steps, function(m) {\n        if (!is.null(m$title) \u0026\u0026 !is.null(m$content) \u0026\u0026 !is.null(m$thinking_time)) {\n            sprintf(\"%s\\n%s\\nTime: %s s\\n\\n\", m$title, m$content, m$thinking_time)\n        }\n    }, USE.NAMES = FALSE)\n))\n```\n\u003c/details\u003e\n\n```{r echo=FALSE}\ntotal_thinking_time \u003c- result$total_thinking_time\ncat(paste0(\"Total thinking time: \", sprintf(\"%.2f\", total_thinking_time), \" s\"))\n```\n\n\u003e A: Therefore, the final answer is 3!\n\n\u003cbr\u003e\n\n\u003e [!NOTE]\\\n\u003e This output is cherry-picked from llama3.1... because 70b was a lot to download for my internet connection. \n\n## Acknowledgments\n\nCursor is a great tool \u003c3\n\n### Credits\n- [g-1](https://github.com/bklieger-groq/g1)\n- [multi-1](https://github.com/tcsenpai/multi1)\n\n## License\nThis project is licensed under the MIT License\n\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcelvoxes%2Fthinkr","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcelvoxes%2Fthinkr","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcelvoxes%2Fthinkr/lists"}