{"id":22111729,"url":"https://github.com/agentmaker/neural-renderer-paddle","last_synced_at":"2026-03-10T14:32:09.556Z","repository":{"id":112830264,"uuid":"436687949","full_name":"AgentMaker/neural-renderer-paddle","owner":"AgentMaker","description":"A PaddlePaddle version of Neural Renderer, refer to its PyTorch version","archived":false,"fork":false,"pushed_at":"2021-12-11T07:56:29.000Z","size":2840,"stargazers_count":13,"open_issues_count":0,"forks_count":2,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-07-25T11:29:43.463Z","etag":null,"topics":["deep-learning","neural-renderer","paddlepaddle"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/AgentMaker.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-12-09T16:36:52.000Z","updated_at":"2022-07-12T08:01:31.000Z","dependencies_parsed_at":null,"dependency_job_id":"29875edc-085d-427c-9ba2-91ea469f8ceb","html_url":"https://github.com/AgentMaker/neural-renderer-paddle","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"purl":"pkg:github/AgentMaker/neural-renderer-paddle","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AgentMaker%2Fneural-renderer-paddle","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AgentMaker%2Fneural-renderer-paddle/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AgentMaker%2Fneural-renderer-paddle/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AgentMaker%2Fneural-renderer-paddle/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/AgentMaker","download_url":"https://codeload.github.com/AgentMaker/neural-renderer-paddle/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AgentMaker%2Fneural-renderer-paddle/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":30337210,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-10T12:41:07.687Z","status":"ssl_error","status_checked_at":"2026-03-10T12:41:06.728Z","response_time":106,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","neural-renderer","paddlepaddle"],"created_at":"2024-12-01T10:51:50.273Z","updated_at":"2026-03-10T14:32:09.548Z","avatar_url":"https://github.com/AgentMaker.png","language":"Python","readme":"# Neural 3D Mesh Renderer in PadddlePaddle\n\nA PaddlePaddle version of [Neural Renderer](https://github.com/hiroharu-kato/neural_renderer), refer to its [PyTorch version](https://github.com/daniilidis-group/neural_renderer)\n\n## Install\n\nRun:\n```sh\npip install neural-renderer-paddle\n```\n\n## Usage\n\nCheck `examples` folder for usage.\n\n## Note\n\nUnittest module is not compatiable with PaddlePaddle. If you want to run test, run:\n```sh\ngit clone \u003cTHIS REPO\u003e neural_renderer_paddle\npip install neural_renderer_paddle/\ncd neural_renderer_paddle/tests\npython -m unittest test_*.py\n```\n\n---\n\nOriginal repo README:\n\n# Neural 3D Mesh Renderer (CVPR 2018)\n\nThis repo contains a PyTorch implementation of the paper [Neural 3D Mesh Renderer](http://hiroharu-kato.com/projects_en/neural_renderer.html) by Hiroharu Kato, Yoshitaka Ushiku, and Tatsuya Harada.\nIt is a port of the [original Chainer implementation](https://github.com/hiroharu-kato/neural_renderer) released by the authors.\nCurrently the API is the same as in the original implementation with some smalls additions (e.g. render using a general 3x4 camera matrix, lens distortion coefficients etc.). However it is possible that it will change in the future.\n\nThe library is fully functional and it passes all the test cases supplied by the authors of the original library.\nDetailed documentation will be added in the near future.\n## Requirements\nPython 2.7+ and PyTorch 0.4.0.\n\nThe code has been tested only with PyTorch 0.4.0, there are no guarantees that it is compatible with older versions.\nCurrently the library has both Python 3 and Python 2 support.\n\n**Note**: In some newer PyTorch versions you might see some compilation errors involving AT_ASSERT. In these cases you can use the version of the code that is in the branch *at_assert_fix*. These changes will be merged into master in the near future.\n## Installation\nYou can install the package by running\n```\npip install neural_renderer_pytorch\n```\nSince running install.py requires PyTorch, make sure to install PyTorch before running the above command.\n## Running examples\n```\npython ./examples/example1.py\npython ./examples/example2.py\npython ./examples/example3.py\npython ./examples/example4.py\n```\n\n\n## Example 1: Drawing an object from multiple viewpoints\n\n![](https://raw.githubusercontent.com/hiroharu-kato/neural_renderer/master/examples/data/example1.gif)\n\n## Example 2: Optimizing vertices\n\nTransforming the silhouette of a teapot into a rectangle. The loss function is the difference between the rendered image and the reference image.\n\nReference image, optimization, and the result.\n\n![](https://raw.githubusercontent.com/hiroharu-kato/neural_renderer/master/examples/data/example2_ref.png) ![](https://raw.githubusercontent.com/hiroharu-kato/neural_renderer/master/examples/data/example2_optimization.gif) ![](https://raw.githubusercontent.com/hiroharu-kato/neural_renderer/master/examples/data/example2_result.gif)\n\n## Example 3: Optimizing textures\n\nMatching the color of a teapot with a reference image.\n\nReference image, result.\n\n![](https://raw.githubusercontent.com/hiroharu-kato/neural_renderer/master/examples/data/example3_ref.png) ![](https://raw.githubusercontent.com/hiroharu-kato/neural_renderer/master/examples/data/example3_result.gif)\n\n## Example 4: Finding camera parameters\n\nThe derivative of images with respect to camera pose can be computed through this renderer. In this example the position of the camera is optimized by gradient descent.\n\nFrom left to right: reference image, initial state, and optimization process.\n\n![](https://raw.githubusercontent.com/hiroharu-kato/neural_renderer/master/examples/data/example4_ref.png) ![](https://raw.githubusercontent.com/hiroharu-kato/neural_renderer/master/examples/data/example4_init.png) ![](https://raw.githubusercontent.com/hiroharu-kato/neural_renderer/master/examples/data/example4_result.gif)\n\n\n## Citation\n\n```\n@InProceedings{kato2018renderer\n    title={Neural 3D Mesh Renderer},\n    author={Kato, Hiroharu and Ushiku, Yoshitaka and Harada, Tatsuya},\n    booktitle={The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},\n    year={2018}\n}\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fagentmaker%2Fneural-renderer-paddle","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fagentmaker%2Fneural-renderer-paddle","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fagentmaker%2Fneural-renderer-paddle/lists"}