{"id":13435204,"url":"https://github.com/getnamo/TensorFlow-Unreal","last_synced_at":"2025-03-18T02:31:18.290Z","repository":{"id":38354953,"uuid":"75088875","full_name":"getnamo/TensorFlow-Unreal","owner":"getnamo","description":"TensorFlow plugin for the Unreal Engine.","archived":false,"fork":false,"pushed_at":"2024-08-30T11:45:10.000Z","size":509,"stargazers_count":1152,"open_issues_count":36,"forks_count":211,"subscribers_count":86,"default_branch":"master","last_synced_at":"2024-09-10T16:15:54.880Z","etag":null,"topics":["blueprint","machine-learning","python","tensorflow","ue4","ue5","unreal-engine"],"latest_commit_sha":null,"homepage":"","language":"C++","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/getnamo.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2016-11-29T14:26:00.000Z","updated_at":"2024-09-03T18:45:04.000Z","dependencies_parsed_at":"2024-01-16T21:46:47.293Z","dependency_job_id":"796e9317-d10c-4262-a5f8-8c19011c7780","html_url":"https://github.com/getnamo/TensorFlow-Unreal","commit_stats":null,"previous_names":["getnamo/tensorflow-ue4"],"tags_count":29,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/getnamo%2FTensorFlow-Unreal","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/getnamo%2FTensorFlow-Unreal/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/getnamo%2FTensorFlow-Unreal/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/getnamo%2FTensorFlow-Unreal/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/getnamo","download_url":"https://codeload.github.com/getnamo/TensorFlow-Unreal/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":221704678,"owners_count":16866811,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["blueprint","machine-learning","python","tensorflow","ue4","ue5","unreal-engine"],"created_at":"2024-07-31T03:00:33.817Z","updated_at":"2024-10-27T16:32:06.910Z","avatar_url":"https://github.com/getnamo.png","language":"C++","readme":"# TensorFlow Unreal Plugin\n\n[![GitHub release](https://img.shields.io/github/release/getnamo/TensorFlow-Unreal/all.svg)](https://github.com/getnamo/TensorFlow-Unreal/releases)\n[![Github All Releases](https://img.shields.io/github/downloads/getnamo/TensorFlow-Unreal/total.svg)](https://github.com/getnamo/TensorFlow-Unreal/releases)\n\n[\u003cimg src=\"https://i.imgur.com/qn73w1u.png\" alt=\"badge\" width=\"100\"/\u003e](https://www.unrealengine.com/en-US/blog/epic-megagrants-reaches-13-million-milestone-in-2019)\n\n[Unreal Engine](https://www.unrealengine.com) plugin for [TensorFlow](https://www.tensorflow.org/). Enables training and implementing state of the art [machine learning](https://en.wikipedia.org/wiki/Machine_learning) algorithms for your unreal projects. \n\nThis plugin contains C++, Blueprint and python scripts that encapsulate TensorFlow operations as an _Actor Component_. It depends on an [UnrealEnginePython](https://github.com/getnamo/UnrealEnginePython) plugin fork and the [SocketIO Client](https://github.com/getnamo/socketio-client-ue4) plugin; these are always included in [binary releases](https://github.com/getnamo/TensorFlow-Unreal/releases) so no manual external downloading is necessary. See [Note on Dependencies section](https://github.com/getnamo/TensorFlow-Unreal#note-on-dependencies) for details on implementation and architecture.\n\nSee [unreal forum thread](https://forums.unrealengine.com/community/work-in-progress/1357673-tensorflow) for discussions.\n\n[Discord Server](https://discord.gg/qfJUyxaW4s)\n\n## Issues and Limitations\n\nThere is currently only a working build for the Windows platform. Be careful where you place your project as you may hit [240 char filepath limit with your python dependencies](https://github.com/getnamo/TensorFlow-Unreal/issues/36).\n\nNear future refactor to open up dev environments and native support (WIP):\nhttps://github.com/getnamo/TensorFlow-Unreal/issues/53\n\n- Machine Learning Remote - https://github.com/getnamo/MachineLearningRemote-Unreal\n- Tensorflow Native - Inference focused (WIP) https://github.com/getnamo/TensorFlowNative-Unreal\n\nTensorflow UnrealEnginePython Platform Issues\n\n- [Linux issue#13 tracking](https://github.com/getnamo/TensorFlow-Unreal/issues/13)\n\n- [Android issue#11 tracking](https://github.com/getnamo/TensorFlow-Unreal/issues/11) - will likely be superceded by tf native \n\n- [Mac OS issue#10 tracking](https://github.com/getnamo/TensorFlow-Unreal/issues/10) - will likely be superceded by ml remote\n\nIf you have ideas or fixes, consider contributing! See https://github.com/getnamo/TensorFlow-Unreal/issues for current issues.\n\n\n\n## Installation \u0026 Setup\n\n 1.\t(GPU only) [Install CUDA and cudNN pre-requisites](https://www.tensorflow.org/install/gpu#windows_setup) if you're using compatible GPUs (NVIDIA)\n 2.\t[Download Latest Release](https://github.com/getnamo/TensorFlow-Unreal/releases) choose CPU or GPU download version if supported.\n 3.\tCreate new or choose project.\n 4.\tBrowse to your project folder (typically found at _Documents/Unreal Project/{Your Project Root}_)\n\n![copy plugins](http://i.imgur.com/Dktr6JK.png)\n \n 5.\tCopy *Plugins* folder into your Project root.\n 6.\tLaunch your project.\n 7.\t(Optional) All plugins should be enabled by default, you can confirm via Edit-\u003ePlugins. Scroll down to Project and you should see three plugins, TensorFlow in Computing, Socket.IO Client in Networking and UnrealEnginePython in Scripting Languages. Click Enabled if any is disabled and restart the Editor and open your project again.\n 8.\tWait for tensorflow dependencies to be automatically installed. It will auto-resolve any dependencies listed in [Content/Scripts/upymodule.json](https://github.com/getnamo/TensorFlow-Unreal/blob/master/Content/Scripts/upymodule.json) using pip. Note that this step may take a few minutes and depends on your internet connection speed and you will see nothing in the output log window until it has fully completed.\n \n![image](https://user-images.githubusercontent.com/542365/36981363-e88aa2ec-2084-11e8-828c-e5a526cda67b.png)\n \n 7. Once you see an output similar to this (specific packages will change with each version of tensorflow), the plugin is ready to use.\n \n### Note on Git Cloning\n\nUsing full [plugin binary releases](https://github.com/getnamo/TensorFlow-Unreal/releases) is recommended, this allows you to follow the [installation instructions as written](https://github.com/getnamo/TensorFlow-Unreal#installation--setup) and get up to speed quickly.\n\nIf you instead wish to git clone and sync to master repository manually then it is expected that you [download the latest python binary dependency release](https://github.com/getnamo/UnrealEnginePython/releases) for UnrealEnginePython. This contains an embedded python build; select the *BinariesOnly-.7z* file from Downloads and drag the plugins folder into your project root. With that step complete, your clone repository should work as expected, all other dependencies will be pulled via pip on first launch.\n\n## Examples\n\n[![mnist spawn samples](http://i.imgur.com/kvsLXvF.gif)](https://github.com/getnamo/TensorFlow-Unreal-examples)\n\n*Basic MNIST softmax classifier trained on begin play with sample training inputs streamed to the editor during training. When fully trained, UTexture2D (1-3) samples are tested for prediction.*\n\nAn example project is found at [https://github.com/getnamo/TensorFlow-Unreal-examples](https://github.com/getnamo/TensorFlow-Unreal-examples).\n\nThe repository has basic examples for general tensorflow control and different mnist classification examples with UE4 UTexture2D input for prediction. The repository should expand as more plug and play examples are made. Consider contributing samples via pull requests!\n\nIt is also the main repository where all development is tracked for all dependencies for this plugin.\n\n## Python API\n\nYou can either train directly or use a trained model inside UE4.\n\nTo start, add your python script file to _{Project Root Folder}/Content/Scripts_.\n\nwrap your tensorflow python code by subclassing TFPluginAPI.\n\n#### MySubClass(TFPluginAPI)\n\nimport ```tensorflow```, ```unreal_engine``` and ```TFPluginAPI``` in your module file and subclass the TFPluginAPI class with the following functions.\n\n```python\nimport tensorflow as tf\nimport unreal_engine as ue\nfrom TFPluginAPI import TFPluginAPI\n\nclass ExampleAPI(TFPluginAPI):\n\n\t#expected optional api: setup your model for training\n\tdef onSetup(self):\n\t\tpass\n\t\t\n\t#expected optional api: parse input object and return a result object, which will be converted to json for UE4\n\tdef onJsonInput(self, jsonInput):\n\t\tresult = {}\n\t\treturn result\n\n\t#expected optional api: start training your network\n\tdef onBeginTraining(self):\n\t\tpass\n    \n#NOTE: this is a module function, not a class function. Change your CLASSNAME to reflect your class\n#required function to get our api\ndef getApi():\n\t#return CLASSNAME.getInstance()\n\treturn ExampleAPI.getInstance()\n```\n\nNote the ```getApi()``` module function which needs to return a matching instance of your defined class. The rest of the functionality depends on what API you wish to use for your use case. At the moment the plugin supports input/output from UE4 via JSON encoding.\n\nIf you wish to train in UE4, implement your logic in ```onBeginTraining()``` and ensure you check for ```self.shouldStop``` after each batch/epoch to handle early exit requests from the user e.g. when you _EndPlay_ or manually call ```StopTraining``` on the tensorflow component. You will also receive an optional ```onStopTraining``` callback when the user stops your training session.\n\nIf you have a trained model, simply setup your model/load it from disk and omit the training function, and forward your evaluation/input via the ```onJsonInput(jsonArgs)``` callback. See [mnistSaveLoad.py example](https://github.com/getnamo/TensorFlow-Unreal-examples/blob/master/Content/Scripts/mnistSaveLoad.py) on how to train a network once, and then save the model, reloading it on setup such that you skip retraining it every time.\n\nNote that both ```onBeginTraining()``` and ```onSetup()``` are called asynchronously by default. If you use a high level library like e.g. keras, may need to store your *tf.Session* and *tf.Graph* separately and use it as default ```with self.session.as_default():``` and ```with self.graph.as_default():``` to evaluate, since all calls will be generally done in separate threads.\n\n\nBelow is a very basic example of using tensorflow to add or subtract values passed in as ```{\"a\":\u003cfloat number or array\u003e, \"b\":\u003cfloat number or array\u003e}```.\n\n```python\nimport tensorflow as tf\nimport unreal_engine as ue\nfrom TFPluginAPI import TFPluginAPI\n\nclass ExampleAPI(TFPluginAPI):\n\n\t#expected optional api: setup your model for training\n\tdef onSetup(self):\n\t\tself.sess = tf.InteractiveSession()\n\n\t\tself.a = tf.placeholder(tf.float32)\n\t\tself.b = tf.placeholder(tf.float32)\n\n\t\t#operation\n\t\tself.c = self.a + self.b\n\t\tpass\n\t\t\n\t#expected optional api: json input as a python object, get a and b values as a feed_dict\n\tdef onJsonInput(self, jsonInput):\n\t\t\n\t\t#show our input in the log\n\t\tprint(jsonInput)\n\n\t\t#map our passed values to our input placeholders\n\t\tfeed_dict = {self.a: jsonInput['a'], self.b: jsonInput['b']}\n\n\t\t#run the calculation and obtain a result\n\t\trawResult = self.sess.run(self.c,feed_dict)\n\t\t\n\t\t#convert to array and embed the answer as 'c' field in a python object\n\t\treturn {'c':rawResult.tolist()}\n\n\t#custom function to change the operation type\n\tdef changeOperation(self, type):\n\t\tif(type == '+'):\n\t\t\tself.c = self.a + self.b\n\n\t\telif(type == '-'):\n\t\t\tself.c = self.a - self.b\n\n\n\t#expected optional api: We don't do any training in this example\n\tdef onBeginTraining(self):\n\t\tpass\n    \n#NOTE: this is a module function, not a class function. Change your CLASSNAME to reflect your class\n#required function to get our api\ndef getApi():\n\t#return CLASSNAME.getInstance()\n\treturn ExampleAPI.getInstance()\n```\n\nA full example using mnist can be seen here: https://github.com/getnamo/TensorFlow-Unreal-examples/blob/master/Content/Scripts/mnistSimple.py\n\nA full example using save/load setup can be seen here: https://github.com/getnamo/TensorFlow-Unreal-examples/blob/master/Content/Scripts/mnistSaveLoad.py\n\nAnother full example using keras api can be found here: https://github.com/getnamo/TensorFlow-Unreal-examples/blob/master/Content/Scripts/mnistKerasCNN.py. Note the keras callback used for stopping training after current batch completes, this cancels training on early gameplay exit e.g. EndPlay.\n\n#### Asynchronous Events to Tensorflow Component\n\nIf you need to stream some data to blueprint e.g. during training you can use the ```self.callEvent()``` api. \n\n##### String Format\nThe format is ```self.callEvent('EventName', 'MyString')```\n\n##### Json Format\nThe format is ```self.callEvent('EventName', PythonObject, True)```\n\nExample use case in [mnistSpawnSamples.py](https://github.com/getnamo/TensorFlow-Unreal-examples/blob/master/Content/Scripts/mnistSpawnSamples.py#L87) where sample training images are emitted to unreal for preview.\n\n## Blueprint API\n\n### Load your python module from your TensorflowComponent\nOnce you've [written your python module](https://github.com/getnamo/TensorFlow-Unreal#mysubclasstfpluginapi), Select your TensorflowComponent inside your actor blueprint\n\n![select component](http://i.imgur.com/f9Syql1.png)\n\nand change the TensorFlowModule name to reflect your _filename_ without .py. e.g. if my python file is _ExampleAPI.py_ it would look like this\n\n![change module name](http://i.imgur.com/mpzymgd.png)\n\nOptionally disable the verbose python log and change other toggles such as training on _BeginPlay_ or disabling multithreading (not recommended).\n\n### Training\n\nBy default the _onBeginTraining()_ function will get called on the component's begin play call. You can optionally untick this option and call _Begin Training_ manually.\n\n![manual train](http://i.imgur.com/YM3KZwy.png)\n\n### Sending Json inputs to your model for e.g. prediction\n\nYou control what type of data you forward to your python module and the only limitation for the current api is that it should be JSON formatted.\n\n#### Basic Json String\nIn the simplest case you can send e.g. a basic json string ```{\"MyString\",\"SomeValue\"}``` constructed using SIOJson like so\n\n![send json string](http://i.imgur.com/xizBrpt.png)\n\n#### Any UStruct Example\n\nSIOJson supports completely user defined structs, even ones only defined in blueprint. It's highly recommended to use such structs for a convenient way to organize your data and to reliably decode it on the python side. Below is an example where we send a custom bp struct and encode it straight to JSON.\n\n![send custom struct](http://i.imgur.com/Ova2xzf.png)\n\nwith the struct defined in blueprint as\n\n![custom struct definition](http://i.imgur.com/hg3qlSK.png)\n\nYou can also interweave structs, even common unreal types so feel free to mix and match both of the above methods. In this particular example we interweave a 3D vector in a json object we defined. The sent input should now be ```{\"SomeVector\":{\"x\":1.0,\"y\":2.3,\"z\":4.3}}```\n\n![send struct](http://i.imgur.com/NJ48M70.png)\n\n\n#### Special convenience case: UTexture2D\n\nA convenience function wraps a UTexture2D into a json object with ```{\"pixels\":[\u003c1D array of pixels\u003e], \"size\":{\"x\":\u003cimage width\u003e,:\"y\":\u003cimage height\u003e}}``` which you can reshape using numpy.\n\n![send texture](http://i.imgur.com/vSq2xea.png)\n\nNote that this currently will convert an image into full alpha greyscale. If you need color texture inputs, use own custom method or make a pull request.\n\n#### Custom functions\n\nIf you need to call python functions from blueprint which the current api doesn't support, you can do so by using the ```CallCustomFunction``` method on the _TensorflowComponent_. You specify the function name and pass in a string as arguments. The function runs on the game thread and will return immediately with an expected string value. For both arguments and returning values, JSON encoding is recommended, but optional.\n\n![custom function call](http://i.imgur.com/ejBs8cI.png)\n\nExample custom function call passing in a string argument to [```changeOperation```](https://github.com/getnamo/TensorFlow-Unreal-examples/blob/master/Content/Scripts/addExample.py#L31) in [addExample.py](https://github.com/getnamo/TensorFlow-Unreal-examples/blob/master/Content/Scripts/addExample.py)\n\n### Handling Tensorflow Events\n\nSelect your _Tensorflow Component_ from your actor blueprint and then click + to subscribe to the chosen event in the event graph. \n\n![events](http://i.imgur.com/2q7486k.png)\n\ncurrent api supports the following events\n\n#### On Input Results\n\nCalled when _onJsonInput()_ completes in your python module. The returned data is a json string of the return data you pass at the end of the function.\n\n![onresults](http://i.imgur.com/lLbtmVt.png)\n\nNormally you'd want to convert this string into _SIOJsonObject_ so you can use your results data in blueprint. It is also typical to have a prediction field attached to this object for e.g. classification tasks.\n\nIf you have a regular return format, consider making your own custom bp struct and fill its value from the json string like this\n\n![fill struct from json](http://i.imgur.com/IBWYzw9.png)\n\nNote that the function will only fill fields that have matching names and ignore all other struct fields. This means you can safely fill a partial struct from a json string that has more fields than the struct defines.\n\n#### On Training Complete\n\nWhen the _onBeginTraining()_ call is complete you receive this event with ```{'elapsed':\u003ctime taken\u003e}``` json, optionally with additional return data passed in from your function.\n\n![ontraining](http://i.imgur.com/XiZhH04.png)\n\n#### On Event\n\nIf you use [```self.callEvent()```](https://github.com/getnamo/TensorFlow-Unreal#asynchronous-events-to-tensorflow-component) you will receive this event dispatch. You can filter your event types by the event name and then do whatever you need to with the data passed in.\n\n![onevent](http://i.imgur.com/ny0aEZv.png)\n\nFor example [mnistSpawnSamples.py](https://github.com/getnamo/TensorFlow-Unreal-examples/blob/master/Content/Scripts/mnistSpawnSamples.py#L121) uses ```self.callEvent()``` to async stream training images and we'd filter this via checking for ```'PixelEvent'```\n\n## Blueprint Utilities\n\n### Conversion\nA large portion of the plugin capability comes from its ability to convert data types. See [TensorflowBlueprintLibrary.h](https://github.com/getnamo/TensorFlow-Unreal/blob/master/Source/TensorFlow/Public/TensorFlowBlueprintLibrary.h) for full declarations and code comments.\n\n#### UTexture2D to float array (grayscale)\n\nConvert a UTexture2D as grayscale to a 1D float array; obtains size from texture.\n\n_Blueprint_\n\n```\nToGrayScaleFloatArray (Texture2D)\n```\n\n_C++_\n```c++\nstatic TArray\u003cfloat\u003e Conv_GreyScaleTexture2DToFloatArray(UTexture2D* InTexture);\n```\n\n\n#### UTexture2D to float array\n\nConvert a UTexture2D to a 1D float array; obtains size from texture. Expects 4 1-byte values per pixel e.g. RGBA.\n\n_Blueprint_\n\n```\nToFloatArray (Texture2D)\n```\n\n_C++_\n```c++\nstatic TArray\u003cfloat\u003e Conv_Texture2DToFloatArray(UTexture2D* InTexture);\n```\n\n#### Invert Float Array\n\nInvert values in a given float array (1-\u003e0, 0-\u003e1) on a 0-1 scale.\n\n_Blueprint_ \n\n```\nInvertFloatArray\n```\n\n_C++_\n```c++\nstatic TArray\u003cfloat\u003e InvertFloatArray(const TArray\u003cfloat\u003e\u0026 InFloatArray);\n```\n\n#### Float array to UTexture2D\n\nConvert a 4 value per pixel float array to a UTexture2D with specified size, if size is unknown (0,0), it will assume a square array.\n\n_Blueprint_ \n\n```\nToTexture2D (Float Array)\n```\n\n_C++_\n```c++\nstatic UTexture2D* Conv_FloatArrayToTexture2D(const TArray\u003cfloat\u003e\u0026 InFloatArray, const FVector2D Size = FVector2D(0,0));\n```\n\n\n#### Float array (Grayscale) to UTexture2D\n\nConvert a 1 value per pixel float array to a UTexture2D with specified size, if size is unknown (0,0), it will assume a square array.\n\n_Blueprint_ \n\n```\nToTexture2D (Grayscale Array)\n```\n\n_C++_\n```c++\nstatic UTexture2D* Conv_FloatArrayToTexture2D(const TArray\u003cfloat\u003e\u0026 InFloatArray, const FVector2D Size = FVector2D(0,0));\n```\n\n#### ToTexture2D (Render Target 2D)\n\nConvert a UTextureRenderTarget2D to a UTexture2D\n\n_Blueprint_ \n\n```\nToTexture2D (Render Target 2D)\n```\n\n_C++_\n```c++\nstatic UTexture2D* Conv_RenderTargetTextureToTexture2D(UTextureRenderTarget2D* InTexture);\n```\n\n#### ToFloatArray (bytes)\n\nConvert a byte array into a float array, normalized by the passed in scale\n\n_Blueprint_ \n\n```\nToFloatArray (bytes)\n```\n\n_C++_\n```c++\nstatic TArray\u003cfloat\u003e Conv_ByteToFloatArray(const TArray\u003cuint8\u003e\u0026 InByteArray, float Scale = 1.f);\n```\n\n## TF Audio Capture Component\n\nA c++ component that uses windows api to capture and stream microphone audio without the need of an online subsystem. See https://github.com/getnamo/TensorFlow-Unreal/blob/master/Source/TFAudioCapture/Public/TFAudioCaptureComponent.h for details on API. \n\nThis component is aimed to be used for native speech recognition when Tensorflow examples mature.\n\n## File Utility Component\n\nA simple blueprint wrapper to save and load bytes from file. Allows to easily flush e.g. audio capture for later use. See https://github.com/getnamo/TensorFlow-Unreal/blob/master/Source/CoreUtility/Public/FileUtilityComponent.h for details on API.\n\n## Use pip to manage your dependencies in the python console\n\nThe plugin uses a pip wrapper script that uses a subproccess to not cause blocking behavior. Simply import it using\n\n```import upypip as pip```\n\nin your script and then type e.g.\n\n```pip.list()```\nwhich should very shortly list all your installed python modules.\n\n```\nPackage        Version  \n-------------- ---------\nabsl-py        0.1.10   \nastor          0.6.2    \nbleach         1.5.0    \ngast           0.2.0    \ngrpcio         1.10.0   \nhtml5lib       0.9999999\nMarkdown       2.6.11   \nnumpy          1.14.1   \npip            9.0.1    \nprotobuf       3.5.1    \nsetuptools     38.5.1   \nsix            1.11.0   \ntensorboard    1.6.0    \ntensorflow     1.6.0    \ntensorflow-gpu 1.6.0    \ntermcolor      1.1.0    \nWerkzeug       0.14.1   \nwheel          0.30.0   \n```\n\nIf you'd like to add another module call the install function e.g. if you wanted to upgrade to gpu version you could simply type\n\n```pip.install('tensorflow-gpu')```\n\nor you can go back to a clean slate with \n\n```pip.uninstallAll()```\n\nwhich should leave you with just the basics\n\n```\nPackage    Version\n---------- -------\npip        9.0.1  \nsetuptools 38.5.1 \nwheel      0.30.0 \n```\n\nSee [upypip.py](https://github.com/getnamo/UnrealEnginePython/blob/master/Content/Scripts/upypip.py) for all the available commands.\n\n## Note on Dependencies\nDepends on an [UnrealEnginePython](https://github.com/getnamo/UnrealEnginePython) plugin fork and the [SocketIO Client](https://github.com/getnamo/socketio-client-ue4) plugin. Both of these and an embedded python build are included in every [release](https://github.com/getnamo/TensorFlow-Unreal/releases) so you don't need to manually include anything, just drag and drop the *Plugins* folder into your project from any release.\n\n### Architecture and Purpose\n\n![architecture](http://i.imgur.com/8bUiCbM.png)\n\n#### UnrealEnginePython\nBased on the wonderful work by [20tab](https://github.com/20tab/UnrealEnginePython), the UnrealEnginePython plugin fork contains changes to enable multi-threading, python script plugin encapsulation and automatic dependency resolution via pip. Simply specifying tensorflow as a _pythonModule_ dependency in https://github.com/getnamo/TensorFlow-Unreal/blob/master/Content/Scripts/upymodule.json makes the editor auto-resolve the dependency on first run. The multi-threading support contains a callback system allowing long duration operations to happen on a background thread (e.g. training) and then receiving callbacks on your game-thread. This enables TensorFlow to work without noticeably impacting the game thread.\n\n#### SocketIO Client\nSocketIO Client is used for easy conversion between native engine types (BP or C++ structs and variables) and python objects via JSON. Can optionally be used to connect to a real-time web service via [socket.io](https://socket.io/).\n\n## Packaging\n\n#### Note on Blueprint Only projects\nYou will need to convert your blueprint only project to mixed (bp and C++) before packaging. Follow these instructions to do that: https://allarsblog.com/2015/11/04/converting-bp-project-to-cpp/\n\n#### Extra step\nSince v0.10.0 the plugin should package correctly, but will require to run the packaged build once to pull the dependencies. You can optionally manually copy them from  ```{Project Root}/Plugins/UnrealEnginePython/Binaries/Win64/Lib/site-packages``` to the packaged folder to ```{Packaged Root}/{Project Name}/Plugins/UnrealEnginePython/Binaries/Win64/Lib/site-packages```.\n\nWhen you first launch your packaged project there may be a black screen for a while (2min) as it reinstalls pip and _pulls the dependencies_ for the first time. You can then reload the map after a few minutes or just restart (check your packaged log to see when it's ready). Each time after that the project should load quickly. Note that you can zip up and move the packaged project to another computer with all the dependencies, but it will have ~20sec boot up on first run as it re-installs pip to the correct location, but it won't have to pull the pip dependencies saving most of the waiting and then quick bootup each time after that.\n\n## Troubleshooting / Help\n\n### I see pip errors from upgrading tensorflow version\n\nDelete ```Plugins\\UnrealEnginePython\\Binaries\\Win64\\Lib\\site-packages``` and restart project\n\n### No module named 'tensorflow'\n\nOn first run you may see this message in your python console\n\n![no tensorflow](http://i.imgur.com/oed8Hhq.png)\n\nWait until pip installs your dependencies fully, this may take ~3-5min. When the dependencies have installed, it should look something like this\n\n![installed](http://i.imgur.com/s8WDu7M.png)\n\nAfter you see this, go ahead and close your editor and re-launch the project. When the project has launched again this error should not show up again.\n\n### 2-3 sec hitch on first begin play\n\nThis is due to python importing tensorflow on begin play and loading all the dlls. Currently unavoidable, only happens once per editor launch.\n\n### Issue not listed?\n\nPost your issue to https://github.com/getnamo/TensorFlow-Unreal/issues\n\n## [License](https://github.com/getnamo/TensorFlow-Unreal/blob/master/LICENSE)\nPlugin - [MIT](https://opensource.org/licenses/MIT)\n\nTensorFlow and TensorFlow Icon - [Apache 2.0](http://www.apache.org/licenses/LICENSE-2.0)\n\n\n\n","funding_links":[],"categories":["Assets"],"sub_categories":["Artificial Intelligence"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgetnamo%2FTensorFlow-Unreal","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgetnamo%2FTensorFlow-Unreal","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgetnamo%2FTensorFlow-Unreal/lists"}