{"id":17975907,"url":"https://github.com/getnamo/MachineLearningRemote-Unreal","last_synced_at":"2025-03-25T15:30:52.044Z","repository":{"id":65517284,"uuid":"183152665","full_name":"getnamo/MachineLearningRemote-Unreal","owner":"getnamo","description":"Machine Learning plugin for the Unreal Engine, encapsulating calls to remote python servers running e.g. Tensorflow/Pytorch.","archived":false,"fork":false,"pushed_at":"2024-12-05T03:33:55.000Z","size":132,"stargazers_count":144,"open_issues_count":13,"forks_count":40,"subscribers_count":8,"default_branch":"main","last_synced_at":"2025-03-21T02:54:19.727Z","etag":null,"topics":["cpp","machine-learning","plugin","pytorch","socket-io","tensorflow","unreal"],"latest_commit_sha":null,"homepage":"","language":"C++","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/getnamo.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-04-24T05:12:36.000Z","updated_at":"2025-02-18T17:10:37.000Z","dependencies_parsed_at":"2025-01-04T07:22:38.790Z","dependency_job_id":"480068ce-f0d8-4193-9c24-3eef0cf4af28","html_url":"https://github.com/getnamo/MachineLearningRemote-Unreal","commit_stats":null,"previous_names":[],"tags_count":5,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/getnamo%2FMachineLearningRemote-Unreal","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/getnamo%2FMachineLearningRemote-Unreal/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/getnamo%2FMachineLearningRemote-Unreal/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/getnamo%2FMachineLearningRemote-Unreal/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/getnamo","download_url":"https://codeload.github.com/getnamo/MachineLearningRemote-Unreal/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":245489756,"owners_count":20623789,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cpp","machine-learning","plugin","pytorch","socket-io","tensorflow","unreal"],"created_at":"2024-10-29T17:21:23.785Z","updated_at":"2025-03-25T15:30:51.757Z","avatar_url":"https://github.com/getnamo.png","language":"C++","readme":"# MachineLearningRemote Unreal Plugin\nA Machine Learning (ML) plugin for the Unreal Engine, encapsulating calls to remote python servers running python ML libraries like Tensorflow or Pytorch. Depends on server complement repository: https://github.com/getnamo/ml-remote-server.\n\n[![GitHub release](https://img.shields.io/github/release/getnamo/MachineLearningRemote-Unreal.svg)](https://github.com/getnamo/MachineLearningRemote-Unreal/releases)\n[![Github All Releases](https://img.shields.io/github/downloads/getnamo/MachineLearningRemote-Unreal/total.svg)](https://github.com/getnamo/MachineLearningRemote-Unreal/releases)\n\n\nShould have the same api as [tensorflow-ue4](https://github.com/getnamo/tensorflow-ue4), but with freedom to run a host server on platform of choice (e.g. remote win/linux/mac instances) and without a hard bind to the tensorflow library.\n\n## Unreal Machine Learning Plugin Variants\n\nWant to run tensorflow or pytorch on a remote (or local) python server?\n\n- https://github.com/getnamo/MachineLearningRemote-Unreal\n\nWant to use tensorflow python api with a python instance embedded in your unreal engine project? \n\n- https://github.com/getnamo/TensorFlow-Unreal\n\nWant native tensorflow inference? (WIP)\n\n- https://github.com/getnamo/TensorFlowNative-Unreal\n\n## Quick Install \u0026 Setup\n\n1. Install and setup https://github.com/getnamo/ml-remote-server on your target backend (can be a local folder), or setup the one embedded in plugin.\n2. Download [Latest Release](https://github.com/getnamo/MachineLearningRemote-Unreal/releases)\n3. Create new or choose project.\n4. Browse to your project folder (typically found at Documents/Unreal Project/{Your Project Root})\n5. Copy Plugins folder into your Project root.\n6. Plugin should be now ready to use. Remember to startup your [server](https://github.com/getnamo/ml-remote-server) when using this plugin. \n\n## How to use\n\n### Blueprint API\n\nAdd a ```MachineLearningRemote``` component to an actor of choice\n\n![](https://i.imgur.com/Mx3gNAi.png)\n\nChange server endpoint and ```DefaultScript``` to fit your use case. ```DefaultScript``` is the file name of your ML script which is placed in your *\u003c[server](https://github.com/getnamo/ml-remote-server)\u003e/scripts* folder. See https://github.com/getnamo/MachineLearningRemote-Unreal#python-api for example scripts.\n\n![](https://i.imgur.com/R3YVPtm.png)\n\nIn your script the ```on_setup``` and if ```self.should_train_on_start``` is true ```on_begin_training``` gets called. When your script has trained or it is otherwise ready, you can send inputs to it using ```SendSIOJsonInput``` or other variants (string/raw).\n\n![](https://i.imgur.com/WjmFLAu.png)\n\nYour inputs will be processed on your script side and any value you return from there will be sent back and returned in ```ResultData``` as *USIOJsonValue* in your latent callback.\n\n#### Other input variants\n\nSee https://github.com/getnamo/MachineLearningRemote-Unreal/blob/master/Source/MachineLearningRemote/Public/MachineLearningRemoteComponent.h for all variants\n\n#### Custom Function\n\nChange the ```FunctionName``` parameter in the ```SendSIOJsonInput``` to call a different function name in your script. This name will be used verbatim.\n\n### Python API\nThese scripts should be placed in your *\u003c[server](https://github.com/getnamo/ml-remote-server)\u003e/scripts* folder. If a matching script is defined in your ```MachineLearningRemote```-\u003e```DefaultScript``` property it should load on connect.\n\nKeep in mind that ```tensorflow``` is optional and used as an illustrative example of ML, you can use any other valid python library e.g. pytorch instead without issue.\n\nSee https://github.com/getnamo/ml-remote-server/tree/master/scripts for additional examples.\nSee https://github.com/getnamo/TensorFlow-Unreal#python-api for more detailed api examples.\n\n#### empty_example\n\nBare bones API example.\n\n```python\nimport tensorflow as tf\nfrom mlpluginapi import MLPluginAPI\n\nclass ExampleAPI(MLPluginAPI):\n\n\t#optional api: setup your model for training\n\tdef on_setup(self):\n\t\tpass\n\t\t\n\t#optional api: parse input object and return a result object, which will be converted to json for UE4\n\tdef on_json_input(self, input):\n\t\tresult = {}\n\t\treturn result\n\n\t#optional api: start training your network\n\tdef on_begin_training(self):\n\t\tpass\n\n\n#NOTE: this is a module function, not a class function. Change your CLASSNAME to reflect your class\n#required function to get our api\ndef get_api():\n\t#return CLASSNAME.getInstance()\n\treturn ExampleAPI.get_instance()\n```\n#### add_example\n\nSuper basic example showing how to add using the tensorflow library.\n\n```python\nimport tensorflow as tf\nimport unreal_engine as ue #for remote logging only, this is a proxy import to enable same functionality as local variants\nfrom mlpluginapi import MLPluginAPI\n\nclass ExampleAPI(MLPluginAPI):\n\n\t#expected optional api: setup your model for training\n\tdef on_setup(self):\n\t\tself.sess = tf.InteractiveSession()\n\t\t#self.graph = tf.get_default_graph()\n\n\t\tself.a = tf.placeholder(tf.float32)\n\t\tself.b = tf.placeholder(tf.float32)\n\n\t\t#operation\n\t\tself.c = self.a + self.b\n\n\t\tue.log('setup complete')\n\t\tpass\n\t\t\n\t#expected optional api: parse input object and return a result object, which will be converted to json for UE4\n\tdef on_json_input(self, json_input):\n\t\t\n\t\tue.log(json_input)\n\n\t\tfeed_dict = {self.a: json_input['a'], self.b: json_input['b']}\n\n\t\traw_result = self.sess.run(self.c, feed_dict)\n\n\t\tue.log('raw result: ' + str(raw_result))\n\n\t\treturn {'c':raw_result.tolist()}\n\n\t#custom function to change the op\n\tdef change_operation(self, type):\n\t\tif(type == '+'):\n\t\t\tself.c = self.a + self.b\n\n\t\telif(type == '-'):\n\t\t\tself.c = self.a - self.b\n\t\tue.log('operation changed to ' + type)\n\n\n\t#expected optional api: start training your network\n\tdef on_begin_training(self):\n\t\tpass\n    \n#NOTE: this is a module function, not a class function. Change your CLASSNAME to reflect your class\n#required function to get our api\ndef get_api():\n\t#return CLASSNAME.get_instance()\n\treturn ExampleAPI.get_instance()\n```\n\n#### mnist_simple\n\nOne of the most basic ML examples using tensorflow to train a softmax mnist recognizer.\n\n```python\n#Converted to ue4 use from: https://www.tensorflow.org/get_started/mnist/beginners\n#mnist_softmax.py: https://github.com/tensorflow/tensorflow/blob/r1.1/tensorflow/examples/tutorials/mnist/mnist_softmax.py\n\n# Import data\nfrom tensorflow.examples.tutorials.mnist import input_data\n\nimport tensorflow as tf\nimport unreal_engine as ue\nfrom mlpluginapi import MLPluginAPI\n\nimport operator\n\nclass MnistSimple(MLPluginAPI):\n\t\n\t#expected api: storedModel and session, json inputs\n\tdef on_json_input(self, jsonInput):\n\t\t#expect an image struct in json format\n\t\tpixelarray = jsonInput['pixels']\n\t\tue.log('image len: ' + str(len(pixelarray)))\n\n\t\t#embedd the input image pixels as 'x'\n\t\tfeed_dict = {self.model['x']: [pixelarray]}\n\n\t\tresult = self.sess.run(self.model['y'], feed_dict)\n\n\t\t#convert our raw result to a prediction\n\t\tindex, value = max(enumerate(result[0]), key=operator.itemgetter(1))\n\n\t\tue.log('max: ' + str(value) + 'at: ' + str(index))\n\n\t\t#set the prediction result in our json\n\t\tjsonInput['prediction'] = index\n\n\t\treturn jsonInput\n\n\t#expected api: no params forwarded for training? TBC\n\tdef on_begin_training(self):\n\n\t\tue.log(\"starting mnist simple training\")\n\n\t\tself.scripts_path = ue.get_content_dir() + \"Scripts\"\n\t\tself.data_dir = self.scripts_path + '/dataset/mnist'\n\n\t\tmnist = input_data.read_data_sets(self.data_dir)\n\n\t\t# Create the model\n\t\tx = tf.placeholder(tf.float32, [None, 784])\n\t\tW = tf.Variable(tf.zeros([784, 10]))\n\t\tb = tf.Variable(tf.zeros([10]))\n\t\ty = tf.matmul(x, W) + b\n\n\t\t# Define loss and optimizer\n\t\ty_ = tf.placeholder(tf.int64, [None])\n\n\t\t# The raw formulation of cross-entropy,\n\t\t#\n\t\t#   tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(tf.nn.softmax(y)),\n\t\t#                                 reduction_indices=[1]))\n\t\t#\n\t\t# can be numerically unstable.\n\t\t#\n\t\t# So here we use tf.losses.sparse_softmax_cross_entropy on the raw\n\t\t# outputs of 'y', and then average across the batch.\n\t\tcross_entropy = tf.losses.sparse_softmax_cross_entropy(labels=y_, logits=y)\n\t\ttrain_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)\n\n\t\t#update session for this thread\n\t\tself.sess = tf.InteractiveSession()\n\t\ttf.global_variables_initializer().run()\n\n\t\t# Train\n\t\tfor i in range(1000):\n\t\t\tbatch_xs, batch_ys = mnist.train.next_batch(100)\n\t\t\tself.sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})\n\t\t\tif i % 100 == 0:\n\t\t\t\tue.log(i)\n\t\t\t\tif(self.should_stop):\n\t\t\t\t\tue.log('early break')\n\t\t\t\t\tbreak \n\n\t\t# Test trained model\n\t\tcorrect_prediction = tf.equal(tf.argmax(y, 1), y_)\n\t\taccuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))\n\t\tfinalAccuracy = self.sess.run(accuracy, feed_dict={x: mnist.test.images,\n\t\t\t\t\t\t\t\t\t\t  y_: mnist.test.labels})\n\t\tue.log('final training accuracy: ' + str(finalAccuracy))\n\t\t\n\t\t#return trained model\n\t\tself.model = {'x':x, 'y':y, 'W':W,'b':b}\n\n\t\t#store optional summary information\n\t\tself.summary = {'x':str(x), 'y':str(y), 'W':str(W), 'b':str(b)}\n\n\t\tself.stored['summary'] = self.summary\n\t\treturn self.stored\n\n#required function to get our api\ndef get_api():\n\t#return CLASSNAME.get_instance()\n\treturn MnistSimple.get_instance()\n```\n\n\n### C++ API\n\nAvailable since 0.3.1. \n\nSame as blueprint API except for one additional callback variant. Use the lambda overloaded functions e.g. assuming you have a component defined as\n```c++\nUMachineLearningRemoteComponent* MLComponent; //NB: this needs to be allocated with NewObject or CreateDefaultSubobject \n```\n\n#### SendRawInput\n```c++\n//Let's say you want to send some raw data\nTArray\u003cfloat\u003e InputData;\n//... fill\n\nMLComponent-\u003eSendRawInput(InputData, [this](TArray\u003cfloat\u003e\u0026 ResultData)\n{\n\t//Now we got our results back, do something with them here\n}, FunctionName);\n```\n\n#### SendStringInput\n\nKeep in mind that if you're using USIOJConvert utilities you'll need to add *SIOJson*, and *Json* as dependency modules in your project build.cs.\n\n```c#\nPublicDependencyModuleNames.AddRange(new string[] { \"Core\", \"CoreUObject\", \"Engine\", \"InputCore\", \"Json\", \"SIOJson\" });\n```\n\nSending just a String\n\n```c++\nFString InputString = TEXT(\"Some Data\");\n\nMLComponent-\u003eSendStringInput(InputString, [this](const FString\u0026 ResultData)\n{\n\t//e.g. just print the result\n\tUE_LOG(LogTemp, Log, TEXT(\"Got some results: %s\"), *ResultData);\n}, FunctionName);\n```\n\nA custom JsonObject\n\n```c++\n//Make an object {\"myKey\":\"myValue\"}\nTSharedPtr\u003cFJsonObject\u003e JsonObject = MakeShareable(new FJsonObject);\nJsonObject-\u003eSetStringField(TEXT(\"myKey\"), TEXT(\"myValue\"));\nFString InputString = USIOJConvert::ToJsonString(JsonObject);\n\nMLComponent-\u003eSendStringInput(InputString, [this](const FString\u0026 ResultData)\n{\n\t//assuming you got a json string response we could query it, e.g. assume {\"someNumber\":5}\n\tTSharedPtr\u003cFJsonObject\u003e JsonObject = USIOJConvert::ToJsonObject(ResultData);\n\tdouble MyNumber = JsonObject-\u003eGetNumberField(\"someNumber\");\n\t\n\t//do something with your number result\n}, FunctionName);\n```\n\nStructs via Json\n```c++\n//Let's say you want to send some struct data in json format\n\nUSTRUCT()\nstruct FTestCppStruct\n{\n\tGENERATED_BODY()\n\n\tUPROPERTY()\n\tint32 Index;\n\n\tUPROPERTY()\n\tfloat SomeNumber;\n\n\tUPROPERTY()\n\tFString Name;\n};\n\n//...\n\nFTestCppStruct TestStruct;\nTestStruct.Name = TEXT(\"George\");\nTestStruct.Index = 5;\nTestStruct.SomeNumber = 5.123f;\nFString StructJsonString = USIOJConvert::ToJsonString(USIOJConvert::ToJsonObject(FTestCppStruct::StaticStruct(), \u0026TestStruct));\n\n//In this example we're using the same struct type for the result, but you could use a different one or custom Json\nFTestCppStruct ResultStruct;\n\nMLComponent-\u003eSendStringInput(StructJsonString, [this, \u0026ResultStruct](const FString\u0026 ResultData)\n{\n\t//do something with the result, let's say we we have another struct of same type and we'd like to fill it with the results\n\tUSIOJConvert::JsonObjectToUStruct(USIOJConvert::ToJsonObject(ResultData), FTestCppStruct::StaticStruct(), \u0026ResultStruct);\n}, FunctionName);\n```\n","funding_links":[],"categories":["Machine Learning / AI / Vision :"],"sub_categories":["Videos"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgetnamo%2FMachineLearningRemote-Unreal","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgetnamo%2FMachineLearningRemote-Unreal","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgetnamo%2FMachineLearningRemote-Unreal/lists"}