{"id":20430696,"url":"https://github.com/isislab-unisa/sof","last_synced_at":"2025-04-12T20:33:44.175Z","repository":{"id":25538483,"uuid":"28971107","full_name":"isislab-unisa/sof","owner":"isislab-unisa","description":"Simulation Optimization and exploration Framework on the cloud: SOF","archived":false,"fork":false,"pushed_at":"2022-10-03T21:24:35.000Z","size":257074,"stargazers_count":9,"open_issues_count":1,"forks_count":4,"subscribers_count":7,"default_branch":"master","last_synced_at":"2025-03-26T14:50:25.918Z","etag":null,"topics":["agent-based-simulation","hadoop","java","mapreduce","optimization-process","simulation-model","simulation-optimization","sof"],"latest_commit_sha":null,"homepage":"","language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/isislab-unisa.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2015-01-08T15:06:17.000Z","updated_at":"2022-07-05T07:45:02.000Z","dependencies_parsed_at":"2023-01-14T07:00:30.955Z","dependency_job_id":null,"html_url":"https://github.com/isislab-unisa/sof","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/isislab-unisa%2Fsof","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/isislab-unisa%2Fsof/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/isislab-unisa%2Fsof/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/isislab-unisa%2Fsof/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/isislab-unisa","download_url":"https://codeload.github.com/isislab-unisa/sof/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248629866,"owners_count":21136329,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["agent-based-simulation","hadoop","java","mapreduce","optimization-process","simulation-model","simulation-optimization","sof"],"created_at":"2024-11-15T08:08:31.477Z","updated_at":"2025-04-12T20:33:44.145Z","avatar_url":"https://github.com/isislab-unisa.png","language":"Java","readme":"SOF: Zero Configuration Simulation Optimization Framework on the Cloud\n==================\n\nSimulation models are becoming an increasingly popular tool for the analysis and optimization of complex real systems in different fields. Finding an optimal system design requires performing a large parameter sweep. Hence, the model tuning process is extremely demanding from a computational point of view, as it requires careful, time-consuming, complex orchestration of coordinated executions. In this paper, we present the design of SOF (Simulation Optimization and exploration framework on the cloud), a framework which exploits the computing power of a cloud computational environment in order to realize effective and efficient simulation optimization strategies. \n\nSOF offers several attractive features: firstly, SOF requires \"**zero configuration**\" as it does not require _any_ additional software installed on the remote node (only standard [Apache Hadoop](http://hadoop.apache.org/) and a SSH access are sufficient). Secondly, SOF is transparent to the user, since the user is totally unaware that system operates on a distributed environment. Finally, SOF is highly customizable and programmable, since it enables the running of different simulation toolkits and/or the ability to exploit diverse programming languages -- provided that the hosting platform support them -- under two different simulation optimization scenarios, as developed by the modeler.\n\nThe framework core has been fully developed and is available under the Apache public licence. It has been tested and validated on several private platforms, such as a dedicated cluster of workstations, as well as on public platforms, including the Hortonworks Data Platform ([Hortonworks](http://hortonworks.com/)). \n\nSOF was designed in [ISISLab](http://www.isislab.it) and allows the simulation modeller to run and collect results in two kinds of scenario parameter space exploration (PSE) and simulation optimization (SO) considering the computational resources as available for a not fixed time and subjects to failure. \n\nSOF was designed to manage three kinds of simulation engine: [MASON](http://cs.gmu.edu/~eclab/projects/mason/), [NetLogo](https://ccl.northwestern.edu/netlogo/) and a generic simulator. SOF provides some software facilities for the first simulators like the automatic simulation input setting and automatic output generating (that does not provide for the generic simulator, for obvious reasons). The generic simulator must be an executable compliant with the cluster machine used.\n\nSOF is a framework to exploit simulation optimization on Hadoop cluster. SOF is divided in two main functional blocks: core and client. The core component provides all functionality to write out Java based client application. The client is a command line Java application that shown the features of the core component and allows to execute PSE and SO process on a [Apache Hadoop](http://hadoop.apache.org/) cluster.\n\nThe SOF system presents two main entities: the SOF client and the remote host machine on which is installed Hadoop, also named the Hadoop master node. Respectively on the left and on the right of the above figure.\n\n![alt tag](https://raw.githubusercontent.com/isislab-unisa/sof/master/architecture/softwarearchitecture.png)\n\n\nSOF architecture is divided in three main software block: a user frontend that is the SOF application for running and managing the simulation on the Hadoop infrastructure, used only on the client side; the Hadoop layer that encloses softwares and libraries provided from Hadoop infrastructure, used on the remote side; and the SOF core that is the main software block composed of six functional blocks, that are used on the client and on the remote side.\n\n#### SOF System workflow\nSOF was designed to execute simulation optimization and parameter space exploration process on Apache Hadoop. In order to execute a simulation optimization process the user must provide a well formatted input:\n* the simulation executable, MASON/NetLogo model or an executable file;\n* the selection and evaluation functions written in any languages supported by the cluster machine (in this case the user must also define the interpreter program path for languages like Python, Groovy etc.);\n* the domain/input/output/evaluate format for the parameters of the simulation.\n\n![alt tag](https://raw.githubusercontent.com/isislab-unisa/sof/master/architecture/workflowarchitecture.png)\n\nA SOF process, shown in the figure, consists in many optimization loops in which are executed simulations on a set of inputs (generated executing the selection function program) in order to generate the outputs set. The outputs set are evaluated using the evaluate function program. At end the selection program is used again to generate a new inputs set for the next optimization loop (obviously the process ends when the selection function program does not generate a new inputs set). By this computational schema is possible to realize many of the simulation optimization algorithms available literature.\n\n\n##System Requirements\n* [Apache Hadoop](http://hadoop.apache.org/) on Linux based Cluster version 2.4.0 or greater.\n* Java Runtime Environment version 7 or greater.\n* An account on the cluster over SSH.\n\n-\n\n\n## Compiling the SOF core library and SOF application from src/ to target/ by Apache Maven\n\nIf you would like to add features to the library, you will have to change the code in `src/` and then compile the library using Maven, in the project folder:\n    \n        $ mvn compile\n        $ mvn package\n\nAfter that yoy have updated `SOF.jar` and `SOF-RUNNER.jar` in the folder `SOF-resources`. Those files are runnable jar file: the former with `SOF.java` for the main class in the `MANIFEST` and the last with `SOF-RUNNER.java`. Both the classes are located in the package `it.isislab.sof.core.engine.hadoop.sshclient.utils.simulation.executor`.\n\nTo release the final build you must run the command:\n    \n       $ mvn package\n    \nThe output files will be in `target/`:\n\t\n\t\t.\n\t\t├── SOF-1.0-library.jar\n\t\t├── SOF-client-shell.jar\n\t\t├── SOF-client-ui.jar\n\t\t├── archive-tmp\n\t\t├── classes\n\t\t├── examples\n\t\t├── generated-sources\n\t\t├── lib\n\t\t├── maven-archiver\n\t\t├── maven-status\n\t\t└── SOF-resources\n\t\t\n# SOF on test environment\n\nTo easily test the SOF environment you can set up a virtual machine with Apache Hadoop infrastructure on your local machine. A popular example of that is a **Hortonworks Data Platform (HDP)**, an open source Apache Hadoop data platform, architected for the enterprise, developed from [Hortonworks](http://hortonworks.com/). The HDP 2.2 Sandbox is provided as a self-contained virtual machine with Hadoop 2.6.0. No data center, no cloud service and no internet connection needed in order to the SOF framework.\n\nDownload the virtual machine at Hortonworks [download page](http://hortonworks.com/products/releases/hdp-2-2/#install), and install your prefered available virtual machine in order to set up the environment.\n\n### How to execute SOF using VirtualBox HDP 2.2 Sandbox\n\nAfter you have installed [VirtualBox](https://www.virtualbox.org/) and downloaded the virtual machine for VirtualBox, follow this steps:\n\n* Execute HortonWorks SandBox 2.2;\n* Boot the virtual machine (follow the instructions on the Hortonworks's site);\n* usually the user's credentials are login: **root** and password: **hadoop** (please check it on Hortonworks's site).\n\nAfterwards you must enable network on the VirtualBox [Settings-\u003e Network-\u003e Enable Network Adapter, choose Bridged Adapter for \"Attached To\" option]. Reboot the machine and check the connection availability.\n\nIn both SOF clients are needed some system configuration parameters:\n\n* *Node IP address* [`-h`]: IP of Hadoop Master node (in this scenario  the IP address of the virtual machine, you can find it by running the command:  `ifconfig`);\n* *Hadoop home directory* [`-bindir`]: the folder that contains the bin directory of Hadoop infrastructure, where you can find  all the Hadoop commands, in this case `/usr/`;   \n* *Home directory* [`-homedir`]: folder where you create SOF temporary directory on the remote machine (Hadoop Master node), in this case the virtual machine home like `/root`;  \n* *Java bin directory* [`-javabindir`]: folder that cointains `/bin` directory of Java installation, in this case `/usr/bin/`;\n* *SOF home directory* [`-sofhomedir`]: SOF installation folder on the HDFS, in this case `/user/root/`. \n\n###### Examples: \n\n* **SOF Simple Java Client** change the parameters setting in the Java class and run it (see  [Getting Started SOF Java Client](https://github.com/isislab-unisa/sof/blob/master/README.md#getting-started-sof-simple-java-client) section).\n\n* **SOF Shell Client** `$  java -jar SOF-Client.jar -h 192.168.0.2  -bindir /usr/  -homedir /root/ -javabindir /usr/bin/ -sofhomedir /user/root/` (see  [Getting Started SOF GUI Client](https://github.com/isislab-unisa/sof/blob/master/README.md#getting-started-sof-schell-client) section).\n     \n* **SOF GUI Client** provides the parameters setting in the GUI (see  [Getting Started SOF Schell Client](https://github.com/isislab-unisa/sof/blob/master/README.md#getting-started-sof-gui-client) section).\n\n## Define Model Parameters Domain-Input-Output-Rating (Evaluate): SOF XML schemas\n\nSOF support two executions modes, as mentioned above, PSE and SO.\nIn PSE mode the input to the simulation and the output must be in XML, compliant with input/output schemas. In the SO mode the user must not provide the input files but must declare the parameters domain in XML using the domain schema.\n\nIn the following there are shown the SOF parameters XML schemas:\n\n* **XML Domain**\nTo [this](https://github.com/isislab-unisa/sof/blob/master/xml/schema/domain.xsd) link there is the domain XML schema for the simulation model parameters.\n\n- - -\n\n* **XML Input**\nTo [this](https://github.com/isislab-unisa/sof/blob/master/xml/schema/input.xsd) link there is the input XML schema  for the simulation model parameters.\n\n- - -\n\n* **XML Output**\nTo [this](https://github.com/isislab-unisa/sof/blob/master/xml/schema/output.xsd) link there is the output XML schema for the simulation model parameters.\n\n- - -\n\n* **XML Ratings**\nTo [this](https://github.com/isislab-unisa/sof/blob/master/xml/schema/ratings.xsd) link there is the ratings XML schema for the simulation model parameters.\n\n- - -\n## Define Evaluation Function\n\nYou have to write your evaluation function in order to define a SO simulation process. You can write this program in all programming languages how you have read in previous sections. In the following example we show how to write an evaluation function in Java language. \n\nBelow we show how to print all output parameters defined in the output.xml file of NetLogo Fire example and how to create and to print new output parameters. We define a HashTable(a container) of items. Any item has a key (the name of variable defined in the output.xml file) and a value(the outputs' value returned by simulation step). We print all these parameters and at the end in the last section of code we show how to create a new parameter for our NetLogo Fire example and how to print it in a SOF legal format.\n\n```java\npackage it.isislab.sof.example.function.evaluation;\n\n/**\n * An example of evaluation function for our NetLogo Fire example with Java language \n * At the end of a step of simulations, the evaluatuation program extracts and prints \n * all parameters contained into output.xml file with associated value  from simulation \n */\n \npublic class EvaluationFunctionExample{\n\n\n\tpublic static void main(String[] args) throws SAXException, ParserConfigurationException {\n\n\t\tDocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();\n\t\tDocumentBuilder dBuilder = dbFactory.newDocumentBuilder();\n\t\tDocument doc;\n\n\t\t/******************************************************************************************************************\n\t\t ** \n\t\t *  This code inserts in a HashTable\u003cVariableName,VariableValue\u003e all output parameters of simulation\n\t\t *\n\t\t/******************************************************************************************************************/\t\t\n\t\ttry {\n\t\t\tdoc = dBuilder.parse(new File(args[0]));\n\n\t\t\tNodeList params=doc.getElementsByTagName(\"param\");\n\t\t\tHashtable\u003cString/*variableName*/, String/*variableValue*/\u003e simulationOutputValues=new Hashtable\u003cString, String\u003e();\n\t\t\tint paramSize=params.getLength();\n\t\t\t\n\t\t\tfor(int j=0; j\u003cparamSize; j++){\n\n\t\t\t\tNode d= params.item(j);\n\t\t\t\tNamedNodeMap attrbsj=d.getAttributes();\n\n\t\t\t\tint attrbsSize= attrbsj.getLength();\n\t\t\t\tfor(int k=0; k\u003cattrbsSize; k++){\n\t\t\t\t\tNode f=attrbsj.item(k);\n\t\t\t\t\t//System.out.println(f.getNodeValue());\n\t\t\t\t\tNodeList list=d.getChildNodes();\n\t\t\t\t\tsimulationOutputValues.put(f.getNodeValue(), list.item(1).getTextContent()); \n\n\t\t\t\t}\n\t\t\t}\n\t\t\t/*********************************************************************************************************************/\n\t\t\t/*********************************************************************************************************************/\t\t\t\n\t\t\t\n\t\t\t\n\t\t\t/**\n\t\t\t * PRINT  OUTPUTS OF SIMULATION\n\t\t\t * FORMAT  variableName:variableValue;\n\t\t\t */\n\t\t\tfor(String key : simulationOutputValues.keySet()){\n\t\t\t\tSystem.out.println(key+\":\"+simulationOutputValues.get(key)+\";\");\n\t\t\t}\n\n\n/*****************************************************************************************************************************/\n\n          /**\n\t\t\t * You can define new output parameters by using value of simulation contained in   \n\t\t\t * HashTable\u003cnameVariable,valueVariable\u003e\n\t\t\t * calling simulationValue.get(\"nameVariable\"); where nameVariable is the name of variabl\n\t\t\t * You can find nameVariabile and its type into output.xml that you created\n\t\t\t *\n\t\t\t * \n\t\t\t * Example\n\t\t\t * At the end of a simulation step of NetLogo Fire example i want to calculate percentage of burned trees. \n\t\t\t * You must:  \n\t\t\t *   -Define the name of new parameter. For Example burnedPercentage \n\t\t\t * \t -Calculate new parameter in Java code(you must parse in a right Java type) \n\t\t\t *   -Print new parameter into file in a correct SOF Evaluator File Format \"ParameterName:ParameterValue;\" \n\t\t\t */  \n\n\t\t\t/*For example, i want to calculate burned trees percentage at the end of a step of simulation, \n\t\t\t * i called this parameter burnedPercentage */\n\t\t\t\n\t\t\t/*Calculate new parameter burnedPercentage*/\n\t\t\tdouble percentage= (Double.parseDouble(simulationOutputValues.get(\"burned-trees\"))*100)/Double.parseDouble(simulationOutputValues.get(\"initial-trees\"));\n\t\t\t\n\t\t\t/*Print 'burnedPercentage:percentage;' in a correct format where percentage is the value */\n\t\t\tSystem.out.printf(\"burnedPercentage:%.2f;\",percentage);\t\n\n\t\t} catch (IOException e) {\n\t\t\tSystem.err.println(\"Error occurred \"+e.getMessage());\n\t\t}\n\n\t}   \n/*****************************************************************************************************************************/\n\n}\n\n   ```\n## Getting Started SOF Client\n#### Getting Started _**SOF Simple Java Client**_\n\nHere is a minimum example of defining a client application using the SOF core. The program create new simulation job in SO mode, submit the job to the system and wait until the process are finished.\n\nAfter build the project by Maven `mvn package`, you are able to run the example in the class `SOFCoreSimpleApplication.java`.\n\t\nThis simple application shows some SOF core features: \n*\tcreate new simulation optimization process;\n*\tsubmit the process; \n*\twait until the simulation optimization process ends.\n\t\nIn the `examples/netlogo-aids` folder project is available all files of a simulation optimization example. This example use a NetLogo simulation named aids.logo, that is based on a simple propagation model of AIDS disease. The optimization process used is defined by the file selection and evaluation functions (respectively `examples/netlogo-aids/selection.jar` and `examples/netlogo-aids/evaluation.jar`), this toy optimization process experiment runs until all agents are sick.\n\nBelow we show a code example of a toy sof application  ([link](https://github.com/isislab-unisa/sof/blob/master/src/main/java/it/isislab/sof/client/application/SOFCoreSimpleApplication.java)).\n\n```java\npackage it.isislab.sof.client.application;\n\nimport it.isislab.sof.core.engine.hadoop.sshclient.connection.SOFManager;\nimport it.isislab.sof.core.engine.hadoop.sshclient.utils.environment.EnvironmentSession;\nimport it.isislab.sof.core.engine.hadoop.sshclient.utils.simulation.Simulation;\nimport it.isislab.sof.core.engine.hadoop.sshclient.utils.simulation.Simulations;\nimport java.io.File;\nimport java.io.FileInputStream;\nimport com.jcraft.jsch.SftpException;\n\npublic class SOFCoreSimpleApplication {\n\n\tpublic static int PORT=22;\n\tpublic static String host= \"127.0.0.1\";\n\tpublic static String pstring=\"password\";\n\tpublic static String bindir=\"/isis/hadoop-2.4.0\";  \n\tpublic static String homedir=\"/isis/\"; \n\tpublic static String javabindir =\"/usr/local/java/bin/\";\n\tpublic static String name=\"isis\";\n\tpublic static String sofhomedir=\"/\";\n\n\tpublic static  String toolkit=\"netlogo\";\n\tpublic static String simulation_name=\"aids\";\n\tpublic static String domain_pathname=\"examples-sim-aids/domain.xml\";\n\tpublic static String bashCommandForRunnableFunctionSelection=\"/usr/local/java/bin/java\";\n\tpublic static String bashCommandForRunnableFunctionEvaluate=\"/usr/local/java/bin/java\";\n\tpublic static String output_description_filename=\"examples-sim-aids/output.xml\";\n\tpublic static String executable_selection_function_filename=\"examples-sim-aids/selection.jar\";\n\tpublic static String executable_rating_function_filename=\"examples-sim-aids/evaluate.jar\";\n\tpublic static String description_simulation=\"this a simple simulation optimization process for AIDS NetLogo simulation example\";\n\tpublic static String executable_simulation_filename=\"examples-sim-aids/aids.nlogo\";\n\n\t/**\n\t * @param args\n\t * @throws SftpException \n\t */\n\n\tpublic static EnvironmentSession session;\n\n\tpublic static void main(String[] args) throws SftpException{\n\n\t\tSimulations sims=null;\n\t\ttry {\n\n\t\t\tSOFManager.setFileSystem(bindir,System.getProperty(\"user.dir\"), sofhomedir, homedir, javabindir ,name);\n\t\t\tif ((session=SOFManager.connect(name, host, pstring, bindir,PORT,\n\t\t\t\t\tnew FileInputStream(System.getProperty(\"user.dir\")+File.separator+\"SOF-resources\"+File.separator+\"SOF.jar\"),\n\t\t\t\t\tnew FileInputStream(System.getProperty(\"user.dir\")+File.separator+\"SOF-resources\"+File.separator+\"SOF-RUNNER.jar\")\n\t\t\t\t\t))!=null)\n\t\t\t{\n\t\t\t\tSystem.out.println(\"Connected. Type \\\"help\\\", \\\"usage \u003ccommand\u003e\\\" or \\\"license\\\" for more information.\");\n\n\t\t\t}else{\n\t\t\t\tSystem.err.println(\"Login Correct but there are several problems in the hadoop environment, please contact your hadoop admin.\");\n\t\t\t\tSystem.exit(-1);\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tSystem.err.println(\"Login Error. Check your credentials and ip:port of your server and try again .. \");\n\n\t\t}\n\t\t//CREATE SIMULATION FROM EXAMPLE IN SO MODE\n\t\ttry {\n\t\t\tSOFManager.makeSimulationFolderForLoop(session, toolkit, simulation_name, domain_pathname, bashCommandForRunnableFunctionSelection,bashCommandForRunnableFunctionEvaluate, \n\t\t\t\t\toutput_description_filename, executable_selection_function_filename, executable_rating_function_filename, description_simulation, executable_simulation_filename,\"\"/*param exacutable param for generic mode, not required for netlogo and mason*/);\n\n\t\t} catch (Exception e) {\n\t\t\t// TODO Auto-generated catch block\n\t\t\te.printStackTrace();\n\t\t}\n\n\t\tSystem.out.println(\"SIMULATION AVAILABLE LIST: \");\n\t\tsims = SOFManager.getSimulationsData(session);\n\t\tif(sims == null){\n\t\t\tSystem.err.println(\"No such simulations.\");\n\t\t}\n\t\tSystem.out.println(\"******************************************************\");\n\n\t\tfor(int i=1; i\u003c=sims.getSimulations().size(); i++){\n\t\t\tint simID= i-1;\n\t\t\tSimulation s = sims.getSimulations().get(simID);\n\t\t\tSystem.err.println(\"sim-id:\"+i+\") name: \"+s.getName()+\" state: \"+s.getState()+\" time: \"+s.getCreationTime()+\" id: \"+s.getId()+\"\\n\");\n\t\t}\n\n\t\tSystem.out.println(\"******************************************************\");\n\n\t\tSystem.out.println(\"Start the simulation with sim-id \"+(sims.getSimulations().size()));\n\t\tsims = SOFManager.getSimulationsData(session);\n\n\n\t\tSimulation s = sims.getSimulations().get(sims.getSimulations().size()-1);\n\t\tif(s == null){\n\t\t\tSystem.err.println(\"No such simulation with ID \"+sims.getSimulations().size());\n\t\t\tSystem.exit(-1);\n\t\t}\n\n\t\tSOFManager.runAsynchronousSimulation(session,s);\n\n\t\tSystem.out.println(\"Waiting for simulation ends.\");\n\t\tSimulation sim=null;\n\n\n\t\tdo{\n\t\t\tsims = SOFManager.getSimulationsData(session);\n\t\t\tsim = sims.getSimulations().get(sims.getSimulations().size()-1);\n\n\n\t\t}while(!(sim.getState().equals(Simulation.FINISHED)));\n\t\tSystem.exit(0);\n\n\t}\n}\n```\n\n\n\n#### Getting Started **_SOF GUI Client_**\n\nSOF framework provides a Java command line client available in the release (`SOF-client-ui.jar, SOFClientUI.java`):\n\n    $  java -jar SOF-client-ui.jar\n\n\n#### Getting Started _**SOF Shell Client**_\nSOF framework provides a Java command line client available in the release (`SOF-client-shell.jar, SOFCoreSimpleApplication.java`):\n\n    $  java -jar SOF-client-shell.jar\n\nThis client application use SSH to connect to the Hadoop cluster. The application parameters are the following:\n* `-h HOST NAME` cluster master node IP address. The default value is `localhost (127.0.0.1)`;\n* `-port PORT NUMBER` listening port for SHH process on cluster. The default value is `22`;\n* `-bindir PATH BIN DIRECTORY` the bin installation path (absolute) of Hadoop. The default value is `/bin`;\n* `-homedir PATH DIRECTORY` the home directory of the user on the master node. The default value  is `~/temp`;\n* `-javabindir PATH JAVA BIN DIRECTORY` the bin installation path of the Java Virtual Machine. The default value is `/usr/bin`;\n* `-sofhomedir USER SOF HOME DIRECTORY` the Hadoop distributed File system directory which will be the root directory for the SOF application. The default value is `/`.\n\nUsage:\n\n     $  java -jar SOF-Client.jar -h 192.168.0.1 -port 1022 -bindir /home/hadoop -homedir /home/user -javabindir /home/java/bin -sofhomedir /home/user/app/SOFtmp\n     \n     \nAfter login this is the command shell:\n    \n    sof$ 18:00:01 \u003e\u003e\u003e\n    \n\n---\n\n#### SOF Client commands overview\n* **`help`** shows the name and a brief use description of SOF commands. \n\n- - -\n\n* **`exit`** exits from SOF application and disconnects the user.  \n\n- - -\n\n* **`createsimulation`** creates a simple simulation in parameter space exploration mode.  This command takes following parameters input:  \n    - ``model`` mason-netlogo-generic\n    - ``simulation name``\n    - ``input XML absolute path``\n    - ``output XML absolute path``\n    - ``brief simualtion description``\n    - ``absolute path of bin file for simulation executable model``\n        * usage ``createsimulation netlogo mysim /home/pippo/input.xml /home/pippo/output.xml \"the description\" /home/pippo/mysim.nlogo``\n\n- - -\n\n*  **`createsimulationloop`**  creates a simulation in simulation optimization mode. This command takes following parameters input:\n    - ``model`` mason-netlogo-generic\n    - ``simulation name``\n    - ``domain XML absolute path``\n    - ``absolute path for bin command to exec the selection and evaluation``\n    - ``output XML absolute path``\n    -  ``absolute path for the selection function file``\n    -  ``absolute path for the evaluate function file``\n    - ``brief simualtion description``\n    - ``absolute path of bin file for simulation executable model``\n        * usage ``createsimulationloop mason mysim /home/pippo/domain.xml /bin/java  /home/pippo/output.xml /home/pippo/selection_function.jar /home/pippo/evaluate_function.jar my description /home/pippo/mysim.jar``\n\n- - -\n\n* **``getsimulations``** prints states and data for all simulations.\n\nReturns for each simulation the following information:   \n- `simulation hdfs identifier` an alphanumeric number associated to the simulation. Note: this is the simulation identifier to identify a simulation on distributed file system.  \n- ``simulation name``\n- ``simulation author`` \n- ``creation time of simulation``\n- ``the simulation description``\n- ``loop status list``:\n    * ``created``, the simulation has been created but not running yet.\n    * ``running``, the simulation are running.\n    * ``finished``, the simulation has been finished correctly.\n    * ``aborted``, the simulation has been finished not correctly: the process was aborted from the system or the user.\n\n- - -\n\n* **`list`** prints a list of all simulations.  \n\nReturns for each simulation the following information:\n- `simulation identifier` an integer number associated to the simulation. Note: this is the simulation identifier to use for all command to refer a simulation. \n- `simulation name`\n- ``status of simulation``:\n    * ``created``, the simulation has been created but not running yet.\n    * ``running``, the simulation are running.\n    * ``finished``, the simulation has been finished correctly.\n    * ``aborted``, the simulation has been finished not correctly: the process was aborted from the system or the user.\n- `creation time of simulation`\n- `simulation identifier on hadoop file system`\n\n- - -\n\n* **`start`** start a simulation execution or restart from last loop if you have stopped simulation(see stop command). This command takes the following input parameters: \n    - `simulation identifier` an integer number associated to the simulation. Note: this is the simulation identifier to use for all command to refer a simulation (given in the list command). \n        * usage `start x` where x is your simulation identifier.\n\n- - -\n\n* **`stop`** stop simulation execution(you can use start command to restart). This command takes the following input parameters:\n    - `simulation identifier` an integer number associated to the simulation. Note: this is the simulation identifier to use for all command to refer a simulation (given in the list command). \n        * usage `stop x` where x is your simulation identifier.\n\n- - -\n\n* **`getsimulation`** shows all data information of a simulation and loops. This command takes the following input parameters: \n    - `simulation identifier` an integer number associated to the simulation. Note: this is the simulation identifier to use for all command to refer a simulation (given in the list command). \n    \nCommand returns the following information:  \n* `simulation hdfs identifier` an alphanumeric number associated to the simulation. Note: this is the simulation identifier to identify a simulation on distributed file system. \n* `simulation name`\n* `simulation author` \n* `creation time of simulation`\n* `the simulation description`\n* ``status of simulation``:\n    - ``created``, the simulation has been created but not running yet.\n    - `running`, the simulation are running.\n    - ``finished``, the simulation has been finished correctly.\n    - ``aborted``, the simulation has been finished not correctly: the process was aborted from the system or the user.\n* `number of loops executed by the process`\n\n- - -\n\n* **`getresult`** download all data of a simulation in zip archive. This command takes the following input parameters: \n    - `simulation identifier` an integer number associated to the simulation. Note: this is the simulation identifier to use for all command to refer a simulation (given in the list command).\n    - `localpath` absolute local path where you will save the tar file. If it isn't specified, it will download on current directory.\n     * usage ``getresult x /home`` download the data of simulation with identifier 2 in the `/home` directory.\n    \n- - -\n\n* **`kill`** stop the SOF process of particular simulation. This command takes the following input parameters: \n    - `simulation identifier` an integer number associated to the simulation. Note: this is the simulation identifier to use for all command to refer a simulation (given in the list command).\n    - `localpath`, absolute local path where you will save the tar file.\n     * usage ``kill x `` interrupt the process of simulation with identifier x.\n    \n- - -\n\n* **`makexml`** this command start a tool to generate the XML files for the SOF process:\n    - ``input.xml``  contains input parameters (name of variable and initial value) of simulation\n    - ``ouput.xml``  contains output parameters (name of variable and value) of simulation\n    - ``domain.xml`` domain file for input parameters (only in SO mode)\n\n**Xml Scope**\nThis command takes the following input parameters:\n   * - ``help`` shows command list \n   * - ``list`` shows the corresponding list to the given xml kind [input, output, domain]\n   * - ``new``  generate a new [input,output,domain] xml file\n   * - ``remove`` remove the corresponding given element\n   * - ``generatexml`` generate the xml file of the corresponding xml kind in the given directory\n   * - ``exit`` go back at previously SOF shell scope\n\n**Parameters Scope**\n   * - ``help`` shows command list \n   * - ``add``: adds a new parameter for following entities : \n       - `Simulation` add [-author | -name | -description | -toolkit] value\n       - `input` add [-string | -double | -long] varName value\n       - `ouput` add [-string | -double | -long] varName\n       - `domain` add [ -discrete varName min max increment| -continuous varName min max increment | -list string varName | -list double varName]\n   * - ``remove`` remove the corresponding given element\n   * - ``list`` shows the corresponding list to the given xml kind [input, output, domain]\n   * - ``exit`` go back at previously SOF shell scope\n\n- - -\n\n\n\n### License\nCopyright ISISLab, 2017 Università degli Studi di Salerno.\n\nLicensed under the Apache License, Version 2.0 (the \"License\"); You may not use this file except in compliance with the License. You may obtain a copy of the License at\n\n       http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fisislab-unisa%2Fsof","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fisislab-unisa%2Fsof","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fisislab-unisa%2Fsof/lists"}