{"id":13471071,"url":"https://github.com/perwendel/spark","last_synced_at":"2025-05-14T08:02:08.003Z","repository":{"id":1467376,"uuid":"1705960","full_name":"perwendel/spark","owner":"perwendel","description":"A simple expressive web framework for java. Spark has a kotlin DSL https://github.com/perwendel/spark-kotlin","archived":false,"fork":false,"pushed_at":"2023-10-08T09:20:37.000Z","size":2503,"stargazers_count":9662,"open_issues_count":260,"forks_count":1572,"subscribers_count":398,"default_branch":"master","last_synced_at":"2025-05-07T07:01:47.501Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/perwendel.png","metadata":{"files":{"readme":"README.md","changelog":"changeset/2.9.3-changeset.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null}},"created_at":"2011-05-05T11:52:05.000Z","updated_at":"2025-05-06T05:47:47.000Z","dependencies_parsed_at":"2024-02-06T10:50:55.239Z","dependency_job_id":"07c6f6a3-6c42-4389-b930-4e7bfe6c1052","html_url":"https://github.com/perwendel/spark","commit_stats":{"total_commits":790,"total_committers":159,"mean_commits":4.968553459119497,"dds":0.6126582278481012,"last_synced_commit":"1973e402f5d4c1442ad34a1d38ed0758079f7773"},"previous_names":[],"tags_count":23,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/perwendel%2Fspark","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/perwendel%2Fspark/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/perwendel%2Fspark/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/perwendel%2Fspark/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/perwendel","download_url":"https://codeload.github.com/perwendel/spark/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254077499,"owners_count":22010730,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-07-31T16:00:39.330Z","updated_at":"2025-05-14T08:02:07.951Z","avatar_url":"https://github.com/perwendel.png","language":"Java","readme":"[![](https://img.shields.io/travis/perwendel/spark.svg)](https://travis-ci.org/perwendel/spark)\n[![](https://img.shields.io/github/license/perwendel/spark.svg)](./LICENSE)\n[![](https://img.shields.io/maven-central/v/com.sparkjava/spark-core.svg)](http://mvnrepository.com/artifact/com.sparkjava/spark-core)\n\nSpark - a tiny web framework for Java 8\n==============================================\n\n**Spark 2.9.4 is out!!**\n```xml\n\u003cdependency\u003e\n    \u003cgroupId\u003ecom.sparkjava\u003c/groupId\u003e\n    \u003cartifactId\u003espark-core\u003c/artifactId\u003e\n    \u003cversion\u003e2.9.4\u003c/version\u003e\n\u003c/dependency\u003e\n```\n\nSponsor the project here https://github.com/sponsors/perwendel\n\nFor documentation please go to: http://sparkjava.com/documentation\n\nFor usage questions, please use [stack overflow with the “spark-java” tag](http://stackoverflow.com/questions/tagged/spark-java) \n\nJavadoc: http://javadoc.io/doc/com.sparkjava/spark-core\n\nWhen committing to the project please use Spark format configured in https://github.com/perwendel/spark/blob/master/config/spark_formatter_intellij.xml\n\nGetting started\n---------------\n\n```xml\n\u003cdependency\u003e\n    \u003cgroupId\u003ecom.sparkjava\u003c/groupId\u003e\n    \u003cartifactId\u003espark-core\u003c/artifactId\u003e\n    \u003cversion\u003e2.9.4\u003c/version\u003e\n\u003c/dependency\u003e\n```\n\n```java\nimport static spark.Spark.*;\n\npublic class HelloWorld {\n    public static void main(String[] arg){\n        get(\"/hello\", (request, response) -\u003e \"Hello World!\");\n    }\n}\n```\n\nView at: http://localhost:4567/hello\n\n\nCheck out and try the examples in the source code.\nYou can also check out the javadoc. After getting the source from\n[github](https://github.com/perwendel/spark) run: \n\n    mvn javadoc:javadoc\n\nThe result is put in /target/site/apidocs\n\nExamples\n---------\n\nSimple example showing some basic functionality\n\n```java\nimport static spark.Spark.*;\n\n/**\n * A simple example just showing some basic functionality\n */\npublic class SimpleExample {\n\n    public static void main(String[] args) {\n\n        //  port(5678); \u003c- Uncomment this if you want spark to listen to port 5678 instead of the default 4567\n\n        get(\"/hello\", (request, response) -\u003e \"Hello World!\");\n\n        post(\"/hello\", (request, response) -\u003e\n            \"Hello World: \" + request.body()\n        );\n\n        get(\"/private\", (request, response) -\u003e {\n            response.status(401);\n            return \"Go Away!!!\";\n        });\n\n        get(\"/users/:name\", (request, response) -\u003e \"Selected user: \" + request.params(\":name\"));\n\n        get(\"/news/:section\", (request, response) -\u003e {\n            response.type(\"text/xml\");\n            return \"\u003c?xml version=\\\"1.0\\\" encoding=\\\"UTF-8\\\"?\u003e\u003cnews\u003e\" + request.params(\"section\") + \"\u003c/news\u003e\";\n        });\n\n        get(\"/protected\", (request, response) -\u003e {\n            halt(403, \"I don't think so!!!\");\n            return null;\n        });\n\n        get(\"/redirect\", (request, response) -\u003e {\n            response.redirect(\"/news/world\");\n            return null;\n        });\n\n        get(\"/\", (request, response) -\u003e \"root\");\n    }\n}\n\n```\n\n-------------------------------\n\nA simple CRUD example showing how to create, get, update and delete book resources\n\n```java\nimport static spark.Spark.*;\n\nimport java.util.HashMap;\nimport java.util.Map;\nimport java.util.Random;\n\n/**\n * A simple CRUD example showing how to create, get, update and delete book resources.\n */\npublic class Books {\n\n    /**\n     * Map holding the books\n     */\n    private static Map\u003cString, Book\u003e books = new HashMap\u003cString, Book\u003e();\n\n    public static void main(String[] args) {\n        final Random random = new Random();\n\n        // Creates a new book resource, will return the ID to the created resource\n        // author and title are sent in the post body as x-www-urlencoded values e.g. author=Foo\u0026title=Bar\n        // you get them by using request.queryParams(\"valuename\")\n        post(\"/books\", (request, response) -\u003e {\n            String author = request.queryParams(\"author\");\n            String title = request.queryParams(\"title\");\n            Book book = new Book(author, title);\n\n            int id = random.nextInt(Integer.MAX_VALUE);\n            books.put(String.valueOf(id), book);\n\n            response.status(201); // 201 Created\n            return id;\n        });\n\n        // Gets the book resource for the provided id\n        get(\"/books/:id\", (request, response) -\u003e {\n            Book book = books.get(request.params(\":id\"));\n            if (book != null) {\n                return \"Title: \" + book.getTitle() + \", Author: \" + book.getAuthor();\n            } else {\n                response.status(404); // 404 Not found\n                return \"Book not found\";\n            }\n        });\n\n        // Updates the book resource for the provided id with new information\n        // author and title are sent in the request body as x-www-urlencoded values e.g. author=Foo\u0026title=Bar\n        // you get them by using request.queryParams(\"valuename\")\n        put(\"/books/:id\", (request, response) -\u003e {\n            String id = request.params(\":id\");\n            Book book = books.get(id);\n            if (book != null) {\n                String newAuthor = request.queryParams(\"author\");\n                String newTitle = request.queryParams(\"title\");\n                if (newAuthor != null) {\n                    book.setAuthor(newAuthor);\n                }\n                if (newTitle != null) {\n                    book.setTitle(newTitle);\n                }\n                return \"Book with id '\" + id + \"' updated\";\n            } else {\n                response.status(404); // 404 Not found\n                return \"Book not found\";\n            }\n        });\n\n        // Deletes the book resource for the provided id\n        delete(\"/books/:id\", (request, response) -\u003e {\n            String id = request.params(\":id\");\n            Book book = books.remove(id);\n            if (book != null) {\n                return \"Book with id '\" + id + \"' deleted\";\n            } else {\n                response.status(404); // 404 Not found\n                return \"Book not found\";\n            }\n        });\n\n        // Gets all available book resources (ids)\n        get(\"/books\", (request, response) -\u003e {\n            String ids = \"\";\n            for (String id : books.keySet()) {\n                ids += id + \" \";\n            }\n            return ids;\n        });\n    }\n\n    public static class Book {\n\n        public String author, title;\n\n        public Book(String author, String title) {\n            this.author = author;\n            this.title = title;\n        }\n\n        public String getAuthor() {\n            return author;\n        }\n\n        public void setAuthor(String author) {\n            this.author = author;\n        }\n\n        public String getTitle() {\n            return title;\n        }\n\n        public void setTitle(String title) {\n            this.title = title;\n        }\n    }\n}\n```\n\n---------------------------------\n\nExample showing a very simple (and stupid) authentication filter that is executed before all other resources\n\n```java\nimport static spark.Spark.*;\n\nimport java.util.HashMap;\nimport java.util.Map;\n\n/**\n * Example showing a very simple (and stupid) authentication filter that is\n * executed before all other resources.\n *\n * When requesting the resource with e.g.\n *     http://localhost:4567/hello?user=some\u0026password=guy\n * the filter will stop the execution and the client will get a 401 UNAUTHORIZED with the content 'You are not welcome here'\n *\n * When requesting the resource with e.g.\n *     http://localhost:4567/hello?user=foo\u0026password=bar\n * the filter will accept the request and the request will continue to the /hello route.\n *\n * Note: There is a second \"before filter\" that adds a header to the response\n * Note: There is also an \"after filter\" that adds a header to the response\n */\npublic class FilterExample {\n\n    private static Map\u003cString, String\u003e usernamePasswords = new HashMap\u003cString, String\u003e();\n\n    public static void main(String[] args) {\n\n        usernamePasswords.put(\"foo\", \"bar\");\n        usernamePasswords.put(\"admin\", \"admin\");\n\n        before((request, response) -\u003e {\n            String user = request.queryParams(\"user\");\n            String password = request.queryParams(\"password\");\n\n            String dbPassword = usernamePasswords.get(user);\n            if (!(password != null \u0026\u0026 password.equals(dbPassword))) {\n                halt(401, \"You are not welcome here!!!\");\n            }\n        });\n\n        before(\"/hello\", (request, response) -\u003e response.header(\"Foo\", \"Set by second before filter\"));\n\n        get(\"/hello\", (request, response) -\u003e \"Hello World!\");\n\n        after(\"/hello\", (request, response) -\u003e response.header(\"spark\", \"added by after-filter\"));\n\n        afterAfter(\"/hello\", (request, response) -\u003e response.header(\"finally\", \"executed even if exception is throw\"));\n\n        afterAfter((request, response) -\u003e response.header(\"finally\", \"executed after any route even if exception is throw\"));\n    }\n}\n```\n\n---------------------------------\n\nExample showing how to use attributes\n\n```java\nimport static spark.Spark.after;\nimport static spark.Spark.get;\n\n/**\n * Example showing the use of attributes\n */\npublic class FilterExampleAttributes {\n\n    public static void main(String[] args) {\n        get(\"/hi\", (request, response) -\u003e {\n            request.attribute(\"foo\", \"bar\");\n            return null;\n        });\n\n        after(\"/hi\", (request, response) -\u003e {\n            for (String attr : request.attributes()) {\n                System.out.println(\"attr: \" + attr);\n            }\n        });\n\n        after(\"/hi\", (request, response) -\u003e {\n            Object foo = request.attribute(\"foo\");\n            response.body(asXml(\"foo\", foo));\n        });\n    }\n\n    private static String asXml(String name, Object value) {\n        return \"\u003c?xml version=\\\"1.0\\\" encoding=\\\"UTF-8\\\"?\u003e\u003c\" + name +\"\u003e\" + value + \"\u003c/\"+ name + \"\u003e\";\n    }\n}\n```\n\n\n---------------------------------\n\nExample showing how to serve static resources\n\n```java\nimport static spark.Spark.*;\n\npublic class StaticResources {\n\n    public static void main(String[] args) {\n\n        // Will serve all static file are under \"/public\" in classpath if the route isn't consumed by others routes.\n        // When using Maven, the \"/public\" folder is assumed to be in \"/main/resources\"\n        staticFileLocation(\"/public\");\n\n        get(\"/hello\", (request, response) -\u003e \"Hello World!\");\n    }\n}\n```\n---------------------------------\n\nExample showing how to define content depending on accept type\n\n```java\nimport static spark.Spark.*;\n\npublic class JsonAcceptTypeExample {\n\n    public static void main(String args[]) {\n\n        //Running curl -i -H \"Accept: application/json\" http://localhost:4567/hello json message is read.\n        //Running curl -i -H \"Accept: text/html\" http://localhost:4567/hello HTTP 404 error is thrown.\n        get(\"/hello\", \"application/json\", (request, response) -\u003e \"{\\\"message\\\": \\\"Hello World\\\"}\");\n    }\n} \n```\n---------------------------------\n\nExample showing how to render a view from a template. Note that we are using `ModelAndView` class for setting the object and name/location of template. \n\nFirst of all we define a class which handles and renders output depending on template engine used. In this case [FreeMarker](http://freemarker.incubator.apache.org/).\n\n\n```java\npublic class FreeMarkerTemplateEngine extends TemplateEngine {\n\n    private Configuration configuration;\n\n    protected FreeMarkerTemplateEngine() {\n        this.configuration = createFreemarkerConfiguration();\n    }\n\n    @Override\n    public String render(ModelAndView modelAndView) {\n        try {\n            StringWriter stringWriter = new StringWriter();\n\n            Template template = configuration.getTemplate(modelAndView.getViewName());\n            template.process(modelAndView.getModel(), stringWriter);\n\n            return stringWriter.toString();\n        } catch (IOException e) {\n            throw new IllegalArgumentException(e);\n        } catch (TemplateException e) {\n            throw new IllegalArgumentException(e);\n        }\n    }\n\n    private Configuration createFreemarkerConfiguration() {\n        Configuration retVal = new Configuration();\n        retVal.setClassForTemplateLoading(FreeMarkerTemplateEngine.class, \"freemarker\");\n        return retVal;\n    }\n}\n```\n\nThen we can use it to generate our content. Note how we are setting model data and view name. Because we are using FreeMarker, in this case a `Map` and the name of the template is required:\n\n```java\npublic class FreeMarkerExample {\n\n    public static void main(String args[]) {\n\n        get(\"/hello\", (request, response) -\u003e {\n            Map\u003cString, Object\u003e attributes = new HashMap\u003c\u003e();\n            attributes.put(\"message\", \"Hello FreeMarker World\");\n\n            // The hello.ftl file is located in directory:\n            // src/test/resources/spark/examples/templateview/freemarker\n            return modelAndView(attributes, \"hello.ftl\");\n        }, new FreeMarkerTemplateEngine());\n    }\n}\n```\n\n---------------------------------\n\nExample of using Transformer.\n\nFirst of all we define the transformer class, in this case a class which transforms an object to JSON format using gson API.\n\n```java\npublic class JsonTransformer implements ResponseTransformer {\n\n\tprivate Gson gson = new Gson();\n\n\t@Override\n\tpublic String render(Object model) {\n\t\treturn gson.toJson(model);\n\t}\n}\n```\n\nAnd then the code which return a simple POJO to be transformed to JSON:\n\n```java\npublic class TransformerExample {\n\n    public static void main(String args[]) {\n        get(\"/hello\", \"application/json\", (request, response) -\u003e {\n            return new MyMessage(\"Hello World\");\n        }, new JsonTransformer());\n    }\n}\n```\n\nDebugging\n------------------\nSee [Spark-debug-tools](https://github.com/perwendel/spark-debug-tools) as a separate module.\n","funding_links":["https://github.com/sponsors/perwendel"],"categories":["Java","Uncategorized","I. Development","HarmonyOS","开发框架","Micro Frameworks inspired by Sinatra (Other Languages)","\u003ca name=\"Java\"\u003e\u003c/a\u003eJava"],"sub_categories":["Uncategorized","2. Web development","Windows Manager"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fperwendel%2Fspark","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fperwendel%2Fspark","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fperwendel%2Fspark/lists"}