{"id":13487477,"url":"https://github.com/apache/linkis","last_synced_at":"2026-01-20T19:10:13.637Z","repository":{"id":36961341,"uuid":"198368711","full_name":"apache/linkis","owner":"apache","description":"Apache Linkis builds a computation middleware layer to facilitate connection, governance and orchestration between the upper applications and the underlying data engines.","archived":false,"fork":false,"pushed_at":"2025-12-31T13:46:34.000Z","size":92222,"stargazers_count":3414,"open_issues_count":166,"forks_count":1172,"subscribers_count":262,"default_branch":"master","last_synced_at":"2026-01-14T17:15:04.296Z","etag":null,"topics":["application-manager","context-service","engine","hive","hive-table","impala","jdbc","jobserver","linkis","livy","presto","pyspark","resource-manager","rest-api","scriptis","spark","sql","storage","thrift-server","udf"],"latest_commit_sha":null,"homepage":"https://linkis.apache.org/","language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/apache.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":"NOTICE","maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2019-07-23T06:41:51.000Z","updated_at":"2026-01-13T04:07:50.000Z","dependencies_parsed_at":"2023-01-17T08:45:48.151Z","dependency_job_id":"ea265e3c-8173-4f78-a8fb-8e4e517a3f3c","html_url":"https://github.com/apache/linkis","commit_stats":{"total_commits":3493,"total_committers":216,"mean_commits":"16.171296296296298","dds":0.7993129115373605,"last_synced_commit":"3647c318edf57470408a31aa5965a0c32f017e7e"},"previous_names":["webankfintech/linkis","apache/incubator-linkis"],"tags_count":40,"template":false,"template_full_name":null,"purl":"pkg:github/apache/linkis","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/apache%2Flinkis","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/apache%2Flinkis/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/apache%2Flinkis/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/apache%2Flinkis/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/apache","download_url":"https://codeload.github.com/apache/linkis/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/apache%2Flinkis/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28609836,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-20T18:56:40.769Z","status":"ssl_error","status_checked_at":"2026-01-20T18:54:26.653Z","response_time":117,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.5:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["application-manager","context-service","engine","hive","hive-table","impala","jdbc","jobserver","linkis","livy","presto","pyspark","resource-manager","rest-api","scriptis","spark","sql","storage","thrift-server","udf"],"created_at":"2024-07-31T18:00:59.858Z","updated_at":"2026-01-20T19:10:12.531Z","avatar_url":"https://github.com/apache.png","language":"Java","readme":"\u003ch2 align=\"center\"\u003e\n  Apache Linkis\n\u003c/h2\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003cstrong\u003e Linkis builds a computation middleware layer to facilitate connection,\n    governance and orchestration between the upper applications and the underlying data engines. \u003c/strong\u003e\n\u003c/p\u003e\n\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://linkis.apache.org/\"\u003eApache Linkis | Website\u003c/a\u003e\n\u003c/p\u003e\n\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://deepwiki.com/apache/linkis\"\u003eApache Linkis | DeepWiki\u003c/a\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://linkis.apache.org/docs/latest/introduction/\" \u003e\u003c!--\n    --\u003e\u003cimg src=\"https://img.shields.io/badge/document-English-blue.svg\" alt=\"EN docs\" /\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n  \u003ca href=\"https://linkis.apache.org/zh-CN/docs/latest/introduction/\"\u003e\u003c!--\n    --\u003e\u003cimg src=\"https://img.shields.io/badge/文档-简体中文-blue.svg\" alt=\"简体中文文档\" /\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n    \u003ca target=\"_blank\" href=\"https://search.maven.org/search?q=g:org.apache.linkis%20AND%20a:linkis\"\u003e\u003c!--\n      --\u003e\u003cimg src=\"https://img.shields.io/maven-central/v/org.apache.linkis/linkis.svg?label=maven%20central\" /\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n    \u003ca target=\"_blank\" href=\"https://github.com/apache/linkis/blob/master/LICENSE\"\u003e\u003c!--\n      --\u003e\u003cimg src=\"https://img.shields.io/badge/License-Apache%202.0-blue.svg?label=license\" /\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n    \u003ca target=\"_blank\" href=\"https://www.oracle.com/technetwork/java/javase/downloads/index.html\"\u003e\u003c!--\n      --\u003e\u003cimg src=\"https://img.shields.io/badge/JDK-8-green.svg\" /\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n    \u003ca target=\"_blank\" href=\"https://github.com/apache/linkis/actions\"\u003e\u003c!--\n      --\u003e\u003cimg src=\"https://github.com/apache/linkis/actions/workflows//build-backend.yml/badge.svg\" /\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n   \u003ca target=\"_blank\" href='https://github.com/apache/linkis'\u003e\u003c!--\n      --\u003e\u003cimg src=\"https://img.shields.io/github/forks/apache/linkis.svg\" alt=\"github forks\"/\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n   \u003ca target=\"_blank\" href='https://github.com/apache/linkis'\u003e\u003c!--\n      --\u003e\u003cimg src=\"https://img.shields.io/github/stars/apache/linkis.svg\" alt=\"github stars\"/\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n   \u003ca target=\"_blank\" href='https://github.com/apache/linkis'\u003e\u003c!--\n      --\u003e\u003cimg src=\"https://img.shields.io/github/contributors/apache/linkis.svg\" alt=\"github contributors\"/\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n   \u003ca target=\"_blank\" href=\"https://badges.toozhao.com/stats/01G7TRNN1PH9PMSCYWDF3EK4QT\"\u003e\u003c!--\n      --\u003e\u003cimg src=\"https://badges.toozhao.com/badges/01G7TRNN1PH9PMSCYWDF3EK4QT/green.svg\" /\u003e\u003c!--\n    --\u003e\u003c/a\u003e\n\n\u003c/p\u003e\n\u003cbr/\u003e\n\n---\n[English](README.md) | [中文](README_CN.md)\n\n# Introduction\n\n Linkis builds a layer of computation middleware between upper applications and underlying engines. By using standard interfaces such as REST/WS/JDBC provided by Linkis, the upper applications can easily access the underlying engines such as MySQL/Spark/Hive/Presto/Flink, etc., and achieve the intercommunication of user resources like unified variables, scripts, UDFs, functions and resource files at the same time.\n\nAs a computation middleware, Linkis provides powerful connectivity, reuse, orchestration, expansion, and governance capabilities. By decoupling the application layer and the engine layer, it simplifies the complex network call relationship, and thus reduces the overall complexity and saves the development and maintenance costs as well.\n\nSince the first release of Linkis in 2019, it has accumulated more than **700** trial companies and **1000+** sandbox trial users, which involving diverse industries, from finance, banking, tele-communication, to manufactory, internet companies and so on. Lots of companies have already used Linkis as a unified entrance for the underlying computation and storage engines of the big data platform.\n\nApache Linkis | DeepWiki : https://deepwiki.com/apache/linkis\n\n![linkis-intro-01](https://user-images.githubusercontent.com/7869972/148767375-aeb11b93-16ca-46d7-a30e-92fbefe2bd5e.png)\n\n![linkis-intro-03](https://user-images.githubusercontent.com/7869972/148767380-c34f44b2-9320-4633-9ec8-662701f41d15.png)\n\n# Features\n\n- **Support for diverse underlying computation storage engines** : Spark, Hive, Python, Shell, Flink, JDBC, Pipeline, Sqoop, OpenLooKeng, Presto, ElasticSearch, Trino, SeaTunnel, etc.;\n\n- **Support for diverse language** : SparkSQL, HiveSQL, Python, Shell, Pyspark, Scala, JSON and Java;\n\n- **Powerful computing governance capability** : It can provide task routing, load balancing, multi-tenant, traffic control, resource control and other capabilities based on multi-level labels;\n\n- **Support full stack computation/storage engine** : The ability to receive, execute and manage tasks and requests for various compute and storage engines, including offline batch tasks, interactive query tasks, real-time streaming tasks and data lake tasks;\n\n- **Unified context service** : supports cross-user, system and computing engine to associate and manage user and system resource files (JAR, ZIP, Properties, etc.), result sets, parameter variables, functions, UDFs, etc., one setting, automatic reference everywhere;\n\n- **Unified materials** : provides system and user level material management, can share and flow, share materials across users, across systems;\n\n- **Unified data source management** : provides the ability to add, delete, check and change information of Hive, ElasticSearch, Mysql, Kafka, MongoDB and other data sources, version control, connection test, and query metadata information of corresponding data sources;\n\n- **Error code capability** : provides error codes and solutions for common errors of tasks, which is convenient for users to locate problems by themselves;\n\n# Engine Type\n\n| **Engine name** | **Support underlying component version\u003cbr/\u003e(default dependency version)** | **Linkis Version Requirements** | **Included in Release Package By Default** | **Description**                                                                                                                                                               |\n| :-------------- | :------------------------------------------------------------------------ | :------------------------------ | :----------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |\n| Spark           | Apache \u003e= 2.0.0, \u003cbr/\u003eCDH \u003e= 5.4.0, \u003cbr/\u003e(default Apache Spark 3.2.1)     | \\\u003e=1.0.3                        | Yes                                        | Spark EngineConn, supports SQL , Scala, Pyspark and R code                                                                                                                    |\n| Hive            | Apache \u003e= 1.0.0, \u003cbr/\u003eCDH \u003e= 5.4.0, \u003cbr/\u003e(default Apache Hive 3.1.3)      | \\\u003e=1.0.3                        | Yes                                        | Hive EngineConn, supports HiveQL code                                                                                                                                         |\n| Python          | Python \u003e= 2.6, \u003cbr/\u003e(default Python2*)                                    | \\\u003e=1.0.3                        | Yes                                        | Python EngineConn, supports python code                                                                                                                                       |\n| Shell           | Bash \u003e= 2.0                                                               | \\\u003e=1.0.3                        | Yes                                        | Shell EngineConn, supports Bash shell code                                                                                                                                    |\n| JDBC            | MySQL \u003e= 5.0, Hive \u003e=1.2.1, \u003cbr/\u003e(default Hive-jdbc 2.3.4)                | \\\u003e=1.0.3                        | No                                         | JDBC EngineConn, already supports ClickHouse, DB2, DM, Greenplum, kingbase, MySQL, Oracle, PostgreSQL and SQLServer, can be extended quickly Support other DB, such as SQLite |\n| Flink           | Flink \u003e= 1.12.2, \u003cbr/\u003e(default Apache Flink 1.12.2)                       | \\\u003e=1.0.2                        | No                                         | Flink EngineConn, supports FlinkSQL code, also supports starting a new Yarn in the form of Flink Jar Application                                                              |\n| Pipeline        | -                                                                         | \\\u003e=1.0.2                        | No                                         | Pipeline EngineConn, supports file import and export                                                                                                                          |\n| openLooKeng     | openLooKeng \u003e= 1.5.0, \u003cbr/\u003e(default openLookEng 1.5.0)                    | \\\u003e=1.1.1                        | No                                         | openLooKeng EngineConn, supports querying data virtualization engine with Sql openLooKeng                                                                                     |\n| Sqoop           | Sqoop \u003e= 1.4.6, \u003cbr/\u003e(default Apache Sqoop 1.4.6)                         | \\\u003e=1.1.2                        | No                                         | Sqoop EngineConn, support data migration tool Sqoop engine                                                                                                                    |\n| Presto          | Presto \u003e= 0.180                                                           | \\\u003e=1.2.0                        | No                                         | Presto EngineConn, supports Presto SQL code                                                                                                                                   |\n| ElasticSearch   | ElasticSearch \u003e=6.0                                                       | \\\u003e=1.2.0                        | No                                         | ElasticSearch EngineConn, supports SQL and DSL code                                                                                                                           |\n| Trino           | Trino \u003e=371                                                               | \u003e=1.3.1                         | No                                         | Trino EngineConn， supports Trino SQL code                                                                                                                                    |\n| Seatunnel       | Seatunnel \u003e=2.1.2                                                         | \u003e=1.3.1                         | No                                         | Seatunnel EngineConn， supportt Seatunnel SQL code                                                                                                                            |\n\n# Download\n\nPlease go to the [Linkis Releases Page](https://linkis.apache.org/download/main) to download a compiled distribution or a source code package of Linkis.\n\n# Compile and Deploy\n\n\u003e For more detailed guidance see:\n\u003e- [[Backend Compile]](https://linkis.apache.org/docs/latest/development/build)\n\u003e- [[Management Console Build]](https://linkis.apache.org/docs/latest/development/build-console)\n\n```shell\n\nNote: If you want use `-Dlinkis.build.web=true` to build  linkis-web image, you need to compile linkis-web first.\n\n## compile backend\n### Mac OS/Linux\n\n# 1. When compiling for the first time, execute the following command first\n./mvnw -N install\n\n# 2. make the linkis distribution package\n# - Option 1: make the linkis distribution package only\n./mvnw clean install -Dmaven.javadoc.skip=true -Dmaven.test.skip=true\n\n# - Option 2: make the linkis distribution package and docker image\n#   - Option 2.1: image without mysql jdbc jars\n./mvnw clean install -Pdocker -Dmaven.javadoc.skip=true -Dmaven.test.skip=true\n#   - Option 2.2: image with mysql jdbc jars\n./mvnw clean install -Pdocker -Dmaven.javadoc.skip=true -Dmaven.test.skip=true -Dlinkis.build.with.jdbc=true\n\n# - Option 3: linkis distribution package and docker image (included web)\n./mvnw clean install -Pdocker -Dmaven.javadoc.skip=true -Dmaven.test.skip=true -Dlinkis.build.web=true\n\n# - Option 4: linkis distribution package and docker image (included web and ldh (hadoop all in one for test))\n./mvnw clean install -Pdocker -Dmaven.javadoc.skip=true -Dmaven.test.skip=true -Dlinkis.build.web=true -Dlinkis.build.ldh=true -Dlinkis.build.with.jdbc=true\n\n### Windows\nmvnw.cmd -N install\nmvnw.cmd clean install -Dmaven.javadoc.skip=true -Dmaven.test.skip=true\n\n## compile web\ncd linkis/linkis-web\nnpm install\nnpm run build\n```\n\n### Bundled with MySQL JDBC Driver\nDue to the MySQL licensing restrictions, the MySQL Java Database Connectivity (JDBC) driver is not bundled with the\nofficial released linkis image by default. However, at current stage, linkis still relies on this library to work properly.\nTo solve this problem, we provide a script which can help to creating a custom image with mysql jdbc from the official\nlinkis image by yourself, the image created by this tool will be tagged as `linkis:with-jdbc` by default.\n\n```shell\n$\u003e LINKIS_IMAGE=linkis:1.3.1\n$\u003e ./linkis-dist/docker/scripts/make-linkis-image-with-mysql-jdbc.sh\n```\n\n\nPlease refer to [Quick Deployment](https://linkis.apache.org/docs/latest/deployment/deploy-quick/) to do the deployment.\n\n# Examples and Guidance\n- [User Manual](https://linkis.apache.org/docs/latest/user-guide/how-to-use)\n- [Engine Usage Documents](https://linkis.apache.org/docs/latest/engine-usage/overview)\n- [API Documents](https://linkis.apache.org/docs/latest/api/overview)\n\n# Documentation \u0026 Vedio\n\n- The documentation of linkis is in [Linkis-Website Git Repository](https://github.com/apache/linkis-website)\n- Meetup videos on [Bilibili](https://space.bilibili.com/598542776?from=search\u0026seid=14344213924133040656)\n\n# Architecture\nLinkis services could be divided into three categories: computation governance services, public enhancement services and microservice governance services\n- The computation governance services, support the 3 major stages of processing a task/request: submission -\u003e preparation -\u003e execution\n- The public enhancement services, including the material library service, context service, and data source service\n- The microservice governance services, including Spring Cloud Gateway, Eureka and Open Feign\n\nBelow is the Linkis architecture diagram. You can find more detailed architecture docs in [Linkis-Doc/Architecture](https://linkis.apache.org/docs/latest/architecture/overview).\n![architecture](https://user-images.githubusercontent.com/7869972/148767383-f87e84ba-5baa-4125-8b6e-d0aa4f7d3a66.png)\n\n# Contributing\n\nContributions are always welcomed, we need more contributors to build Linkis together. either code, or doc, or other supports that could help the community.\nFor code and documentation contributions, please follow the [contribution guide](https://linkis.apache.org/community/how-to-contribute).\n\n# Contact Us\n\n\n- Any questions or suggestions please kindly submit an [issue](https://github.com/apache/linkis/issues).\n- By mail [dev@linkis.apache.org](mailto:dev@linkis.apache.org)\n- You can scan the QR code below to join our WeChat group to get more immediate response\n\n\u003cimg src=\"https://linkis.apache.org/Images/wedatasphere_contact_01.png\" width=\"256\"/\u003e\n\n# Who is Using Linkis\n\nWe opened an issue [[Who is Using Linkis]](https://github.com/apache/linkis/issues/23) for users to feedback and record who is using Linkis.\nSince the first release of Linkis in 2019, it has accumulated more than **700** trial companies and **1000+** sandbox trial users, which involving diverse industries, from finance, banking, tele-communication, to manufactory, internet companies and so on.\n","funding_links":[],"categories":["Java","大数据"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fapache%2Flinkis","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fapache%2Flinkis","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fapache%2Flinkis/lists"}