{"id":15038073,"url":"https://github.com/dropbox/pyhive","last_synced_at":"2025-05-14T20:04:49.393Z","repository":{"id":13736791,"uuid":"16431132","full_name":"dropbox/PyHive","owner":"dropbox","description":"Python interface to Hive and Presto. 🐝","archived":false,"fork":false,"pushed_at":"2024-08-07T19:52:11.000Z","size":411,"stargazers_count":1678,"open_issues_count":220,"forks_count":551,"subscribers_count":60,"default_branch":"master","last_synced_at":"2025-04-03T02:57:05.614Z","etag":null,"topics":["dbapi","hive","hiveserver2","presto","python","sqlalchemy"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/dropbox.png","metadata":{"files":{"readme":"README.rst","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2014-02-01T09:05:07.000Z","updated_at":"2025-03-18T17:31:02.000Z","dependencies_parsed_at":"2023-01-13T17:35:55.828Z","dependency_job_id":"a8d88e89-3332-4346-aa36-6dbbe20d69af","html_url":"https://github.com/dropbox/PyHive","commit_stats":{"total_commits":179,"total_committers":50,"mean_commits":3.58,"dds":0.541899441340782,"last_synced_commit":"3547bd6cccf963a033928b73c5ed498684335c39"},"previous_names":[],"tags_count":19,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dropbox%2FPyHive","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dropbox%2FPyHive/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dropbox%2FPyHive/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dropbox%2FPyHive/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/dropbox","download_url":"https://codeload.github.com/dropbox/PyHive/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248161255,"owners_count":21057553,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["dbapi","hive","hiveserver2","presto","python","sqlalchemy"],"created_at":"2024-09-24T20:37:00.055Z","updated_at":"2025-04-10T04:53:56.994Z","avatar_url":"https://github.com/dropbox.png","language":"Python","readme":"========================================================\nPyHive project has been donated to  Apache Kyuubi\n========================================================\n\nYou can follow it's development and report any issues you are experiencing here: https://github.com/apache/kyuubi/tree/master/python/pyhive \n\n\n\nLegacy notes / instructions\n===========================\n\nPyHive\n**********\n\n\nPyHive is a collection of Python `DB-API \u003chttp://www.python.org/dev/peps/pep-0249/\u003e`_ and\n`SQLAlchemy \u003chttp://www.sqlalchemy.org/\u003e`_ interfaces for `Presto \u003chttp://prestodb.io/\u003e`_ ,\n`Hive \u003chttp://hive.apache.org/\u003e`_ and `Trino \u003chttps://trino.io/\u003e`_.\n\nUsage\n**********\n\nDB-API\n------\n.. code-block:: python\n\n    from pyhive import presto  # or import hive or import trino\n    cursor = presto.connect('localhost').cursor()  # or use hive.connect or use trino.connect\n    cursor.execute('SELECT * FROM my_awesome_data LIMIT 10')\n    print cursor.fetchone()\n    print cursor.fetchall()\n\nDB-API (asynchronous)\n---------------------\n.. code-block:: python\n\n    from pyhive import hive\n    from TCLIService.ttypes import TOperationState\n    cursor = hive.connect('localhost').cursor()\n    cursor.execute('SELECT * FROM my_awesome_data LIMIT 10', async=True)\n\n    status = cursor.poll().operationState\n    while status in (TOperationState.INITIALIZED_STATE, TOperationState.RUNNING_STATE):\n        logs = cursor.fetch_logs()\n        for message in logs:\n            print message\n\n        # If needed, an asynchronous query can be cancelled at any time with:\n        # cursor.cancel()\n\n        status = cursor.poll().operationState\n\n    print cursor.fetchall()\n\nIn Python 3.7 `async` became a keyword; you can use `async_` instead:\n\n.. code-block:: python\n\n    cursor.execute('SELECT * FROM my_awesome_data LIMIT 10', async_=True)\n\n\nSQLAlchemy\n----------\nFirst install this package to register it with SQLAlchemy, see ``entry_points`` in ``setup.py``.\n\n.. code-block:: python\n\n    from sqlalchemy import *\n    from sqlalchemy.engine import create_engine\n    from sqlalchemy.schema import *\n    # Presto\n    engine = create_engine('presto://localhost:8080/hive/default')\n    # Trino\n    engine = create_engine('trino+pyhive://localhost:8080/hive/default')\n    # Hive\n    engine = create_engine('hive://localhost:10000/default')\n\n    # SQLAlchemy \u003c 2.0\n    logs = Table('my_awesome_data', MetaData(bind=engine), autoload=True)\n    print select([func.count('*')], from_obj=logs).scalar()\n\n    # Hive + HTTPS + LDAP or basic Auth\n    engine = create_engine('hive+https://username:password@localhost:10000/')\n    logs = Table('my_awesome_data', MetaData(bind=engine), autoload=True)\n    print select([func.count('*')], from_obj=logs).scalar()\n\n    # SQLAlchemy \u003e= 2.0\n    metadata_obj = MetaData()\n    books = Table(\"books\", metadata_obj, Column(\"id\", Integer), Column(\"title\", String), Column(\"primary_author\", String))\n    metadata_obj.create_all(engine)\n    inspector = inspect(engine)\n    inspector.get_columns('books')\n\n    with engine.connect() as con:\n        data = [{ \"id\": 1, \"title\": \"The Hobbit\", \"primary_author\": \"Tolkien\" }, \n                { \"id\": 2, \"title\": \"The Silmarillion\", \"primary_author\": \"Tolkien\" }]\n        con.execute(books.insert(), data[0])\n        result = con.execute(text(\"select * from books\"))\n        print(result.fetchall())\n\nNote: query generation functionality is not exhaustive or fully tested, but there should be no\nproblem with raw SQL.\n\nPassing session configuration\n-----------------------------\n\n.. code-block:: python\n\n    # DB-API\n    hive.connect('localhost', configuration={'hive.exec.reducers.max': '123'})\n    presto.connect('localhost', session_props={'query_max_run_time': '1234m'})\n    trino.connect('localhost',  session_props={'query_max_run_time': '1234m'})\n    # SQLAlchemy\n    create_engine(\n        'presto://user@host:443/hive',\n        connect_args={'protocol': 'https',\n                      'session_props': {'query_max_run_time': '1234m'}}\n    )\n    create_engine(\n        'trino+pyhive://user@host:443/hive',\n        connect_args={'protocol': 'https',\n                      'session_props': {'query_max_run_time': '1234m'}}\n    )\n    create_engine(\n        'hive://user@host:10000/database',\n        connect_args={'configuration': {'hive.exec.reducers.max': '123'}},\n    )\n    # SQLAlchemy with LDAP\n    create_engine(\n        'hive://user:password@host:10000/database',\n        connect_args={'auth': 'LDAP'},\n    )\n\nRequirements\n************\n\nInstall using\n\n- ``pip install 'pyhive[hive]'`` or ``pip install 'pyhive[hive_pure_sasl]'`` for the Hive interface\n- ``pip install 'pyhive[presto]'`` for the Presto interface\n- ``pip install 'pyhive[trino]'`` for the Trino interface\n\nNote: ``'pyhive[hive]'`` extras uses `sasl \u003chttps://pypi.org/project/sasl/\u003e`_ that doesn't support Python 3.11, See `github issue \u003chttps://github.com/cloudera/python-sasl/issues/30\u003e`_.\nHence PyHive also supports `pure-sasl \u003chttps://pypi.org/project/pure-sasl/\u003e`_ via additional extras ``'pyhive[hive_pure_sasl]'`` which support Python 3.11.\n\nPyHive works with\n\n- Python 2.7 / Python 3\n- For Presto: `Presto installation \u003chttps://prestodb.io/docs/current/installation.html\u003e`_\n- For Trino: `Trino installation \u003chttps://trino.io/docs/current/installation.html\u003e`_\n- For Hive: `HiveServer2 \u003chttps://cwiki.apache.org/confluence/display/Hive/Setting+up+HiveServer2\u003e`_ daemon\n\nChangelog\n*********\nSee https://github.com/dropbox/PyHive/releases.\n\nContributing\n************\n- Please fill out the Dropbox Contributor License Agreement at https://opensource.dropbox.com/cla/ and note this in your pull request.\n- Changes must come with tests, with the exception of trivial things like fixing comments. See .travis.yml for the test environment setup.\n- Notes on project scope:\n\n  - This project is intended to be a minimal Hive/Presto client that does that one thing and nothing else.\n    Features that can be implemented on top of PyHive, such integration with your favorite data analysis library, are likely out of scope.\n  - We prefer having a small number of generic features over a large number of specialized, inflexible features.\n    For example, the Presto code takes an arbitrary ``requests_session`` argument for customizing HTTP calls, as opposed to having a separate parameter/branch for each ``requests`` option.\n\nTips for test environment setup\n****************************************\nYou can setup test environment by following ``.travis.yaml`` in this repository. It uses `Cloudera's CDH 5 \u003chttps://docs.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_cdh_download_510.html\u003e`_ which requires username and password for download.\nIt may not be feasible for everyone to get those credentials. Hence below are alternative instructions to setup test environment.\n\nYou can clone `this repository \u003chttps://github.com/big-data-europe/docker-hive/blob/master/docker-compose.yml\u003e`_ which has Docker Compose setup for Presto and Hive.\nYou can add below lines to its docker-compose.yaml to start Trino in same environment::\n \n    trino:\n        image: trinodb/trino:351    \n        ports:     \n            - \"18080:18080\"    \n        volumes:    \n            - ./trino:/etc/trino\n\nNote: ``./trino`` for docker volume defined above is `trino config from PyHive repository \u003chttps://github.com/dropbox/PyHive/tree/master/scripts/travis-conf/trino\u003e`_\n\nThen run::\n    docker-compose up -d\n\nTesting\n*******\n.. image:: https://travis-ci.org/dropbox/PyHive.svg\n    :target: https://travis-ci.org/dropbox/PyHive\n.. image:: http://codecov.io/github/dropbox/PyHive/coverage.svg?branch=master\n    :target: http://codecov.io/github/dropbox/PyHive?branch=master\n\nRun the following in an environment with Hive/Presto::\n\n    ./scripts/make_test_tables.sh\n    virtualenv --no-site-packages env\n    source env/bin/activate\n    pip install -e .\n    pip install -r dev_requirements.txt\n    py.test\n\nWARNING: This drops/creates tables named ``one_row``, ``one_row_complex``, and ``many_rows``, plus a\ndatabase called ``pyhive_test_database``.\n\nUpdating TCLIService\n********************\n\nThe TCLIService module is autogenerated using a ``TCLIService.thrift`` file. To update it, the\n``generate.py`` file can be used: ``python generate.py \u003cTCLIServiceURL\u003e``. When left blank, the\nversion for Hive 2.3 will be downloaded.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdropbox%2Fpyhive","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdropbox%2Fpyhive","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdropbox%2Fpyhive/lists"}