{"id":30318245,"url":"https://github.com/deepmind/dnc","last_synced_at":"2025-08-17T20:09:50.173Z","repository":{"id":48485355,"uuid":"86463017","full_name":"google-deepmind/dnc","owner":"google-deepmind","description":"A TensorFlow implementation of the Differentiable Neural Computer.","archived":false,"fork":false,"pushed_at":"2021-07-23T08:02:12.000Z","size":73,"stargazers_count":2516,"open_issues_count":9,"forks_count":443,"subscribers_count":163,"default_branch":"master","last_synced_at":"2025-08-11T10:26:48.610Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/google-deepmind.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-03-28T13:24:38.000Z","updated_at":"2025-08-06T21:39:08.000Z","dependencies_parsed_at":"2022-09-24T16:15:00.641Z","dependency_job_id":null,"html_url":"https://github.com/google-deepmind/dnc","commit_stats":null,"previous_names":["google-deepmind/dnc","deepmind/dnc"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/google-deepmind/dnc","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google-deepmind%2Fdnc","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google-deepmind%2Fdnc/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google-deepmind%2Fdnc/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google-deepmind%2Fdnc/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/google-deepmind","download_url":"https://codeload.github.com/google-deepmind/dnc/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/google-deepmind%2Fdnc/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":270899582,"owners_count":24664720,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-08-17T02:00:09.016Z","response_time":129,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-08-17T20:04:24.996Z","updated_at":"2025-08-17T20:09:50.165Z","avatar_url":"https://github.com/google-deepmind.png","language":"Python","readme":"# Differentiable Neural Computer (DNC)\n\nThis package provides an implementation of the Differentiable Neural Computer,\nas [published in Nature](\nhttps://www.nature.com/articles/nature20101.epdf?author_access_token=ImTXBI8aWbYxYQ51Plys8NRgN0jAjWel9jnR3ZoTv0MggmpDmwljGswxVdeocYSurJ3hxupzWuRNeGvvXnoO8o4jTJcnAyhGuZzXJ1GEaD-Z7E6X_a9R-xqJ9TfJWBqz).\n\nAny publication that discloses findings arising from using this source code must\ncite “Hybrid computing using a neural network with dynamic external memory\",\nNature 538, 471–476 (October 2016) doi:10.1038/nature20101.\n\n## Introduction\n\nThe Differentiable Neural Computer is a recurrent neural network. At each\ntimestep, it has state consisting of the current memory contents (and auxiliary\ninformation such as memory usage), and maps input at time `t` to output at time\n`t`. It is implemented as a collection of `RNNCore` modules, which allow\nplugging together the different modules to experiment with variations on the\narchitecture.\n\n*   The *access* module is where the main DNC logic happens; as this is where\n    memory is written to and read from. At every timestep, the input to an\n    access module is a vector passed from the `controller`, and its output is\n    the contents read from memory. It uses two futher `RNNCore`s:\n    `TemporalLinkage` which tracks the order of memory writes, and `Freeness`\n    which tracks which memory locations have been written to and not yet\n    subsequently \"freed\". These are both defined in `addressing.py`.\n\n*   The *controller* module \"controls\" memory access. Typically, it is just a\n    feedforward or (possibly deep) LSTM network, whose inputs are the inputs to\n    the overall recurrent network at that time, concatenated with the read\n    memory output from the access module from the previous timestep.\n\n*   The *dnc* simply wraps the access module and the control module, and forms\n    the basic `RNNCore` unit of the overall architecture. This is defined in\n    `dnc.py`.\n\n![DNC architecture](images/dnc_model.png)\n\n## Train\nThe `DNC` requires an installation of [TensorFlow](https://www.tensorflow.org/)\nand [Sonnet](https://github.com/deepmind/sonnet). An example training script is\nprovided for the algorithmic task of repeatedly copying a given input string.\nThis can be executed from a python interpreter:\n\n```shell\n$ ipython train.py\n```\n\nYou can specify training options, including parameters to the model\nand optimizer, via flags:\n\n```shell\n$ python train.py --memory_size=64 --num_bits=8 --max_length=3\n\n# Or with ipython:\n$ ipython train.py -- --memory_size=64 --num_bits=8 --max_length=3\n```\n\nPeriodically saving, or 'checkpointing', the model is disabled by default. To\nenable, use the `checkpoint_interval` flag. E.g. `--checkpoint_interval=10000`\nwill ensure a checkpoint is created every `10,000` steps. The model will be\ncheckpointed to `/tmp/tf/dnc/` by default. From there training can be resumed.\nTo specify an alternate checkpoint directory, use the `checkpoint_dir` flag.\nNote: ensure that `/tmp/tf/dnc/` is deleted before training is resumed with\ndifferent model parameters, to avoid shape inconsistency errors.\n\nMore generally, the `DNC` class found within `dnc.py` can be used as a standard\nTensorFlow rnn core and unrolled with TensorFlow rnn ops, such as\n`tf.nn.dynamic_rnn` on any sequential task.\n\nDisclaimer: This is not an official Google product\n","funding_links":[],"categories":["Software","Python"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdeepmind%2Fdnc","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdeepmind%2Fdnc","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdeepmind%2Fdnc/lists"}