Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/yu4u/dnn-watermark
Implementation of "Embedding Watermarks into Deep Neural Networks," in Proc. of ICMR'17.
https://github.com/yu4u/dnn-watermark
deep-learning deep-neural-networks keras watermak
Last synced: 7 days ago
JSON representation
Implementation of "Embedding Watermarks into Deep Neural Networks," in Proc. of ICMR'17.
- Host: GitHub
- URL: https://github.com/yu4u/dnn-watermark
- Owner: yu4u
- Created: 2017-01-15T16:58:37.000Z (almost 8 years ago)
- Default Branch: master
- Last Pushed: 2022-07-28T07:18:28.000Z (over 2 years ago)
- Last Synced: 2024-06-16T00:37:29.104Z (5 months ago)
- Topics: deep-learning, deep-neural-networks, keras, watermak
- Language: Python
- Homepage: https://arxiv.org/abs/1701.04082
- Size: 26.4 KB
- Stars: 116
- Watchers: 8
- Forks: 43
- Open Issues: 9
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Embedding Watermarks into Deep Neural Networks
====
This code is the implementation of "Embedding Watermarks into Deep Neural Networks" [1]. It embeds a digital watermark into deep neural networks in training the host network. This embedding is achieved by a parameter regularizer.This README will be updated later for more details.
## Requirements
Keras 1.1.2 (<1.2.0), tensorflow 0.12.1 (<1.0.0), numpy, matplotlib, pandas**[CAUTION]**
We found that custom regularizers had been deprecated in the latest versions of Keras as discussed [here](https://github.com/fchollet/keras/pull/4703).> Custom regularizers may no longer work.
Therefore please use the old versions of Keras and TensorFlow.
(keras 1.1.2 does not work on tensorflow >= 1.0.)```sh
pip install keras==1.1.2
pip install tensorflow==0.12.1
pip install tensorflow-gpu==0.12.1
```## Usage
Embed a watermark in training a host network:```sh
# train the host network while embedding a watermark
python train_wrn.py config/train_random_min.json# extract the embedded watermark
python utility/wmark_validate.py result/wrn_WTYPE_random_DIM256_SCALE0.01_N1K4B64EPOCH3_TBLK1.weight result/wrn_WTYPE_random_DIM256_SCALE0.01_N1K4B64EPOCH3_TBLK1_layer7_w.npy result/random
```Train the host network *without* embedding:
```sh
# train the host network without embedding
python train_wrn.py config/train_non_min.json# extract the embedded watermark (meaningless because no watermark was embedded)
python utility/wmark_validate.py result/wrn_WTYPE_random_DIM256_SCALE0.01_N1K4B64EPOCH3_TBLK0.weight result/wrn_WTYPE_random_DIM256_SCALE0.01_N1K4B64EPOCH3_TBLK1_layer7_w.npy result/non# visualize the embedded watermark
python utility/draw_histogram_signature.py config/draw_histogram_non.json hist_signature_non.png
```Extracted watermarks from the embedded host network and the non-embedded networks:
![](images/hist_signature_non.png)
## License
All codes are provided for research purposes only and without any warranty.
When using any code in this project, we would appreciate it if you could refer to this project.## References
[1] Y. Uchida, Y. Nagai, S. Sakazawa, and S. Satoh, "Embedding Watermarks into Deep Neural Networks," ICMR, 2017.