Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/ashwinpn/binary-classification-using-tensorflow


https://github.com/ashwinpn/binary-classification-using-tensorflow

deep-learning machine-learning multilayer-perceptron-network neural-network research statistical-analysis tensorflow tensorflow-examples tensorflow-experiments tensorflow-tutorials

Last synced: 13 days ago
JSON representation

Awesome Lists containing this project

README

        

# Binary-Classification-using-Tensorflow

## Terminology

* Batch Size
It is a hyperparameter which defines the number of samples the algorithm would work through before the internal parameters are updated.
* Epoch
Number of epochs reflects the number of iterations the learning algorithm will undergo while working through the entire dataset.
* Perceptrons
A machine learning construct usually used for classification tasks.
* Stochastic Gradient Descent
When the batch_size = 1.
* Batch Gradient Descent
When epoch = 1.

## Important considerations

* What should be the batch_size?
* Gradient descent vs stochastic gradient descent vs mini-batch gradient descent

## Common questions / issues
* Can we use np.mean instead of tf.reduce_mean? Is there a difference?
The functionality is the same. However, tf.reduce_mean would work only within a session.

Try this:

```
c = np.array([[3.,4], [5.,6], [6.,7]])
print(np.mean(c,1))

Mean = tf.reduce_mean(c,1)
with tf.Session() as sess:
result = sess.run(Mean)
print(result)

```
Output:

```
[ 3.5 5.5 6.5]
[ 3.5 5.5 6.5]
```

## Results
| Batch size | Cost | Accuracy|
|------------|---------------|---------|
| 32 | 0.079790163 | 0.97696 |
| 100 | 0.073474707 | 0.98096 |


## Checklist

- [x] Classification (normally distributed clusters)
- [ ] Plot learning curve of model vaildation error against training time
- [ ] Tune batch size and learning rate