Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/alexdremov/deepswift
Swift autograd library
https://github.com/alexdremov/deepswift
algorithm autograd backpropagation graph matrix ml swift
Last synced: 12 days ago
JSON representation
Swift autograd library
- Host: GitHub
- URL: https://github.com/alexdremov/deepswift
- Owner: alexdremov
- Created: 2021-07-17T17:26:18.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2021-07-20T13:37:04.000Z (over 3 years ago)
- Last Synced: 2024-11-16T15:26:36.889Z (2 months ago)
- Topics: algorithm, autograd, backpropagation, graph, matrix, ml, swift
- Language: Swift
- Homepage:
- Size: 68.4 KB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# DeepSwift
[![codecov](https://codecov.io/gh/AlexRoar/DeepSwift/branch/main/graph/badge.svg?token=yq5czbVKrC)](https://codecov.io/gh/AlexRoar/DeepSwift)Build dynamic computational graph with forward and backward propagation functionality
## Example
Simple SGD optimization:
Find minimum of sum_{elements} |log(1 + x ^ 2)|
```swift
import DeepSwiftlet variable = Input(Matrix.random(dim: Shape(row: 5, col: 2), generator: {Float.random(in: -1...1)}), name: "x")
let function = (variable.pow(2) + ConstNode(1)).log().abs()
var lossGraph = function.sum()
let lr = 0.01
var loss = try! lossGraph.forward()
for _ in 0..<500 {
loss = try! lossGraph.forward()
print(loss)
try! lossGraph.backward()
variable.update(variable.value - lr * variable.grad!)
}assert(loss == Matrix(0))
```
## Matrix operations### Bradcasting
Element-wise operations support broadcasting similarly to numpy```swift
(Matrix<1, 5> -broadcasted-> Matrix<5, 5>) * Matrix<5, 5> = Matrix<5, 5>(Matrix<1, 1> -broadcasted-> Matrix<5, 5>) * Matrix<5, 5> = Matrix<5, 5>
(Matrix<5, 1> -broadcasted-> Matrix<5, 5>) * Matrix<5, 5> = Matrix<5, 5>
```### Element-wise multiplication (Hadamard product)
```swift
let a: Matrix = [
[1, 2, 3],
[4, 5, 6],
[7, 8, 9]
]let b: Matrix = [
[-1, 2, -3],
[4, -5, 6],
[-7, 8, -9]
]a * b == Matrix(internalData: [
[1 * -1, 2 * 2 , 3 * -3],
[4 * 4 , 5 * -5, 6 * 6 ],
[7 * -7, 8 * 8 , 9 * -9],
])```
In the same manner supported:
- Addition
- Substraction
- Division## Graph
Computational graph has several restrictions:
- Scalar values are 1x1 matrices
- N-dimensional tensors are not supported
- Must be directed & acyclic. You need to make sure that there is no cycles. In case of cyclic graph, forward and backward propagations will run infinitely.### Building elements
Graph consists of several types of elements: variables – `Input()`, constants – `ConstNode()`, and operations – `+, -, /, *, **, functions`.
Simple example:
```swift
let x = Input(Matrix(5), name:"Input variable")var graph:Graph = x * x + 2 * x + 5
// Integer literals are transformed to ConstNodesprint(try? graph.forward().as(Int.self) == Matrix(5 * 5 + 2 * 5 + 5))
x.update(0)
print(try? graph.forward().as(Int.self) == Matrix(0 * 0 + 2 * 0 + 5))
```
### Functions
Almost all needed function are implemented. The interface for introducig new functions is provoded.
Supported element-wise functions:
- tanh
- sigmoid
- abs
- pow
- sum
- reduceMean
- reduceSum
- log
- ReLU
- ELU
- LeReLU
- exp### Computing gradient
Gradient is computed using backpropagation. Forward pass is required before executing backprop
```swift
let x = Input(Matrix(5), name:"Input variable")var graph:Graph = x * x + 2 * x + 5
// Integer literals are transformed to ConstNodestry? graph.forward()
try? graph.backward()print(x.grad!.as(Int.self) == Matrix(2 * 5 + 2))
// d/dx(x^2 + 2 * x + 5) = 2 * x + 2
```Partial derivatives are supported
```swift
let x = Input(Matrix(5), name:"x")
let y = Input(Matrix(7), name:"y")var graph:Graph = x * y + ConstNode(2) * (x + y) + ConstNode(5) * y
// Integer literals are transformed to ConstNodestry? graph.forward()
try? graph.backward()print(x.grad!.as(Int.self) == Matrix(7 + 2))
print(y.grad!.as(Int.self) == Matrix(5 + 2 + 5))
// d/dx(x * y + 2 * (x + y) + 5 * y) = y + 2
// d/dy(x * y + 2 * (x + y) + 5 * y) = x + 2 + 5
```