Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/colbylwilliams/customvision

Swift SDK for Microsoft Custom Vision
https://github.com/colbylwilliams/customvision

cognitive-services custom-vision ios macos microsoft microsoft-cognitive-services sdk swift

Last synced: 18 days ago
JSON representation

Swift SDK for Microsoft Custom Vision

Awesome Lists containing this project

README

        

# CustomVision

Swift SDK for Microsoft's [Custom Vision Service](https://www.customvision.ai)

## Features

- [x] [Custom Vision Training API 3.0](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.0)
- [x] [Custom Vision Prediction API 3.0](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Prediction_3.0)
- [x] Work with `UIKit` & `Foundation` objects like `UIImage`
- [x] Export & Download `CoreML` models for use offline
- [ ] Sample App

## Requirements

- iOS 11.0+ / Mac OS X 10.11+ / tvOS 9.0+ / watchOS 2.0+
- Xcode 9.3+
- Swift 4.1+
- [Custom Vision](https://www.customvision.ai/) Account

## Installation

### Carthage

[Carthage](https://github.com/Carthage/Carthage) is a decentralized dependency manager that builds your dependencies and provides you with binary frameworks.

You can install Carthage with [Homebrew](http://brew.sh/) using the following command:

```bash
$ brew update
...
$ brew install carthage
```

To integrate CustomVision into your Xcode project using Carthage, specify it in your [Cartfile](https://github.com/Carthage/Carthage/blob/master/Documentation/Artifacts.md#cartfile):

```cartfile
github "colbylwilliams/CustomVision"
```

Run `carthage update` to build the framework and drag the built `CustomVision.framework` into your Xcode project.

### CocoaPods

[CocoaPods](http://cocoapods.org) is a dependency manager for Cocoa projects.
You can install it with the following command:

```bash
[sudo] gem install cocoapods
```

> CocoaPods 1.3+ is required.

To integrate the Azure.iOS into your project, specify it in your [Podfile](http://guides.cocoapods.org/using/the-podfile.html):

```ruby
source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '12.0'
use_frameworks!

pod 'CustomVision', '~> 1.0'
```

Then, run the following command:

```bash
$ pod install
...
```

### Swift Package Manager

_Coming soon_

## Usage

To get started using CustomVision, you need to provide the SDK with your [Training Key](https://www.customvision.ai/projects#/settings) and [Prediction Key](https://www.customvision.ai/projects#/settings).

If you're working with a single [project](https://www.customvision.ai/projects), you can also provide a default Project ID that will be used for every project operation _(instead of passing it in as a parameter every time)_.

There are two ways to provide the Training Key, Prediction Key, and Project ID; programmatically, or by adding them to a plist file:

### Programmatically

The simplest way to provide these values and start using the SDK is to set the values programmatically:

```swift
CustomVisionClient.defaultProjectId = "CUSTOM_VISION_PROJECT_ID"
CustomVisionClient.subscriptionRegion = "CUSTOM_VISION_PROJECT_ID"
CustomVisionClient.shared.trainingKey = "CUSTOM_VISION_TRAINING_KEY"
CustomVisionClient.shared.predictionKey = "CUSTOM_VISION_PREDICTION_KEY"

CustomVisionClient.shared.getIterations { r in
// r.resource is [Iteration]
}
```

### Plist File

Alternatively, you can provide these values in your project's `info.plist`, a separate [`CustomVision.plist`](https://github.com/colbylwilliams/CustomVision/blob/master/CustomVision/CustomVision.plist), or provide the name of your own plist file to use.

Simply add the `CustomVisionTrainingKey`, `CustomVisionPredictionKey`, `CustomVisionProjectId`, and `CustomVisionSubscriptionRegion` keys and provide your Training Key, Prediction Key, and default Project ID respectively.

**_Note: This method is provided for convenience when quickly developing samples and is not recommended to ship these values in a plist in production apps._**

#### Info.plist

```plist
...

CFBundleName
$(PRODUCT_NAME)
CustomVisionProjectId
CUSTOM_VISION_PROJECT_ID
CustomVisionTrainingKey
CUSTOM_VISION_TRAINING_KEY
CustomVisionPredictionKey
CUSTOM_VISION_PREDICTION_KEY
CustomVisionSubscriptionRegion
CUSTOM_VISION_SUBSCRIPTION_REGION
...
```

#### CustomVision.plist

Or add a [`CustomVision.plist`](https://github.com/colbylwilliams/CustomVision/blob/master/CustomVision/CustomVision.plist) file.

```plist

CustomVisionProjectId
CUSTOM_VISION_PROJECT_ID
CustomVisionTrainingKey
CUSTOM_VISION_TRAINING_KEY
CustomVisionPredictionKey
CUSTOM_VISION_PREDICTION_KEY
CustomVisionSubscriptionRegion
CUSTOM_VISION_SUBSCRIPTION_REGION

```

#### Named plist

Finally, you can `CustomVisionTrainingKey`, `CustomVisionPredictionKey`, `CustomVisionProjectId`, and `CustomVisionSubscriptionRegion` key/values to any plist in your project's **main bundle** and provide the name of the plist:

```swift
CustomVisionClient.shared.getKeysFrom(plistNamed: "SuperDuperDope")
```

## Training Images

The CustomVision SDK adds several convenience functions to make uploading new training images to your project as easy as possible.

This example demonstrates creating a new Tag in the Custom Vision project, then uploading several new training images to the project, tagging each with the newly created tag:

```swift
let client = CustomVisionClient.shared

let tag = "New Tag" // doesn't exist in project yet
let images: [UIImage] = // several UIImages

client.createImages(from: images, withNewTagNamed: name) { r in
// r.resource is ImageCreateSummary
}
```

## Export & Download CoreML models

One of the most useful features of this SDK is the ability to re-train your Project, export the newly trained model (Iteration), download to the phone's file system, and compile the model on-device for use with `CoreML`.

```swift
func updateUser(message: String) {
// update user
}

let client = CustomVisionClient.shared

client.trainAndDownloadCoreMLModel(withName: "myModel", progressUpdate: updateUser) { (success, message) in

}
```

Once the compiled model is persisted to the devices filesystem (above) you get the url of the model using the client's `getModelUrl()` func:

```swift
if let url = client.getModelUrl() {
let myModel = try MLModel(contentsOf: url)
}
```

## License

Licensed under the MIT License. See [LICENSE](License) for details.