https://github.com/alephao/coremlexample
An example of CoreML using a pre-trained VGG16 model
https://github.com/alephao/coremlexample
coreml ios11 machine-learning swift xcode9
Last synced: 5 months ago
JSON representation
An example of CoreML using a pre-trained VGG16 model
- Host: GitHub
- URL: https://github.com/alephao/coremlexample
- Owner: alephao
- Created: 2017-06-07T00:03:26.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2017-06-23T02:54:05.000Z (over 8 years ago)
- Last Synced: 2025-10-05T03:00:01.038Z (5 months ago)
- Topics: coreml, ios11, machine-learning, swift, xcode9
- Language: Swift
- Homepage:
- Size: 15.6 KB
- Stars: 37
- Watchers: 4
- Forks: 6
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
## CoreMLExample
In this example we use AVFoundation to continuously get image data from the back camera, and try to detect the dominant objects present in the image by using a pre-trained VGG16 model.
## Setup
To run this project, you need to download a pre-trained VGG16 model (I couldn't add it here because the file is larger than 100mb) and you can do it by running the `setup.sh` on the root folder. This will download the pre-trained model from apple's website.
```shell
git clone https://github.com/alaphao/CoreMLExample.git
cd CoreMLExample
./setup.sh
```
If you prefer, you can [download the model here](https://docs-assets.developer.apple.com/coreml/models/VGG16.mlmodel) and move it to the `CoreMLExample` folder.
## Requirements
* Xcode 9 beta
* Swift 4
* iOS 11
## Useful Links
* [Welcoming Core ML](https://medium.com/towards-data-science/welcoming-core-ml-8ba325227a28)
* [Integrating a Core ML Model into Your App](https://developer.apple.com/documentation/coreml/integrating_a_core_ml_model_into_your_app])