Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/FridayTechnologies/PrototypeKit
A swift package to make prototyping machine learning experiences for Apple Platforms more accessible to early developers.
https://github.com/FridayTechnologies/PrototypeKit
coreml createml ios ipados machine-learning macos swift visionos
Last synced: 13 days ago
JSON representation
A swift package to make prototyping machine learning experiences for Apple Platforms more accessible to early developers.
- Host: GitHub
- URL: https://github.com/FridayTechnologies/PrototypeKit
- Owner: FridayTechnologies
- License: apache-2.0
- Created: 2024-02-01T10:47:44.000Z (9 months ago)
- Default Branch: master
- Last Pushed: 2024-06-06T16:38:52.000Z (5 months ago)
- Last Synced: 2024-07-30T20:59:04.473Z (3 months ago)
- Topics: coreml, createml, ios, ipados, machine-learning, macos, swift, visionos
- Language: Swift
- Homepage:
- Size: 3.95 MB
- Stars: 6
- Watchers: 0
- Forks: 1
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.md
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# PrototypeKit
[![Swift](https://github.com/FridayTechnologies/PrototypeKit/actions/workflows/swift.yml/badge.svg)](https://github.com/FridayTechnologies/PrototypeKit/actions/workflows/swift.yml)
(Ironically, a prototype itself...) 😅
**Status**: Work In Progress
## Goals 🥅
- Make it easier to prototype basic Machine Learning apps with SwiftUI
- Provide an easy interface for commonly built views to assist with prototyping and idea validation
- Effectively a wrapper around the more complex APIs, providing a simpler interface (perhaps not all the same functionality, but enough to get you started and inspired!)# Examples
Here are a few basic examples you can use today.
## Camera Tasks
### Start Here
1. Ensure you have created your Xcode project
2. Ensure you have added the PrototypeKit package to your project (instructions above -- coming soon)
3. Select your project file within the project navigator.4. Ensure that your target is selected
5. Select the info tab.
6. Right-click within the "Custom iOS Target Properties" table, and select "Add Row"7. Use `Privacy - Camera Usage Description` for the key. Type the reason your app will use the camera as the value.
### Live Camera View
Utilise `PKCameraView`
```swift
PKCameraView()
```Full Example
```swift
import SwiftUI
import PrototypeKitstruct ContentView: View {
var body: some View {
VStack {
PKCameraView()
}
.padding()
}
}
```### Live Image Classification
1. **Required Step:** Drag in your Create ML / Core ML model into Xcode.
2. Change `FruitClassifier` below to the name of your Model.
3. You can use latestPrediction as you would any other state variable (i.e refer to other views such as Slider)Utilise `ImageClassifierView`
```swift
ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
```Full Example
```swift
import SwiftUI
import PrototypeKitstruct ImageClassifierViewSample: View {
@State var latestPrediction: String = ""
var body: some View {
VStack {
ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
Text(latestPrediction)
}
}
}
```### Live Text Recognition
Utilise `LiveTextRecognizerView`
```swift
LiveTextRecognizerView(detectedText: $detectedText)
```Full Example
```swift
import SwiftUI
import PrototypeKitstruct TextRecognizerView: View {
@State var detectedText: [String] = []
var body: some View {
VStack {
LiveTextRecognizerView(detectedText: $detectedText)
ScrollView {
ForEach(Array(detectedText.enumerated()), id: \.offset) { line, text in
Text(text)
}
}
}
}
}
```### Live Hand Pose Classification
1. **Required Step:** Drag in your Create ML / Core ML model into Xcode.
2. Change `HandPoseClassifier` below to the name of your Model.
3. You can use latestPrediction as you would any other state variable (i.e refer to other views such as Slider)Utilise `HandPoseClassifierView`
```swift
HandPoseClassifierView(modelURL: HandPoseClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
```Full Example
```swift
import SwiftUI
import PrototypeKitstruct HandPoseClassifierViewSample: View {
@State var latestPrediction: String = ""
var body: some View {
VStack {
HandPoseClassifierView(modelURL: HandPoseClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
Text(latestPrediction)
}
}
}
```### Live Sound Classification (System Sound Classifier)
This model uses the system sound classifier, and does not currently support custom Sound Classifier Models.1. You can use recognizedSound as you would any other state variable (i.e refer to other views such as Slider)
Utilise `recognizeSounds` modifier
```swift
.recognizeSounds(recognizedSound: $recognizedSound)
```Full Example
```swift
import SwiftUI
import PrototypeKit
struct SoundAnalyzerSampleView: View {
@State var recognizedSound: String?
var body: some View {
VStack {
Text(recognizedSound ?? "No Sound")
}
.padding()
.navigationTitle("Sound Recogniser Sample")
.recognizeSounds(recognizedSound: $recognizedSound)
}
}
```## FAQs
Is this production ready?
no.