Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/FridayTechnologies/PrototypeKit

A swift package to make prototyping machine learning experiences for Apple Platforms more accessible to early developers.
https://github.com/FridayTechnologies/PrototypeKit

coreml createml ios ipados machine-learning macos swift visionos

Last synced: 2 months ago
JSON representation

A swift package to make prototyping machine learning experiences for Apple Platforms more accessible to early developers.

Awesome Lists containing this project

README

        

# PrototypeKit

[![Swift](https://github.com/FridayTechnologies/PrototypeKit/actions/workflows/swift.yml/badge.svg)](https://github.com/FridayTechnologies/PrototypeKit/actions/workflows/swift.yml)

(Ironically, a prototype itself...) 😅

**Status**: Work In Progress

## Goals 🥅
- Make it easier to prototype basic Machine Learning apps with SwiftUI
- Provide an easy interface for commonly built views to assist with prototyping and idea validation
- Effectively a wrapper around the more complex APIs, providing a simpler interface (perhaps not all the same functionality, but enough to get you started and inspired!)

# Examples

Here are a few basic examples you can use today.

## Camera Tasks

### Start Here

1. Ensure you have created your Xcode project
2. Ensure you have added the PrototypeKit package to your project (instructions above -- coming soon)
3. Select your project file within the project navigator.
Screenshot 2024-02-02 at 3 42 28 pm

4. Ensure that your target is selected
Screenshot 2024-02-02 at 3 43 22 pm

5. Select the info tab.
6. Right-click within the "Custom iOS Target Properties" table, and select "Add Row"
Screenshot 2024-02-02 at 3 44 40 pm

7. Use `Privacy - Camera Usage Description` for the key. Type the reason your app will use the camera as the value.
Screenshot 2024-02-02 at 3 46 30 pm

### Live Camera View

Utilise `PKCameraView`

```swift
PKCameraView()
```

Full Example



```swift
import SwiftUI
import PrototypeKit

struct ContentView: View {
var body: some View {
VStack {
PKCameraView()
}
.padding()
}
}
```

### Live Image Classification

1. **Required Step:** Drag in your Create ML / Core ML model into Xcode.
2. Change `FruitClassifier` below to the name of your Model.
3. You can use latestPrediction as you would any other state variable (i.e refer to other views such as Slider)

Utilise `ImageClassifierView`

```swift
ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
```

Full Example



```swift
import SwiftUI
import PrototypeKit

struct ImageClassifierViewSample: View {

@State var latestPrediction: String = ""

var body: some View {
VStack {
ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
Text(latestPrediction)
}
}
}
```

### Live Text Recognition

Utilise `LiveTextRecognizerView`

```swift
LiveTextRecognizerView(detectedText: $detectedText)
```

Full Example



```swift
import SwiftUI
import PrototypeKit

struct TextRecognizerView: View {

@State var detectedText: [String] = []

var body: some View {
VStack {
LiveTextRecognizerView(detectedText: $detectedText)

ScrollView {
ForEach(Array(detectedText.enumerated()), id: \.offset) { line, text in
Text(text)
}
}
}
}
}
```

### Live Hand Pose Classification

1. **Required Step:** Drag in your Create ML / Core ML model into Xcode.
2. Change `HandPoseClassifier` below to the name of your Model.
3. You can use latestPrediction as you would any other state variable (i.e refer to other views such as Slider)

Utilise `HandPoseClassifierView`

```swift
HandPoseClassifierView(modelURL: HandPoseClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
```

Full Example



```swift
import SwiftUI
import PrototypeKit

struct HandPoseClassifierViewSample: View {

@State var latestPrediction: String = ""

var body: some View {
VStack {
HandPoseClassifierView(modelURL: HandPoseClassifier.urlOfModelInThisBundle,
latestPrediction: $latestPrediction)
Text(latestPrediction)
}
}
}
```

### Live Sound Classification (System Sound Classifier)
This model uses the system sound classifier, and does not currently support custom Sound Classifier Models.

1. You can use recognizedSound as you would any other state variable (i.e refer to other views such as Slider)

Utilise `recognizeSounds` modifier

```swift
.recognizeSounds(recognizedSound: $recognizedSound)
```

Full Example



```swift
import SwiftUI
import PrototypeKit
struct SoundAnalyzerSampleView: View {
@State var recognizedSound: String?

var body: some View {
VStack {
Text(recognizedSound ?? "No Sound")
}
.padding()
.navigationTitle("Sound Recogniser Sample")
.recognizeSounds(recognizedSound: $recognizedSound)
}
}
```

## FAQs

Is this production ready?


no.