Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/dialogflow/dialogflow-apple-client

iOS SDK for Dialogflow
https://github.com/dialogflow/dialogflow-apple-client

Last synced: about 24 hours ago
JSON representation

iOS SDK for Dialogflow

Awesome Lists containing this project

README

        

# DEPRECATED Objective-C(Cocoa) SDK for api.ai

| Deprecated |
|-------|
| This Dialogflow client library and Dialogflow API V1 [have been deprecated and will be shut down on October 23th, 2019](https://blog.dialogflow.com/post/migrate-to-dialogflow-api-v2/). Please [migrate to Dialogflow API V2](https://cloud.google.com/dialogflow-enterprise/docs/migrating). |

[![Build Status](https://travis-ci.org/api-ai/api-ai-ios-sdk.svg)](https://travis-ci.org/api-ai/api-ai-ios-sdk)
[![Version](https://img.shields.io/cocoapods/v/ApiAI.svg?style=flat)](http://cocoapods.org/pods/ApiAI)
[![License](https://img.shields.io/cocoapods/l/ApiAI.svg?style=flat)](http://cocoapods.org/pods/ApiAI)
[![Platform](https://img.shields.io/cocoapods/p/ApiAI.svg?style=flat)](http://cocoapods.org/pods/ApiAI)

* [Overview](#overview)
* [Prerequisites](#prerequisites)
* [Running the Demo app](#runningthedemoapp)
* [Integrating api.ai into your iOS app](#integratingintoyourapp)

---------------

## Overview
The API.AI Objective-C(Cocoa) SDK makes it easy to integrate speech recognition with API.AI natural language processing API on Apple devices. API.AI allows using voice commands and integration with dialog scenarios defined for a particular agent in API.AI.

## Prerequsites
* Create an [API.AI account](http://api.ai)
* Install [CocoaPods](http://cocoapods.org/)

## Running the Demo app
* Run ```pod update``` in the ApiAiDemo project folder.
* Open **ApiAIDemo.xworkspace** in Xcode.
* In **ViewController -viewDidLoad** insert API key.
```
configuration.clientAccessToken = @"YOUR_CLIENT_ACCESS_TOKEN";
```

Note: an agent in **api.ai** should exist. Keys could be obtained on the agent's settings page.

* Define sample intents in the agent.
* Run the app in Xcode.
Inputs are possible with text and voice (experimental).

## Integrating into your app
### 1. Initialize CocoaPods
* Run ```pod install``` in your project folder.

* Update **Podfile** to include:
```Podfile
pod 'ApiAI'
```

* Run ```pod update```

### 2. Init the SDK.
In the ```AppDelegate.h```, add ApiAI.h import and property:
```Objective-C
#import

@property(nonatomic, strong) ApiAI *apiAI;
```

In the AppDelegate.m, add
```Objective-C
self.apiAI = [[ApiAI alloc] init];

// Define API.AI configuration here.
id configuration = [[AIDefaultConfiguration alloc] init];
configuration.clientAccessToken = @"YOUR_CLIENT_ACCESS_TOKEN_HERE";

self.apiAI.configuration = configuration;
```

### 3. Perform request.
```Objective-C
...
// Request using text (assumes that speech recognition / ASR is done using a third-party library, e.g. AT&T)
AITextRequest *request = [apiai textRequest];
request.query = @[@"hello"];
[request setCompletionBlockSuccess:^(AIRequest *request, id response) {
// Handle success ...
} failure:^(AIRequest *request, NSError *error) {
// Handle error ...
}];

[_apiAI enqueue:request];

```
## How to make contributions?
Please read and follow the steps in the [CONTRIBUTING.md](CONTRIBUTING.md).

## License
See [LICENSE](LICENSE).

## Terms
Your use of this sample is subject to, and by using or downloading the sample files you agree to comply with, the [Google APIs Terms of Service](https://developers.google.com/terms/).

This is not an official Google product.