Maintained with ☕️ by
IcePanel logo

Introducing Amplify for iOS and Android

Share

Services

The [Amplify Framework](https://aws-amplify.github.io/) is an open source project for building cloud-enabled mobile and web applications, consisting of libraries, UI components, and a CLI toolchain. Today we are releasing a preview of [Amplify iOS](https://aws-amplify.github.io/docs/ios/start) and [Amplify Android](https://aws-amplify.github.io/docs/android/start), open source libraries that enable mobile developers to build scalable and secure cloud powered serverless applications. Developers can easily add capabilities such as Analytics, AI/ML, API (GraphQL and REST), DataStore, and Storage to their mobile applications using these Amplify libraries. Support for escape hatches in Amplify allows you to use the generated iOS and Android SDKs for additional use cases. The Amplify iOS and Android libraries are use case-centric, in contrast to the AWS service-centric Mobile SDKs, and provide a declarative interface that enables mobile developers to programmatically apply best practices with abstractions resulting in a faster development cycle and less lines of codes. You can use these libraries with backends created using the Amplify CLI or with existing AWS backends. It is our recommended way to build mobile applications powered by AWS Services. This release also includes support for the [Predictions category](/about-aws/whats-new/2019/07/amplify-framework-adds-predictions-category/) in Amplify iOS that allows developers to easily add and configure AI/ML based use cases in their iOS applications using few lines of code. No machine learning experience is required. Developers can accomplish use cases such as text translation, speech to text generation, image recognition, text to speech, and insights from text using the Predictions category in the new Amplify iOS library and the Amplify CLI. The supported use cases leverage services such as [Amazon Rekognition](/rekognition/), [Amazon Translate](/translate/), [Amazon Polly](/polly/), [Amazon Transcribe](/transcribe/), [Amazon Comprehend](/comprehend/), and [Amazon Textract](/textract/). The Predictions library for iOS leverages both Amazon AI/ML services and the [CoreML framework](https://developer.apple.com/documentation/coreml) to provide a union of result sets that has exceptionally high accuracy. In addition, the iOS Predictions library detects internet connectivity and seamlessly switches between online and offline inferences. Offline inferences are supported for use cases such as detecting labels in an image, identifying language and syntax, and detecting entities and key phrases in text. For more details on how to use the Amplify iOS and Amplify Android, refer to our [blog post](https://aws.amazon.com/blogs/mobile/introducing-aws-amplify-for-ios-and-android/). To learn more about Amplify Framework, please visit our [documentation](https://aws-amplify.github.io/).