Apple Watch to AWS

This is my journey through developing an Apple Watch application that transmits sensor data through a data flow system at Amazon.  Let's see what comes out the other end.


Initial Setup

  • Assume Xcode is installed (from app store)
  • Follow the AWS setup instructions here -- Note: don't try this with an "Empty" project. It will fail during pod install with a missing target
  • Reload the Xcode project using the *.xcworkspace file just created
  • Do a test build to a target, say an iPhone6 -- Note: this will fail to compile unless you re-open the project using the .xcworkspace file
  • At this point it should compile and launch the simulator

Set up AWS Plumbing

  • Create an AWS account.  Get used to using least privilege and multi-factor authentication (MFA) from the start.  Security first philosophy suggests creating prototypes as if they are secured productized applications.  Long lived credentials, if any should only authorize minimal functions for the application.  (e.g. in IAM think carefully when you apply a '*' to a policy)
  • Create a privacy policy page.  This page should state how customer data will be used, removed, consent, etc.  This is required by Cognito below
  • Set up Cognito Identity.  Cognito will route identities to authorizations.  Details are here.  I set up an Identity Pool named "WatchTest"; no unauthenticated identities, with Amazon as the sole Authentication provider
  • Create an S3 bucket for use in this experiment
  • In IAM extend the just created Cognito_...AuthRole policy to grant access to the bucket (you could use a bucket policy, but this seems to keep roles and policies in one place).  The policy should look something like this:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1433614809000",
            "Effect": "Allow",
            "Action": [
                "s3:DeleteObject",
                "s3:DeleteObjectVersion",
                "s3:GetObject",
                "s3:GetObjectAcl",
                "s3:GetObjectTorrent",
                "s3:GetObjectVersion",
                "s3:GetObjectVersionAcl",
                "s3:GetObjectVersionTorrent",
                "s3:PutObject",
                "s3:PutObjectAcl",
                "s3:PutObjectVersionAcl"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKET/*"
            ]
        },
        {
            "Sid": "Stmt1433615018000",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:ListBucketMultipartUploads",
                "s3:ListBucketVersions",
                "s3:ListMultipartUploadParts"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKET"
            ]
        }
    ]
}
  • Upload some object into that bucket for testing

Configure Federated Login

  • I'm trying to learn Swift at the same time.  This page has a good summary of setting up a Bridging Header and connecting to the AWS SDK.  Don't forget to modify the project settings to specify this header in the Swift compiler section
  • Extend the AppDelegate to include the Cognito to credentials provider as shown in the example above.  These temporary credentials will be used to access AWS Paas services directly
  • Create the Amazon login page.  This flow will ask user for credentials and return an auth token. Details are here
  • Once the above is done the application should show a login button and should authenticate against Amazon
  • Once the authentication completes, add in a couple of delegates, one to fetch the user profile from the identity provider, and one to initialize AWS access tokens

Enable Location Services

  • This isn't too big a deal.  Just follow the details of using the LocationManager
  • I've added a basic 'turn on the location updater' which posts current location to a text field

Plumb Events to Kinesis Stream

  • Again, not too big a deal now that we have access tokens working
  • Create a kinesis stream
  • Add PutRecords authorization to the authenticated role for Cognito from above
  • Then take the location updates and queue them up in the KinesisRecorder
  • I added a little bit of extra sauce to delay flushes to Kinesis when either the buffer is over a size limit or an amount of time since last flush has occurred

No comments:

Post a Comment