Tuesday, July 28, 2015

WatchOS 2.0 beta4: WCSession working a LOT better!

Some good news on this end. After completing the upgrade of Watch and iPhone to beta4, WCSession is working a lot better. With the same ttl source from before upgrade; reachability and connectivity makes a lot more sense. For example:
  • reachability seems to stay the same as long as the Watch and iPhone are in range.
  • it is also not a function of either application being live and in focus. This is huge as now my iPhone background thread continues to dequeue even though the Watch is displaying some other task (or is in idle mode)
  • killing the iPhone app doesn't change Watch's reachability to iPhone. (but the process timer commands are stopped as expected)
(updates)
  • the iPhone appears to suspend process timers when the application is not in focus.
  • BUT, these timers seems to wake up when the phone is awake at lock screen when the app is in focus under the lock screen!?!
  • (the above feels like some rickety support for the badge at the lower left corner of iPhone indicating when apps on the Watch are doing something interesting)
  • there is some pause or suspend of a session during a voice call too
Regardless, this does change the strategy a bit. It may now be possible to have the Watch initiate dequeue events to the iPhone (perhaps even by using the WCSession.transferUserInfo which is nicely queued for us). I will work on background threads on the Watch now, having it initiate transfers to the iPhone, instead of the other way around.

p.s. upgrade was nominal this time; configuration, Xcode symbols, paring, etc. It just worked.


Watching files copy: WatchOS 2.0 beta4 upgrade...

It appears that a key background process feature is available starting with beta4 (WXExtensionDelegate:didReceive*Notification). This is something I want to pursue primarily for background dequeue of sensor recorder data. This means several hours of watching software install (everyone should know, computers are mostly good for copying data around).

So far:

  • Xcode 7.0 beta4 upgrade is looking fine (this is a lot of software to download, unpack for install, and then verify for execution and then unpack and boot up the simulators!)
  • iOS9 beta4 upgrade is looking good. Apps appear to mostly work as or better than before (TODO: test wifi stability for a while, it has been off during beta3 work)
  • WatchOS 2.0 is forthcoming. The iPhone watch app is loaded with the new profile and shows an upgrade to beta4 is available. Tomorrow will bring 2-3 hours more watching files copy (download, get it over to watch, get it loaded and restore from backup)
Tomorrow!

Monday, July 27, 2015

Distributing Watch sensor data using WCSession

I've taken a checkpoint on the code refactoring, mostly to demonstrate message flow between the Watch and the iPhone. Features of this checkpoint:

  • Watch and iPhone display reachability on the fly
  • Watch initiates a sensor record operation
  • Now, when a dequeue operation is enabled:
    • Watch sends the dequeue switch command to iPhone
    • iPhone starts a timer
    • iPhone replies to the Watch that it completed this operation (as part of a diagnostic)
  • When timer fires on iPhone:
    • iPhone sends message to Watch asking for next batch of sensor data
    • Watch fetches up to one 'batch' of data and returns it to iPhone
    • iPhone displays some basic data about the return packet
  • Watch keeps track of how many timer commands it receives
  • iPhone keeps track of how many timer attempts it tries
Here's a part of the Watch and iPhone diagnostic displays for reference.  You will note that both sides are reachable and data has been passed to the iPhone.




This exercise is primarily using the sendMessage() method which only works when the application is live on both sides. But with a bit of careful execution, data flow is illustrated. Features and problems with this approach:

  • Reachability is (regardless of the reachable state):
    • whether the two devices can communicate
    • whether the iPhone application is in focus (for Watch)
    • whether the Watch application is in focus (for iPhone)
  • The watch can not launch the iPhone application with this approach. Similarly, fetch requests from the phone will not be delivered if the Watch isn't actively running the application
  • Also, there is a sequence where system state is incorrect:
    • start Watch app (do not start iPhone app)
    • note that the state is reachable (this is the problem)
    • enable the dequeue (notice that delivery is stuck in sending...)
  • In general, the reachability support works ok, it is possible for the Watch to go in and out of range, either device to go into airplane mode, be rebooted, application killed and restarted, etc. and all continues to work as designed.
Next up will be to work with some of the queued message types and then more importantly to figure out how to launch an iPhone background method.

Sunday, July 26, 2015

Another visit to the SF/Marin Food Bank today!

We spent another afternoon up at the SF/Marin Food Bank today. This time, oranges and packaged oats! Was nice to see so many families today -- just regular family outings. The kids really enjoyed doing the hard work, made the time go by quickly.

Here's a linked photo to give you a feel for what the work is like (thank you kqed):



Again, if you are looking for a nice afternoon with friends, family or just on your own; take a weekend morning or afternoon and go help out at the food bank. Your work does feel appreciated and given the magnitude of the operation, this is a good effort for a good cause.

You can easily schedule your time here, although some folks just show up on short notice. Depending on the day, this often just works fine.

Thursday, July 23, 2015

Forwarding CMSensorRecorder results to iPhone for display

My next step in building out a sensor data stream is to send recorded data to the iPhone. The ttl code on github is up to date. Initially this is just displayed on the phone, a bit of a hello world. However, this illustrates a few features and issues:
  • The phone and watch do not have to be connected during the call to transfer data
  • Data split into batches
  • The WCSession callbacks (data sent, error sending)
  • The serialization bug in NSDate in beta3 (send it as string...)
  • Need to figure out why NSString(format: "%s", dataFormatter.stringFromDate()) doesn't encode properly...
Here's a screen shot of what the results look like.  We see the watch processed 4082 events.



And here's the first page of the events as shown on the iPhone.



Next up, get the reader on a background thread, get the serializer working cleaner, perhaps using temporary file transfer, and then send data to Kinesis.  Next, adjust the UI to have an 'enable' recorder feature which will:
  • Kick off the recorder in small batches with a daemon thread to kick off another batch continuously if the enable switch is set
  • Have a periodic dequeue from the sensor IFF the iPhone is reachable (there is a notify for this), only sending the newly available batches.
  • Similarly, the iPhone will queue up this data (using the local Kinesis buffering) for delivery when its network is reachable
At that point there should be a durable delivery chain from watch through phone to AWS...

Tuesday, July 21, 2015

A fix for remote debugging iOS9 WatchOS2 beta3!

Many thanks to dhash. Looks like an upgrade issue, where Xcode 7 isn't quite working when state from prior Xcode exists in local directories.  From Apple Forum Thread:
dhash (edited gm)
Jul 21, 2015 1:08 AM 
I have found the solution to get debug working all the time.
  1. Quit Xcode then go to /Users/<username>/Library/Developer/Xcode/
  2. There should be 2 folders, "Watch OS DeviceSupport" and "watchOS DeviceSupport"
  3. Delete "watchOS DeviceSupport"
  4. Delete all older versions of Xcode DeveloperPortal* files. Keep the DeveloperPortal 7.0.* files.
  5. Open Xcode
  6. Delete derived data in Windows->Projects
  7. Deploy and enjoy debugging 

Now, back to a more regular development pattern!  Productivity should improve now.

Sunday, July 19, 2015

Deeper dive into CMSensorRecorder

I have modified the ttl program to have a little more control over the sensor recorder, its performance, latency, etc. Several guesses:

  • It looks like the recordAccelerometerFor(seconds) method initiates or extends a global record queue. If a new recordAccelerometerFor action is inside a range of a prior action, then effectively nothing changes. It isn't clear to me if other tasks are setting this always. Maybe one of my earlier record functions (at 1 year or 1 day) is still running?!?  Also there doesn't appear to be a stop function
  • Some documents say the maximum record ring is 3 days, others 12 hours. Perhaps it is 12 hours on the watch
  • The visibility of the data appears to be dependent on some background task that will not run when a program is active. If you keep hitting the "process" action, the data doesn't seem to change
  • The data and sensor rate make sense
The code here helps to dig into what is happening (when you don't have a debugger). Here is an operational display (in two screen-shots). Note: there is a slider scrolled off the top of the screen above 'durationMins' used to dial in the duration for the recordAccelerometerFor() method.



Here we see the last time Start was run was 21:45. And 15 minutes later, a press of the 'Process Since' shows 34k events, batch number 21868, the values for the last event and the time range for the events in that block. I do need to figure out what is going on with the min date. Again, probably some fence value; -1 or minDate or something.

So, the sensorRecorder is working -- now need to figure out the best approach to using it.

You really have to want it: Developing on Apple pre-release software...

Given the current state of debugging WatchOS2 beta3, I've been reduced to printf debugging. Basically, try to figure out what might need to be shown and build big property panels that show a few values. From this try to figure out what is really going on. This is even more fun as the only reliable means of distributing this code to the hardware is via an archive-> ad-hoc deployment through iTunes to phone!

Recap:

  • no debugging, resort to printf style debugging
  • no simulator support of accelerometer so have to debug on the hardware
  • no quick means of distributing code to hardware
  • no reliable wifi on the phone -- so need to use cellular data
  • often times this whole flow needs resets, restart, etc [i haven't had to do re-pairing lately]
I've gone through this a few times with Apple in the past, if you really want to get on a beta program be prepared to get creative. I'm fairly certain the lead for the rollup of all these features, has a thankless job; days of arguing with teams "you missed your date, and the workaround sucks".

I know this is pretty much the norm inside Apple. After all they are always working on the next version of something. I think, however, there are not too many folks inside Apple who are trying to use all the product -- sort of like an external developer would. They are big and compartmented. And a product tends to look like the organization that built it...

(Back in the day, I was 'tech lead' for a couple Sun Solaris releases; running around trying to corral a bunch of teams to make their dates.) 


Saturday, July 18, 2015

Helped out at Sacred Heart Community Service today

Today, Stanford Alumni Association arranged a volunteer shift for several of us at Sacred Heart Community Service. Spent the morning putting together food bags that'll be distributed directly next week.

Sacred Heart is quite an operation. They distribute most of their items directly to those in need. Food, non-cook food, clothing, etc. A large operation that has been doing this for over 50 years.

Our assembly line crew of about 15 volunteers bagged 1000-1100 bags of food, maybe 20-25 pounds each. Somehow I wound up on the end of the line; meaning I was the one loading these bags into the palette bins. I can skip the YMCA today that is for sure.

A nice way to spend the morning with friends and family!

Friday, July 17, 2015

Raw sensor data on the Watch -- fairly limited use

Yes, the accelerometer is accessible to an in-focus application, however, since there is no control over keeping an application in focus, using accelerometer directly is pretty much only for test. Here's a screen dump for your amusement (the code is in the ttl repo).


What you see here is a test where I move the accelerometer event stream activation to a switch. Entering the application only starts the display update. The sensor is fine, but the total count shows the updates stop when the focus goes out of the application. And there appear to be cases where the app itself is terminated -- e.g. re-entering the application shows the state as reset. That the app is suspended or sensor data isn't delivered is pretty much a showstopper for an acquisition system.

A little reading and, oops, I should have been using CMSensorRecorder. This appears to be a way to start a background recorder capturing sensor data. WWDC presentation on CoreMotion suggests up to 3 days of data can be recorded at up to 50hz.  And, Apple warns, this will have some impact on battery life, something I will next measure.


Wednesday, July 15, 2015

Limits of CoreMotion on Apple Watch

Too bad.  I have confirmed some comments from other folks testing this device.  The CoreMotion package is somewhat hardware deficient on the Watch.  Here's a screen dump from test program:


What this confirms is that only the 3-axis accelerometer is available in the Watch.  Since the gyroscope and magnetometer aren't available, the high-level motion library is also unavailable.

And, as shown in the code, the maximum delivery rate is 100 samples/second.  Incidentally, the accelerometer orientation is:
  • +X is in the right direction (toward the crown)
  • +Y is in the top direction (toward the 12 o'clock position of the watch)
  • +Z is coming out of the watch toward the viewer
Apparently LIS3DH is, or is close to the actual hardware used (thanks to Mr. Sparks for the hint).  It would be interesting to see if the sampling range and sensitivity is actually dynamic -- or if it is hard coded to +/-4G, 2mG, etc. I'll put together a test to see if this can be determined (Apple doesn't yet provide this information so any signal processing they do is a little harder to derive).

Next step will be to log the actual events to determine sample jitter, actually delivery jitter or delivery dropout as it appears that the samples are coming out of the MEMS at a constant rate. And of course to see what happens when the application isn't in foreground. Right now my code turns off the event delivery so I'll make some changes to see if the host OS actually stops delivering events when the app is out of focus. I'll need to dig into HealthKit a bit too...

Wow! finally found a workaround to run an app on the watch with beta3 software

I, like quite a few other folks, have been stuck trying to get anything to run on the hardware. Various combinations of code signing issues and so on were down the wrong path of getting a app to run. Thankfully "BJamUT" posted a reliable workaround to getting code on the phone/watch.  From Apple forums:
Jul 15, 2015 2:29 PM
Update: I can run (not debug) every time on the watch by installing an ad hoc distribution through iTunes.
  1. Product -> Archive  [You may have some code signing issues to figure out here--especially if you have multiple people who sign the builds for your organization.]
  2. Export... -> for Ad Hoc Distribution
  3. Uninstall the app from your iPhone.
  4. Drop the ipa file onto iTunes.
  5. Select your iPhone in iTunes, go to the Apps tab, and find the app you just dropped in.
  6. Click install.
  7. Click apply.
  8. Make sure that "Show App on Apple Watch" is selected in the Apple Watch app, and wait for it to install.
It should run at least.  I wasn't able to attach the debugger to the processes, but I was able to see the performance of the changes that I had already tested on the simulator.

The good news is that performance of button and table responsiveness is probably 10x that of beta 2 if you have lots of buttons.  It was tough to get it to run, but I feel like there was a treasure at the end of the hunt.
Yes, the downside is I'll have to resort to some sort of printf debugging here -- perhaps to a property panel or glance or something.  This code pattern doesn't seem to work with the debugger either.

Anyway, I can now take the code from last Saturday and distribute it into my phone via local iTunes. And the app with accelerometer processing starts up and runs even when the phone is offline or out of range!  This is the demo I have been working toward; having an application run "natively" on the Watch!

Status:

  • beta3 is reasonably stable
  • battery life is good
  • phone communication via wifi is disabled -- it is unreliable across cellular link changes
  • basic apps are working roughly
  • (waiting for next beta dump to reset all the "this works, this doesn't"...)


Sunday, July 12, 2015

China Visit Pictures

I've posted a few photos from our visit to China this year.  This trip included:
  • Sichuan: visiting cities, temples, mountains, lakes, Tibetan villages, and especially spicy food!
  • Visiting mother-in-law's newly remodeled home in Wuxi along with some tasty dumplings and banquets
  • A business trip/banquet to Shanghai 
  • Visiting more of our family in Beijing with trips to Summer palace, great wall, old town districts and more banquets
  • Another business trip/banquet in Beijing

All in all a packed two+ week trip!

Saturday, July 11, 2015

Helped out at the San Francisco Food Bank this afternoon

The Stanford Alumni association has a pretty large volunteer organization.  Today was a nice session up at the San Francisco Food Bank.  Stanford had a pretty good turn out and we had fun with food preparation.

Today, I was part of a crew packaging up bundles of snack food.  The food bank gets large pallets of food and our job is to re-package it into classroom sized portions.  Later I wound up sorting plums and nectarines.  A bit more work, and a lot more lifting.  Still the 2.5 hours went by fast.

I do recommend this to anyone looking for a great way to spend time with friends and family.  Food banks feed a lot more people than you might think (this one distributes 107,000 meals a day).  Here's a direct link to the volunteer calendar.  Pick a time and show up!


CoreMotion accelerometer working

I've updated the ttl project to include basic accelerometer output on the display.  It looks like this may be the only sensor on the Watch.  I need to dig a bit more to make sure (e.g. no gyros or magnetometer).

(And no, the simulator doesn't have this support -- so testing this would require plumbing in a synthetic data stream or some sort of unit test data source.  Still a pretty primitive ecosystem.)

Anyway, this is what the screen looks like for the ttl application (actual screen dump from the watch itself):


Other notes:
  • Debugging is exceptionally difficult.  The screen dump from above was from one of only two sessions that have actually worked on the real hardware.  The profile reset was only part of the problem.  Disabling "diagnostics to apple", iMessage, HeySiri, etc. all seem to help kick in one more session.  Perhaps the reliability of debugging is a function of the network between the watch and the phone.  Or still we're stuck in beta ware.
  • The iPhone5 may be part of the problem -- not the speediest machine out there.
  • You will notice that the above right columns are in SF font and are not constant width.  I do have an outstanding project to understand the best way to get constant width working for numbers as mentioned during the SF font overview.

I think I found the "Untrusted Developer" root cause!

Within a narrow window, I replaced the phone and upgraded to Xcode beta3.  This wound up having quite a few provisioning profiles associated with the phone known as "Greg's iPhone".  To fix:
  • Xcode
  • Devices menu
  • Select offending device
  • Right click Profiles
  • Delete all of the provisioning profiles
  • Then re-launch, re-install the applications
All the other attempts, like re-pairing phone, full restores, and so on masked the root cause, and worked typically one time.

Now back to CoreMotion library.  I've got accelerometer output working -- too bad it isn't supported on simulator.  Next up will be the CMDeviceMotion higher level data.

beta3 release of Xcode7, iOS9, WatchOS2 is out!

Seems turning off "Hello Siri" didn't help much with Watch battery life.  It is still around 7 or 8 hours when running beta2 of WatchOS.  beta3 was released, now battery burn issue is resolved!

Upgrade is quite a few long running steps (almost 6 hours):
  • Download new Xcode
  • Remove old Xcode (this seems to help with file collisions...)
  • Install new Xcode
  • Verify an app (e.g. ttl) still works in local simulator (takes a long time to boot initial images of iPhone6 and Watch in simulators)
  • Download iOS9 image
  • Turn off iMessage
  • Restore iPhone to this image (this is a LOT of steps and reboots to get apps and data back in there)
  • Re-pair watch
  • Download WatchOS2 beta3 profile
  • Send profile to email on iPhone
  • Click on that attachment from iPhone
  • Do the profile update, watch reboot cycle again
  • Now do the WatchOS upgrade to beta3
  • Turn iMessage back on
  • Retrust the developer on the phone/watch pair
  • Restart Xcode (to get Watch symbols loaded back into Xcode)
  • Resync the developer's profiles in Xcode (prefs->accounts->${user}->View Details->(resync)
  • At this point, attempt to launch the program to the real devices

Anyway, I am able to get the ttl application loaded to the Watch -- and this application demonstrates the controller running in the watch.  Now I have an application that can launch and run without the paired iPhone being in contact.  Next step: understanding sensor data.

Tuesday, July 7, 2015

Betaware: Upgrading to WatchOS 2 and iOS9

beta2 of the new WatchOS and iOS has been a bit unstable for me.  The upgrade of Xcode, iOS and WatchOS worked ok.  A few existing apps are crashing on both the phone and the watch.  But otherwise, the phone operation remains about the same.  Xcode 7 appears to reliably get applications out to the phone.

However, the WatchOS 2 upgrade isn't quite as robust.  Battery burn is high.  It may be the OS is set to some debug mode.  Or maybe there is some crash/loop program under the hood.  "Hello Siri" being on by default on the watch holds promise.  I've turned it off for now.  We'll see.

Debugging any application is a hit more miss effort.  Some folks have noticed a pattern where iPhone5 as the host may be part of the issue as it isn't as quick as newer phones.

The Swift2 porting seems straight forward.  I am trying to figure out how to get a pure standalone application running on the watch.  The goal is to demonstrate an application that continues to run in a controlled fashion when the phone is off or out of range.  I sort of have this working now (after the first launch of the application has completed while in range of the phone).

Sunday, July 5, 2015

Tried Flotation Therapy today!

I've been meaning to try flotation therapy for a while.  This is sometimes known as Sensory Deprivation, but I think flotation is more accurate.  Today was the day and I have to say, it met the hype!

Travelled up to Float Matrix in San Francisco today.  The appointment pretty much went to what is listed in their FAQ:

  • Watch the eating and drinking before the appointment
  • Arrive and take a deep exfoliating shower
  • The attendant guides you to your float pod -- you put in ear plugs
  • You get in and close the door and float on your back
  • An hour later they knock on the outside, you get out, take another shower and that's it
Anyway, the super dense Epsom Salt mix does several things:
  • Keeps you floating out of the water so your face is comfortably above water
  • Keeps your skin from getting wrinkles (salt balances)
  • Does a nice job of nourishing your skin (whatever epsom salts do)
  • Keeps the water fairly disinfected on its own
I liked it.  At the beginning, the ringing in my ears [even while in a basement, in an insulated box, with ear plugs in] was the major stimulus that remained.  Then a little vertigo; with no visual reference a slight tumbling sensation.  A very slight sensation. After a while, the vertigo and ringing subsided and I definitely felt my muscles relax -- lower back, neck, lower legs -- all sore from a hike yesterday.  And then a little later my mind did indeed calm down; stopped thinking about work and todo lists and the such and really started to enjoy to buoyancy, temperature, blackness, quiet, slow heartbeat.  The water density made it easy to float so could move arms a bit to stretch, legs, etc -- all without worrying about getting the liquid in eyes [it DOES sting I found out].

They have a nifty ventilation trick for the relatively humid air in the pod -- there is a high and low vent pipe to the outside that wind up allowing for a slight convection.  So when your head is at the far end of the pod, you get a soft flow of fresh air.  However, with the humidity, scentless epsom salts, temperature control, this was very comfortable -- no stuffed airways, no throat tickles.

Around the end I did almost doze off -- more like meditative state -- brain stopped, no motion, no sound, touch, etc.  This is the goal -- and probably why they have 90 minute sessions too.

I came out of the place nice and relaxed, joints feel good and so on.  I think I'll work to make this a regular visit.

Saturday, July 4, 2015

Build a countdown timer application

Well, here's another experiment just prior to flipping to WatchOS 2.  This application is a countdown timer that displays days/hours/minutes/seconds until a target time.

The application is mostly an exercise in date math, but it does highlight the limitations of the current OS -- that being that the application is pretty much just a remote display from the phone.  e.g. this application will not start, nor update if the paired phone is not connected.

Some other observations:

  • Not quite sure how to ensure that the UI is updated with the exact countdown time prior to display.  These two behaviors seem a bit asynchronous.  The willActivate() method seems to be called a bit late, or the display update is latent.  I'm not going to worry too much until I flip over to WatchOS 2.
  • I need to do some work with resource bundles, property pages, etc.  Basically to make this application look a bit more production ready. 
  • Same thing with OS and hardware combinations.  I don't have a good feel how general these applications are; UI layout, properties, supported features, etc.  Some learning to do here.
Code is here.