The online home of John Pollard

Privacy Policy Update

An update to an earlier post - removing Crashlytics and Fabric from my apps

Last year I wrote about my Privacy Policy, outlining the approach I take on my mobile apps, and trying to justify the trade-offs I’m making.

This is a follow-up on what’s changed in the mean time.

Removing Crashlytics and Fabric

I’ve been trying to remove 3rd party code from my apps as much as possible.

This is a really more of a practical concern - fewer dependencies makes it easier to change the code/keep up to date on OS changes/be agile - than privacy worries (not really knowing what 3rd party libraries are actually doing).

Therefore I decided to remove Crashlytics and Fabric from my own apps.

Both Apple and Google offer built-in basic analytics and crash reporting via the App Store Connect and Play Store servers respectively. I realised for my own apps, this is perfectly sufficient for what I need.

All I really care about is finding out about crashes or issues, plus every so often knowing the breakdown of OS versions my users have so as to plan when I can drop support for older devices.

This isn’t an anti-Google stance, as I’m still using Firebase and Google Mobile Ads. It’s just why give my user’s data away any more than I need to?

Summary

Your mileage may vary, and I’m not sure I’d advise this approach for all my clients, but I’m happy this will work well for me and my users.

A Shortcut for sharing Live Photos

Replacing some average iOS apps for converting Live Photos with a simple Shortcut

Here's one I made earlier

I’ve been searching for an easy way to share iOS Live Photos on social media for a while, and have tried a whole bunch of frankly pretty average apps without being very happy with any of them.

Last week Jason Kottke shared on Twitter a shortcut that could convert a Live Photo into a video, which looked close to what I wanted. I’d never considered I could write my own converter before!

Below is a screenshot of my version. This one will convert a Live Photo into a looping animated GIF. A little trick I also learnt was if you use the “Quick Look” block at the end, you get a preview of how the GIF looks, as well as a share button from where you can share or save it as appropriate.

The image of Seljalandsfoss from last year’s trip to Iceland above was made using the script - and I’m very happy with the results!

Live Photos shortcut

Getting Daily Alerts for MLB Condensed Games

Keep up to date on the travails of the Mets without needing an MLB TV subscription

Citi Field

30 years ago I spent a couple of summers working in New Jersey, and got drawn into the crazy world of New York Mets baseball.

Back then, keeping in touch when back home meant scouring the International Herald Tribune for any baseball news, but for the last decade I’ve been a subscriber to MLB TV which has been fantastic.

When considering renewing this year’s subscription, I realised I hardly watch any live games any more. Most games are played in the evening US Eastern Time or later, so what I generally do is wait until the next morning and try to watch the condensed game - around 5 minutes of “Match Of The Day” style highlights - without knowing the score.

Accessing MLB Condensed Games for free

MLB are a lot less precious with their highlights packages than the Premier League (surprise, surprise!), and make a lot of content available for free on both YouTube and within the MLB TV app for free.

However within the MLB TV app you can’t get notifications for when new condensed games are available without a paid subscription (which is fair enough!), and it’s really hard to check when a the video is ready without finding out the score, or some information about the game via the other videos already published.

However, the excellent Baseball Theater had figured out that the configuration for the mobile apps is held in the open, so that gave me the idea of hacking together a similar solution to send alerts as soon as there is a new condensed game is ready for viewing.

My MLB Condensed Games alert system

It’s all a bit Heath-Robinson, but my system works as below. Note the code is all avaialble on GitHub, and the below is copied from the readme.md of the repo:

There are 2 main entry points - designed to be run from AWS Lambda:

  1. lambdaCondensedGame,js - Checks whether a condensed game stream is available for a given team on a given day
  2. lambdaMonitor.js - Will see if a condensed game has been added, and if so sends a Slack message saying a new game is ready, plus a link to the stream

Condensed Game function

This function reads from the POSTed JSON formatted like:

{
	gameDate: "2019-04-26",
	team: "nym"
}

You must also send a HTTP header of MLBAPIRequest with a value set as an environment variable of the same name.

This will then:

  1. Build URL like http://gd2.mlb.com/components/game/mlb/year_2018/month_06/day_26/master_scoreboard.xml (using the incoming date)
  2. Look for <game> node where home_file_code=”nym” or away_file_code=”nym” (using the incoming team)
  3. Pick up game_pk attribute to get the URL e.g. “https://statsapi.mlb.com/api/v1/game/530594/content?language=en”
  4. Find in the media.epgAlternate nodes the “Extended Highlights” section, and then find the item that is the condensed game video (if it exists)
  5. Pick the correct URL node - the video that has a value that ends in .mp4

Assuming that a condensed game is found, the function then returns JSON like:

{
	"opponent": "pit",
	"date": "2019-04-26",
	"url": "http://mediadownloads.mlb.com/mlbam/mp4/2018/06/27/2202032583/1530076464641/asset_1200K.mp4",
	"mediaType": "Extended Highlights" 
}

Note:

  • The opponent attribute can null if no game has been found
  • The url attribute can null if no condensed game stream has been found for the game
  • The mediaType attribute can null or “Extended Highlights” (I experimented with also getting the “Recap” for a while)

Configuration - Environment variables

  • MLBAPIRequest: The value to be sent in the header of any request

Monitoring function

This function saves the last found game data in an S3 bucket, and then is designed to run on a schedule to do the following:

  • Looks at date in saved game data JSON from the last successful run
  • If date is today, we’re done (we don’t cope with double-headers yet!)
  • Otherwise, call the condensed game function for either yesterday or today (if the last game was yesterday), and if the result has a url attribute
    • Save the latest game data JSON to S3
    • Call Slack sending the url attribute in a mesage
    • Call an IFTTT web hook that sends an iOS notification, clicking on which opens the viedo URL

You can setup a schedule in Cloudwatch to run every N minutes.

Configuration - Environment variables

  • S3ACCESSKEYID: Access Key for the S3 bucket to save game data
  • S3DATABUCKET: Name of the S3 bucket to save game data
  • S3DATAFILE: Name of the file to save game data in
  • S3SECRETACCESSKEY: Access Secret for the S3 bucket to save game data
  • SLACK_WEBHOOK_URL: URL of the webhook to send the Slack message to
  • TEAM: Team abbreviation to monitor e.g. nym
  • IFTTT_EVENT_NAME: Name of the IFTTT event to send the notification call to
  • IFTTT_MAKER_KEY: Name of the IFTTT maker key to enable IFTTT calls

Summary

Everything is working really well so far, and I’m getting exactly what I need. Very happy! I may still subscribe to MLB TV later in the season if I’m missing the weekend live games - or indeed if the Mets get in a playoff race - but for now I can keep wtaching the highlights without it.

The MLB “API” was a bit in flux for the first few weeks of the season. It’s not a public API so that’s to be expected, but hopefully it will be stable for a while now.

I also built an iOS Shortcut to call the API directly, so if for some reason the video URL changed, or I missed/deleted the notification by accident.

If you want to set something similar up for yourself and if the GitHub instructions aren’t clear, let me know and I’ll try to help you out!

Using NFC and Launch Center Pro to launch Spotify

Probably saves me a couple of seconds every day :)

I’ve been thinking about playing about with NFC tags for a while, and when Launch Center Pro released a new version that both supported NFC and lets you buy a sticker pack, I thought it was a good time to jump in.

When I’m coding, I’ve got an almost Pavlovian need to have music playing. I think after spending years working in open-plan offices wearing headphones to isolate myself from distractions, I can’t actually work without music on anymore.

Now I spend most of the time working at home, I still have music on - and I’m a long time Spotify subscriber (the quality of their recommendations are unsurpassed!)

Writing a Siri Shortcut to start Spotify playing

Note this section was updated from the original post, as the simple way of just opening the URL spotify-action://press-play stopped working just after I originally wrote this 😠

The shortcut to resume Spotify playing basically uses the Spotify Web API to control playing in the app.

This API is still in Beta, so hopefully won’t change/break in the future, but in essence the shortcut does the following:

  1. Fetches an access token from https://accounts.spotify.com/api/token using a client ID and secret you’ve setup on the Spotify developer dashboard
  2. Open the Spotify app - you need to do this first because of an existing bug in the Spotify API
  3. Call https://api.spotify.com/v1/me/player/play?device_id=[Device ID] to resume playing, using a device ID you’ve found by previously calling the devices API at https://api.spotify.com/v1/me/player/devices

If anyone is interested, let me know and I’ll make the effort to make my script more reusable and shareable, as at present I’ve hard-coded the various IDs into my shortcut 😱

Setting up an NFC Trigger in Launch Center Pro

The LCP NFC integration is really nice. To add a new sticker, you just press “+” in the ‘NFC Triggers” section of “Settings”, and it lets you scan one of your tags and give it a logical name.

Then it’s simple to associate an LCP action with that trigger - obviously here I setup an action to run my Shortcut e.g. shortcuts://run-shortcut?name={{Play Spotify}} where “Play Spotify” is the name of the Shortcut.

Running the Shortcut

To run the Shortcut using the tag, I just move my iPhone XS Max over the NFC tag, which triggers a notification asking if I want to “Open in Launch”.

Tapping the notification then opens the LCP action, and after a couple of seconds Spotify starts playing.

It would be much nicer if I didn’t have to tap the notification to kick things off, and if you’re not shown all scripts running in LCP and Shortcuts.

The notification restriction is down to Apple’s security rules - so not sure we’ll ever get rid of that - but hopefully at some point soon Apple will better integrate Shortcuts so we don’t have to see the app running in the foreground. iOS13?

If none of that made much sense, here’s it all in action …

Other NFC ideas …

I’ve added a tag to the dashboard of my car, and written a Shortcut that can do one of “Play Spotify”, “Play Overcast” or “Open Google Maps with directions home” - the 3 possible things I usually do when I get in the car.

My next idea is to start and stop Toggl timers from another tag on my desk, but I’m waiting for the almost mythical forthcoming Timery app before I do, as I can’t face tackling the Toggl API myself.

Playing with Siri Intents

Yeltzland makes it to yet another platform

I’ve been enjoying playing with the new Siri Intents in iOS 12, and obviously didn’t need much of an excuse to get my Yeltzland app on yet another platform!

Shortcuts from NSUserActivity

It was pretty easy to get some basic integrations with Siri Shortcuts working. I was already using NSUserActivity on each view on the app to support Handoff between devices, so it was quite simple to extend that code for Siri shortcuts.

For example, on the fixture list page I can add the following:

    // activity is the current NSUserActivity object

    if #available(iOS 12.0, *) {
        activity.isEligibleForPrediction = true            
        activity.suggestedInvocationPhrase = "Fixture List"
        activity.persistentIdentifier = String(format: "%@.com.bravelocation.yeltzland.fixtures", Bundle.main.bundleIdentifier!)
    }

Making the activity eligible for Prediction means it can be used as a Siri Shortcut, and obviously the suggested invocation phrase is a hint for when you open the shortcut in Settings to be able to open the app directly on the Fixture List view from Siri.

Building a full custom Siri Intent

Probably the most useful app feature to expose via a full Siri Intent is the latest Halesowen score. By that I mean an intent that will speak the latest score, as well as showing a custom UI to nicely format the information.

There are plenty of good guides on how to build a custom Siri Intent out there, so I won’t add much detail on how I did this here.

However one strange issue I couldn’t find a work around for was that, when trying to put a number as a custom property in the Siri response, I couldn’t get the response to be spoken.

As you can see from the setup below, I got around this by passing the game score as a string rather than a number, but I wasted a long time trying to debug that issue. Still no idea why it doesn’t work as expected.

Custom response properties

Building a custom UI to show alongside the spoken text was also pretty easy. I’m quite happy with the results - you can see it all working in the video below

To make the shortcut discoverable, I added a “Add to Siri” button on the Latest Score view itself. This is really easy to hookup to the intent by simply passing it in the click handler of the button like this:

    if let shortcut = INShortcut(intent: intent) {
        let viewController = INUIAddVoiceShortcutViewController(shortcut: shortcut)
        viewController.modalPresentationStyle = .formSheet
        viewController.delegate = self
        self.present(viewController, animated: true, completion: nil)
    }

I’m sure you’ll agree the view itself looks pretty classy 🙂

Latest Score view

Summary

It was a lot of fun hooking everything up to Siri, and I’m really pleased with how it all turned out.

Overall I think opening up Siri to 3rd party apps could be game-changing for the platform. Previously Siri was so far behind both Google and Amazon it was almost unusable except for the most basic of tasks. However, now it can start working with those apps you use all the time, I can see it being a truly useful assistant.

Siri is still a way behind of course, but once custom parameterised queries are introduced - presumably in iOS 13 - and if the speech recognition can be improved, it is definitely going to be a contender in the voice assistant market.

I’m also looking forward to Google releasing their similar in-app Actions at some point soon.

Exciting times ahead!