Next TTC: Behind the scenes

Last week, I released Next TTC 2.0, a major update to my real-time arrival predictions utility app for Toronto’s streetcars (buses to be added by Toronto Transit Commission (TTC) soon).

Today, I’d like to discuss how I went about dealing with the data that is provided through the API and which UI/UX choices I made based on the data and how it would be used.

Bit of background

Data is provided for Toronto’s public streetcar transportation system through the nextbus.com API. Basically vehicles are outfitted with a GPS. Based on their location and proximity to a certain stop, the API provides arrival time predictions. So for example, if a streetcar is 1.2 km away from a XYZ stop, based on the average speed and time it takes from the streetcar to arrive at XYZ it will tell you that the streetcar is arriving in 4 minutes and 43 seconds. This can however change if the streetcar is backed up in traffic or suddenly able to travel at faster speeds, or make less stops en route to stop XYZ.

The data

TTC provides up to 5 arrival time predictions for the requested stop and/or route.

By requesting data for a stop only, you’re provided with all routes and their arrival times for that particular stop. Depending on the stop, this may be one or several routes at once. You’re also able to request stop and a specific route, which means the API just returns arrival time predictions for that route.

Furthermore, for TTC, some streetcars make short-turns. For example, route 501 westbound has two end destinations. The streetcars are on the same track for most of the way, but every few streetcars will stop at an earlier station. This means the API also returns predictions for each route-short-turn.

The data may appear as such:

  • Route 0
    • Destination 0
      • Prediction 0
      • Prediction 1
      • Prediction 2
      • Prediction 3
      • Prediction 4
  • Route 1
    • Destination 0
      • Prediction 0
      • Prediction 1
      • Prediction 2
      • Prediction 3
    • Destination 1
      • Prediction 0
      • Prediction 1
  • Route 2
    • Destination 0

In the above example we have requested data for a stop. This stop is serviced by 3 streetcar routes. Route 0 is currently in service and we are returned 5 arrival time predictions. Route 1 is also in service and has two destinations (or short-turns) for which we are returned results for both. Route 2 is currently not in service (some streetcar routes only run during rush hour) and we are not returned any arrival time predictions.

So as you can see, we are potentially dealing with a great deal of data.

Scope

Version 1 of the API provided less amount of data and details, so in Next TTC 2.0 I wanted to upgrade as well, while still retaining the basic user experience of the app.

The basic scope of the app is to launch the app and instantly get arrival time predictions for the stop nearest your current location. In essence this is pretty easy with Core Location. Once you have a user’s location, loop through all stops to find the nearest stop and request the data for that stop.

But how do you retain a simple user experience as required by the scope when you’re dealing with potentially lots of varied data?

UI/UX decisions

In Next TTC 1.x, if a stop had more than one route, I prompted the user to select the desired route – or telling them Route X and Route Y service this stop but only Route Y is currently running. While it was the only app allowing the user to view other routes at the same stop it was intrusive and quite annoying.

Alert prompt in Next TTC 1.x to select route for stop at user's current location

With version 2 of the API returning multiple destinations (short-turns) for one route there was more data to present.

My goal was to present all the data possible without any interruptions, while also making it extremely easy and quick to change between the desired data. So in 2.x I added the trust navigation bar (I also went from not having the status bar visible, to making it visible, so the user could see the current time), which allowed me to added a UISegmentedControl to quickly switch between routes for the stop at the user’s current location.

Next TTC 2.0 provides as easy way to switch between routes servicing the same stop in the navigation bar as well as switching between a route's short-turns

With a checkmark, I also visualize whether a route is running. While the returned data is a bit scrambled, I chose to sort the segments in an ascending order. Furthermore, the selection defaults to the first route in the segmented control that is in service, so the user isn’t initially presented with an empty result for a route they most likely wouldn’t care about, since they want to know when the next vehicle is arriving at their current location.

In the screenshot, you can also see another row of segments: All, Long Brand and Humber. These are the short-turns. As we saw earlier, the API would have returned two destination. I then present these two selections above the route button along with a combined set of predictions for both the short-turns sorted in an ascending order. In this case, 501 Queen is a long route, going from east of Toronto to west of Toronto. Both destination short-turns are quite far out of downtown, so if you’re going to a stop way before that, you wouldn’t care which one of the upcoming streetcars you can take, because you know you’re getting off before that. At the same time, if you’re going to Long Branch station, selecting Long Branch will provide you with upcoming arrival time predictions for that location, which could save you lots of time waiting or getting on the wrong streetcar (Long Branch is further than Humber. Unfortunately the API doesn’t sort the destination short-turns in order of proximity, so it’s not possible to present them in a sorted order at this time).

Example of a stop serviced by up to 5 different streetcar routes

While it’s recommended in the API documentation not to show seconds, I felt it was appropriate to show it because “1 minute” may be 1 minute and 20 seconds or less than a minute. If you’re running to catch a streetcar, 30 seconds precision matters a lot. Luckily, lots of users love this feature over other apps that provide the same basic use-case. I only show it for the upcoming streetcar, while the subsequent cars are displayed in minutes. The app auto-updates the predictions based on calculations for the count down time, so the user is always shown the most accurate prediction at any given time. It feels kind of awesome standing at a stop looking up from your iPhone running Next TTC and seeing the streetcar arrive exactly as predicted in the app :)

Core Location

Anyone who’s ever had to work with Core Location probably agrees with me when I say it’s a bitch. While the API is great, it was a challenge to find a balance between very accurate location results and also being able to locate the user when location accuracy is very low (no GPS for example). With Next TTC, I wanted to be able to locate the user quite precisely and quite quickly to find the nearest stop, and not find a stop 500 meters away or more.

All devices cache the user’s location and when initially prompting for a user’s location that’s exactly what you get back. It might be an extremely accurate location, but it may also be very old and inaccurate. If you’ve ever opened the Maps app, and seen the blue dot move over a large area before locating you, or having the circle around it cover a very large area, this is exactly what I am talking about.

So in dealing with this challenge, I initially did a whole bunch of stuff in code to try and figure out whether the location was usable, whether to continue trying to find a new, better location, etc. It turned out quite difficult so in Next TTC 2.0 I went back to the basics, but made it possible for the user to quickly re-locate themselves and request new arrival time predictions for their updated location. In Next TTC 1.x I had a GPS arrow button, but it turns out, even though Apple uses this, most users don’t understand the icon. Since I had worked so much on getting a good location initially (which sometimes took several seconds; way too long) I didn’t really make it possible to re-locate oneself, other than picking another stop manually and then hitting the GPS arrow button again. Poor UX, Rune!

Main screen in Next TTC 1.x using user's current location (GPS arrow)

So as dicussed, in Next TTC 2.0 I got rid of lots of code for my Core Location class, and implemented a much easier way to get an updated (and more accurate location if available) if the initial location wasn’t satisfactory. I implemented the pull-to-refresh paradigm, which works wonderfully for my app.

Example of my custom "pull-to-re-locate" implementation of Pull-to-refresh seen in lots of apps

Clutter or not?

When dragging the bottom view down, it reveals a button to add a stop to favourites. When this button is hidden, I added a small light that allows the user to quickly see whether it’s a favourite (light is on) or not (no light), without having added yet another button that takes up a bunch of screen real estate and doesn’t get used 90% of the time if that much!

Extra hidden menu bar reveals "Add favourite" button

The great thing about this extra menu bar is that I can add features/buttons in the future without cluttering up the UI more, leaving the user to focus on the most important thing to them: “When’s the next streetcar arriving at this stop?”

I did add a button in the centre of the toolbar, to open/close the hidden menu. Based on beta tester feedback, it wasn’t obvious enough that dragging/pulling down on the picker view would reveal the extra menu bar.

Next TTC 2.0 has the same grey noise background as Next TTC 1.x as well as light letters. When I released the app back in February it was winter and the sun wasn’t shining very much. As the months passed and spring arrived, I started receiving feedback from users that the UI was too dark and very hard to discern in bright sunlight. Sure enough, I tested it and the only thing I saw on my iPhone screen was my own face reflected on the glass screen.

Based on this feedback, I added a “Dark on Light UI” setting in the app, which changes the background to a noisy off-white with dark text, making it much easier to read in direct summer sunlight.

While not as pretty as the default setting, the "Dark on Light UI" setting makes it possible for Next TTC users to easily read the content on the screen in bright sunlight

Wrapping up

I hope you’ve enjoyed a bit of insight into the choices that went into the user experience and design of Next TTC 2.0. Here’s a few points to wrap up what I think you should take away from it:

  • While you may have a UI in mind early on, UI and UX design is closely related to the data you need to present. While considering your UI, really dig through your data and decide early on what’s important and what parts of the data you want to present to the user, or what takes precedence.
  • Create a scope early on. This will help you on your path to completing the app and provide clear guidelines for designing and coding your app, since you’ll have the data and the user in mind.
  • Revise, revise, revise. Don’t settle on a particular design just because it looks great. If it doesn’t work well in terms of usability or if it doesn’t allow you to present the data or the user to interact with the data in an appropriate manner, scrap it and figure out a better way of making this possible.
  • Don’t be afraid to make big changes. Next TTC 2.0 was a huge update for me. I completely rewrote the core of the app and redesigned 80% of the app. I consider it a brand new app (which also made it painful in deciding it’s a free update to the thousands of current users who’d this awesome update for free, only having paid a dollar for the previous version).
  • Listen to tester and user feedback and make appropriate changes to your app. Something might appear important to you, or easy to use from a UX point of view, but if several users tell you it’s not, listen to them.

Random tips

I was expecting this blog post to be about something completely different, but I’m still out of the country and have come down with a bad cold, so it’ll be a bit of a short and simple.

I’ve decided just to a few coding tips, I have come across and though might be useful for other devs, especially if you haven’t worked much with UIKit before in your development, or perhaps just these areas below. (At the bottom of this post is an Xcode project file which includes examples of the tips below).

Tip #1: Check for URL scheme and open correct app from your own
You’ll probably run into having to link to another app, for example your company’s Twitter account. Instead of just linking to the Twitter website, you can do a quick check in code and open the Twitter app instead. You can also do it with other Twitter client apps as well, however, not all have URL schemes setup (that I could find), and most users will probably be using the official one anyway. Regardless, the code is good to do the same check for other apps, or if Twitter.app doesn’t exists on the user’s device, you can try another Twitter client app, or use a UIAlertView to ask the user what else they’d like to use (depending on which ones are possible to open using the URL scheme).

Basically, all you have to do is ask whether UIApplication responds to the URL:

BOOL twitter = [[UIApplication sharedApplication] openURL:[NSURL URLWithString:@"twitter://user?screen_name=runmad"]];
if (!twitter) {
	[[UIApplication sharedApplication] openURL:[NSURL URLWithString:@"http://www.twitter.com/runmad"]];
} else {
	[[UIApplication sharedApplication] openURL:[NSURL URLWithString:@"twitter://user?screen_name=runmad"]];
}

List of URL Schemes: http://wiki.akosma.com/IPhone_URL_Schemes

Tip #2: Getting a free UINavigationController in your UIViewController
I have seen some apps that use a UIToolbar as a navigation bar, and this just doesn’t work. If you compare the two, they’re actually a bit different, and this really shows when put in the wrong place.

If you’ve got a UIViewController, turning it into a UINavigationController is really easy:

RootViewController *rootViewController = [[RootViewController alloc] init];
UINavigationController *navigationController = [[UINavigationController alloc] initWithRootViewController:rootViewController];
[self.navigationController presentModalViewController:navigationController animated:YES];
[rootViewController release];
[navigationController release];

Tip #3: Getting a free UIToolbar in your UINavigationController
This one I only just recently found out (thanks to @auibrian)
. UINavigationController actually comes with a UIToolbar. It’s property is set to hidden:YES by default, so all you have to do is override this property when you load the view.

[self.navigationController setToolbarHidden:NO];

Note: I have seen some weird behaviour when pushing view controllers onto the screen (buttons not getting a pushing animation, just appearing in the toolbar in place). Also, you will need to hide the toolbar if you’re popping or pushing to a view controller that you do not want to show the toolbar (because with pushing you’re reusing the same navigation controller). In my opinion it’s not a very good way, I wish the view controllers were more separate for this.

Tip #4: Use shadowColor and shadowOffset appropriately when faking text indention
One thing that bugs me are improper use of a UILabel’s shadowColor and shadowOffset. When using these, one has to pay attention to the colour of the background as well as the text colour. In the below example (when we’re going for the look where text looks carved into the display, used by Apple across the entire UI), we’ll see that when using darker text, use a lighter colour for the shadow and a positive value for the offset.

When displaying lighter coloured text, use a darker shadow colour and a negative value for the offset, which in both cases create the proper indented or “carved” effect.

Using the wrong combination doesn’t create this effect and makes the text appear raised from the rest of your UI, which is most likely not what you should be going for in most cases ;)

I’ve made a small Xcode project you can download here, which has the code examples from tips 1, 2 and 3 and I suppose #4 as well, since navigationItem.title is using text indention.

Again, apologies for a bit of a short and simple post due to being under the weather and traveling.

Quick guide to updating your app’s UI for iPhone 4

iPhone 4 will make your UI look stunning. Everything in UIKit has been scaled up already so it will require only a bit of work on your end to make the rest of your app look amazing on iPhone 4.┬áIf you only code and don’t touch Photoshop, you’re in luck. However, if you’re a UI designer or have the skills to do your own UI, hopefully you did your original artwork in a nice high resolution – if not, you have a bit of work cut out for you with updating all the UI elements to 2x the resolution.

Updating your app’s UI to be compatible with iPhone 4’s Retina Display is amazingly simple. Since the scale works in points, not pixels, you will have to do very little work on the layout itself. Apple Engineers have made it really simple to use new graphics for all your UI on iPhone 4, at the same time as being compatible (and not using 2x memory) on older, lower resolution, lower memory devices.

All you have to do is add the same image file at 2x the pixels to your app’s resources and name it the same it’s lower resolutions sibling with the following suffix: “@2x”.

For example, in your Resources folder, you will now need to have two image files, one for older devices called myImage.png and one for iPhone 4 called myImage@2x.png, which is twice the resolution.

This way, when you call [UIImage imageNamed:@”myImage.png”]; (or contentsOfFile:) iPhone 4 will automatically grab the filename with a @2x suffix, and lower resolution images will grab the lower resolution copy. You don’t have to check for the device loading the image and write any additional code to grab the correct image. Genius!

If you have seen an iPhone app on the iPad in 2x scale, that’s pretty much how your app is going to look on the iPhone 4. Perhaps not so drastic, but there will be a noticeable difference from app not optimized for iPhone 4 and “Retina Apps” as Apple calls them.

Thoughts on @2x on iPad…
I didn’t hear anything at WWDC regarding this, but my thought is that they’ll integrate this into the iPad for the next major OS release. Basically, it will be able to do the same and grab the higher resolution image appropriately for iPhone apps running at 2x scale.

Happy Photoshopping to UI designers and happy relaxing programmers!

UPDATE JULY 7, 2010:

I have discovered a bug in Apple’s code that deals with grabbing the correct image on iPhone 4. If you are using imageWithContentsOfFile: the code will in fact not automatically grab the @2x if running on iPhone 4. I have submitted a bug report to Apple, and they’ve informed me they are now aware of the issue and the Engineers are currently working to fix the bug. So for now, stick with imageNamed: for all your images.

Some notes on UIView animation

UIView animation is a simple and nice way to add to your user experience. I just wanted to point out a few suggestions when using UIView animations.

Duration (speed):
If you choose to use animation to compliment some of the stuff already happening in UIKit, either at the same time or before/after, it makes a big difference how fast your animations are. Pretty much all the UIKit animations I have come across have a duration of 0.3f and so should yours. Of course, it’s doesn’t always work 100% but for the most part, 0.3f is what you should aim for. It’s quick so your user don’t wait for something to finish animating before continuing with the next input action, and it’s not too fast so that the user doesn’t have a chance to see where the object came from or what happened.

If you have an animation happening while the keyboard animates up or down, use an animationDuration of 0.3f. Same with pushing and popping the navigationController. Annotations in MKMapView also drop at a duration of 0.3f.

0.3f is the way to go.

A simple UIView animation can be added with the following code:

[UIView beginAnimations:nil context:nil];
[UIView setAnimationDuration:0.3f];
self.segmentControl.alpha = 1.0f;
[UIView commitAnimations];

The above example is from an app I’m doing, where the segmentControl is enabled and I increase its visibility in the toolBar at the same rate as a pin drops in the map within the same screen.

When to use animation:
A few objects come with free animation (also at a duration of 0.3f, of course). For example, when adding a UIBarButton to your UINavigationBar, consider setting these with animation. If you replace a UIBarButton with another, they’ll even animate in and out nicely during the change. When adding a pin to a map, why not drop it onto the map with an animation, instead of it suddenly appearing on a map from nowhere?

Another good advice is to do animation (whether your own or with objects that include animations) to bring attention to an object. For example, if you have a pushed view, consider what you can “add” after the view has appeared through animating your objects in viewDidAppear.

Create a better UX with animation
Consider all the ways you can use UIView animation blocks in your app to enhance the user experience. It’s a great way to create a more fluid and pleasant experience for your users. A user’s inputs and actions will feel less rough and more smooth and soft to the touch. Don’t go overboard with animations. Too many will become annoying and it’s important to use animations only where appropriate.

The best advice is probably to have a look at many of the built in apps designed by Apple as well as the many free animations that a part of UIKit objects (how UIBarButtons animate in and out when you push a UIViewController stack, how a modal view appears from the bottom, etc.).