Here at Intrepid, we’re encouraged to spend about four hours a week on a side project. When I received an Apple tvOS dev kit, I knew I had to use that time to explore one of my ideas. I chose to do a Pictionary like game where you draw on the TV using the Siri remote’s trackpad. The designers I worked with and I settled on Doodaloo as the working title.
Building a tvOS can be very similar to building an iOS app. In fact, it shouldn’t be too hard to port a lot of apps over to tvOS because it uses the same Cocoa frameworks.
But Doodaloo could not be built using a TVML template, obviously. I got started by using a simple drawing app I’d built during Intrepid’s Apprentice program for iOS. I could mostly just plop it in, because tvOS uses the same APIs for receiving touch events from the trackpad and
UIGestureRecognizers work very much the same as well.
Still, tvOS presents new challenges, it wasn’t as easy as saying the old drawing app was a tvOS app and calling it a day. I ran into a few interesting issues:
- With such a small trackpad area on the remote, it was a challenge to make the drawing experience pleasant and easy to understand.
- I had to use the focus engine to control what buttons, views, etc. are selected at any time -- sometimes I ended up working around the focus engine.
- Using motion control from the remote doesn’t let you use the higher level APIs -- you have to tap into the accelerometer data.
Building a Good Drawing Experience
I opted to use
UIPanGestureRecognizer for the drawing capabilities. It made it easier to differentiate the various touches I would be receiving. In the app, taps on the sides of the pad where the arrow selectors are would be used for quickly changing brushes and colors. Using the
touchesBegan API would make it more difficult to discern when a touch was drawing and when it was tapping.
By the way, you also receive those trackpad directional touches through gesture recognizers like so:
It’s also worth noting this is how you access other remote buttons like play/pause presses too:
Another factor making the drawing computation different from the iOS drawing app is the idea of a cursor. In iOS a touch could be anywhere, and we would draw the line to that point. But in tvOS we need a way to show the user where they are on the page currently. Then they need to choose whether to draw or just move the cursor. I always think of this as having the pen on and off the page. I decided a press of the touchpad (aka
UIPressType.Select) would toggle the pen on and off.
Each pan gesture on the trackpad now moves the cursor relative to it’s last position. Which means we need some guards for where the edge of the screen is — yet another thing we didn’t have to worry about in iOS, where all touches were inherently on the screen. So I just check every point we add to the line and convert it to some point that is actually within the view.
Managing UIFocus (or not managing it)
With the help of one of our designers, we came up with a novel way to change brush sizes and colors. The user would tap on the directional pad and a little scrollable view would appear over the cursor like so:
Our biggest challenge was the focus engine: the focus engine doesn’t let you manually choose which view has the focus. It became so much of a pain within my UICollectionView that I actually just worked around it. AirBnB’s team has a good blog post about using the focus engine – but it is still a little unclear how you could force one view to be focused without a lot of subclassing.
My solution was to use a UICollectionView and set cells as selected or not based off directional taps on the remote’s trackpad. This switches the focus for the user visually, but since my
DrawingView never gave up the actual UIFocus, it’s a bit of an illusion. When you press to select the brush color or size within that collection view, the drawing controller is still receiving it, but it manages the collection view accordingly. I chose to handle it within my draw mode toggling function:
Motion control the hard way
We wanted to use the shake motion as way to erase everything, etch-a-sketch style.
In iOS there are some convenient methods for determining whether a shake of the device happened with
motionBegan: withEvent:. If you’re unfamiliar, you can read more on that via Apple. I was expecting the same to apply to the Siri remote, but those abstractions apparently do not apply to the remote because it is technically a
GCController object and you have to access the acceleration data directly via
Thankfully Apple made it very convenient to access. You pass in a closure that takes a
GCMotion object -- where you can get lots of info like
userAcceleration. As Apple’s documentation notes, total acceleration is the sum of both. But doing something with that data requires a bit of math and guesswork.
See if you can follow this:
To translate: if you shake longer than 1.5 seconds we’ll clear the drawing view. But the acceleration of a shake can fluctuate and at some point in that shaking if you were a little less vigorous than that “3” value we are looking for, we want to forgive you. So as long as your real acceleration goes above 3 within 0.25 seconds we’ll keep the 1.5 second timer going. Otherwise we realize you stopped shaking.
We have to track all those NSDates outside of scope to maintain them between each call of the handler. And I chose 3 as the acceleration limit from trial and error — looking at the readouts as I shook the controller.
What’s your tvOS idea?
Those are some highlights from my tvOS project. Does this help you with any project you’re working on? Let us know!
If you’re interested in hearing more about Doodaloo, tweet at me (@thepaulrolfe) or email me (firstname.lastname@example.org). I can always go into more depth on things we encountered using tvOS.