iOS6 Maps: It’s the Siri, Stupid.

ImageCommentators lauding and panning iOS6 maps are focussing on the core maps experience—for good reason—but it’s the Siri integration that’s worth paying most attention to.

The core maps scale with money and time – both of which Apple has ample amounts of. More planes will fly, more servers will crunch data, updates will be deployed faster to the web service. Nothing a billion dollars and a couple of years can’t fix.

Siri as your Navigator

It’s the Siri experience that I’m most psyched about. It is, after all, something I predicted in November 2011 – in a post aptly titled ‘Siri-based navigation is coming soon‘. It only took a year!

It’s a verbose post that maps out why Siri with a simple premise:

Typical car travel is a two-person activity – one to steer, and another to navigate. Sure, you can make-do with just one person fumbling with a GPS devices while trying to drive. But imagine a future where Siri becomes that navigator – skillful, omniscient, helpful and entirely hands-off.

The future is, of course, here. It’s a testament to Apple’s genius that when the future arrives, it hits you with a ‘well, duh‘ realization.

My personal experience with Siri as Navigator

This morning, when I left home, I asked Siri: “Take me to work“. And she did. I did not look at the iPhone even once (well OK, I did, but not to peek at the directions), and Siri even re-routed me when I hit traffic. I even asked her “Are there any gas stations near the route” and she found some. 

Funnily enough, when providing directions to the next gas station, one of the options she offered was “Find the next one“. She understands you might just have missed the exit to the first one. That’s smart!

Then I asked her “Is there a Starbucks near my destination?“. Siri couldn’t answer this – but it wasn’t so bad! She said “Sorry, I can’t find places near a business“. In other words, she understood what I was asking, but just didn’t know how to answer just yet. Of course, this will change in the future.

And this is where the data comes in

When Apple says they’ll get better with usage – don’t be fooled. The usage won’t actually improve the visual map experience that much – they probably already knew that Brooklyn Bridge don’t look so good. And they’ve already had the traffic data pumping in from the previous incarnation for years.

No, this is about Siri. As Apple gets real-world usage of a completely voice-drive application, because it needs to be, and it practically begs to be, they’re developing the deepest, most comprehensive understanding of how travel gets done, not just how maps look and feel.

So this is classic Apple – focus on learning how to make things better, not just copying what already exists.

Roadmap for Siri

Siri, as navigator, of course needs a roadmap. Based on my previous blog post, here’s a suggested roadmap for Siri actions. My humble suggestions for the product manager for this feature at Apple (or for the corresponding person at Google + Google Now, natch!)

  • [done] Spoken directions with Siri’s voice – “Next turn in 300 feet”
  • [done] Routing and rerouting commands, using Contacts information – “Take me home”
  • [done] Gas nearby – “Are there any gas stations near the route”
  • [untested] Adding gas stations as waypoints – “Siri: Added Shell gas station as waypoint”
  • Send message about current route to someone – “Let Abha know when I’ll reach home”
  • Notice unexpected slowdown and proactively suggest rerouting – “Siri: We seem to be stuck in traffic, but I have an alternate routing suggestion for you”
  • Find parking spot at destination – “What are the parking options at the destination?”
  • Nirvana – “Siri, let’s go home, but stop by Pete’s laundry and the Safeway near our home on the way”

Now, if only Siri could understand my accent well…

Amit

Coming soon: Siri and Spotlight search for iOS Apps

(Side-note: I’ve been interested in writing this post for about two years now. Every few months I feel I should write it before Apple actually ships such a feature. Finally the post’s done!)

How does iOS reduce its dependence on web search from Google or Bing? Simple, syndicate Spotlight searches to installed apps.

Introducing Spotlight for Apps

Imagine if every iOS app could implement a ‘search’ capability, that would be exposed to the iOS search subsystem. This search could be over local data, e.g. Address Book-like applications; over remote data, e.g. Wikipedia; or over either, e.g. Mail.

Your favorite content sources (like Yelp), content aggregators (like Flipboard or New York Times) and even crowdsourced information aggregators (like, hey, Twitter and Facebook) already reside as apps on your iOS device. Any search you’re interested in doing would hit these apps first – and would probably show the information you care for right at the top.

Continue reading

Siri-based map navigation is coming soon

Typical car travel is a two-person activity – one to steer, and another to navigate. Sure, you can make-do with just one person fumbling with a GPS devices while trying to drive. But imagine a future where Siri becomes that navigator – skillful, omniscient, helpful and entirely hands-off.

Continue reading

Siri causes rare Apple regression in functionality, but it’s all good, people.

With Siri, Apple has moved the on-device voice recognition engine, called Voice Control,

firmly to the cloud. This means that the universal ability Apple used to have, whether you had data connectivity or not, to call friends and family by talking to the phone, is no longer there.

Is this a bad regression?

On the face of it, this is a tremendous regression – you now have to ‘pay’ to use a capability that was previously ‘free’ – either through the use of data minutes, or through a WiFi connection of sorts.

In addition, even with fast 3G connections, the latency to call someone in the addressbook is very noticeable now, with a round-trip conversation with a cloud-hosted Siri required, before a simple call can be made.

Or a masterstroke to get a free ‘Gold Set’ data?

Continue reading

2 quick observation using Siri with an Indian accent

Understanding non-American accents

Why the focus on ‘Indian’? Well, by far the biggest issue for me was in getting Siri to understand my accent. People who know me claim that I don’t have that thick an accent, but that’s not what Siri ‘tells’ me.

On the plus side, Siri is subtly training me to have a more ‘American’ voice. She highlights words she has trouble understanding in blue, so you can quickly figure out what words one is consistently mispronouncing, or pronouncing with a thick accent. ‘Paragon’ was totally off, for instance!

Your wish is my command – but be QUICK

It’s almost impossible to dictate text messages through Siri – at least for me. I tend to pause just a little too long between sentences, and Siri needs you to know exactly what you want to say before you start dictating.

In fact, I ended up using Voice Notes to record quick notes for myself, when I’d have preferred to dictate them through Siri. On the plus side, you can use earphone controls to record/pause Voice Notes, for some quick eyes-on-the-road note taking.

Ethnic language models

The language models need more work, and I won’t be surprised if there are people hard at work creating ethnicity-oriented speech models. Guessing the ethnicity shouldn’t be that hard to do, given all the clues Apple has – the name of the person, their entire addressbook, and recorded voice samples. These samples can be compared to aggregate training data collected from virtually every country on the planet.

The holy grail, of course, would be a trained model on a per-person basis, but I wonder if they have the compute resources and data volume, on a per-person basis, to make that happen effectively.

To the future!
Amit

Are you Siri-ous? Siri is just a toy, and that’s OK.

Chris Dixon wrote a great post in Jan '10 called 'The next big thing will start off looking like a toy' (http://cdixon.org/2010/01/03/the…). Siri is a great example of this.

The leading indicator that Siri will be amazingly successful is that people are having fun with it – and it's not breaking.

Let's take the simple use case of 'Hey Siri, tell me a joke.' Consistently answering this with a humorous response requires

  • Speech recognition (or at least noise-free recording)
  • Cloud-based interpretation and response
  • Believable text-to-speech

(The last one is very important, by the way. Apple has spent decades getting text-to-speech right, complete with stresses, tones and nuances (har har). This is paying rich dividends now)

Each time someone demoes the 5 funny Siri use-cases to their friends, the system becomes smarter, but more importantly, the technology gets humanized. More people will use it because it's like a toy.

In fact, I don't have an iPhone 4S right now. Guess what – I've started using the clunky Voice Control on my iPhone 3GS these days, because I somehow trust it'll work now. And it does!

Also in Jan '10, Fred Wilson wrote a dictated blog post using an Android phone (http://www.avc.com/a_vc/2010/01/…). I'm sure he, and other long-time android users all over, are looking at iPhone users quizzically, and wondering what the fuss is. After all, isn't this merely Voice Control plus some Text-to-Speech thrown in?

The difference is – Android's voice capabilities were a feature, Siri is a toy. And that makes all the difference.

Amit

Post on Quora