iOS6 Maps: It’s the Siri, Stupid.

ImageCommentators lauding and panning iOS6 maps are focussing on the core maps experience—for good reason—but it’s the Siri integration that’s worth paying most attention to.

The core maps scale with money and time – both of which Apple has ample amounts of. More planes will fly, more servers will crunch data, updates will be deployed faster to the web service. Nothing a billion dollars and a couple of years can’t fix.

Siri as your Navigator

It’s the Siri experience that I’m most psyched about. It is, after all, something I predicted in November 2011 – in a post aptly titled ‘Siri-based navigation is coming soon‘. It only took a year!

It’s a verbose post that maps out why Siri with a simple premise:

Typical car travel is a two-person activity – one to steer, and another to navigate. Sure, you can make-do with just one person fumbling with a GPS devices while trying to drive. But imagine a future where Siri becomes that navigator – skillful, omniscient, helpful and entirely hands-off.

The future is, of course, here. It’s a testament to Apple’s genius that when the future arrives, it hits you with a ‘well, duh‘ realization.

My personal experience with Siri as Navigator

This morning, when I left home, I asked Siri: “Take me to work“. And she did. I did not look at the iPhone even once (well OK, I did, but not to peek at the directions), and Siri even re-routed me when I hit traffic. I even asked her “Are there any gas stations near the route” and she found some. 

Funnily enough, when providing directions to the next gas station, one of the options she offered was “Find the next one“. She understands you might just have missed the exit to the first one. That’s smart!

Then I asked her “Is there a Starbucks near my destination?“. Siri couldn’t answer this – but it wasn’t so bad! She said “Sorry, I can’t find places near a business“. In other words, she understood what I was asking, but just didn’t know how to answer just yet. Of course, this will change in the future.

And this is where the data comes in

When Apple says they’ll get better with usage – don’t be fooled. The usage won’t actually improve the visual map experience that much – they probably already knew that Brooklyn Bridge don’t look so good. And they’ve already had the traffic data pumping in from the previous incarnation for years.

No, this is about Siri. As Apple gets real-world usage of a completely voice-drive application, because it needs to be, and it practically begs to be, they’re developing the deepest, most comprehensive understanding of how travel gets done, not just how maps look and feel.

So this is classic Apple – focus on learning how to make things better, not just copying what already exists.

Roadmap for Siri

Siri, as navigator, of course needs a roadmap. Based on my previous blog post, here’s a suggested roadmap for Siri actions. My humble suggestions for the product manager for this feature at Apple (or for the corresponding person at Google + Google Now, natch!)

  • [done] Spoken directions with Siri’s voice – “Next turn in 300 feet”
  • [done] Routing and rerouting commands, using Contacts information – “Take me home”
  • [done] Gas nearby – “Are there any gas stations near the route”
  • [untested] Adding gas stations as waypoints – “Siri: Added Shell gas station as waypoint”
  • Send message about current route to someone – “Let Abha know when I’ll reach home”
  • Notice unexpected slowdown and proactively suggest rerouting – “Siri: We seem to be stuck in traffic, but I have an alternate routing suggestion for you”
  • Find parking spot at destination – “What are the parking options at the destination?”
  • Nirvana – “Siri, let’s go home, but stop by Pete’s laundry and the Safeway near our home on the way”

Now, if only Siri could understand my accent well…



32-year-old video of Steve Jobs, underscoring that Apple’s priorities lie in software

Even 3 decades ago, Steve Jobs had a clear vision of software‘s role in making computing personal. In this clip, dated a mere 4 years from the founding of Apple, he describes how he wants to use all this new hardware computing power to make the 1-on-1 interaction with a computer go smoother.

(From a rare clip contributed to the Computer History Museum by Regis McKenna)

Skip to 12:17 where he talks about this, specifically saying:

…we’re gonna start chewing up power specifically to help that 1-on-1 interaction go smoother – and specifically not, to actually do the number crunching and database management and word processing, we’re gonna actually start applying a lot of that power specifically to help us remove that barrier…

Of course, as with all great founders, he was quite optimistic about how soon this would happen:

…it looks like the timing is just right for that to occur. So hopefully, when we have our international Applecore meeting, the 3rd or 4th one from now, we’ll all be able to talk about how we’ve solved that problem, because I really think it’s gonna happen…

It’s incredible how well products Apple produced in the last 3 decades, under Steve Jobs, hold up against this articulation of Apple’s software strategy – and explains the inordinate level of effort expended in getting the user experience right. Apple products have a better user experience than other companies building similar products – not because they have incredible designers – but because these designers are building to solve a different, more aspirational, more human goal.

John Gruber’s recent post on Isaacson completely missing this centrality of software in the Steve Jobs narrative reminded me of this clip. People who marvel at how iPhone is truly a software-first device shouldn’t be surprised – Steve’s Apple has always been a software company first; they don’t make hardware because the margins are high, but because they need a certain kind of hardware to make the software vision a reality.


iPad + Bluetooth keyboard aggravations – feature requests for the Apple team

Yet another flight to India (we have a new team in Bangalore), yet another chance to try the iPad/Keyboard solution I’ve written about.  So far, it’s working out great!

This is quickly going from hypothesis to prototype to full-blown implementation, but there are some rough edges that Apple could help smoothen out.

Navigation aggravation

Most of these annoyances relate to navigating the iPad via the bluetooth keyboard. The Cmd-key doesn’t really exist on the iPad, and so don’t most of the shortcuts related to that key. Specifically, if the following keystrokes existed in the Mail app, I’d be a happy panda:

Continue reading

Samsung to announce Swipeit, an Apple Airplay competitor, at CES 2012

On January 9, 2012, Samsung will announce their Apple Airplay competitor – the Samsung SwipeItat CES. SwipeIt works exactly like the Apple’s AirPlay. Watching a video on your smartphone and want to see it on the big screen? Just click a button and the video starts playing on your TV.

Samsung started pushing this application out to their internet-connected TVs on Jan 1, and increased the intensity this weekend, presumably to reach full coverage by the time CES hits monday.

Samsung's SwipeIt application is being pushed to all 2011 Samsung Smart TVs

Continue reading

Coming soon: Siri and Spotlight search for iOS Apps

(Side-note: I’ve been interested in writing this post for about two years now. Every few months I feel I should write it before Apple actually ships such a feature. Finally the post’s done!)

How does iOS reduce its dependence on web search from Google or Bing? Simple, syndicate Spotlight searches to installed apps.

Introducing Spotlight for Apps

Imagine if every iOS app could implement a ‘search’ capability, that would be exposed to the iOS search subsystem. This search could be over local data, e.g. Address Book-like applications; over remote data, e.g. Wikipedia; or over either, e.g. Mail.

Your favorite content sources (like Yelp), content aggregators (like Flipboard or New York Times) and even crowdsourced information aggregators (like, hey, Twitter and Facebook) already reside as apps on your iOS device. Any search you’re interested in doing would hit these apps first – and would probably show the information you care for right at the top.

Continue reading

Siri-based map navigation is coming soon

Typical car travel is a two-person activity – one to steer, and another to navigate. Sure, you can make-do with just one person fumbling with a GPS devices while trying to drive. But imagine a future where Siri becomes that navigator – skillful, omniscient, helpful and entirely hands-off.

Continue reading

Siri causes rare Apple regression in functionality, but it’s all good, people.

With Siri, Apple has moved the on-device voice recognition engine, called Voice Control,

firmly to the cloud. This means that the universal ability Apple used to have, whether you had data connectivity or not, to call friends and family by talking to the phone, is no longer there.

Is this a bad regression?

On the face of it, this is a tremendous regression – you now have to ‘pay’ to use a capability that was previously ‘free’ – either through the use of data minutes, or through a WiFi connection of sorts.

In addition, even with fast 3G connections, the latency to call someone in the addressbook is very noticeable now, with a round-trip conversation with a cloud-hosted Siri required, before a simple call can be made.

Or a masterstroke to get a free ‘Gold Set’ data?

Continue reading

2 quick observation using Siri with an Indian accent

Understanding non-American accents

Why the focus on ‘Indian’? Well, by far the biggest issue for me was in getting Siri to understand my accent. People who know me claim that I don’t have that thick an accent, but that’s not what Siri ‘tells’ me.

On the plus side, Siri is subtly training me to have a more ‘American’ voice. She highlights words she has trouble understanding in blue, so you can quickly figure out what words one is consistently mispronouncing, or pronouncing with a thick accent. ‘Paragon’ was totally off, for instance!

Your wish is my command – but be QUICK

It’s almost impossible to dictate text messages through Siri – at least for me. I tend to pause just a little too long between sentences, and Siri needs you to know exactly what you want to say before you start dictating.

In fact, I ended up using Voice Notes to record quick notes for myself, when I’d have preferred to dictate them through Siri. On the plus side, you can use earphone controls to record/pause Voice Notes, for some quick eyes-on-the-road note taking.

Ethnic language models

The language models need more work, and I won’t be surprised if there are people hard at work creating ethnicity-oriented speech models. Guessing the ethnicity shouldn’t be that hard to do, given all the clues Apple has – the name of the person, their entire addressbook, and recorded voice samples. These samples can be compared to aggregate training data collected from virtually every country on the planet.

The holy grail, of course, would be a trained model on a per-person basis, but I wonder if they have the compute resources and data volume, on a per-person basis, to make that happen effectively.

To the future!

Are you Siri-ous? Siri is just a toy, and that’s OK.

Chris Dixon wrote a great post in Jan '10 called 'The next big thing will start off looking like a toy' (…). Siri is a great example of this.

The leading indicator that Siri will be amazingly successful is that people are having fun with it – and it's not breaking.

Let's take the simple use case of 'Hey Siri, tell me a joke.' Consistently answering this with a humorous response requires

  • Speech recognition (or at least noise-free recording)
  • Cloud-based interpretation and response
  • Believable text-to-speech

(The last one is very important, by the way. Apple has spent decades getting text-to-speech right, complete with stresses, tones and nuances (har har). This is paying rich dividends now)

Each time someone demoes the 5 funny Siri use-cases to their friends, the system becomes smarter, but more importantly, the technology gets humanized. More people will use it because it's like a toy.

In fact, I don't have an iPhone 4S right now. Guess what – I've started using the clunky Voice Control on my iPhone 3GS these days, because I somehow trust it'll work now. And it does!

Also in Jan '10, Fred Wilson wrote a dictated blog post using an Android phone (…). I'm sure he, and other long-time android users all over, are looking at iPhone users quizzically, and wondering what the fuss is. After all, isn't this merely Voice Control plus some Text-to-Speech thrown in?

The difference is – Android's voice capabilities were a feature, Siri is a toy. And that makes all the difference.


Post on Quora

S is for ‘Second Wave’. or, Early adopters don’t buy S-series iPhones

There are two types of people. Those that buy Model Number iPhones, and those that buy S series iPhones. (tweet this:

This being Apple, it's easy to attribute to stratagem what's probably just a coincidence – but it's quite convenient how this has worked out in recent years. A new iPhone model gets launched, and the early adopters switch to it, come hell or high water (read: angry wife or contract breakage fees).

The late majority, however, choose to wait and see. Next year, the S model gets launched, and it's time for them to move.

Conveniently, typical cellphone contracts last 2 years – a perfect cycle for both the tame early adopters, as well as the last majority.

It couldn't have been designed better if it had been Designed by Apple in California. Oh wait…

Thanks to Kent Brewster for making this idle thought post-worthy.


Post on Quora

Steve Jobs and me – a shared birthday and then some.

This isn’t a unique distinction – roughly 0.25% of the world’s population shares a birthday with Steve Jobs – but one of them is me.


Where I grew up in India, no Apple products were to be found – at all. And yet, I was an avid reader of MacWorld. The heroes of the industry were IBM and Microsoft – and yet, I was an Apple fanatic. Or a NeXT fanatic – frankly, whatever Steve was backing at any given time.

It’s hard to believe that the first time I touched a real Apple computer was in 1998, when I came to USC to study for my Masters. It was the feeling you get when you discover the love you knew existed, you just hadn’t met them yet. So it was for me with Apple, and with Abha, both relationships I developed while at USC.

I still remember going to the Palo Alto Apple Store in 2001 to check out the Titanium PowerBook G4 that had been released then. I asked how I could upgrade the memory – and the Geniuses just lifted the tabs on the keyboard to expose the SDRAM slots. I opened up a Terminal and tried out a few Unix commands – and it was like I was hacking on the Linux box at home. What a fantastic marriage of technology and design! I left that day with my first Apple product under my arms.

2001 was still early days for the new breed of the Mac faithful. I had one of the only two Apple laptops at Inktomi, in a sea of thousands of Windows machines. It’s hard to believe that today – when it seems in Silicon Valley, only people from other parts of the world carry Windows laptops.

In fact, when I was returning from India recently, with stopovers at multiple airports, I could estimate how far I was from the Silicon Valley by measuring the density of Apple laptops at the airports.

The nearest I came to Steve was in 2005, when I was at the Apple campus, on the floor where all the executives sat. I was being interviewed by Bertrand Serlet for a position managing the Mac OS X productivity apps, but I chose to join Yahoo! instead. I have often wondered if I made the right decision – today more than ever.

Like many others, Steve Jobs has inspired my professional journey. It’s only now settling in that I will never, ever be in the same room as my childhood hero.

On the personal front, my mother passed away a few weeks back – she was 57. My brother wrote a touching eulogy for her – better than I could ever have. Now Steve – my professional inspiration, has also moved on at the age of 56. These events put one’s life in perspective – I could have fewer years left on the planet then I’ve lived so far – it’s time to make them count.


(image credit)

Prediction for the Apple iPhone event on October 4: an iPhone 5 on Sprint

The technorati is abuzz with the (lowered) expectation that there will not be an iPhone 5 released tomorrow – and only an iPhone 4S will be released.

This is hogwash. Every release cycle, without fail, Apple resets the expectations to be lower than what they’ll announce, and the press eats it up readily. In fact, that’s so well-established these days, that the mere fact that the press seems to be lowering expectations makes me even more confident that we’ll see an iPhone 5 tomorrow.

Given the way T-Mobile and Sprint are behaving, it’s also clear that Sprint is getting this new model.

Can’t wait!


The User Experience of the upcoming Apple iCloudBook

Technology and trends are falling in place for Apple’s upcoming iCloudBook – a revolutionary laptop where the primary data storage is all on the iCloud – i.e. in the iCloudDrive, with the local Hard Drive just being a ‘working copy’.

In a previous post, I described new technologies that Apple has introduced in Mac OS X Lion, that portend the Apple Cloud Laptop – or what Om dubbed the iCloud Laptop.

In this post, I’d like to describe what the user experience for what I now call the iCloudBook might look like.

Continue reading

The Blueprint for the Apple Cloud Laptop (aka @siracusa is smarter than you)

So you think John Siracusa is brilliant. You read him assiduously (well, at least once every 18 months anyway). You don’t know the half of it! In his brilliant Ars Technica review of Lion, in one little throwaway link labeled ‘hmmm’ (here), John masterfully hinted on a Apple Cloud Laptop in the offing.

I think he’s on the money. Lion has introduced a set of OS-level technologies and changes, that can finally make the ‘Apple Cloud Drive’ a reality – and with that, the Apple Cloud Laptop.

Here’s my analysis of his analysis. Let’s go.

It starts with File Revisioning

Continue reading

Quick hits on the Apple iPad – in bite-sized morsels, even!

Much has been said already, so here are a few non-obvious things from yesterday’s Apple iPad announcement.
Underappreciated and revolutionary:
  • OS/Appkit APIs, Controls and Design guidance on building touch Apps – even ones as complex as iWork. (eg: reimagined iTunes volume slider, manipulation of the pie chart)
  • Related to that, reducing the number of on-screen toolbar/menu controls even further (compared to the Mac version of an app, say iWork)
  • Retiring the mouse as an input device, even as an option. Firmly embracing touch as a primary input mechanism.
  • Transitioning the keyboard from a primary input device to an accessory.
  • Treating 3G as an option, a different approach than any other device in the market (including theiPhone or iPod Touch).
  • Making the file system disappear – embracing the app-centric iPhone model, versus the Finder-centric mac.
  • Transitioning even core apps like iWork to the cloud, in terms of pricing, delivery, and presumably file storage.
  • Related to that, rethinking the bundling strategy (iWork) and focussing on individual apps.
  • Acknowledging it’s not sufficient to control the hardware packaging – core components like the CPU/GPU need to be owned too.
Unnoticed but remarkable:
  • Jonathan Ive now wants to be called Jony Ive
  • Reinterpretation of TV viewing – TV feeds as underlays (viz MLB) vs internet as overlay (viz Yahoo! Widgets)
  • New Keynote transitions! The ‘force land’, ‘letters falling off’ and ‘shatter’ effects seem new!

– Amit