Reimagining the Shopping Filter UX

Ashwin Dinesh
YML Innovation Lab
Published in
2 min readAug 25, 2016

Applying filters to refine your mobile shopping experience can be a painful task. You face a multiple set of drop-downs, checkboxes, sliders and various confusing UI elements crammed into a small screen. Even the most simplified filter interface can be a bit intimidating to an average user. There is a small learning curve to this.

We at Y Media Labs Innovation, thought about a concept to improve the filter UX for mobile shopping apps. Although simplifying filters is not easy , we conceptualized a different approach to this problem. Use Speech and AI.

In this concept users can just tap on the mic button and talk to the app in natural language and the app identifies the filter keywords and applies them on the search results. This experience would remove a lot of unwanted interaction steps in the flow. The user’s voice input “slim fit jeans.. blue color, 36 inches and from gap” is processed and the keywords like “fit”, “size”, “color” and “brand” are identified by AI and the search results are filtered. Users can clear and set a completely different set of filters by just one tap, compared to the conventional flow the user would have to reset the current filters and tap though the entire set of filters again.

We have built the concept using Speech Framework in iOS 10, you can check it out here

Developed at Innovation Labs @ Y Media Labs

--

--