MLearning.ai
Published in

MLearning.ai

The importance of UX in AI

A black and white photo of a robot’s back
Photo by Jesse Chan on Unsplash
  • Your smartphone OS automatically adjusting your schedule when a timezone change happens
  • Siri giving you the answer to 13 * 476
  • Amazon factory robots
  • .. and everything in between that can learn or even “seem to learn” and respond based on that learning.

The current state of AI

AI is all the rage these past couple of years, and for good reason. The technologies behind AI have progressed exponentially (as with most technologies) and the applications of AI tech have also grown, not just vertically but also across the board. It is now easy to see or hear AI in its many forms being used in the financial and health industries. And seeing in popular media the robots Boston Dynamics working through obstacles gives us the feeling of “we’re almost there”. It is becoming as ubiquitous as the internet itself. Constantly intertwined with the apps and products we use, whether we know or notice it or not. We even have an AI in our pockets constantly. However, that is not an “overall” state of AI. How do we, the end-users, perceive our experiences with AI? I could personally say, that yes, I am impressed with some of my experiences with AI but if I compare that to what it can actually do (the tech behind it and its exponential growth), why does it seem to be lagging? Plus, at times, not only does serve its purpose, it degrades the usefulness of the product (e.g. badly implemented chatbots).

UX can help

It should be expected that the companies and teams working on UX are on top of their games, so to speak. They are using the most proven methodologies and processes in producing their tech, and I’m pretty sure a lot of resources are being put in for usability research for it. What I would think that would help more is making smaller- leaner iterations on releasing the tech and proving there is an actual value to its intended users. This is where the great Lean UX methodology comes in. Going back to my Siri example, what if Apple decided to release in a leaner form, with a scoped down set of features accompanied by a limited press release of what it can do. Let’s say that on a release Apple just said, we have a voice assistant that can help you query Google just through your voice, and it will give you the top 3 results. That’s it. That is small scope, with a small scope of intended user scenarios, accompanied by grounded expectations of what it can do. That’s small enough to invest in, big enough to see the value and work out the kinks, and useful enough to a set of people which results in a version the team can build on top of, pointed in the right direction.

--

--

Data Scientists must think like an artist when finding a solution when creating a piece of code. ⚪️ Artists enjoy working on interesting problems, even if there is no obvious answer ⚪️ linktr.ee/mlearning 🔵 Follow to join our 28K+ Unique DAILY Readers 🟠

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store