Iterative Interactions: Conducting Tests with Manufacturing Stakeholders

Justine Chou
Team ARM Institute || MHCICapstone
4 min readJun 12, 2024

This sprint, we’ve been iterating on our prototypes, testing with manufacturing stakeholders, and working together to mapping out complete experiences that make our ecosystem feel more cohesive & connected to one another.

Prototype Updates

Physical Learning Experience — Cobot

For our cobot, we built a low-fidelity user flow to flush out the interactions and level of information we wanted to present to the user. This consisted of the different parts of the robot arm, applications in which this arm could be used in, and a task / demonstration of how the robot moves and acts. Our goal is to demystify what robots are, and get users more comfortable with using one at a high level. To aid us in building out our prototype, we gained access to the AI Maker Space at CMU’s Tepper School of Business to use their Kinova Arm, understanding its interactions and how to control it.

ARM also gave us access to their Yaskawa robot arm at Mill 19, which shares a lot of similarities to the Kinova arm, with a larger stature and a tablet interface as opposed to a video game controller. Our client walked us through how to use it and its controls, and going forward we’ll be figuring out how to include this robot in our experience and understanding its technical capabilities and limitations.

Portable Conversation Facilitator — Cards:

For our cards prototype, we are determining how much information is given, and how to present them. This ranges from short tags, longer paragraphs, and including QR codes with links to better visualize how they are used.

For our testing, we gave certain scenarios in which we asked our users to act as if they were facility managers deciding on what type of robots needed to be included in order to complete this task. This way, we could see how they were digesting the information, if they were scanning the QR codes, and how they liked the information to be presented to them. Going forward, we want to conduct different tasks our users can do with these cards, and find other ways these cards can be used.

On Demand Learning — Chatbot:

For the chatbot, we’re using a mix of Voiceflow and Figma to determine the interactions, how much information is given in the answers, and what types of questions users would potentially ask. Not only are we planning on this chatbot to answer questions regarding robotics, but also about ARM’s services, their current / previous projects and members, and how they can get in touch with ARM for meeting on robotics integration projects. Going forward, we are looking into ways this chatbot can be trained on ARM’s datasets to provide accurate and useful information.

ARM and the capstone team at SME

Testing at SME

The team went to the Smart Manufacturing Experience (SME) conference in Downtown Pittsburgh. This featured a grand expo hall of companies providing manufacturing solutions and talks on the future of manufacturing as it moves to industry 4.0. Aside from the bags of free swag, we were able to conduct several intercept usability tests with attendees ranging from manufacturing engineers, operations managers, analytics and business development directors, etc. each with varying levels of interactions with our target users of facility managers and operators. We were able to derive several insights from their comments and interactions, and quantitatively measure their level of robotics knowledge before and after the test sessions.

In the next sprint, we will be using conference feedback, as well as critique from our faculty and classmates, to iterate on our prototypes. Our team will be attending ARM’s upcoming open house where we will have a chance to test a higher fidelity of our prototypes with more of our target users.

--

--