User Interface/ Interaction Design Simplified (1/n)

Ronit Malhotra
25 min readFeb 14, 2024

--

What is Interaction design?

Designing interactive products to support people in their everyday and working lives.
The design of spaces for human communication and interaction.

Goals of interaction design

Develop usable products.
Involve users in the design process.

Good & Bad design

User Interface (UI) Design and User Interaction (UX) Design are crucial aspects of creating software, websites, and mobile applications that are both functional and enjoyable for users. While UI focuses on the visual design elements — like colors, typography, and layout — UX emphasizes the overall feel and experience of using a product. Let’s break down what constitutes good and bad design in both areas, with simple examples to illustrate these concepts.

Good UI Design

**Clarity:** A well-designed interface is clear and intuitive. For example, a website with a clean layout, easily readable fonts, and straightforward navigation helps users find what they need without confusion.

**Consistency:** Consistency in design elements such as buttons, fonts, and color schemes makes the interface easier to use. For example, if all buttons that perform similar actions are the same color and shape, users will quickly learn how to interact with the site or app.

**Feedback:** Good UI provides immediate feedback. For example, when you fill out a form and hit submit, a message should appear telling you whether the action was successful or if there were errors.

Bad UI Design

**Overload:** A cluttered interface with too much information, too many colors, or an excess of buttons can overwhelm users. For instance, a webpage crammed with ads, text, and images might make it hard to focus on the main content.

**Inconsistency:** Inconsistent design leads to confusion. If every page of an app uses different colors, fonts, and button styles, users will struggle to understand how to navigate and interact with the product effectively.

**Lack of feedback:** When users take an action and receive no indication of whether it was successful, it leads to frustration. For example, clicking a button that appears to do nothing can make users wonder if the button worked at all.

Good UX Design

**Efficiency:** Good UX design allows users to achieve their goals quickly and easily. For example, an e-commerce app that lets users filter products effectively and check out with just a few taps provides a satisfying experience.

**Accessibility:** A product designed with all users in mind, including those with disabilities, ensures a wider audience can use it. For instance, adding alternative text for images helps visually impaired users understand the content.

**Engagement:** Engaging UX design keeps users interested. An app that uses gamification to encourage learning or completing tasks can make the experience fun and rewarding.

Bad UX Design

**Complexity:** If using a product feels like a chore due to complex navigation or obscure features, it’s a sign of bad UX. For instance, a mobile app that requires too many steps to perform a simple action, like booking a ticket, can deter users.

**Ignoring User Feedback:** Not incorporating user feedback into design improvements can lead to a product that doesn’t meet needs or solve problems effectively. An app with a high number of user complaints about a specific feature that remains unchanged is an example of this.

**Unnecessary Features:** Overloading a product with features that most users don’t need or want can complicate the experience. For example, a writing app with too many distracting tools can hinder the main task of writing.

In summary, good UI/UX design is about creating an aesthetically pleasing interface that is easy and enjoyable to use, while bad UI/UX design complicates interaction and detracts from the user experience. The goal is always to keep the user’s needs at the forefront, ensuring that the design is accessible, intuitive, and engaging.

Different Usability goals

Usability goals are benchmarks that guide the design of user interfaces (UI) and user experiences (UX) to ensure that a product is not only functional but also user-friendly. These goals help in creating products that people can use efficiently, effectively, and with satisfaction. Let’s explore each of these goals with simple examples:

1. Effective to Use

**Goal:** Users can achieve their desired outcomes through using the product.

**Example:** A navigation app that quickly provides accurate, easy-to-follow directions from point A to point B. Users can reach their destinations without getting lost, demonstrating the app’s effectiveness.

2. Efficient to Use

**Goal:** Users can achieve their goals with minimal effort and time.

**Example:** An e-commerce website allows users to reorder previously purchased items from their order history, saving time and effort compared to searching for the items again.

3. Safe to Use

**Goal:** Protects users from making dangerous errors and minimizes the consequences of mistakes.

**Example:** A word processing software that automatically saves documents at regular intervals ensures that users don’t lose their work if they forget to save manually or if the program closes unexpectedly.

4. Have Good Utility

**Goal:** Provides the features and functions that users need.

**Example:** A photo editing app with a comprehensive set of editing tools ranging from simple cropping and filtering to advanced color correction and layering caters to a wide range of user needs, from casual to professional.

5. Easy to Learn

**Goal:** New users can quickly become familiar with the product and start using it effectively.

**Example:** A fitness app that guides new users through an introductory tutorial showing how to track workouts, set goals, and find exercises. This helps users quickly understand how to use the app to its full potential.

6. Easy to Remember How to Use

**Goal:** Users can return to the product after a period of not using it and remember how to use it without having to relearn everything.

**Example:** A digital camera with a simple and intuitive control layout allows users to easily remember how to change settings (like ISO, aperture, and shutter speed) even if they haven’t used the camera in a while.

In summary, these usability goals help ensure that products are designed in a user-centric manner, focusing on providing a satisfying and efficient experience. By aiming to meet these goals, designers and developers can create products that not only meet the immediate needs of their users but also foster a positive, long-term relationship with their user base.

User Experience Goals

Mnemonic: SEE HAM REFS

User experience (UX) goals focus on the emotional and psychological aspects of how users interact with products, aiming to create positive and meaningful experiences. These goals go beyond the functional to address how users feel about using a product. Let’s break down each of these goals with simple examples to illustrate their importance in UX design:

1. Satisfying

**Goal:** Users feel content and fulfilled with their interaction.

**Example:** Completing a challenging level in a puzzle game leaves players feeling satisfied, as the challenge matches their skills, making the experience rewarding.

2. Fun

**Goal:** The experience is enjoyable and engaging.

**Example:** A social media app with interactive filters and games that users can play with friends makes the interaction not just about sharing moments but also about having fun together.

3. Enjoyable

**Goal:** The overall experience is pleasant and pleasing.

**Example:** A music streaming app that easily lets users discover new music tailored to their tastes can make the process of finding new songs and artists enjoyable.

4. Entertaining

**Goal:** The product provides amusement or enjoyment.

**Example:** An e-book reader app that offers interactive storytelling, including visuals and sound effects, can turn reading into an entertaining activity beyond just text on a screen.

5. Helpful

**Goal:** The product provides useful assistance or makes tasks easier.

**Example:** A fitness tracker that suggests workouts based on your activity level and goals, offering guidance and information to help you stay healthy.

6. Motivating

**Goal:** Encourages users to take action or continue using the product.

**Example:** A language learning app that sets daily goals and provides feedback on progress, motivating users to keep practicing and improving their skills.

7. Aesthetically Pleasing

**Goal:** The product is visually appealing.

**Example:** A minimalist note-taking app with a clean, attractive design that makes the process of jotting down thoughts visually pleasing and less cluttered.

8. Rewarding

**Goal:** Users receive positive reinforcement that encourages continued use.

**Example:** A mobile game that rewards players with in-game currency or items after completing tasks or achievements, making the effort feel worthwhile.

9. Supporting Creativity

**Goal:** Enables users to express themselves or create something new.

**Example:** A digital art app that offers a variety of tools and brushes for creating digital paintings, encouraging users to explore their artistic talents.

10. Emotionally Fulfilling

**Goal:** The experience resonates on an emotional level, creating a sense of connection.

**Example:** A social networking app that helps people with similar interests to connect and share experiences can fulfill a user’s need for belonging and community.

User experience goals like these are essential for designing products that not only meet the practical needs of users but also connect with them on an emotional and psychological level, making the overall interaction something memorable and valuable. By focusing on these aspects, designers can create products that truly enhance users’ lives.

Constraints, their types elaborated

Constraints in design refer to limitations or conditions that influence the outcome of a design process. These constraints can be categorized into logical, cultural, and physical, each playing a crucial role in shaping the user experience. Understanding these constraints helps designers create more thoughtful, accessible, and effective designs. Let’s explore each type with examples:

Logical Constraints

**Definition:** Logical constraints are related to the functionality, usability, and logic of a design. They ensure that a product operates in a way that makes sense to the user and fulfills its intended purpose efficiently.

**Example:** In a web form for signing up to a newsletter, a logical constraint might require the user to enter a valid email address. The system checks if the input matches the format of an email address (like `name@domain.com`). If it doesn’t, the form won’t submit and might show an error message. This constraint ensures that the user provides the necessary information in a format that the system can recognize and use.

Cultural Constraints

**Definition:** Cultural constraints take into account the cultural, social, and demographic factors that influence how different users perceive and interact with a design. These constraints are essential for creating products that are accessible and appealing to diverse user groups.

**Example:** Consider the use of colors in an app’s design. In some cultures, white is associated with purity and weddings, while in others, it might be related to mourning and funerals. A culturally sensitive design would consider these differences when choosing a color scheme for a global audience to ensure it communicates the intended message without offending or confusing users.

Physical Constraints

**Definition:** Physical constraints are related to the physical capabilities of the users and the devices they use. These constraints ensure that a product can be used comfortably and effectively by as many people as possible, regardless of their physical abilities or the device’s specifications.

**Example:** Designing a mobile app with large, easily tappable buttons is a way to address physical constraints. This design choice considers users with limited dexterity or those using devices with small screens, making the app more accessible and easier to use for everyone.

In summary, considering logical, cultural, and physical constraints in design leads to products that are not only functional but also inclusive and respectful of the diverse needs and contexts of their users. By acknowledging and addressing these constraints, designers can create more user-friendly and effective designs.

Let’s simplify the concepts of logical, cultural, and physical constraints in design with more straightforward examples and explanations:

Logical Constraints

**What They Are:** Logical constraints are rules or conditions based on reason and logic that guide how something should work or be used. They help ensure that a design makes sense and works correctly.

**Simple Example:** Think of a light switch. The logical constraint is that flipping the switch up turns the light on, and flipping it down turns it off. This rule is logical because it follows a consistent and predictable pattern that people can easily understand and remember.

Cultural Constraints

**What They Are:** Cultural constraints involve the norms, values, and expectations of different societies and groups. These constraints influence what is considered appropriate, offensive, or meaningful in a design, varying from one culture to another.

**Simple Example:** Imagine the color of a wedding dress. In many Western cultures, white is traditionally worn by brides, symbolizing purity. However, in some Eastern cultures, red is preferred because it represents good luck and happiness. Designing wedding attire or related products would require considering these cultural preferences to meet the expectations of different cultural groups.

Physical Constraints

**What They Are:** Physical constraints are related to the physical abilities of users and the characteristics of the devices or environments they use. These constraints ensure that products can be used comfortably and effectively by people with a wide range of physical capabilities and in various settings.

**Simple Example:** Consider a door handle. A round doorknob requires grasping and turning, which can be difficult for people with arthritis or limited hand strength. A lever handle, on the other hand, can be pushed down with an elbow or forearm, making it easier to use for everyone, regardless of their physical abilities. This design choice addresses physical constraints by accommodating more users.

By taking into account logical, cultural, and physical constraints, designers can create products and experiences that are sensible, respectful of cultural differences, and accessible to a broad audience, including those with varying physical abilities.

Conceptual Models, and their classification based on activities

A conceptual model is a high level description of the proposed system in terms of a set of integrated ideas and concepts about what it should do, behave and look like, that will be understandable by the users in the manner intended.

A conceptual model, in the context of user interface and user experience design, is essentially a blueprint or map of how a system is organized and operates. It’s like an abstract representation of how a system works, including its components and the relationships between them, all geared towards making it understandable for the users. This model helps users form expectations about how to interact with the system. Let’s break it down with a simple analogy and then explore how conceptual models are classified based on activities.

Conceptual Model Explained with an Example

Imagine you’re using a new digital library app for the first time. Your understanding of how this app works forms your conceptual model of it. This might include:

  • Books as digital files that can be borrowed or read.
  • A search function that helps you find books by title, author, or genre.
  • A section for your borrowed books, similar to a physical bookshelf.
  • A way to return books after reading, making them available for others.

This conceptual model helps you navigate and use the app effectively because it matches your expectations based on your understanding of physical libraries, but in a digital context.

Let’s simplify conceptual models based on activities with straightforward examples, focusing on giving instructions, conversing, manipulating, navigating, exploring, and browsing. These models help us understand how users interact with systems or applications through different activities:

1. Giving Instructions

**What It Is:** Users provide commands or instructions to the system to perform specific tasks.

**Example:** A voice-controlled smart home system. You say, “Turn on the living room lights,” and the system understands and executes the command. The conceptual model here involves users giving verbal instructions to control devices within their home.

2. Conversing

**What It Is:** Interaction with the system resembles a conversation, with inputs and responses going back and forth.

**Example:** Chatbots on customer service websites. You type in a question like, “What are your store hours?” and the chatbot replies as if you’re conversing with a human. This model is built on the idea of simulating a natural, human-like dialogue.

3. Manipulating

**What It Is:** Users interact directly with objects within the system, often through touch or drag-and-drop actions.
Exemplified by: (i) what you see is what you get (WYSIWYG) and (ii) the direct manipulation (DM) approach.

**Example:** A photo editing app where you use your fingers to pinch to zoom, swipe to scroll through filters, or drag to adjust the placement of text. This model is about physically interacting with digital elements as if they were tangible objects.

4. Navigating

**What It Is:** Moving through information or spaces within the system to reach a destination or find information.

**Example:** Using a GPS navigation app to get driving directions. You enter your destination and follow the app’s route guidance. The conceptual model is based on the idea of a map that guides you from your current location to your intended destination.

5. Exploring

**What It Is:** Freely examining the contents of the system without a specific goal, often to discover new content or features.

**Example:** Browsing a streaming service without knowing what to watch. You scroll through categories and suggestions, watching trailers and reading descriptions, until something catches your interest. The model here encourages curiosity and discovery within a digital environment.

6. Browsing

**What It Is:** Looking through information or options in a more structured way, often with a vague goal or to see what’s available.

**Example:** Shopping online for a new jacket without a specific brand or style in mind. You might filter by size, color, or price range, skimming through the results until you find a few options that appeal to you. This model is about systematically searching through a set of items to narrow down choices.

Each of these conceptual models is based on different user activities, guiding the design of systems to ensure they meet user needs and preferences in a way that feels natural and intuitive. By understanding and applying these models, designers can create more effective and user-friendly experiences.

What is Direct Manipulation, what are its core principles, what are its advantages and disadvantages?

Direct manipulation is a user interface design concept where users interact directly with objects on the screen rather than using abstract commands. The idea is to make digital objects behave as physical objects would, making interactions more intuitive. For example, when you drag a file icon to a folder on your computer’s desktop, you’re using direct manipulation. You “grab” the file and “move” it to a new location, just as you might move a physical document from one place to another.

Core Principles of Direct Manipulation

1. **Visibility of Objects:** Users can see the objects they are manipulating, which makes the interaction more intuitive.
2. **Incremental Action:** Users receive immediate feedback from their actions, making it clear what effect they’ve had.
3. **Reversible Actions:** Mistakes can be easily undone, reducing the risk of using the software and encouraging exploration.
4. **Physical Actions or Presses:** Users interact with objects more through actions (like dragging, dropping, or resizing) than through complex command sequences.

Advantages of Direct Manipulation

- **Intuitiveness:** It’s easy for users to understand and start using without much instruction, as it mimics the physical world.
- **Satisfaction:** Users often find it more satisfying to interact with objects directly, as it gives a sense of control.
- **Error Reduction:** The visibility of objects and actions, along with immediate feedback, helps reduce errors.
- **Learnability:** New users can learn systems more quickly when they can directly manipulate objects and see the outcomes of their actions immediately.

Disadvantages of Direct Manipulation

- **Complexity in Implementation:** Creating a direct manipulation interface can be technically challenging and resource-intensive.
- **Limited to Simple Tasks:** It’s best suited for simple, concrete tasks. Complex abstract tasks might not be easily represented with direct manipulation.
- **Screen Space Requirements:** It requires significant screen space to display objects, which can be a limitation on small screens.
- **Over-Simplification:** There’s a risk of oversimplifying complex tasks, which might lead to a loss of functionality or power for advanced users.

Indirect Manipulation

Indirect manipulation is when users interact with the system through abstract commands or representations rather than directly manipulating objects. For example, using a command line interface (CLI) where you type commands to perform actions (like `copy file1.txt destinationFolder`) is a form of indirect manipulation. You’re not directly “moving” the file in a visual or physical sense; instead, you’re instructing the system to do it for you based on a set of commands.

Examples and Comparison

- **Direct Manipulation:** On a smartphone, enlarging a photo by spreading two fingers apart on the screen. You’re directly interacting with the photo in a way that mimics physical actions.

- **Indirect Manipulation:** Using keyboard shortcuts or text commands to enlarge a photo. You’re not interacting with the photo directly, but rather using an abstract method to achieve the same result.

Direct manipulation interfaces tend to be more user-friendly, especially for novices, because they rely on intuitive actions rather than memorized commands. However, indirect manipulation can be more efficient for expert users performing complex tasks, as it often allows for quicker interactions once the commands are learned. Both approaches have their place, depending on the user’s needs, the task’s complexity, and the context of use.

Conceptual models based on Objects

Conceptual models based on objects focus on how systems, applications, or environments are understood in terms of the objects within them and how those objects relate to each other and to the user’s actions. This approach is grounded in the idea that users interact with software through a mental model of identifiable objects that resemble or represent tangible items.

Conceptual Models Based on Objects: An Overview

An object-based conceptual model in interactive design simplifies complex systems into more manageable components by representing information and functionality as “objects” that users can interact with. These objects often mimic real-world counterparts or abstract entities designed to be easily recognizable and understandable by users.

Core Elements

- **Objects:** The primary elements users interact with, which can be visual (icons, buttons, avatars) or conceptual (files, messages, settings).
- **Actions:** What users can do with the objects, such as create, modify, delete, or share.
- **Attributes:** Characteristics of the objects, such as size, color, status, or location.
- **Relationships:** How objects relate to one another, which could be hierarchical (folders containing files) or associative (hyperlinks connecting web pages).

Examples

1. **Desktop Interface:** The desktop of a computer operating system is a classic example of an object-based conceptual model. Files and folders are represented as icons (objects) that users can open (action) by double-clicking. These icons have properties (attributes) like name, size, and type, and can be organized into hierarchies (relationships) where folders contain files and other folders.

2. **Social Media Platform:** In a social media app, the primary objects might be posts, comments, and profiles. Users can create posts, comment on others’ posts, and view profiles. Each post can have attributes like the author, timestamp, and content type (text, photo, video). Relationships are evident in how comments are attached to posts and how profiles link to the posts and comments made by the user.

3. **E-commerce Website:** On an e-commerce site, products are the objects. Users can search for products, add them to a cart, and make purchases. Products have attributes such as price, description, and customer reviews. Relationships exist between products (such as those that are frequently bought together) and between products and categories (hierarchical organization).

Advantages

- **Intuitiveness:** Mimicking real-world interactions makes the digital environment more intuitive and easier to navigate.
- **Learnability:** New users can quickly understand how to interact with the system based on their prior knowledge of similar objects.
- **Efficiency:** Experienced users can perform tasks quickly by directly manipulating familiar objects.

Challenges

- **Complexity for Complex Systems:** Simplifying very complex systems into object-based models can be challenging and might oversimplify or obscure important functionalities.
- **Design Limitations:** Designers must carefully balance the simplicity of object representations with the need to convey sufficient information, which can limit design options.

Object-based conceptual models are a powerful way to design interactive systems that are user-friendly and align with human cognitive processes. By grounding digital interactions in familiar concepts, these models facilitate both initial learning and ongoing efficiency, making technology more accessible and engaging for a broad range of users.

Let’s simplify the concept of conceptual models based on objects with a brief summary and simple examples:

What Are Conceptual Models Based on Objects?

These models are a way of designing computer systems, apps, or websites so that everything you interact with on the screen is represented as an object. These objects are like things you’re familiar with from the real world or simple items you can easily understand and use.

Key Points:

- **Objects:** The stuff you see and interact with, like icons, buttons, or images.
- **Actions:** What you can do with these objects, such as clicking, dragging, or opening them.
- **Attributes:** Details about the objects, like their color, size, or name.
- **Relationships:** How objects are connected or related to each other. For example, a folder might contain files.

Examples:

1. **Desktop Interface:** Imagine your computer’s desktop. It has folders (objects) that you can open or move around (actions). Each folder can be named (an attribute) and can contain files or other folders (relationships).

2. **Social Media Platform:** On a platform like Instagram, posts are objects. You can like or comment on them (actions), they have details like the number of likes or the caption (attributes), and comments are connected to posts (relationships).

3. **E-commerce Website:** On Amazon, products are the objects. You can add them to your cart (action), they have prices and descriptions (attributes), and they might be related to other products as recommendations (relationships).

Why It Matters:

Using objects makes digital environments easier to understand and navigate because they work in a way that’s similar to how things work in the real world. It’s about making tech products user-friendly so that anyone can quickly learn how to use them without getting confused.

In summary, conceptual models based on objects help make digital interactions feel more natural by organizing information and functionalities into familiar “objects” that we can easily interact with, understand, and relate to, just like we do with everyday items.

What is understanding the user? What is cognition? What are different cognitive aspects?

Understanding users in the context of design involves grasping how they think, feel, and behave when interacting with products or systems. This understanding is crucial for creating user-friendly designs that meet users’ needs and preferences. A key part of understanding users is cognition, which refers to the mental processes involved in gaining knowledge and comprehension. These processes include thinking, knowing, remembering, judging, and problem-solving.

Let’s break down the different cognitive aspects with simple examples:

1. Attention

**What It Is:** The process of focusing on specific information while ignoring other stimuli.

**Example:** When using an app, a notification badge (like a small red circle) on an icon grabs your attention, making you focus on it and potentially click to see the new message or alert.

2. Perception and Recognition

**What It Is:** Perception is how we interpret sensory information. Recognition is identifying what we perceive based on past experiences.

**Example:** Seeing an envelope icon on a website and recognizing it as the symbol for email or messages. The shape and design of the icon help you quickly understand its function without needing text.

3. Memory

**What It Is:** The ability to store and retrieve information over time. There are different types, such as short-term and long-term memory.

**Example:** Remembering your password when logging into a site. Short-term memory is used when you temporarily memorize a phone number, while long-term memory is involved in remembering your login details over time.

4. Reading

**What It Is:** The process of decoding symbols (like letters and words) to understand their meaning.

**Example:** Reading user reviews on a product before making a purchase decision. The layout, font size, and readability of the text all influence how easily you can read and understand the information.

5. Speaking and Listening

**What It Is:** Speaking involves conveying information verbally, while listening is the process of receiving and interpreting auditory messages.

**Example:** Using voice commands to interact with a smart speaker. You speak to give commands (“Play some music”), and the device listens to execute the command, showcasing interaction through speech.

6. Problem Solving

**What It Is:** The process of finding solutions to difficult or complex issues.

**Example:** Figuring out how to use an unfamiliar coffee machine. You use cues from the machine’s design and your past experiences with similar devices to solve the “problem” of making a coffee.

7. Planning

**What It Is:** The process of thinking about and organizing the activities required to achieve a desired goal.

**Example:** Planning the steps you need to take to book a holiday using an online travel service, from choosing the destination to making reservations and paying.

8. Reasoning and Decision-Making

**What It Is:** Reasoning is the process of forming conclusions, judgments, or inferences from facts or premises. Decision-making involves choosing between different options or actions based on reasoning.

**Example:** Deciding which smartphone to buy after comparing features, prices, and reviews. You reason through the information to make the best decision for your needs and budget.

9. Learning

**What It Is:** Acquiring knowledge or skills through experience, study, or being taught.

**Example:** Learning to use a new app by exploring its features and using tutorials. The design of the app can facilitate learning by being intuitive and providing helpful guidance.

Understanding these cognitive aspects helps designers create products that are more aligned with how users think and behave. By considering attention, perception, memory, and other cognitive processes, designers can enhance the usability, accessibility, and overall user experience of their products.

Mental Models, What are they? Some Examples

Mental models are the internal representations that people form of how things work in the real world, based on their expectations and past experiences. In interactive design, understanding users’ mental models is crucial because it influences how they interact with technology and digital interfaces. When the design of a system aligns with the users’ mental models, it becomes more intuitive and easier to use.

Example 1: Thermostat Control

**User’s Mental Model:** Most users expect that turning a dial or moving a slider up will increase something (like temperature or volume) and moving it down will decrease it. This expectation comes from real-world experiences, such as using volume knobs or light dimmers.

**Design Implication:** A digital thermostat interface that uses a vertical slider to control temperature should follow this mental model. Sliding up to increase the temperature and sliding down to decrease it matches the users’ mental model, making the interface intuitive to use.

Example 2: File Management on Computers

**User’s Mental Model:** People often think of digital files and folders in the same way they think of physical files and filing cabinets. This includes concepts like moving a document into a folder or throwing unwanted files into a trash bin.

**Design Implication:** Operating systems like Windows or macOS use desktop icons such as folders and trash/recycle bins that visually represent these physical objects. Dragging a file icon into a folder icon on the screen aligns with the mental model of organizing physical documents, making the digital environment familiar and easy to navigate.

Example 3: E-commerce Shopping Cart

**User’s Mental Model:** Shoppers are accustomed to the concept of a shopping cart from the physical world, where they add items as they browse through a store and then proceed to checkout to pay for them.

**Design Implication:** E-commerce websites use shopping cart icons and functionality to mirror this physical shopping experience. Users can add items to a digital cart and proceed to checkout when ready. This design leverages the mental model of shopping in a physical store, making the online shopping process intuitive.

Example 4: Undoing Actions

**User’s Mental Model:** In the real world, people often expect to have the opportunity to undo an action, like erasing pencil marks or returning items to a shelf.

**Design Implication:** Providing an “Undo” feature in software applications aligns with this expectation. Whether it’s editing a document, sending an email, or deleting a photo, users anticipate the ability to reverse actions. Implementing an easy-to-find and use “Undo” function matches users’ mental models of being able to correct mistakes.

Example 5: Bookmarking Web Pages

**User’s Mental Model:** Just as readers might bookmark pages in a book to easily return to them, internet users expect to be able to mark web pages for quick access later.

**Design Implication:** Web browsers offer a bookmarking feature that allows users to save links to web pages. This digital bookmarking concept is directly related to the mental model of using a physical bookmark, making it a familiar and useful feature for users.

In summary, when interactive designs are guided by an understanding of users’ mental models — how they think and expect things to work — they become more user-friendly. Designers aim to create interfaces that feel natural and intuitive by aligning digital interactions with these mental models, thereby reducing the learning curve and enhancing the overall user experience.

What is Memory load, how to reduce it?

Memory load refers to the amount of information a user needs to remember while interacting with a system or interface. In the context of interactive design and user interface design (UID) within engineering, reducing memory load is crucial because it makes systems easier to use and learn. When users have to remember less, they can navigate and interact with technology more efficiently and with less frustration.

How to Reduce Memory Load

Reducing memory load involves designing interfaces that minimize the need for users to recall information from memory, providing cues and information at points where decisions or actions are required. Here are strategies to reduce memory load, explained from scratch with examples:

1. Use Visual Cues and Icons

**Strategy:** Incorporate intuitive visuals or icons that users can recognize at a glance, reducing the need for text-based instructions.

**Example:** A trash bin icon for deleting files or a magnifying glass icon for search functions. These universally recognized symbols help users understand functionality without remembering what each option does.

2. Simplify Tasks

**Strategy:** Break down complex tasks into smaller, manageable steps, guiding users through each process.

**Example:** In an e-commerce checkout process, splitting the task into discrete steps (e.g., shipping address, payment information, and order review) helps users focus on one set of details at a time, reducing the cognitive load of remembering all information at once.

3. Provide Defaults

**Strategy:** Offer default settings or options that are commonly used, so users don’t have to remember their preferences each time.

**Example:** Defaulting to the most common shipping option or filling in a user’s billing address based on previously entered information. This reduces the need for users to recall and re-enter details.

4. Utilize Recognition Over Recall

**Strategy:** Design interfaces where users can recognize options rather than having to recall them from memory.

**Example:** In a software application, using dropdown menus with options for formatting text (like font size, color, and style) allows users to select from available choices instead of remembering specific formatting codes or commands.

5. Offer Feedback and Error Messages

**Strategy:** Provide immediate feedback or informative error messages that guide users on what to do next without having to remember complex rules.

**Example:** If a user enters an incorrect password, displaying a message that specifies the password requirements (e.g., “Password must be at least 8 characters long and include a number”) helps users correct their input without trying to remember the criteria.

6. Consistency Across the Interface

**Strategy:** Maintain consistency in the design, layout, and terminology throughout the interface to reduce the cognitive effort required to learn new sections of the application.

**Example:** If a symbol or color indicates an action (like red for delete or a plus sign for add), using those indicators consistently across the application helps users understand their meaning without memorization.

7. Externalize Information

**Strategy:** Design systems that allow users to offload memory tasks onto the system itself.

**Example:** Auto-saving feature in document editing software or history logs in search engines. Users don’t need to remember to save their work manually or recall their previous searches, as the system keeps track of these actions.

By implementing these strategies, designers can significantly reduce the memory load on users, making interfaces more intuitive and user-friendly. This approach not only enhances user satisfaction but also improves the overall effectiveness of the technology, as users can focus more on their tasks rather than on how to navigate the system.

Reducing memory load in interactive design makes systems easier to use by not overburdening users with too much to remember. Here’s a simplified summary of strategies to achieve this, with straightforward examples:

1. **Visual Cues and Icons**
- **What to Do:** Use easy-to-recognize symbols.
- **Example:** A house icon for the “Home” page.

2. **Simplify Tasks**
- **What to Do:** Break tasks into smaller steps.
- **Example:** A multi-step form that guides users through registration.

3. **Provide Defaults**
- **What to Do:** Offer default settings for common choices.
- **Example:** Shipping information auto-filled with the most commonly used address.

4. **Recognition Over Recall**
- **What to Do:** Let users choose from options instead of remembering them.
- **Example:** Dropdown menus for selecting your birth year.

5. **Feedback and Error Messages**
- **What to Do:** Give clear feedback or help when errors are made.
- **Example:** A message saying “Your password needs at least one number” when creating a password.

6. **Consistency Across the Interface**
- **What to Do:** Keep design and terminology the same throughout.
- **Example:** Using the same color to highlight all actionable buttons.

7. **Externalize Information**
- **What to Do:** Let the system remember information for the user.
- **Example:** Auto-saving progress in a form so users don’t have to start over if they leave the page.

By following these simplified strategies, designers can create interfaces that are easier and more intuitive for users, reducing the effort needed to use and navigate digital products effectively.

This is all for Blog 1(out of n) on User Interface Design/ User Interaction Design. Hope you liked reading it, I have simplified everything to its smallest form for the reader to read and carry forward the information. Do clap if you liked the blog and follow for more such interesting ones. Also, if you have suggestions on which topics, I should take up next, do leave them below in the comments section. With that said,
Let Wisdom Go Viral!

--

--

Ronit Malhotra

Engineering(IT) | Coding | Science and Technology | The Art of Living | Finance & Mgmt | Yogic Sciences | Startups | Interviews | Health