Instagram Redesign

Researching Features for Power Users

Project Team

During the Fall 2015 semester, myself along with three other UMass B.S. Computer Science students researched a problem area within the Instagram application and designed a solution. Each team member contributed to each step in the research and design process.

Project Overview

We aim to tackle a significant functionality gap in Instagram’s mobile user experience. The current version of Instagram offers users limited search functionality for finding photos based on specific criteria. Our user group includes active Instagram users, primarily users who have posted numerous (50+) photos over a long (1+ year) period of time. We break the functionality of searching for photos into three categories: Search By Tag, Search By Location, and Search By Date Posted. We analyze these functionalities at a user and global scope (i.e. in the Explore tab, where photos from all users are displayed, a given user Profile, where photos that user has posted are displayed). Locating older photos a user posted at a personal profile level or global level is exceptionally laborious. We aim to design an intuitive solution that fits seamlessly with Instagram’s current structure.

The first functionality category, Search By Tag, currently is only available at a global scope in Explore and only supports searching by one hashtag at a time. No Search by Tag functionality currently exists in Profile.

The second functionality category, Search By Location, currently is only available at a single user scope in a given user’s Profile. One of the tabs on a Profile is the Map view, where a given user’s photos are displayed pinned on a map based on where they were taken, and are grouped based on how far in or out the map has been zoomed. Zooming out groups photos into a “stack,” or collection of photos for a given area, such that only the most recent photo is visible for that area. Zooming in breaks down these stacks into more specific photo stacks, until eventually stacks would only consist of photos taken at the exact location. Users can browse photos taken in each area and by tapping on photo stacks view all photos taken in that area for the zoom level. In Map view, users can only Search By Location and have no other search or filter functionality.

The third functionality, Search By Date Posted, has no existing support aside from manually scrolling through photos in a given user’s Profile or globally in the Explore view. This is the most significant functionality currently missing from Instagram. The current functionality of searching for a given user’s older photos exists in two tabs, the Grid view that displays ~9 photos at a time but without any additional info (e.g., posted date, tags, etc), or the List view that displays <1 photo at a time that takes up the entire screen, but has all associated info about the photo. Both views require scrolling to navigate through the photos, and both views offer no efficient means of searching for and navigating to older photos.

The key insight that will guide our design is that search functionality should be expanded and improved to allow users to easily and intuitively sort through photos on instagram. We plan to initially survey active Instagram users (self-identified) about their experience using social media recruitment on Facebook. For future testing, we plan to utilize social media similarly for surveys, but will additionally recruit users to meet for 5–10 minutes to offer feedback on prototypes we have created at different stages in the design process. Depending on the amount of success we have recruiting participants for in-person testing via Facebook, we may adopt more spontaneous testing methods, such as stopping passerby and asking if they would take a moment to offer some feedback. We plan to revisit willing early testers and ask their feedback throughout the process in addition to recruiting new participants.

Knowledge Elicitation

We decided to frame the purpose of the interviews as learning more about Instagram’s user experience generally, without alluding to our specific focus. We were interested in seeing whether frustrations emerged naturally and confirmed or disproved our theories about search and filter functionality. We planted several “dummy” tasks that asked users to perform operations that Instagram currently does not have the functionality for. We were interested in seeing how users responded to these tasks, and where they would expect to find these functionalities and whether or not they questioned their existence.

All three users were in their early-twenties and were UMASS Amherst students. One participant was female, and the other two were male. All participants were able-bodied and had no significant visual impairments or needs that would require accessibility accommodation. The user tests all took place in the Computer Science USPACE, which we chose because of its convenience for all team members and participants. Unfortunately, the USPACE did provide several limitations, which included a public setting instead of a more private space. Participants may have felt more comfortable talking through their process in a different environment, and will be take into consideration for future user tests.

We recruited users of three different levels of usage to assess the experience of an advanced power user, an average user, and a new user:

Advanced User: User A was an active user of Instagram. She has posted 500 photos over the past 5 years. She primarily uses Instagram to browse photos by content creators she found through Pinterest or friends of hers. She did not use the location search functionality, but did understand how it worked. She expected she could filter by tag within a user’s profile, but when she could not immediately find a way to do so in the User Profile view, she looked in the Explore view and tried inputting the hashtag and username into the search bar. She was surprised and frustrated to discover this did not function how she expected, and looked for other tools in the search area of Explore to try to refine her query, but did not find anything helpful.

Average User: User M was a more passive user of Instagram. He had posted 20 photos but used Instagram to find and follow content producers he was interested in. User M noted that many times he would use outside sources to find Instagram accounts that he was interested in. He cited how the built in search for finding content producers of a specific type was difficult. User M was indifferent for the deeper search functionality for your own photos since he didn’t post a lot of his own photos.

New User: User B has never seriously used Instagram. His last experience with it was five years ago and he’s managed less than five photos. He offers a completely fresh perspective on the current search function in Instagram. For the most part, User B felt that the search functionality was lacking. It was easy for him to find the more well-designed search features such as geotag search. But when he tried to find pictures using filters and description tags, he felt confused and frustrated with the lack of functionality.

Explain why you chose the specific knowledge elicitation method.

In person attitudinal & behavioral usability test using concurrent think aloud protocol.

We used the “concurrent think aloud” moderation technique to encourage participants to communicate their thoughts as they occurred while completing assigned tasks. We chose this moderation method because it encouraged a stream of consciousness that helped us to understand participants’ decisions as they interacted with the application. It also encouraged communication of each user’s emotions because it was easy to understand how certain aspects of the application provided delight or frustrated them when they expressed it verbally.

We wanted the participants to feel comfortable and respected in order to make sure any unintentional awkwardness would not affect their performance. To assure this, we took several measure before and during our interviews. At the start of the interview, we read a disclaimer to participants that explained the purpose of the interview and that we were testing the software, not them, in order to discourage self-consciousness. We also stated that no personal information would be shared. During the interviews, we had the participants use one of our Instagram accounts instead of their own account to prevent any distractions of their own photos. We made sure our questions did not lead or influence the participant towards a biased response. We strived to remain neutral in our communication to the participant. When the participants asked the tester questions, they responded with questions such as “What do you think?” or “I’m interested in what you would do,” or “What would you want to have happen?” if they did not know how something worked. If they frequently dodged questions with “I don’t know,” we responded with “if I had to force you to give me an answer,” to encourage natural brainstorming without losing valuable data.

Three Main Findings

One main thing we learned was that all our participants believed advance search functionality existed within the application. We realized this when we asked the user to perform several advanced searching tasks which can’t, to the best of our knowledge, be completed. We did this to see if this is something the user would expect of the application. One of the tasks asked the user to find all the photos on Instagram of themselves with the hashtag “kitten” (see interview script and notes below; part B). Currently, there is no way to sort through your own photos using hashtags so this task was impossible.

All users thought this task was possible. We observed how they handled this task to influence where we should place this functionality. It gave us good insight into where users expected these advanced search features should be. All users went to the Explore tab (the tab where all global searching occurs) thinking they could enter in multiple parameters to get back more specific results. They quickly realized this wasn’t the case.

We also found that users do not predominantly use Instagram to search for relevant/interesting content. Instead, they use other tools such as Pinterest or blogs to find interesting content creators to follow, and then use Instagram to browse through the content they have compiled. This is because the current search functionality doesn’t allow users to easily find profiles who post things they may be interested in. One study participant commented that they usually find Instagram accounts to follow on other sites, specifically design blogs, then go back to Instagram to follow them and see their posted content in their social feed.

This point was found also in section B of the interview (see interview script and notes below; part B) when we asked users to find any pictures that had the hashtag “kitten”. All users noted that they have a hard time finding content that interests them. The hashtag system does not support users finding interesting, relevant content. Two users expected to be able to type multiple hashtags into the search bar, which currently is not supported. Additionally, the same users expected to be able to use the search bar to filter by both tags and user profiles. This will surely influence our design, prompting us to integrate additional search functionality into the places where users instinctively looked for it. Specifically in the search options in the Explore tab. One user expected to be able to filter photos in a User Profile view and looked for some button or hidden advanced functionality in this view. This will influence our design by determining where we will place this functionality in User Profile so that it is intuitive for power users, but not overwhelming or cluttered for new users.

Users believed (correctly) that the only way to locate older photos was by scrolling downwards until reaching the end of photos, either in the grid view (~9 photos at a time with no contextual information displayed) or in the list view (1 photo at a time but with all contextual information displayed).

When we asked the users to find a photo from a year ago, all of them immediately went to the list view to start scrolling (see interview script and notes below; part A). This validates our belief that there should be more in depth search tools to find an older photo.

All of this feedback will guide our design by influencing where we place the date range search functionality. Since our sample users gravitated towards their photo list view to find an older photo it would make sense to put this functionality there so it is clearly accessible when a user needs it. A user shouldn’t have to spend a potentially very long time to access their older photos. Streamlining this process is more intuitive and useful for Instagram power users. When asked to explain why she went to the list view, User A explained that she did not want to have to keep clicking into photos to see when they were posted to gauge her progress towards the goal.

Need & Main Idea

Instagram caters to a wide range of users. For some, it is a place to share the occasional nature photo or baby picture. But for others, it is an entire photograph database. Power users can have thousands of selfies, breakfast photos, and candid shots in their collection. And for these JPEG hoarders, effective search is more important than ever. What if they need to find photos from that one trip to Paris five years ago? Or if they want to show off the fact that they have 5,402 pictures of their cat? Or even if they want to see how many people in their town had a hot dog for lunch? Right now, Instagram lacks a robust system for describing and searching for photos. Power users can’t search their own photos by keywords, location, or date range. Similarly, any user cannot search for other people’s photos by multiple keywords or location. We aim to design a better system for describing, tagging, and searching for pictures in large Instragram collections.

Existing approaches

Currently on Instagram the best way to search through your own photos is for you to scroll through them and find the ones you want. To find all your photos posted in a certain location your only option is to select the map interface option on your profile and zoom in to the desired location. These methods may have sufficed for the early years of Instagram, but as users have amassed photos over several years these methods have become extremely inefficient. We address these issues by adding more search functionality options to users’ individual profiles. This means users are able to search photos by date range, hashtags, and location all in more efficient and timely ways.

Users currently find new content in Instagram in the Explore tab in two ways. First, by looking at trending hashtags, and second by searching using a hashtag over all the public photos on Instagram hoping to find something relevant. Search using multiple hashtags or by location isn’t supported so there isn’t an efficient way to narrow down the results further to find something you might be interested in. Our solution will connect the user with more content they are interested in so they don’t have to be reliant on just trends or a singular tag. For us, the Explore tab should have more robust search functionality so users can sort the plethora of photos being taken every second. This means adding the ability to show results by location and multiple hashtags.

Our Approach

We target the three main search filtering gaps in current functionality, searching by multiple tags, searching all photos by location, and searching by date, in our redesign of Instagram’s interface. We believe in giving the user more tools to search through all their photos and others, while maintaining the simplicity and elegant user experience Instagram currently offers. Our redesign implements unobtrusive search tools into the Explore tab and User Profile tab to fill in the functionality gaps.

The first of these tools will be a search filtering feature, a search bar, within the user’s profile that allows them to search their own photos by their associated hashtags. For example, a user who posted a photo and tagged it with “kitten” would be able to search “kitten” and this photo and any other photos tagged with “kitten” would be displayed. The search bar will support searching multiple hashtags within the user’s profile. We propose that the search bar in the Explore tab should support multiple hashtags as well, although it will require no visual redesign.

The search bar for searching tags functionality will also be implemented consistently in the map tab of the User Profile tab and the Explore tab. The map tab of the User Profile will have a search bar added to the top of the screen that allows users to reduce the photos displayed on the map to only those matching the specified tag.

The second tool will be adding a location search feature to the Explore tab similar to the map view tab found within the User Profile tab. Users will be able to filter their searches by location and tag without changing the current search flow significantly. When users were asked to search for photos of kittens posted in Boston, they noticed the “By Place” search option that appears in the Explore tab when you tap on the search bar. They thought they would be able to select a location from under this menu and then add additional search tag criteria. We integrated this user feedback by changing the flow of searching by location such that selecting a location will bring users to a map view of all photos in that area, mirroring the User Profile map tab. The search bar will appear as it does in the User Profile map tab.

The third tool allows users to search through their own photos based on when they were posted. A dropdown menu will be added to the User Profile tab that allows users to select which year to display photos from. The users we tested indicated that they did not mind the mechanism of scrolling through photos to find specific photos, but did mind the inefficiency of scrolling through their entire photo post history. We propose a yearly search filter as a solution that reduces the number of photos users have to scroll through without dramatically altering the aspects of the user experience they enjoy.

All the above modifications will prototyped and tested with a diverse new user pool and with the users we previously tested. We will be looking to see if they easily recognize the new search features and that our implementations of them are intuitive and uncluttered.

Representative Tasks

Easy task: Find all photos with the tags “kittens” and “puppies”

  • This uses multiple user-specified keywords to filter the entire photo database and display these results to the user.

Medium task: Find your photos posted in 2014

  • This is simple with our proposed changes, as it only applies one search filter, a date range filter, to the user’s photos.

Difficult task: Find all photos with the tag “kittens” posted in the Boston area

  • This task uses two separate and different tags to filter through all pictures and display them.

Storyboards

Introduction

We wanted to pick tasks that would clearly showcase all the new functionality users could expect and how it would fit into the existing UI. There is still more to show in what we plan on redesigning but what we demonstrate below are the key interactions we believe users will be making.

Easy Task Description

This task illustrates our fundamental goal of better searches by using photo’s descriptors. The main way we do this is by tags, the most basic of which is a tag by keyword (colloquially known as a “hashtag”). Currently, users cannot search through their own photos using even basic descriptions such as “kittens” or “puppies”. With our design, users are able to search and filter by these kinds of keywords. Entering a keyword in the search bar will return all photos with a matching tag. Entering multiple keywords will return the photos that are tagged with all the entered keywords (aka. the intersection of all keywords).

Medium Task Description

This task shows how easy it will now be for users to sort their own photos by date. Before, users would have to scroll until they reached the photo they wanted. Now, users can select which year they want to load pictures from then scroll down that. This leaves what users enjoyed, a light amount of scrolling, while making the task faster. Also, all searches and dates chosen persist between the different tabs view whether the user is in grid view, list view, map view, or your photos view.

Hard Task Description

This task incorporates multiple functionality improvements we proposed, including searching all photos by location and searching all photos by multiple tags.This task involves several steps and is considerably more complicated than the other tasks we chose. It requires users to first navigate to the Explore view of Instagram and then search photos by location. The user will then select the location, Boston in this task, and be brought to a view displaying a map and photos for that location. The new search bar and date selection icon will appear above the photos consistent with how they were incorporated into the User Profile view. The search bar will support multiple tags, allow users to search for “puppies” and “kittens.” The results of this query will show all photos posted in Boston that have been tagged with the hashtag puppies and the hashtag kittens.

Hand Drawn Wireframes

Profile View

In the profile view users will have the ability to perform in-depth searching of the specific users photos. This can be see below the tabs where there is now a row dedicated to deeper searching. In this row there is a small calendar icon on the left and to the right of this a search box that takes up the rest of the row. These two input options extend the user’s search ability by allowing them to sort photos by a year, years, or multiple hashtags all at the same time.

To sort by a year or years all a user has to do is tap the calendar icon, which activates a dropdown menu containing years, so the user can select a single year or multiple years. The user’s selections are then put into the search bar as search terms with the color yellow surrounding them to signify that they are a special time search hashtag. This hashtag will be removeable from the search bar by tapping the “x” icon that will be next to it inside it’s yellow box. Photos will then be filtered by this criteria. This will lessen the time a user has to spend searching for photos from specific years.

For a user to search by multiple hashtags all they have to do is enter the hashtags into the search bar. Each tag will be surrounded by a box with an “x” to the right of the hashtag so it can be easily removed. With every additional tag more filtering will be done on your images and displayed to you. When there are multiple hashtags the images that are displayed first are the ones that have all of those tags. From there, it breaks down into an images that have any of the tags.

Sorting by year and sorting by hashtag on your own photos can be combined giving the user flexibility in how they filter their photos. All searches will persist throughout the four tabs in the profile view. This means a search can be made in the grid view, then a user can switch to the map view to see the locations of the photos from the search they just made.

Explore View

In the explore view, key functionality is being added to both the search bar and the main view itself. In the search bar, multiple hashtags will be supported as described above in the profile view section. Another new feature is the ability to sort posts by location. This can be activated by clicking on the search bar bringing up a set of tabs which allow searching in more specific categories. One of these categories will be “Places”. When clicking on this category a map will come up where a user can enter in locations and hashtags to narrow down what photos appear on the map view.

Another change to Explore is the way users will be able to see content relevant to them. Currently in Instagram users are shown posts relevant to what they search. We believe that this is an acceptable approach but leaves the user open to having things displayed that they aren’t actually interested in. Our solution is to add another screen via the Customize button in the Explore View where a user can add and remove tags that they are interested in which will influence what this photo feed shows. This view will still use the automatic creation feature from your searches but allow the user to remove and add things dynamically in a separate view.

Storyboard Testing

Participants

Observations

0 — Not a usability problem

1 — Cosmetic problem

2 — Minor usability problem

3 — Major problem: should fix

4 — Usability catastrophe: must fix

Multiple Hashtags: Display and Explanation (Rank 2)

Different users had different expectations about how hashtags would work, especially in our easy task. When entering “kittens” and “puppies” into a search bar, it wasn’t clear to them if the search results would be the union of the two sets, or the intersection. We asked them which one they would expect and most said they would expect the intersection, but they said being able to see the union would be helpful too. Some suggested we have separate tabs, one for union and one for intersection. Others suggested we do it through conjunctive words (ex. “kittens and puppies”, “kittens or puppies).

It is clear to us that our current way of displaying search results is not as intuitive as it can be. Our goal is to find a clear way to tell the user exactly what they’re looking at when they search by hashtags without frustrating them by making decisions for them.

Date Selection: Calendar Button and Clarity of Use (Rank 3)

In our medium task, we asked users to try to filter photos by date. Our expectation was that they would use our Calendar button to do this. However, this proved to be very unintuitive. For most of the users we tested, their first instinct was to type a date into the search bar like a hashtag (ex. “March 2013”). Many thought that this would be easier to do than use the calendar button.

When we prompted the users to click on the calendar button, many of them didn’t know what it would do. Some expected a the drop down menu that we had. Others expected the display pane to change into different “mini-collages” of photos organized by year (much like how the iPhone Photos app does it). One user also observed that the photos in the search results didn’t indicate which year and month it was posted. Adding this could be good feedback to the user, indicating that their date-search worked.

Global Location Search: Places Tab and Location Specification (Rank 4)

Our difficult task had users choosing a location in the Explore tab. We expected them to do this by using the “Places” tab on our advanced search menu. This tab was on the same row as other tabs such as “Top”, “Tags” and “People”. Users were more inclined to click on these tabs, and didn’t notice the Places tab until prompted.

After clicking the Places tab, we expected users to use the top search bar to type in the location they wanted. Unfortunately, most of them wanted to click on the “Near Current Location” box, thinking this was a new search bar made specifically for locations.

The third issue happened after they got to the map view. Getting to this point meant using the search bar at the top of screen, and then using the search bar above the photos that displayed. It was unclear to our users at this point that the photos being displayed were being filtered by location and hashtag.

Prototype Changes

Explore View

All users looking to find photos of kittens posted in Boston searched for the hashtag kittens first, and then wanted to specify location. We plan to incorporate this search hierarchy and additional feedback about the process in two ways. First, by modifying the view that appears when users tap the search bar in the Explore View by removing the “Places” option. Typing the location into the search bar that otherwise is used for hashtags and usernames was confusing to users, as they expected to be able to interact with the “Near Current Location” text instead to specify the location, and search for hashtags in the main search bar. This leads to the second modification, which will be a new text field “All Locations” next to a pin icon (consistent with indications of location elsewhere in the app) that will appear on the main Explore View page, below the Trending section and within the Explore section of the Explore View. Users will be able to tap this and type in a new location (e.g. “Boston”) and if they do so on the main Explore View page this will display all photos posted in Boston. If they first search a hashtag in the search bar, the “All Locations” field will appear above the photo results and can be interacted with in the same way to specify which area to display results from (e.g. photos of kittens in Boston).

User Profile View

In the profile view we will be fixing the way dates are selected across the views and also the way the grid view displays them. Currently the grid view just shows photos with no way to discern their date information. We will adjust this so there is a year banner at the start of every year as you scroll through photos, and also month banners above sections of photos within that year. Instead of having a calendar icon, which users found confusing, we will instead show the current year the user is looking at where the calendar icon used to be. When the user clicks on this they will be able to select a year, as they could before, which will take them to the top of that year’s banner in the grid view.

Users all expressed confusion regarding whether searching two hashtags would display results that matched both criteria or either criteria. After collecting user feedback, it still wasn’t clear whether union or intersection should be the default setting. As a result, we decided to clarify what users were looking at when results were displayed. For example, when a user enters “kittens”[space]”puppies” into the search bar and hit enter, a text box saying “Displaying photos with both #kittens and #puppies,” will appear above the resulting photos. The “both” in this text could be tapped by users and changed to either and the text would then read “Displaying photos with either #kittens or #puppies.”

High-Fidelity Prototype

Prototype: https://invis.io/W45ANYPS7

Changes from low fidelity prototype to high fidelity prototype:

  • Changed flow of searching by location
  • Removed the “Places” tab in the view following the user tapping the search bar based on user feedback.
  • This is part of our hard task. All users that we tested indicated that when searching for both locations and tags, it was more natural for them to search for tags first, then location. By removing the “Places” tab, users will naturally type in their tags first. They can then filter by location by using the new “All Locations” tab that appears on the top of the results screen. This is something most of our users wanted to do during testing; they wanted to interact with the location pin, so we implemented that functionality.
  • Added “All Locations” location search on main Explore view below “Explore Posts” to allow users to see all photos in a specified location without having to search by tag first.
  • This is to maintain consistency between the Explore view home screen and the search results screen.
  • Added customizable Explore view
  • Added Customize option to Explore view to allow users the ability to change what photos they can browse through.
  • Power users indicated that while they knew the Explore view photos were catered to their interests, they wished they could customize it instead of having it handled in behind the scenes.
  • Novice users will not be overwhelmed with too many search options, because the customization screen is hidden, and can only be accessed through the Customize button.
  • Changed date icon to drop-down menu and added date information to photos in Profile view
  • Filtering by date in the Profile view is now done through a drop-down menu instead of a calendar icon. Users can use the menu to select which year they want to filter by.
  • In our testing we realized that users expected a full calendar to pop up when tapping the calendar icon, so we made it simpler to avoid confusion. We also marked the first photo posted each month to offer a clear yet non-intrusive time indicator that users can reference while scrolling through grid.
  • Added “Add Tag” label to space bar in Explore view search.
  • When typing in the Explore view search bar, the space bar is now labeled “Add Tag”
  • Makes users aware they can add multiple tags and that it will return the intersection. Previously users in our tests expressed the confusion with what would separate each tag when searching. This “Add Tag” text now makes it clear that adding a space will add the tag.

Prototyping tools used:

Sketch — Sketch has many benefits. It hooks up well with InVision for user testing. The prototype cheaper than Adobe suite but has powerful vector editing tools. It has a large community supplying assets. It also has much more flexibility for customization than Balsamiq.

InVision — InVision makes it easy to prototype without actually building out functionality in code. We used InVision to hook up each screen that we designed in Sketch. This makes the prototype act like a fully functional application. InVision allows testing without downloading external software.

Final Prototype

Link to High Fidelity Prototype: https://projects.invisionapp.com/share/W45ANYPS7#/screens

Changes made to high fidelity prototype after final user test:

  • Changed Customize button to only a gear icon
  • During high fidelity testing, the user was drawn to the Customize button when trying to search for things in the Explore view. Because the user was looking to do a customized search, they saw the large customize button label and decided to touch it. They didn’t think to use the search bar at the top. We made this button much more subtle, so that it will not distract casual users, but the feature is still there for power users. The gear icon is widely accepted as representative of Settings and more advanced options, so we felt it was an appropriate choice.
  • Moved the “All Locations” button from the Explore view home page to the Customize page.
  • The blue “All Locations” button further distracted users from the search bar at the top. After some thinking about the 80–20 rule, we realized that filtering photos by location in the Explore View is something for advanced users, and not part of the 20% of functions that users will use 80% of the time. Therefore, we moved this “All Locations” button to the Customize page.

Previous changes that worked during final user test:

  • Search by location workflow
  • Removal of the “Places” tab made the workflow a lot more in tune with what users would naturally do. It is now expected that they search by tags first, then by location (using the “All Locations” button present on the results screen) and this was validated by our user test.
  • Drop down menu to filter by year instead of calendar icon
  • Changing the calendar icon to a year drop-down menu was validated by our user test. When asked to filter photos by date posted, the user immediately went to the Profile view then to the year drop down menu.
  • “Add Tag” label on space bar in Explore view search
  • When searching in the Explore view, the user naturally used the space bar when trying to add a new tag. When the user saw that the space bar was labeled “Add Tag”, it confirmed to them that this was the right action.

Conclusion

As a group we were extremely invested in our project. We all had ideas about how Instagram could work better for the end user, both power and novice, and we wanted to see these become a reality. This benefitted our group since we were all committed to making the best product we could and we believe this showed in our every step. We invested countless time in productive conversations over minute details to ensure the highest quality integration of our functionality with an existing system.

Our first challenge included task delegation amongst the group, which we recognize was something that could have gone better. It was hard to coordinate who was doing what at different times during the semester and unexpected life obligations frequently changed our plans. This meant at certain times, someone or a couple were left with a disproportionate amount of work. All team members were well-intentioned and were eager to contribute, but we definitely could’ve done a better job balancing workload and communicating about extraneous obligations.

Another challenge was realizing it was more challenging than we anticipated to incorporate new ideas into an existing product without disrupting too much or completely revamping the design. We opted to make our changes as seamless as possible while still accomplishing our functionality goals, as opposed to trying to completely redesign all or components of Instagram.We wanted to preserve Instagram’s aesthetic and important features, so that if the user opened our version tomorrow, it would be a smooth transition. This made it hard to implement drastic changes as we had to balance our ideas with Instagram’s ideas. This meant a lot of iterative work and discussion on how to keep that look and feel consistent in our changes.

As an extension of this, we wished at the beginning of the project that we fleshed out our idea more. We found ourselves coming up with new features towards the end that had us scrambling to add it to our final product. It ended up working out, but this could have been avoided had we just planned better from the beginning.

Our group, however, was more than well equipped to handle these challenges. Not only were we driven by our passion for the project, but we had a strong sense of teamwork. We would each play to our strength and cover each other’s weaknesses. We also each had a lot of experience working in groups and did not experience much friction or conflict.

All in all, our group was successful in what we aimed to do and we came out with a design we were all proud.

AUTHORS: Nick Delfino, Steve Jones, Catherine Feldman, Richard Yan