Improving a Banking App’s Direct Deposit Experience Through Data
Usability testing and 1–1 interviews are often heralded as the ultimate source of insights, but powerful information can be drawn from analytics as well.
During my time at BMTx — a white-label BAAS fintech company — I oversaw the UX design of the banking platform offered by a major telecoms company. One of the features offered was an automated direct deposit solution, powered by a 3rd party processing service. In addition to employment pay checks this service was supposed to support federal benefits such as social security and unemployment, as well as IRS refunds.
While this design process began as an attempt to increase usability of the service and increase the number of account holders with direct deposit enabled we also wound up identifying and rectifying a key issue with the messaging provided by the 3rd party partner around federal deposits which was failing to communicate a critical final action users needed to take and effectively preventing their ability to set up direct deposit for their benefits, unemployment, and stimulus checks during this time.
The Task
Direct deposit is seen as one of the most imporatnt user actions from the perspective of the bank. According to a survey by NACHA, more than 80% of US workers receive their wages via direct deposit — and having customers set up direct deposit can increase banking retention rates, as customers with direct deposit are more likely to view their bank as their primary financial institution, leading to a higher likelihood of additional product uptake and overall customer satisfaction.
Having already added the bonus of accessing direct deposit funds 2-day early the inital boost in adoption had quickly plateaued and we now needed to find a way to increase conversion rate of the overall process.
Research
With no budget provided to the project for moderated usability testing we decided to turn to data instead in the hopes of generating some behaviour based hypotheses.
Qualitative Data
The next best thing to a customer survey was to see what they’ve already told us. Upon reaching out to customer care, we found that they receive a huge volume of requests related to direct deposit.
I spent a few days manually combing through these questions, and sorting them by the general theme of the issue. Color coded and turned into a scatter plot correlating issue to the volume of questions related to it, we were able to clearly identify some of the main problems our users faced.
Quantitative Data
In addition to evaluating the care communications, we pulled the available success metrics. While it’s not possible to correlate every direct deposit a user gets to a direct deposit request it may have came from, we decided to define success as any submitted direct deposit request that did show a direct deposit in the user’s account at any point during that following month.
It was here we had the biggest shock — finding that the number of users who submitted a direct deposit request was much higher than the percentage of users who ever saw a direct deposit successfully hit their bank account. This rate of failure far exceeded what we would have expected to exist from normal user or system error — but indicated a major flaw somewhere in the process.
Planning
Using the data to identify key problem areas and support our assumptions, we created a list of weighted user stories to serve as the foundation of our design planning. An important and often overlooked aspect to this problem is giving attention at this time to both the priorities of our users as well as the business objectives.
Including both perspectives in user stories helps to identify any conflict. While we weighted these user stories based on our own initial perspective, we reviewed them with stakeholders and adjusted the weighting based on the business and technology teams perspective. The stories were weighted on a scale of 0–3, with 0 being the highest priority and 3 being the lowest. The following user stories are presented below with the corresponding design solution we introduced.
Design
In order to turn our user stories into actionable solutions, I started by marking up the flow of the current experience. Calling out painpoints as they corresponded to user stories and documenting possible improvements. During this phase, I worked with closely with the technical team to understand the constraints of the 3rd party api. What was the order information had to be collected and submitted in; when and how were error statuses returned.
Armed with that knowledge, I began to create a new flow — wireframing out the possible solutions and design improvements we could make. Once all stakeholders and tech leads felt confident on the approach, I moved into high-fidelity designs.
Improvement: Re-imagining the Direct Deposit Dashboard
The current entrypoint into the flow launched the user right into creating a new direct deposit. There was no way to view the history of their sumbissions or see statuses. We decided to introduce a direct deposit dashboard that would serve as the historical overview of their past actions, provide a place to take them to when there was an issue or additional action they needed to complete.
From this screen we created 2 paths — one to take users through the in-app request process, and the other that would give them access to their account information, the paper form to download for their employers, and relevant FAQ’s to hopefully answer easy functionality questions.
Clarifying the Account Selection & Splitting Process
Another key area we sought to improve was providing more understanding around what splitting a direct deposit was. To do this, we first added a speed-bump prompting user’s to choose how they wanted to split their deposit, and then took them into the respective screen that included a relevant help modal to provide more context.
Adding Error Validation & Alert Status’
Finally, to address the confusion around the status of submitted direct deposits, we added inline error validation to catch problems with information — and added an api call to confirm the validity of the form before fully submitting.
As a final step, we added a summary state screen that provided the final status — letting users know if their direct deposit request was being processed, fully submitted, or in some cases requiring an additional user action.
Evaluation & Outcome
Post-launch, the effectiveness of the final solution was evaluated through tracking a decrease in direct deposit support requests and a correlated increase in successful direct deposit submissions. In total, the redesign led to a significant improvement in the success rates of submitted switches and a decrease in customer support requests.