Product Management Playbook

A Beginner’s Guide to Digital Product Development, and the Role of a Product Manager

Jonathan Moed
22 min readMay 22, 2018

--

This guide covers the digital product development process from the time a product/feature idea is conceived until after that feature is launched. It breaks down each product development phase, including helpful tips, and suggested deliverables. It also covers the many roles of a product manager throughout this process, including: owner, champion, coordinator, communicator, documenter and more.

In addition to drawing on my own experiences as a former product manager at retail e-commerce startup Jet.com, I also draw from learnings and publicly available materials shared by other members of the Product team at Jet.

If you come across any unfamiliar terms, you can consult the glossary in the appendix, or do an ol’ fashioned Google search.

The Process

Source: Alvaro Reyes on Unsplash

To start, see below for the general structure guiding the product development and management process. This process is NOT necessarily linear, and will not always follow the same order of phases/events. That said, each product development process should at some point cover all of the below phases.

  1. Definition
  2. Socialization
  3. Discovery/Research
  4. User Requirements
  5. Design
  6. Delivery/Build
  7. Test
  8. Deployment
  9. Measurement
  10. Optimization

Before I dive into each phase in more detail, here are a couple of general product management tips upfront:

PM Tip: Get under the hood. As a product manager, it is crucial to proactively familiarize yourself with the systems and roles of your counterparts on the business, design, and technology sides. It builds relationships between you and the other teams, and will come in handy as you are thinking through what your feature is and how it should/will work. Practically, this means sitting with a designer or with a developer, watching, and asking questions to understand their tools and thought processes.

PM Tip: Always be learning. Web Product Management is a relatively new field, and has not yet been standardized. There are many different right ways to do product management, and you’re likely to learn countless strategies depending on the people who work with and the organizations you work for. All this to say…there’s always room to learn. The more you can interact with other PMs (within and outside of your company), and learn about the PM discipline at Meetups and conferences, the better.

And now…The product development cycle phases, and the role of the product manager in each phase:

1. Definition: outline an opportunity assessment — what the feature is (verbally and visually), why it makes sense to build, and who will be involved in the development process. Size the product opportunity at a high level.

The starting point for product development will probably be a vision or product strategy. Once these high-level descriptions are set, dictating what’s important to the future direction of your team/organization, it’s time to think about individual features that ladder up to and help make progress toward achieving this vision and strategy. Each feature should address a SINGLE primary problem with a SINGLE primary solution. To identify what this problem and solution are, it is often helpful to map out all of the current problems/issues and see whether they can be condensed to a common theme. This will allow you to focus your thinking. Also, it’s helpful to pull together people in your company who know about the focus area to ensure you have a solid understanding of the customer problem, and the user it affects.

In most cases, coming out of the Definition phase, all of your questions won’t and shouldn’t be completely defined, as details will change and be tweaked throughout the development process. Further definition and answering will occur in the “Discovery/Research” phase (keep reading ;)). The purpose of this exercise is to organize thoughts and provide stakeholders a high-level idea of the feature. This upfront communication is critical in establishing an efficient and productive dynamic between the product manager and the stakeholders.

As you are imagining a feature solving for a single problem with a solution that is impactful, usable, and feasible, the below questions will help formalize and document your thoughts, and articulate why this feature should be prioritized over other features. The more of these questions that can be answered, the easier it will be to validate/prioritize the feature later on in the process.

QUESTIONS:

  • Name (feature)
  • Name (business owner, team)
  • Stakeholders: Who are the key stakeholders (leadership) for this feature (including sign-off/advocacy and delivery)? Which teams do you anticipate providing input/support in the development and management of the feature?
  • Overview of the Feature (1–2 sentence description): Will this be improving upon an existing feature/experience or addressing an entirely unmet need (i.e. what is the foundation/baseline the feature will hopefully build upon)
  • Which company objectives/goals does this feature align with?
  • What problem will this solve for the user and for the business?
  • Who’s the target market, and what’s the size of that market?
  • Why should WE invest in this feature, and why NOW? Are there any timing dependencies / moments associated with this feature (i.e. does it need to be built to support a specific experience)?
  • Do competitors offer similar features?
  • Is this a new feature using existing data or new data? Where is the data coming from?
  • What is the target impact of this feature (ROI)? Including all calculations. Consider site performance (traffic), financial performance (sales, margin), SEO performance, other business outcomes (e.g., partnership opportunities, PR)
  • What metrics are we trying to move (key KPIs)?

PM Tip: Tread lightly. As you define, you’ll be tempted to reach out to tech teams to get a better understanding of how things work today, and the potential technical requirements and timelines for the feature. It’s alright if there is an informal back-and-forth with your tech counterparts, but it’s always better to provide something for them to react to rather than posing open-ended questions early on. When you do engage, be sure to clarify that your research is purely exploratory at this point. Each team/function has its territory, and while everyone is working together and collaborating, your success as a PM depends on maintaining close and informal relationships, so that no one you speak with feels like their territory is being encroached on.

Deliverable: a “Product Brief” detailing the product, the opportunity/impact, KPIs/metrics to track, stakeholders, any risks/dependencies etc.

2. Socialization: engage the feature stakeholders, and discuss the feature with them.

The reason this step is necessary is both to ensure that a feature is as fleshed out as possible and has input from across teams, and to allow the teams to raise any concerns/issues.

If there is a system through which new/in-progress features are tracked, open a ticket for your feature (e.g., Jira, TFS). This ticket will hold all progress and content created throughout the process, and will ultimately turn into the sprint ticket for build (i.e. the ticket that the developers will refer to). Here is an example of a ticket template used to communicate feature details between the product manager and technology stakeholders:

After creating a ticket or creating a destination for documentation related to your feature, the next step is to share the Product Brief and documentation with your core PM team + manager, as well as the following stakeholders (assuming these stakeholders/teams exist in your company. If not, it is up to you as the PM to be the jack of all trades):

  • Product group mailing list: inform them of your work
  • UX: begin discussion around design UX resourcing needed
  • Analytics: inform and make sure the feature is on their radar in case any further analysis is needed and to ensure effective tracking is in place if/when the feature launches
  • Build product manager/your tech counterpart: begin discussion around scope and tech resourcing needed
  • User Research: begin discussion around user research needed to give the Research team enough time to source and enroll eligible research candidates
  • Native (app) team: inform if the feature will be launched on native as well as desktop/mobile web

PM Tip: KYA (know your audience). When you engage these stakeholders/teams, distinguish between ‘inform’ stakeholders and ‘responsible’ stakeholders. Inform stakeholders should be notified and kept in the loop about the feature, while responsible stakeholders will potentially play a role in the design and build process. An email to the inform stakeholders outlining your work should suffice, and a wiki page is even better (a web-based product called Confluence is a great tool for this). Responsible stakeholders should be engaged in-person — try to consolidate to as few meetings as you can allowing for direct communication.

PM Tip: Pre-work is key. In advance of each meeting, come up with the set of questions you need answers to, and which teams you think should provide those answers. That way, the meetings are not a one-way dialogue with you explaining an overview of the feature, AND you’ll keep the momentum moving following the meeting.

PM Tip: Get the word out. One of the many hats you wear as a PM is that of a salesman. It’s up to you to get other teams/stakeholders excited about your feature, because they have many other features they are and could be building. Oftentimes, features will be built or not built based on the wishes of the senior leadership team, or other senior stakeholders. As the product champion, it is your responsibility to convince others to buy-in.

Coming out of these stakeholder meetings, and any follow-up discussions, you should have a good sense of whether it makes sense to continue pursuing. The stakeholder meetings should result in 1 of 2 outcomes:

  • Go: the feature is slotted into the Engineering team’s roadmap (and development sprints are dedicated to the feature). The PM’s role moving forward is to conduct additional research, refine product details, and develop requirements and designs before the product feature is ready to be built
  • No-Go: the product feature will not be built at this point based on the initial proposal

If No-Go, understand why and work to improve your idea. If Go, proceed to Phase #3 — Discovery/Research. In most cases, the feature requires more research and definition; however, in rare cases, the feature is fully fleshed out, and can be slotted into the Delivery roadmap assuming requirements and designs are created (see Phases #4–6).

Note: many companies plan their feature prioritization and release schedule on a quarterly basis, or some other regular interval across several teams. This can be separate from, but informed by, your stakeholder conversations. Make sure to have your Product Brief ready and to have these stakeholder conversations in advance of any broader prioritization discussions, so that you can advocate for your feature, and that others in the room will already be aware of the initial scope.

PM Tip: Don’t take it personally. When you spend a significant amount of time and effort thinking through and investing in a feature, the natural inclination (consciously or sub-consciously) is to become attached to that product. This could bring with it biases and could mean you are hesitant to ‘let go’ of the product, even if the support/value is not there. Try to maintain an objective point of view surrounding the value/impact of your feature to your company and users, and the quality of evidence you have to speak to this value/impact. Putting work into a feature that is not prioritized is NOT a failure, as the learnings will contribute to further exploration until the feature is optimized and ready to be built.

3. Discovery/Research: answer and confirm 3 questions (from the perspective of the user and the business): is the feature usable, impactful, and feasible?

  1. Is the feature usable? (usability: does it solve a user problem and provide them a solution that they can easily access)
  2. Is the feature impactful? (value/impact: does it add value to the business and to users)
  3. Is the feature feasible? (feasibility: based on current resources, can this feature be built and delivered on-time and with a level of quality ensuring it is impactful and usable)

Key stakeholders specifically for Discovery work: UX, User Research, Engineering, Analytics, Content/Copy, SEO — depends on the type of project, support functions (e.g., Legal)

To answer the above 3 questions, it’s best to consult directly with the end-user through user research as much as possible (see appendix for user research tools/resources), as well as looking at competitors and industry leaders. Upon evaluating the results of your research, you and the other feature stakeholders will again conclude whether the feature is ready to be slotted into the roadmap, the feature requires more Discovery/Research, or the feature is ultimately not worth further investment.

PM Tip: Inform always. Throughout the Discovery process (and as a good general rule), maintain consistent communication with your stakeholders, updating them of any progress and keeping them informed of any developments. This could take the form of a weekly status update email, or a dedicated slack channel.

PM Tip: Cut through the noise. When it comes to communication, maintain a single source of truth or master feature document. Given how many people are typically involved in the feature development process from beginning to end, it’s critical to consider how you communicate, and what you share. Try to maintain a single channel of communication with the key feature insights(splintered email chains to different stakeholders or numerous slack channels are a recipe for disaster).

Deliverable: detailed “Product Brief” or feature overview document including further developed problem statement and opportunity/solution.

4. User Requirements: document user stories.

These stories describe different cases or scenarios, including the context, an action taken by the user, and a resulting reaction. They depict how your feature should respond to the behavior of the user. Here’s an example of a story, which follows a structure known as BDD (behavior driven development):

The case is the “scenario” statement, the context is the “given” statement, the user action is structured as the “when” statement, and the feature reaction or response is structured as the “then” statement.

The purpose of this exercise is to think through as many cases and variations as possible. Ideally, for any and every action performed by the user, there should be a decided-on corresponding response. This prevents a broken user experience. While your goal should be to cover as many relevant cases as possible, there is NOT an expectation that you will be able to think of everything. This is an iterative process where both you and the Engineering team will tweak and improve the cases.

There are several variations on the structure and syntax of these stories, so you should consult your tech counterparts to align on their preferred format. This will be one of the key documents used by the developers to build your feature, so it is important to write it in a style they are familiar with. As mentioned above, find time to sit with the Engineering team to review and iterate on these cases.

PM Tip: Organize your work based on priority. In crafting your cases/stories, separate popular cases from rare/edge cases (i.e. create a separate section for edge cases). This creates a natural organization of which cases are more common/important to address.

Deliverable: user requirements document outlining user stories for your feature.

5. Design: advise and work with the UX Design team to develop high-fidelity design mock-ups (simulating the actual look and feel as opposed to sketches or wires).

You will likely already have some form of initial wireframes or sketches that were used in user research, and your user requirements will serve as the guide to building designs that reflect an exceptional user experience.

The UX Design team will typically own the development of the designs. As the PM, your role is to provide guidance, feedback, and to make sure your research and input is reflected in the designs. The designers should and will guide the visual layout, and you will ensure this matches the desired experience. You can and should also loop in relevant stakeholders and share the designs as needed.

Here is an example of a high-fidelity screen, including notes and feedback on the right side:

Source: Zeplin

PM Tip: Look forward. As you go through the Design (and Delivery/Build) processes, brainstorm ways the feature can be used for different applications, or additional future uses beyond what it’s being built for today. Rather than thinking of the feature as a distinct enhancement, think of it as version 1.0, both because you can continue to iterate on the feature once it’s launched (see phase #10), and because in all likelihood this feature can have multiple uses. Your UX Design and Tech counterparts will be much more invested and buy into the feature development process more if they understand that it is not a one-off development, but will inform and serve as the foundation for much more development work in the future.

Deliverable: high-fidelity design mock-ups

6. Delivery/Build: propel the feature from an idea and some documents into a finished product.

While this phase is called the Delivery/Build phase, the role of the PM in this phase can be summed up in one word: MEETINGS. It is your job to schedule and manage several meetings intended to set the stage for and guide the actual build, which is led by the Engineering team (i.e. developers).

The first step is to package up 3 main documents: your Product Brief, your user requirements, and your designs. These will be the materials used to build your feature, and may be referred to as the feature “requirements.” Once these materials are organized, it’s time to guide the translation and explanation of these documents through separate meetings with Build/Engineering stakeholders to scope and organize the work, UX stakeholders to review flows, and Analytics stakeholders to align on how best to track the impact of the feature once deployed.

Following these meetings, the Engineering team, equipped with the information it needs, begins to code/build the feature. Depending on the feature, this build could span either or both front-end and back-end development. Assuming the product development process is on-track, the build will take place in the sprints in which it was originally slotted.

Throughout the build process, you maintain contact with the lead designer and lead developers to ensure the product is what it is meant to be. This could take the form of weekly meetings, or less formal communications. The developer will likely have different versions of the feature (some less finished than others) to show you and hear your feedback. You should also inform supporting stakeholders such as the Marketing team about the feature and its timeline, so that any efforts to support the feature launch can be prepared.

Additionally, while the build is underway, you should take the time to begin developing a reporting dashboard, using an analytics platform such as Adobe Omniture or Looker. Getting started on this dashboard early on will save you time down the road, and will ensure that once the feature launches, you are not scrambling to measure the performance properly. If your feature is improving upon or adding to an existing process, make sure to include current performance — the status quo performance. This is incredibly important as it serves as a performance baseline against which the new feature will be measured and evaluated. Feel free to meet with and enlist the help of the Analytics team if relevant — they’re the experts on building analytics dashboards and can confirm that you’re building the dashboard correctly.

PM Tip: Stand up. In many cases, the Engineering team will hold daily stand-up meetings to talk through the day’s progress and surface any issues/concerns. This is a great and frequent forum for the PM and the designer to stay informed and ask questions as they come up.

Deliverable: compilation of requirements: brief, user stories, designs

7. Test: test pre-release and make sure the feature works as it should. The user stories come in handy in this phase, as they guide how you test the feature.

After the build work is ‘complete,’ the feature must be tested before being deployed. Typically, the feature will be in a QA or staging environment for pre-launch testing (i.e. a site that mimics what the end user sees, but is accessible only to those with permission). As stated above, use the user stories as a guide; put yourself in the user’s shoes and walk through the steps of the user story. Does the feature behave like it should? If not, document or ‘log’ any bugs you find and communicate them to the developer. Make sure to test on different devices (desktop, mobile, app, different phones/operating systems). If possible, make this testing a group activity — get the developers, designers, and you together (and anyone else with intimate knowledge of the feature) and divide and conquer. This saves time and you can have fun with it — a win-win!

There are 2 types of testing you can leverage (1 internal and 1 external):

  1. UAT (User Acceptance Testing): Cross-platform and device testing done as a group to allow for conversation and to consolidate feedback.
  2. Crowdsourced QA: Testing outsourced to additional users through a platform such as Applause to gather additional feedback and log bugs.

PM Tip: Plan backwards. It’s critical to leave enough time for testing before a feature is released. Oftentimes the timeline is tight as the feature approaches its scheduled launch date, and it’s tempting to compromise on testing time to stay on schedule. Do not fall into this trap! Make sure to leave several days for testing, and more importantly for fixing any bugs that come out of testing. Work backwards from your launch date to set aside this window.

PM Tip: Stay organized. There will be many people involved in QA testing, each testing different use cases across different platforms. It’s crucial to maintain a SINGLE record of what has been tested by whom, and any bugs or issues that require follow up work. This minimizes duplication, and makes sure everyone is on the same page. If possible, divide up the testing work upfront so that each person knows his or her scope and there is no overlap.

Deliverable: testing log outlining testing use cases, QA owners, results, identified bugs, and resolution statuses.

8. Deployment/Launch: review and double-check that all steps have been followed, teams notified, and pre-cautions taken. Give the green light to launch.

It’s the homestretch — launch is in sight. As the PM, it’s your job to pilot the launch, making sure each person involved knows their role, and that there’s clear communication about when the feature will launch. This communication is not only with the feature developers and designers, but also supporting functions like Marketing and the Business teams. These functions must not be forgotten, because the site placement and marketing of the feature is just as important as the feature itself. These other teams are instrumental in getting word about the feature to users, and making sure the feature is discoverable so that users can take advantage of it. The first step to users using the feature is users discovering the feature.

In advance of the launch, take the time to craft an email to be sent upon launch, recapping the work that went into the feature development, detailing why the launch of the feature is important for the business — why it’s exciting — and most importantly, thanking all of the people who helped along the way. Be sure to include these people on the email, offering them credit, as well as any senior leadership who have been/are invested on the feature development. It’s great to be able to promote the feature within the business, and to make a splash when it launches. Oftentimes, because you’ve been working heads down on the feature for so long, you lose sight of the fact that many people within the company probably don’t know about it, but would love to hear more. Take the opportunity to spread the word.

On launch day, it’s best to gather the core team together, so that if there are any pressing issues once the feature is launched, they can be dealt with swiftly. Assuming all of these precautions and preparations have been completed, it’s time to flip the switch!

9. Measurement: measure feature performance to validate/update assumptions about feature functionality, and end user reception & adoption.

You will measure several different types of performance, at several different intervals. The business stakeholders will want to see any immediate results, and while it’s tempting to show these results, it’s also important to make sure you have enough of a data sample to feel confident in reporting on the results. The types of results you’ll be measuring include, but are not limited to:

  • Site performance (traffic)
  • Financial performance (sales, margin)
  • SEO performance
  • Other business outcomes (e.g., on-boarding new partners, vendor funding)

The target impact across these performance types — the ROI — will have already been defined in the Definition phase (Phase #1), as well as the metrics used to measure these results. In Phase #6, hopefully you started building a reporting dashboard. Now is when that dashboard will come in handy, although this time with live data. Taking the time to schedule an Analytics touch point in that phase means that hopefully anything that needed to be in place in order to properly track the feature is in place. This covers various triggers or events related to the feature (in other words, if the feature leads users to take certain actions, the point and frequency at which these actions occur will be captured).

Measure the performance of the feature within the first week to get an initial sense of performance, and keep a constant eye on performance going forward. You can take screenshots to illustrate key metrics and results, and communicate performance to stakeholders.

Source: Adobe

Importantly, continue to document findings related to the feature. Regardless of whether the feature is hitting the targets that were originally estimated, try to tease out key learnings about what went right, and what could be improved. These post-launch findings can be added to the Confluence/wiki page that’s already set up. The reason documentation is so important is not only to retrospectively evaluate the feature development process (the good, the bad, and the ugly), but also to inform future feature builds. It’s vital for other PMs and developers to understand what has already been done, and what they can work from. There’s no point in reinventing the wheel each time a feature is being developed!

PM Tip: Tell a story with the data. Showing stakeholders a series of numbers is important, but make sure those numbers are accompanied by an explanation or story — how is the feature performing, and WHY is it performing as such.

PM Tip: Conduct a post-mortem. It’s best practice to gather everyone involved in the feature from inception to launch after the feature is deployed to participate is a “post-mortem” or “retrospective.” This in-person meeting is an open forum for anyone to talk about any suggestions for future improvement (whether process oriented or technology oriented). The more participation, the better. You can hand out sticky notes so people can write down and present their thoughts and suggestions.

Deliverable: feature performance dashboard, displaying key metrics and live data, organized in a digestible format.

10. Optimization: tweak and improve. Do everything you can to make sure the feature is optimized, and is doing what it is meant to do. This involves quick decision-making and execution.

Once the feature is launched, that doesn’t mean it can’t be improved and tweaked. View this final phase as the beginning of another test-and-learn cycle. Given the real data you now have about the performance of the feature, and hopefully hypotheses for why users are interacting with the feature as they are (or aren’t), it’s time to course correct. The more data you have, the easier it will be for you, along with your cross-functional feature collaborators, to make tweaks to improve the feature and its performance.

Some of these tweaks will be obvious — bugs or technical issues. Others will be more nuanced, and will be rooted in user feedback. Whatever the case may be, don’t hesitate to continue iterating on the feature. This is your chance to see how the changes you and the team make to the feature impact its performance in a quick feedback cycle with real users in a real production environment.

It’s extremely hard to completely optimize a feature, but attempting to do so marks the final phase of the product development process. Going forward, your ‘baby’ has grown up, and will require less attention. It’s time to turn your attention to a new project — the next great feature.

Appendix

Glossary:

  • Agile: a software development methodology encouraging rapid and iterative development cycles (as opposed to a single large, polished cycle and release known as ‘Waterfall’)
  • Back-end: the Back end developers would typically say they focus more on the plumbing, infrastructure and algorithms that drive the flow of data in a system.
  • Delivery: the build process through which the feature is developed and deployed
  • Discovery: the research process through which a feature is researched and initially validated and selected for development
  • Feature flag: A feature flag should be used when a new design or behavior is to be tested with a small amount of traffic before going live. The use of a feature flag allows the old code to still work as it did before, so that if the test is a flop the flag can be disabled and the new code removed before the next release. The opposite is true, if the test is successful the flag can be 100% enabled and the old code removed before the next release
  • Front-end: the Front end developers would typically say they focus on the last leg of the user experience — the UI (user interface), the browser, HTML, forms, etc.
  • Full stack: both front-end and back-end
  • MVP: minimum viable product, or as many prefer to think of it, product market fit. This is the most basic version of a feature that will accomplish the intended results. It’s a starting point, but one that is usable.
  • Product backlog/roadmap: a list of product features to be explored and/or built all fitting within certain themes based on an overall product vision and strategy
  • Product/feature: in this context — an enhancement or capability whose goal is to make a digital experience better or easier (either for a backend user or end-user/customer)
  • Sprint: a time-boxed period of time in which a piece/the entirety of your feature is built (typically 2 weeks)
  • User story: an Agile development tool describing feature from an end-user perspective. The user story describes the type of user, what they want and why

Resources/Tools:

Market/Competitive research:

  • Sales data: Nielsen
  • Market share, key players, and trends: IBISWorld, Euromonitor, Forrester, Gartner
  • Statistics: Statista

Company performance:

  • Traffic/site performance: Adobe Omniture, Google Analytics, Amplitude, Mixpanel, Heap
  • Sales: SQL/Looker
  • Vision/Priorities: hear from leadership; speak to business teams

User research:

  • Onsite user surveys (specific to your site): Hotjar, Sprig
  • Market user surveys: Google Consumer Survey, SurveyMonkey
  • User purchase behaviors: Google Consumer Barometer (free)
  • Prompted user testing: in-person lab testing, usertesting.com, usabilityhub.com, Qualtrics
  • NPS: Promoter.io
  • Tree-testing (decision trees): Optimal Workshop

Documentation:

  • Confluence (wiki)
  • Notion
  • Slab
  • Sharepoint
  • Dropbox/Paper
  • Google Docs
  • Dropmark (visual layout)

Project Management (with tech teams):

  • Jira
  • Airtable
  • ClickUp
  • TFS
  • Trello
  • Asana
  • Wrike

Prototyping:

  • InVision
  • Figma

QA Testing:

  • Applause

Design:

  • Figma (design, prototyping)
  • Zeplin (sharing and discussing designs)
  • Sketch (creating designs)
  • Axure (creating designs, interactivity)
  • Balsamiq (creating designs — basic)
  • Pencil (creating designs- basic)

Learning & Development:

  • General Assembly
  • Codeacademy
  • Udemy

Articles/Books:

Videos:

--

--

Jonathan Moed

World explorer, Founder @ Startup Universal, Forbes contributor. I write about tech startups & what it means to start-up in international startup ecosystems.