As with many startups, we’ve gone through a great deal of change at TaskRabbit. In the two-ish years I’ve been here, we’ve worked on a diverse array of products while also trying various organizational structures. Since the question of how product teams are organized and how they work is a common discussion topic, I figured I’d throw our hat in the ring to describe how we operate.
While most people are familiar with our core consumer product where Clients hire Taskers to get their tasks done, we actually have quite a few products and codebases that complement and support the core experience:
- Web - the core hiring experience, the Tasker onboarding experience, our marketing site
- Client iOS - the core hiring experience on iOS
- Client Android - the core hiring experience on Android
- Tasker iOS - a separate app built expressly for our Taskers- this app allows them to find work, get hired, chat with Clients, and report hours worked.
- Tasker Android - same as above, but on Android
- Admin web - the admin site is used by everyone internal at TaskRabbit. It allows our member services team to support both Taskers and Clients through the entire task lifecycle should they ever need help. Our marketing team uses it to manage content and our trust and safety team uses it to review any shady tasks that come in. These are just a small handful of the tools we’ve built to support our internal business.
- Platform/API - all of our client applications use the API including our web product.
There are currently two Product Managers in the company to support all of these products, and we split our responsibilities by client or platform. Melissa currently owns the iOS and Android products while I own the web, API, and admin products. We are each responsible and have autonomy over our respective backlogs. Despite the wealth of products, the teams are split into three groups: Android, iOS, and web/API. Each of these respective teams has the fairly standard agile rituals of sprint planning, scrum, retrospective, and grooming (read more here.
We have a joint roadmap that we maintain in Trello as well as on a physical wall. Here is a photo of our physical roadmap:
Trello functions as both an active and passive form of communication. High-level initiatives and projects are represented in our Trello board and any interested party is able to subscribe to said projects. As the project takes shape and decisions are made, it’s the PM’s responsibility to update the Trello card so all relevant parties are notified. Projects move through a few phases:
Icebox -> Backlog -> On Deck -> Discovery -> Dev/Design -> Done
- Projects in the Icebox aren’t currently prioritized.
- Projects in the Backlog are prioritized and slated to be started in the next 90 days (roughly)
- Projects that are On Deck are to be started in the next couple of weeks. The purpose of On Deck is to communicate to any potential - stakeholders that if they are interested in this initiative, they can get any relevant data or thoughts ready for the project to be kicked-off.
- Projects in Discovery have been kicked off and we’re now nailing down the exact problem we’re solving and doing research to further validate or invalidate our assumptions. The goal is to have narrowed down our options and to have a solution defined. Often at this point we have wireframes and rough UX prototypes such that clear user stories can be written.
- Projects in Dev/Design are exactly as described— based on the user stories that have been written, engineers and designers pair together to bring the wireframes and rough UX prototypes to production-level functionality.
Once the project is deployed, we always measure (if possible) the impact of any given change or feature, using some combination of MixPanel, Google Analytics, and good ol’ mySQL.
Once a week, Melissa and I create a Product Review presentation that is presented to the executive team and shared with the entire company. In this Product Review, we review how the various products are performing with regards to our business goals, present the Product Roadmap, and then dig into analyzing how features performed. We try to use the Product Review as a way to communicate how the products are performing as well as a means of initiating discussions for where we believe the product and business should go. As often as possible, we try to go into the Product Review session with clear answers to why our various metrics are performing the way they are, have clear next steps for projects, as well as an opinion and asks of Marketing and Operations based on how our features have performed.
This post is just a high-level snapshot of how we work, but it’s worth noting that our process is very fluid. As our team grows and changes, we’ll almost certainly be changing our process in order to be (hopefully) more efficient and effective.