Sam Ostrom's Work Portfolio
Product Feedback and Data Analysis
Summary: I implemented a product feedback process and analysis for both trial and current customers in order to better understand what features and items that we needed to produce. I also helped train our Sales and Customer Success departments on follow up questions to ask when we received direct or indirect feedback
Why did we undertake this initiative?
When I first got to Autify, our product feedback was a mess. Most feedbacks were not being submitted and if they were, they were put into a single bucket. Product was implementing features based off who was speaking the loudest internally or whatever engineering wanted to produce. There was very little analysis being done on the feedbacks either and no qualification of whether a feature we produced would actually have any impact and how much risk there was to us. Sales and Customer Success were under heavy fire from customers to produce new features. Our feature production was extremely slow.
What did I do to complete this initiative?
This was another job that spanned my entire career at Autify. There's so much I did that I can't list it all but here are the major highlights:
1) I went through EVERY single feedback from every one of our channels (email, support tickets, slack, demos videos, etc) and categorized them into buckets on what product they pertained to, whether they needed more clarification, what part of the product they where referring to, importance of customer, and a variety of other factors. This took me months to get caught up as we had 1000's of feedbacks that were either never submitted or not categorized at all.
2) During my feedback collection and categorization, I introduce a tool to the company called productboard that helped with feedback collection and analysis. I implemented the process for feedback submission and worked with product to help guide them with feedback analysis.
3) After we got the feedback process implemented, I analyzed which feedbacks occurred the most, connected them with the customers that submitted them and scored them based off revenue impact. Higher revenue customers made feedbacks have a higher impact score. I also did this for potential revenue gains by trial customers.
4) I took these scores and my feedback analysis to my leadership to show them where our product was severely lacking. After hearing these, leadership asked me to work with engineering to analyze what features we could produce, how long each would take, and do a risk analysis on each of them.
5) After that, I continuously worked with engineering to weigh what features we would produce and put them on our roadmap.
Results of the initiative?
I'm very proud of the results of this initiative. There are a lot of things that came from it, some of them unforeseen. Below is a list of major results.
1) We actually began to see features being produced faster and with more intention behind them that delivered valuable impact to revenue. We saw an increase in sales conversion and a decrease in customer churn rate.
2) Sales and Customer Success felt a lot more clarity around our roadmap and could tell clients what features were coming in production.
3) Product and Engineering were actually happier with the process after it was implemented.
4) Cross functional departments felt more connected with the product. Internal surveys showed that employee satisfaction with feature production and roadmap transparency improved.
5) Educational documentation was continuously asked for so much so that it spurred me on to implement Autify University, our help center. I go into this in a separate section.