Aaron Smith is an executive product leader, most recently Chief Product Officer at Pattern, an ecommerce acceleration platform. He has also held product leadership roles at Amazon, Grubhub, and TrueCar. Additionally, he has served as a board member of the Rutgers University CX Advisory Board, and as a Product School instructor. He now runs his own consulting firm, Rebel Diamond Consulting, helping companies improve tech investment ROI, build an innovation culture, and scale at low cost using offshore teams.
In our conversation, Aaron talks about how to identify (and steer away from) vanity metrics and measure actual business impact results instead. He shares how ridding vanity metrics instills trust around success throughout the organization. Aaron also discusses the importance of understanding the root of customer problems rather than focusing on prescriptive feedback.
Product management at the center of tech
Can you tell us a little bit about how you got your start in product management?
I actually started in marketing and was running what felt like a small business within our company. This was in the early days of the internet and ecommerce, and I spent a lot of my time working with the tech team. A lot of our opportunity was improving the shopping experience so we could do marketing. I managed the warehouse, the logistics of shipping, ordering, and more. It was a great experience. I spent a lot of time with the tech team and, without knowing, that kind of experience played a role in my becoming a product leader.
Why did you end up moving out of marketing into product management, and what would you say is the aspect of product management that you’re most passionate about?
Tech is right at the center of everything when you’re in ecommerce. That’s really what attracted me to working in product, and I also love end-to-end and feeling the impact that you can have on the company. If it’s anything related to ecommerce, the tech is the front door to the entire experience. There are so many teams that are important, such as customer care, logistics, and marketing, and I love how product connects them all.
Understanding problems themselves
You’ve worked at Ancestry, Grubhub, and Amazon. What did you learn from those roles in the early days of product management?
When I worked for Ancestry, I realized that we get so much valuable data from our customers. This data not only tells us how customers are using the apps, but is also a source of direct feedback via various surveys and customer support feedback. This is super valuable information, so I created a monthly email called “voice of the customer.” It ranked the biggest things that customers cared about. It was eye-opening for people because not everybody had that type of perspective.
Data-driven analysis that tells us what customers care about and their biggest pain points is vital. When I joined Amazon, the only way I could describe the experience is that I was a fish who’d been out of water for a long time and got back in. The whole company is so customer-centric. It’s hard to be disconnected from the customer when you work at Amazon because everybody’s laser focused on the customer. Amazon has a number of leadership principles, but they all roll up under customer obsession. Working there emphasized always being aware of the customer, what they care about, and how we can make an awesome experience.
How would you say that the mindset of trying to focus on the customer in ecommerce also overlaps with your role in product management?
It’s all wound together. A lot of product managers get stuck by thinking they should listen to customers and do whatever they say. Customers have a never-ending supply of ideas that may or may not be useful, but the important thing is to understand the problem itself. What pain points are they experiencing that are stopping them from being able to do the things they want to do? That’s critical. That’s where you find what I call diamonds, the big opportunities.
A lot of product people get stuck by thinking they should talk to the customer and ask them every single question they can think of about the product. Customers have a never-ending supply of ideas, but the important thing is to understand the problem itself. Listening to their ideas is good because sometimes you can get some good things out of it, but that prescriptive feedback can often be super off.
It’s important to step back and look at pain points first. Then, you can figure out the value between making a step level improvement or taking an innovative new approach. That should be what product is about — gleaning those opportunities and then taking that step back and innovating. And if you listen too much to the prescriptive feedback of customers, you end up making small incremental improvements or no improvements at all.
Can you point to an example of how you identify those pain points or kind of work to reframe the customer’s point of frustration into a pain point that you can actually resolve?
One experience that sticks in my mind is around Amazon’s scheduled delivery. It used to be a nightmare. I took over that group and within the first two weeks there, I had two “Jeff Bezos escalations.” This is where Jeff Bezos would take a huge customer complaint and forward it to one of his SVP’s with a question mark on it. That set off a bomb because that meant it would trickle down to the person who ran that group.
As that group’s leader, you then not only had to address the issues and understand them inside and out, but also come up with a plan of action and share that with the executives in a super tight timeframe. The pressure was on and there were so many things broken with it.
In one instance, someone ordered a TV off of Amazon and had to stay home from work three days in a row because the delivery kept not happening. When it finally showed up, they watched the delivery person throw their TV from the truck onto the ground, slide it through the snow and the mud, then drag it into their house and across the carpet leaving mud and snow tracks. They then set the TV on a table and removed the packaging, only to reveal that the screen was totally shattered. The customer was frustrated and wanted the driver to return it, but the driver said, “I can’t take it back. You’re going to have to call Amazon for that.”
The points of failure were in every little interaction, and that’s the tip of the iceberg. They had interactions with customer care and other things that all just kind of went sideways. To me, that was just a wealth of opportunity. I found that we were able to make some really great progress by starting with things that were just broken.
Avoiding vanity metrics
How do you encourage your product managers to take these pain points and look past the maybe initial panic about them to develop an innovative solution?
Too many product teams focus on just delivery. They get a lot of requests and become a feature factory, and it’s really easy to get sucked into that kind of work mode. But the most important thing is getting business results. How do we know that we win? Instead of looking at vanity metrics that look cool but didn’t really move the needle, what is it that will have a direct impact on customer experience, which improves stickiness, which improves company health?
That becomes exciting because when people know what to focus on, they become laser-focused. If a feature or development is halfway through and the person realizes it isn’t going to move the needle, they’re not going to waste their time to get the points for delivering it. They’re going to iterate on what they can do to best get the result. It becomes gamification of work as you track results that you get, and it’s exciting.
Why is it important to go back and look at those results after the product launches to consider what needs more iteration?
To me, tech and business are the same thing. And if you don’t measure results, how do you know you’re doing the right thing at all? You must be beholden to vanity metrics if you think you’re doing a good job but you don’t ever measure the results. That, to me, is table stakes.
It’s amazing how many teams don’t do that and how many companies struggle as a result. I see a lot of OKR lists that are chock-full of deliverables and not focused on results. OKRs are all about what’s your objective and then key result, and how do you know if you achieved it?
If the tech isn’t aligned with getting results, I’d like to review the hypotheses and knock ideas around to make sure we’re doing the right things. I teach my teams that if they find something that’s going to get a better result, just do that. Don’t try to go through a bunch of administrative processes to get it approved, because at the end of the day, what I want to hold you accountable for is not every little thing you promised at the start of the year, but whether or not you achieved your results.
Instilling trust across the organization
Could you give an example of how you have worked to either refine the process of setting OKRs, or talked through what a key result should look like to your team?
I have a really good example from Grubhub. We’d see product people sending out victory lap emails saying, “We released this feature and 10,000 people used it today — they must love it!” Or, “We increased conversion by 55 percent. This is a huge success.” I would talk with finance and they’d say, “Well, if conversion improved as much as the product team said it would, our company would be 10X what it is.” As a result, it was a false success. We could not connect the dots between what the product team reported out versus our actual business results.
More great articles from LogRocket:
Along the lines of OKRs, the question was, “What metric do you look at that tells us if we’re winning?” One of the key ones the finance team brought up is Daily Average Grubs (DAGs), which was the number of orders we got per day. If we can build something that improves that number, that shows us customers love it. So, we decided to create A/B tests that focused on DAGs. We’d measure how many orders one test got per day for 50 percent of the traffic. And then how much does this new experience get versus a legacy experience?
I had weekly meetings with the CEO and the CFO to go through our hypotheses, tests we’re running, and results achieved. The very first time we brought this in and had a big win, we’d been doing these meetings for a few weeks. The CEO turned to the CFO and to his right-hand woman and said, “Is this real or not?” They said, “It’s real. We are the ones measuring it. We can see it directly impacted our sales. We know it’s coming from this.” It changed the whole game for trust between the product and finance teams, and rally, the whole company.
You alluded to how vanity metrics can damage trust within an organization. How can folks focus on avoiding vanity metrics to make sure they’re measuring and celebrating genuine success?
I’d say to have multiple groups engaged and on the same page about what is a win. If product is saying it’s a win, even if it’s not bending metrics and if the business can’t attribute what we’re doing to what’s happening, then it’s almost like it didn’t happen. Getting a shared view of what the world looks like is the first place to start to avoid vanity metrics.
When I was first at Grubhub, every group had a different way of measuring results, and this is not unusual for companies. Not just different metrics but different data sources. That was debate number one. We had to pull in multiple groups to get alignment on what’s the source of truth for data that we can agree is the right place to pull it from. And then what’s the right calculation, because everybody had different ways to calculate it.
If you can get on the same page about the data source and the calculations, you’ve already won the battle because everybody sees the same thing. If you can build that into your test results or where you’re measuring success and everybody sees the same result, that’s powerful.
When you are in that setting and you do have that shared understanding, how do you move through the challenges that arise?
When you arrange things from a leadership perspective, you can get on the same page with all the different groups. But, it’s really interesting when you go down a few levels and ask, “What’s the most important thing that you should be caring about? How do you measure success? I think that’s a great question to ask.
It’s almost scary sometimes when you go down a few levels because these are the people that are on the front lines doing the work. As head of product, once you’ve gotten alignment on what’s important, that has to be distilled to the team.
For example, at Grubhub, we bought some cheap thermometer posters to go up on the wall. Every time we had a win that got put into production permanently, we would draw a line on it and color it in. We’d put the name of the feature or test with its corresponding number of DAGs and watch it go up. We had a goal that we set with the executive team of hitting a certain number of DAGs. By getting everyone on board realizing that this is what we care about, suddenly, we heard all these different people across the company talking about DAGs.
That was exciting because it was clearly permeating through the culture. Everyone realized we need to build things that people love so they order more. And this exact metric tells us if we’re doing that. We set that goal for the year and hit it within six months. We ended up doubling the goal and then we beat that too. There’s magic when people have alignment, like finding a rhythm in a three-legged race. That kind of focus from everyone is really a game changer for achieving company results.
LogRocket generates product insights that lead to meaningful action
LogRocket identifies friction points in the user experience so you can make informed decisions about product and design changes that must happen to hit your goals.
With LogRocket, you can understand the scope of the issues affecting your product and prioritize the changes that need to be made. LogRocket simplifies workflows by allowing Engineering, Product, UX, and Design teams to work from the same data as you, eliminating any confusion about what needs to be done.
Get your teams on the same page — try LogRocket today.