#362 – Bridging the Gap: AI’s Impact on Market Research with Justin Chen
Podcast

#362 – Bridging the Gap: AI’s Impact on Market Research with Justin Chen

Summary

In this episode, Justin Chen reveals how split testing can revolutionize your e-commerce business. We dive into the power of AI in optimizing Amazon product images and explore PickFu's journey in democratizing consumer research. Discover how AI-generated insights can refine your product line and the future of split testing. Join us for this tran...

Transcript

#362 - Bridging the Gap: AI’s Impact on Market Research with Justin Chen Speaker 1: Welcome to episode 362 of the AM PM podcast. This week, my guest is Justin Chen, one of the co-founders of PickFu. We're going to be talking about the importance of split testing your images, your videos, everything about your product, and a whole lot more in this episode. It's going to be a lot of good information. Hope you enjoy it. Unknown Speaker: Welcome to the AM-PM Podcast. Welcome to the AM-PM Podcast, where we explore opportunities in e-commerce. We dream big and we discover what's working right now. Plus, this is the podcast where money never sleeps. Working around the clock in the AM and the PM. Are you ready for today's episode? I said, are you ready? Let's do this. Here's your host, Kevin King. Speaker 1: Welcome to the AM-PM Podcast, Mr. PickFu himself, Justin. Justin, how are you doing, man? Speaker 2: Good. Finally on the podcast. Super happy to be here. Speaker 1: Yeah, this is awesome. For those of you that don't know who, what PickFu is, PickFu is one of the, is actually I think the original split test, testing service in the space. There's a few people that have kind of followed in your footsteps since then, but you're the old G's, the guys who basically started it all, right? Speaker 2: Yeah, that's right. And I think you're the one who put us on the map for e-commerce. Speaker 1: Me? What did I do? Speaker 2: You mentioned us, what was it? Was it Global Sources back in 2018? Speaker 1: Yeah, I might have. That's when you started, right? Speaker 2: Well, so we started way before that, but we weren't really on the map for Amazon and e-commerce until I think you mentioned it. And then I think Manny mentioned it on this podcast, on the AM-PM Podcast. And then we definitely saw a lot more Amazon sellers coming in. But before that, we weren't really targeted towards Amazon or e-commerce. Speaker 1: Oh, really? Was it called PickFu then as well? Speaker 2: Yeah, it was still called PickFu. So my partner John and I, we built it, oh my God, I mean, probably over a dozen years ago as a side project. And we were building a completely different business, like not, you know, it was a website, it was something we were doing before this. And we were working on a redesign for that website actually. And so we were trying to figure out which way to go and we couldn't decide between the two of us, we wanted to get actual consumer feedback. So being engineers, we built this as a side project. Kind of threw it up on the internet as things happen, put a paywall on it, you know, PayPal button back in the day. And yeah, it kind of lived there, took on a life of its own for a while and, you know, meandered through a few different industries. Speaker 1: So you guys are doing all the coding. Were you just the, just the two of you or did you have a team? Speaker 2: Yeah, so back then, it was just the two of us. And we built the initial version of it. We had a lot of... We thought entrepreneurs would be our target use case, being entrepreneurs ourselves. But we found a lot of authors using it for book titles and book covers. Tim Ferriss, for our work week, he was famously talking about using Google Ads to test his book titles. And so people were trying to get more data-driven about testing book titles. And so We had a lot of self-publishing authors and using them to test that kind of stuff. So that was really cool. We have a lot of best-selling authors that use PickFu for that kind of stuff still today. In fact, James Clear was an early user of PickFu as well. Speaker 1: Oh, really? Atomic Habits, yeah. Speaker 2: Yeah, that's right. Yeah, yeah. So really cool to see that kind of content coming through. And then to kind of meander through, you know, mobile gaming is an interesting space for us. A lot of just software startups using it. DuckDuckGo, if you know, the privacy search engine. Speaker 1: Yeah. Uh-huh. Speaker 2: Yeah, Gabriel, their founder, he was one of our biggest supporters in the very beginning, loves using it. And so they continue to use it on kind of almost every creative decision in their company. So whether it's ads or UX or anything like that. Speaker 1: And when it first started, you were kind of jobbing out the backside because you got an army of people. We'll talk about that here in a minute that actually participate and help do this. But in the beginning, it wasn't like kind of like almost like a jobbing out to like a Mechanical Turks or something. And then you grew big enough to where you could actually start cultivating your own audience. Speaker 2: Yeah, so we tried to stitch together. Yeah, we did start with that. And now we're stitching together different panels and trying to bring kind of the best quality to our consumers without having to know like where that quality comes from. Speaker 1: So for those of you that don't understand, if you've never heard of PickFu, PickFu is basically a split testing service. It started out pretty basic where you could compare two images. You could say, like what he said with authors, they could say, which of these book covers should I go with? Which one looks more appealing? Or you can ask a question, whatever question you want. But you're trying to isolate, rather than just asking your friends and family who are close to you and might not give you their honest opinion, or they're a little bit more jaded, this total strangers that don't know you from Adam, don't know anything about what you do, don't know your personality, don't know anything and get their opinion. And it's not just vote for which one you like is best. And maybe that's how it started. But you guys now have Where they actually have to say why and we'll talk about some of that in a second. There's different variations of that, but they don't just say, I like A or B best. They say, I like A best because, and they write a short little couple of sentences or a paragraph in some cases of why. And it's really great because then you get to get an insight into not only which one they chose, but why they chose it. And sometimes they'll say stuff that you didn't think of. You're like, oh, I didn't realize everybody thought yellow makes you want to throw up or whatever they might say. And you see a pattern of that and you're like, holy cow. And so that's where it started. So the audience that does this, basically the way it works is you put... There's a number of different tests and we'll talk about some of those in a minute. But at its core basic, it's It's image A against image B. And you can do more than one image. You can do four images, whatever. But it's image A or image B at its core. And then there's a group of people. You select how many people you want to actually give you opinions. And the most common number is probably 50. I mean, you can go up higher to 200, to whatever you want. But the most common number, that way you get a statistically valid response. Because you need 30 or more, actually, is technically in statistics, the critical number. And so you do 50 and these people, you know, within a matter, it depends on the time of day you do it. If it's during the middle of the day on a weekday, usually you have your answers within an hour. You know, it might take a little bit longer if it's, you know, at night or on a weekend or something like that. The audience that gives these feedbacks, where do they actually come from? I mean, how big is that audience and where do those people come from? Speaker 2: Yeah, so they're all people who like to be paid for doing surveys. And so we tap now, now we tap into all the same enterprise market research panels that, you know, say Procter and Gamble would use when they're running their longer form market research studies. And so in the US, we are able to tap into over over 10 million different consumers who are, you know, coming from different avenues who just want to, you know, get paid or get rewarded for doing something. And so The challenge with all these panels is maintaining a high level of quality and then, like you said, being able to target the audience by different demographic attributes. And so that's kind of the layer that we're bringing to it is being able to target like by Amazon Prime members who are like You know, like to drink tea and have cats or something like that. And being able to do that and make sure that the quality of responses, the responses that they give you explaining why they chose something are of high quality and that they're genuinely, you know, trying to do a good job. Speaker 1: So these people, they get paid a nominal amount for every survey they answer. So they're not, are they just doing yours or they're part of some big panel that they're also doing other companies. They're sitting at home, you know, for a couple of hours, one afternoon, it could be a soccer mom. It could be someone that's just at their office and bored during their lunch hour. And they just like to make a little bit extra money. They just bang through different surveys. Like they get one from you, they get one from somebody else. And for an hour, they make 10, 15, 20 bucks on the side or something. Is that kind of how it works? Speaker 2: Yeah, pretty much. Yeah. So they're, they're definitely doing other people's surveys as well. Like you said, they're typically just bored or they want to do something while they're watching TV or whatever it is. So like kind of side hustle money. Speaker 1: And how many, you said it's 10 million that you have access to? Speaker 2: Yeah, yeah. Obviously, like, you know, not everyone's on at a given time. And, you know, finding certain attributes can be a little difficult. So while most of our, our polls do finish in under an hour, if you were to do a lot of targeting, like for a very niche audience, it could take a little bit longer. Speaker 1: And what Justin means by that is you can go in there and you can just let it be open. So you can say, here's my main product image number one, here's my main product image number two, which one do you like? At a minimum, you should probably say that you want prime members, but that's still most of the US. That's not going to limit you too much. And then you can go, there's a whole range of demographic and other filters where you can choose. I only want women over 50 or I only want people who work out three to four times a week or whatever it may be. And you can filter that down. And so that's what he's saying, that that can slow it because they have to fit those criteria. Which is better for you in a lot of cases if that's your target because you don't want some random dude that has never worn makeup in his life voting on your makeup box, for example. I mean, you'd rather have someone that's actually in the market to buy that stuff. Do you have the ability to actually Have people that actually have bought versus just their, that's their interest. Do you have access to like databases? Like we know this, this person bought from a Mary Kay cosmetics or something. And you can't say exactly where, but you can, can you overlay that? So we actually know they're spending money on those things. Speaker 2: We're not able to verify that. It is self-reported. But we do ask questions like, what are the types of products you've bought in the past 6 to 12 months? And so we do ask specifically, are you buying things in electronics category or pets or toys and things like that? But we're not able to validate it against their actual Amazon shopping history or anything like that. Speaker 1: And so most of them you said can you get the answer with even if you do like 200 you get the answer within an hour in most cases? Speaker 2: Yeah, I think if you do a general response poll, which is basically first come first serve, it would probably be done in under an hour. Speaker 1: And now you've expanded it now beyond just single images, you can test titles, you can test bullet points, you can test a number of things. Talk about some of the different options that are that are in PickFu for what were most commonly used. Speaker 2: Yeah, so like you said, when we first started, it was for comparing two things, A versus B. A lot of times it was an image. It can be text, it could be video. We see animated GIFs a lot, or even audio. And so basically any kind of media format, for example, we've used it to test voiceover actors for videos or theme music, that kind of stuff. And so the head to head was the original one, we opened it up for up to eight options. So when you're when you're comparing, let's say eight variations of your logo, what actually happens is we asked the respondents to rank every single one. So they're not just choosing, oh, I like, you know, the seventh option out of eight, because if they just chose one, you may not get to any kind of, you know, statistically interesting answer, you're just going to get like 12 and a half percent or something like that. So, by asking them to rank every single one, we're actually able to run an instant runoff, kind of like what they do in a lot of elections now, and you actually take into consideration their second place vote, their third place vote, and actually come up with a winner. It's our most preferred winner based off their ranking. And so, that's kind of what we do when it's more than two options. Speaker 1: It gets eight points. If they rate it two, it gets seven points and so on down. So you could have something that's rated number two by a lot of people that ends up actually being the best choice, not necessarily the one that got the most first place votes. Because there might be a, the number one choice might've had 30 people like it, but a lot of people gave it a seven. And then all those people or vast majority of them gave, rated it number two. That actually means number two is actually probably statistically better, right? Speaker 2: Yeah, yeah. So that's why we kind of call it the most preferred, right? Like it's not horribly offensive, but most generally liked by the audience. And it's worked out pretty well. The other way we actually do it is if you want, and this takes a little bit longer and does cost a little bit more as we do a full round Robin tournament, Where we'll actually do A versus B, A versus C, A versus D, A versus E, and all the permutations and essentially do kind of like a tennis tournament, have them match up against each other. And the winner of the tournament is actually the winner. And that's a ton of responses. And we'll actually see a lot of our larger enterprise companies or gaming companies who just want all the data. And that's the kind of test that they like to run. Speaker 1: And you also, I mean like for Amazon sellers, you can actually, you have an option where you can bring in like imports a listing from Amazon or actually imports a listing. Is it the listing? No, it's the search results, right? Speaker 2: Search results, yeah. Speaker 1: And then you could normalize it. So you could actually, you know, if you got 20 people selling a product, your dog bowl, you could bring in all 20 dog, uh, 19 other dog bowls, uh, from Amazon page and then put yours in one randomly put yours in one of those spots, your image. And then you can go in, you can normalize the review. So, so actually make everybody have four star reviews. So there's not, One that's everybody has the same price and everybody is prime, for example, and then it's gonna get some really good insight on where the eye goes and what people are clicking on, right? Speaker 2: Yeah, that's right. So we kind of call it a mock-up tool generator. And so what happens is you can put in your ASIN and maybe you're not even selling it. Maybe it's a category you want to go in like you were talking about, and you want to see if you're going to be able to compete against all the other dog bowls or whatever it is. And so you can put in your product concept, maybe the title you're thinking about using, maybe it's the price you're thinking about going in at and seeing if are you going to be able to steal some of the clicks against the top category leaders. Or maybe you're already in the category and you just want to test out a new main image and see if that's going to affect how people perceive you against your competition. So it's a really important way to gauge in a hypothetical safe sandbox, how you're going to do with either changes to your listing or before even investing into a new product line. Speaker 1: That's cool. And then recently, earlier this year, too, you added some AI to it where it does like the sentiment, right, where it can actually read the reviews and what the people write, you know, their voting and actually come up with a general sentiment, correct? Speaker 2: Yeah, so since every one of our polls includes written feedback, like you were saying, that's a lot of text to analyze. And so if you're doing 50, 100, 200, even 500 responses, that's a lot of text for any person to read through and kind of pull out all the themes and the broader likes and dislikes. And so what we've done is we've integrated with AI to feed all those results into AI and generate a nice three-paragraph executive summary That highlights the pros and cons and, you know, the even possible next steps that you may want to take with your poll. And then we also break out likes and dislikes by every single option using that AI as well. Speaker 1: Now, what's the advantage of using something like PickFu to do testing versus just using Amazon split testing, where you can split test the price and people are voting with their wallets versus their mouse, really, in that case, or testing images, you know, running Facebook ads to see which one gets clicked on the most. That's what a lot of people have done before they knew much about PickFu is run 10 different Facebook ads, see which one gets the most clicks. Why should we do your service versus something like that? Or should you do both? Speaker 2: Yeah, I mean, we definitely recommend doing both PickFu and Amazon manager experiments. I think it's just different parts of the process. So the way we recommend doing is using PickFu earlier in the process. So as you're trying out different creative directions, and maybe you want to try out layouts that Or product designs or color variations that may never see light of day. Obviously, that's what you would use PickFu for. Once you come down to maybe your top two main images, or you've centered in on what the prices are going to be, then test it live with manager experiments. The benefit is that PickFu is just going to be so much faster. Obviously, manager experiments is going to take a number of weeks to gather that data. Even Facebook ads may take a while. And with both of those approaches, you're not going to get any written explanations why. So that's why it's not great for experimentation, but maybe just fine tuning what you already know are good options and you're just trying to eke out the best performance out of it. Speaker 1: I remember earlier this year, my trainer, he'd made a journal and he's like, Kevin, I don't know anything about Amazon. Can you help me out? And so I offered to help him out. I was like, I'm not going to do the work for you, but I'll help you get going and explain it. And so he went out to on, he's like, well, what do I need to do? I said, you need to create a bunch of images, gave him all the steps that you need to do. So he went on Fiverr and found somebody and created like a main image for him. And I said, one's not enough. We need like four or five of them. So he gave, he had four or five of them done. Some of it was just slight color changes. Some of it was wording changes or layout changes, a way of showing it. And then we did kind of what you did. We didn't use the round robin option. We just did one after another, but we put it A, number one against number two, and then number three against number four, and then a winner of both of those. And then came out with a winner out of the four. And then I told him and he was happy. He's a little bit surprised, reading some of the comments like, wow, they didn't like this and is a little bit surprised. But then we took it and I just grabbed, I don't think I used the actual tool where you have the 20 or whatever it is. I just grabbed four thumbnails of competitors, the top competitors, the ones that are doing the best, that are kind of similar to him. And we just mocked up, we put those four against each other and said, which of these four would you like the best? And he came out like, I think he got, I forget the exact numbers, like 6% of the vote. Which was like, it was a shot to the gut for him. He was like, you know, he was proud of this thing. And like, it's a nice thing, but, and I've seen it, I've held it in my hand, but people were just like, no, we like these others better. So we read the comments, like you said, and they said, we'd like that we can see this page or we'd like, we can see this example or this color. So he went back and made a bunch of changes on it. And we ran the same thing again against the exact same competition. And this time I think he went up to, I forget the exact number, 16 or, Maybe 20% somewhere around in that ballpark, which says that, yeah, he would like to have gotten 70% of it, but getting that percentage still means that he can take enough people, there's enough depth in that field where even if he's in the fourth or fifth best seller, he's still going to have some decent sales. Speaker 2: Yeah, I think you bring up an important point there that you don't always have to win the test when you're testing against a competition, right? You just, you want to make improvements and going from 6% to 20% show that you're probably going to get some clicks and probably going to get some sales. And then secondly, you know, the, an interesting way that a lot of people end up using us is that they, they come to test out their main image because they think that's the issue with their, their listing, right? Like, Oh, I'm not getting the clicks. I need to test on my main image. And they'll start testing either their main image against their competition or just some variations they came up with. And the written feedback actually comes back as people saying they don't like the product, they don't like the design, they don't like the branding, they don't like the packaging, kind of like what the feedback you were reading about the journal. And Then they realize, oh man, I should have been getting validation much earlier, much earlier, you know, before I spent, you know, $10,000 on this inventory and having it sit in Amazon, just trying to optimize the main image, I should have been testing out my concepts because, you know, these hypothetical tests, if you do it early enough, will cost you almost nothing, right? Like you could have, you could just have your designer mock something up, you could use 3D renders. The whole point is to do these hypothetical tests before you invest a ton of time or money into a product line that maybe is not going to do that well. And maybe you still would have done it at 6% or 20% taking away from the competition, but maybe there's like five other products that you also had ideas for that could do actually better. Maybe that's worth investing your time there. Speaker 1: I know someone using AI to do that. They're actually analyzing products on Amazon. They'll say, hey, I want to do a new, I don't know, a new dog bowl, for example, a new slow feed dog bowl. And they'll go into AI and do prompting and actually engineer the image. Speaker 2: That's cool. Speaker 1: And then come up with a cool design there, you know, refine it a little bit mid-journey or whatever it is. And then load that up onto PickFu and actually then compare it against the competition and do one of these competition comparisons with a number of competitions and see what people say. This is before they've ever contacted a manufacturer or anything. See what people say. And I know one person I think told me they have to get at least What was the number, 50 or 60%? And they'll refine it. If they don't get that, they'll read what they say. They'll go back and have the AI re-render it. And based on what the people say, and then they'll come back and they'll try to get it to where if they can get 50, 60, 70% of the people to vote for them, they're like, okay, I think we can launch this product. And then they go and actually do the sourcing and everything. And so there's, that's a great use of something like this for, what's it cost to do 50 people? Like, I guess it depends on the poll, but 50 bucks or something like that. And so, I mean, you might spend a few hundred bucks or maybe, you know, even 500 or a thousand bucks if you're doing a lot of different testing, but that's a lot better than spending 10 grand or 20 grand on molds and making a new product and bringing it over just to find out that you can't compete. Speaker 2: Yeah, I mean, I think the pairing of AI generation with actual human feedback is, is the perfect combination now, especially because our humans are giving you that written feedback, and you're actually going to be able to have that actionable insight into what to then refeed back into the AI. Okay, like people don't like this color, you know, or this design, like just tweak it this way, like they said, and it's just going to hone in on on something you're going to be so confident about going to market with. Speaker 1: Now you guys just do images, right? I can't compare videos, right? Speaker 2: You can do videos. Yeah. So you can do a video. Speaker 1: Is there a length maximum or anything or how does that work? Speaker 2: You just upload it directly. I think 30 seconds is included with any basic poll and then it will just kind of prorate it if you go over that. So if it went to two minutes, it would probably cost Forex as much. So yeah, you could do comparison to video or you could just upload a video by itself and then ask people, what do you think about this product concept? Maybe it's like a 3D render spinning or something like that or whatever it is. Speaker 1: Can't compare two videos to each other. How do I know they actually watched the whole video? Speaker 2: That's a little more difficult. Yeah. So we're working on doing some tracking around that. But we do make sure that they have enough time to watch all the video. And then you can probably tell by the responses as well. So depending on how you phrase your question, you can probably get them to elaborate on what's some of the specific things that they like or dislike about each video. Speaker 1: And you also have speaking of video, I mean, you actually have an option where you can now kind of eavesdrop in on someone's thought process. So they actually they do like a screen capture video as they're walking through the product and then scrolling up and down and they're commenting so you can listen in to what they are saying. Speaker 2: Right. That's right. Yeah, yeah. So we do have a screen recording capability where you might just want to get people to talk through either them looking at an image or a listing or whatever it is, because there's definitely different thoughts that you surface when you just get people to freeform as opposed to just writing their final So that's a really interesting way to kind of get into people's minds. And then, of course, at the end, we'll continue to do that AI summary to kind of synthesize all those random thoughts that people had. Speaker 1: So I don't have to go listen to a bunch of grandmas and people with accents and whatever talking. The AI will take transcripts and summarize it for me. Speaker 2: Yeah, exactly. Yeah. So you can just go look at the whole summary and just gather all their thoughts in one place. Speaker 1: That's cool. So why do you think a lot of people aren't using this? Is it because they just don't want to spend the money? I mean, yeah, okay, do a few tests. It's maybe a few hundred bucks by the time you do a series of tests. But why do you think it is, what's the barrier to more people not actually using something like this to really hone in on what they're doing? Speaker 2: I mean, I think the first thing is that I don't think people realize that they can do something like this. I think even when they hear about it, they might still have hesitation around it being too complex. And we still hear that even for people who have heard of PickFu. It's like, oh, I don't want to figure out how to use it. And I don't think they're giving it a fair shake to just log in and try it. We try to make it You know, as simple, stupid as possible. But we've also heard some hesitation around asking the right question. And so, you know, some people, especially when they're paying money, they're not sure like, is this the right question? Am I phrasing it right? And so what we've tried to do recently is have a lot more pre-built templates and questions so that you're not having to come up with a phrasing of it. So if you're trying to test an Amazon main image, for example, we've got a preset test. That's got a pretty good question. Obviously, you can still tweak it if you want to add more context, but that reduces the friction. Yeah. And then I think there's, like you said, I think the costs as any entrepreneur or Amazon seller, they're watching every single dollar that they spend. And I think there's hesitation and just adding yet another thing that they might potentially have to pay for. We've tried to reduce some of that friction as well. We're lowering the minimum number of responses. I know you mentioned 30 and 50 responses as being more significant, but In our minds, something is better than nothing, at least getting some directional feedback. So you can go as low as 15 responses. So that might be only $15 to you. And we were experimenting with some $5 combos. I know that you saw where it's only five responses. But And while you wouldn't want to make a huge decision based on five or 15 responses, it's still interesting to hear five people to give you feedback, right? Like if you're going to go into a coffee shop, you probably would only talk to five to 15 people anyways. And so getting the thoughts from those kinds of strangers is still going to be helpful, right? Because there still might surface some objections or whatever it is, and maybe enough to give you the confidence to spend 50 to get a little bit more significance. Speaker 1: You said that some people are worried about asking the right question. What are some tips you could give to make sure you do ask the right question? What are some phrases that you shouldn't do or that you should do to really make sure you're getting solid data? Speaker 2: Yeah, definitely try to be as plain, plainly worded, I guess, and unbiased as possible. So and simple. So obviously, don't ask a ton of questions. Ask, ask one question. One of the most common ones is, you know, which one would you buy? Obviously, that's very generic, or which one would you click on, you could add a little bit of context. So if you were shopping on Amazon for A teapot, which one would you buy? And that might give the appropriate level of context without skewing it too much. What you don't want to do is you don't want to like tip your hand and let them know which one is yours. You know, why wouldn't you buy the red one or something like that? It's too leading, right? So just try to be as plainly worded as possible. And then I guess the other tip would be the targeting is something that's great, but it can also be a downfall for some people. I think some people target too granularly and they have such a narrow view of who their end user of their product is, but not necessarily who the buyer is. And so I would keep that in mind, right? Because for example, men's products, maybe you have something that's strictly for men, but If you look at your brand analytics, maybe it's actually women who are buying it. And so you need to know who's actually buying it. Maybe there's different times of year where the men are buying it. Other times of year, the women are buying it as gifts. And so those kinds of things or just being too nervous about thinking about who cooks in the kitchen or something like that. You know, oh, maybe you think that only women cook in the kitchen, but maybe a lot of men are actually buying this product or whatever it is. So, you know, I would be careful about being too targeted. Obviously, like you said, if it's makeup or something like that, and it's makeup for women, then you can you can be a little bit safer about that. But try to use as much data if you have analytics as possible to, you know, verify that the right the right targeting. Speaker 1: Is there a way to actually test to see, like, do you have an option to where you can segment the responses by demographics as well, right? I think that's in some of the PDF reports you can download, it'll show you like if a certain age group liked it more than another age group or a certain gender liked it more than another gender or something like that, right? Speaker 2: Yeah, that's right, actually. So there's kind of two aspects to demographics for our poll. So the first one being targeting who actually answers it. So that would be restricting who can actually see it and respond. So if you only want men to answer, that's the first option. The second aspect of it is we actually ask a bunch of demographic questions To allow you to segment the responses and also get to know your respondents. So those would include like, you know, age and gender income, or any of the other traits that you can target by but say you just want to know about if they use makeup or if they like to cook in the kitchen or if they have pets. And so what will happen is, They'll not only give the written explanation, but they'll give all that demographic information. When you hover over every response, it'll tell you the demographic profile of that person. So you'll know that it was a male who has, you know, dogs and loves to drink tea. And then we have demographic charts where you can then filter and segment like, oh, okay, so all the men like option A, all the women like option B, all the tea drinkers like option C. So that might be an interesting way if you don't know who your product is going to resonate with. We've actually seen some customers run larger polls, you know, 200 or 500 responses, general audience, just to see like, okay, I'm really not sure who we should target. I just want to see who this is resonating with. Maybe it's a product concept, really early stage, and they just want to see who they can start targeting afterwards, right? So just doing it very broad first. Speaker 1: What's one of the weirdest things you've seen someone test? Speaker 2: Weirdest things? Speaker 1: Something like what the heck is this? Speaker 2: We see a lot of adult things. Speaker 1: Oh yeah? Speaker 2: Yeah, we see a lot of adult products. We see a lot of steamy romance novels, which are always fun. A lot of salacious covers and things like that. So yeah, I don't know if there's anything super weird. I think there's a lot of personal use cases that we've seen. So testing The kids, kids names, you know, like, Oh, I'm about to have a boy and kind of like testing different names. So those are always fine. So a lot of interesting personal uses, even like LinkedIn or dating profile pictures test your Tinder Tinder pictures or something, which outfit should I wear? You know, we see people doing that kind of stuff. Oh, really? Yeah, yeah. Speaker 1: What about you have a case study or someone example of you can give us of someone who just radically changed their business after doing a test. They were at X level and then they came in split tests and all of a sudden their sales went up 10,000% or some crazy thing. Is there some example that you can give us around that? Speaker 2: Yeah, I mean, so Thrasio, who's, you know, one of the bigger Amazon aggregators, one of their Their main product that they kind of got their notoriety for was that angry orange pet deodorizer, the cleaner that they use. When they first had that product, they wanted to do a rebrand of it and so they actually used PickFu for that. They were trying to justify a big repackaging, a big new packaging change and I think it was going to cost them $50,000. So before they did it, they wanted to make sure that it was going to be validated and so they ran a bunch of designs through PickFu Eventually came up with their new orange bottle packaging, and they launched with that, immediately saw sales improve after that. And from there, they continued to expand the product line, all using that same packaging. And so that was a tremendous success for them, obviously, a much bigger project than a lot of people are working on. But we've had other really simple ones. We had a case study with Yes Bar, and they were just changing their main image. And they agreed to isolate everything else. They didn't change anything else about it, but they ran a $65 PickFu test, just testing a few different images and made the tweak to that winning one and immediately saw their click-through rate increase by 12%. And so immediately netted thousands of dollars in new sales the next week and ROI was super easy for them. Speaker 1: So it's not just smaller Amazon type of sellers using this. You said you have a lot of like Fortune 500 companies and stuff and eight Madison Avenue agencies and stuff using your service, right? Speaker 2: Yeah. So we actually have quite a few traditional CBG companies using us now. I think our goal is actually just to get enterprise grade consumer research into the hands of the people doing the work. So that includes individual Amazon sellers that might be mom and pop shops, but it also includes The eCommerce and Amazon managers at larger CPG companies, people creating new products there. It's just a much faster way to gather consumer research than what's previously been accessible. Before, large companies had to work with their consumer insights teams, they had to work with a market research consultant, kind of run these really long form studies. And it wasn't something that you could rapidly iterate on a design or, you know, a concept very quickly. And so we're definitely seeing much more adoption in the enterprise space. We recently got our SOC 2 certification, which is a security certification for For software that a lot of corporate companies look for as a checkbox to show that we take security very seriously as a company. We're trying to build the best tech platform for self-service consumer research out there. Speaker 1: Is PickFu just for the US market or do you have international markets as well? Speaker 2: Yeah, we recently expanded to international with Australia, UK, Canada and Germany. And we have a couple more countries coming out later this year. And what's actually cool about the German audience and any other country that we add in the future that's not English speaking is we'll actually automatically translate the question into the native language. So the question will get translated into German. We'll show you what it is if you want to verify it or you could have your, you know, if you speak German, you can just write the question in German. The response forms are all in German. We collect the responses in German and translate it back into English. And of course, we'll also show you the actual German responses as well so that you can validate it. And so we're trying to enable this, you know, anyone-to-anyone consumer research. So, you know, you could be a Chinese seller, Getting feedback from a German audience and you could use our app completely in Chinese. We've localized the app into Chinese as well. So you could use our poll builder in Chinese and write your question in Chinese and it'll go translate into German and you know, everything back. So that's kind of the vision that we're working with. Speaker 1: That's cool. I remember before I knew about you guys, this must have been like 2015, I was coming out with a line of makeup mirrors. And I got five or six samples from China, different factories. And I'm a dude, I don't know what the heck women want in a little pocket makeup mirror, but I saw there was an opportunity with the keywords and the research on Amazon. So it's like, all right, this is a good extension of a brand I was doing in the beauty space. And I didn't have a PickFu, so I actually walked into a bar and just sat there and any woman that would come into the bar, it's like, do you mind? Give me your opinion. I'll buy you a drink or something. Give me your opinion on it. You know, like you said, I got 10 or 15 people maybe over the course of four or five hours. And I mean, it's cool that they could actually hold it and touch it, you know, and check it versus online. But in even the older days, you know, companies have spent thousands of dollars on focus groups and stuff, you know, bring 20 people into a conference room and Just to get feedback that now you can get for 60 bucks or less in an hour or less. It's crazy where the technology, where we've come. Speaker 2: Yeah, I mean, I think what's been amazing is just watching this democratization of everything, right? Like, as you know, anyone can sell online, anyone can sell in a marketplace, or they could spin up a Shopify site. And so we're just trying to, you know, add that other piece of being able to validate with Consumer Insights, because now that you can sell online, you're competing against every other large company out there, right? There's no reason to Any Amazon seller can't outsell, you know, the larger brands, but they are using consumer insights. They're using their very expensive market research consultants and all that kind of stuff. So, you know, you should be validating as well, right? And this is the tool that you can use to do that. Speaker 1: So can I test, I can test, you said earlier, you can test video. So I'm just brainstorming here. So if I've got a spokesperson and I have a male spokesperson, a female spokesperson, a country hillbilly, a proper nerd or whatever, and I create the exact same video with the exact same script, but all I do is just change out the person. And this could even be an avatar using AI or something. And I can do a test to see which one resonates best with the audience. I could do something like that as well, right? Whoever they believe is the most believable or something, I can do that. Speaker 2: Yeah, that's a great test. Yeah. So voiceover, actors, actual hosts of videos, you know, theme music. When we advise people who are testing video, we try to get them to test different components of the video before they compose like that final thing. And then, you know, it's just like a product, right? You want to get feedback as you're developing it. So, you know, we've even seen people test storyboards, actually. It'll be like literally hand-sketched storyboards of like, Oh, here's like the shot by shot of what I'm going to be doing. What do you think? And it's really cool to see things at that idea stage. You know, even text ideas, you know, it doesn't have to be anything that's actually 3d rendered or, you know, actually Photoshopped. Like it could be pretty raw. Speaker 1: Is there an NDA that the people who are participating, the voters signs that they can't, like, if I did something like that in the, how do I, you know, some people are going to like, well, if I show this prototype of what I'm doing, what if they just copy and steal it from me? That's going to be a worry of some people. Speaker 2: Yeah. So they agree to a non-disclosure agreement before they are even able to see the test. So that is, That's kind of the best that we can do. Obviously, people can breach it. But we've never had that issue in the past. And we've worked with a lot of pretty sensitive IP at this point. And so I think, I know that every entrepreneur has that worry. Obviously, as entrepreneurs, we have that worry. But I think after you've been in the game long enough, you realize that Everyone wants to execute more on their own idea as opposed to going as the only someone else's idea in most cases. But, you know, I can't really say it's never going to happen, but we do take as many precautions as we as we can. Speaker 1: Is there something like with AI where you've done thousands if not tens of thousands or hundreds of thousands of tests where you could actually make recommendations yourself? Or if I ran a test and here's 50 people gave me their feedback, that's great. But could you then say, okay, Kevin, here's a dog bowl. We've ran tests for 37 other dog bowls in the past. Analyze that and combine it with what my people said and then make suggestions based on that, like some sort of AI suggestions or something based on that. Is that something that could be possible? Speaker 2: It's definitely possible. Yeah. I won't make any promises, but it's definitely possible. It's a good idea. Speaker 1: Yeah. I mean, there's a lot of cool stuff that you could do or even where you could jump on a call. Like we're talking about the focus groups of old where you sit in a conference room or something where I can even I'm just brainstorming here, but something that I would find cool is like, let's get 10 people on a 15 minute Zoom call. And so that I can hold it up and show them three dimensionally, or I could even send it to them. I can like, hey, you want a free product, go here's a code or whatever. To go get it on Amazon or somewhere and okay, go over it now or I'll send it to them, you know, you can pass their address or whatever, and then, you know, give us some time to be delivered. And then we say on, on Tuesday, you know, at eight o'clock, you know, we're going to meet and I'd be willing to pay them, you know, better money for that. So they have it in their hand. And it's a live group thing. And, and I think you could, depending on the product, you might not want to do that for all the products, but something like that could be really, really cool too. Speaker 2: Yeah, for sure. I mean, there's, and I think we might, there might even be ways to kind of approximate that with different AI models and, you know, interaction mechanisms. So a lot of interesting opportunities now. Speaker 1: Yes, so the important thing here is what's the most important message to get across whether you use PickFu or something else? Why should you be testing? Speaker 2: The thing that I want to emphasize the most is definitely that people should be testing. I think it's just a way to de-risk your investment. We've had so many stories of people Avoiding catastrophe by being able to not go into a product line or not overstock a certain color variation. That's a super common one, right? Maybe you're stocking up different colors of umbrellas or whatever it is, or different patterns of a puzzle. And you need to make a decision on how much to buy of each or even if you should diversify so much. And so those are our favorite stories where people say like, hey, I use PickFu. I chose the color variations I did because of the PickFu feedback. Didn't get left holding the bag on a bunch of ugly colors. So it's such a risk minimizer that it's kind of a no-brainer. And of course, it's going to improve your sales, right? By minimizing risk, it's going to improve sales, both by better product selection and product development, but also through things like main image and A-plus content, all that other stuff that you can improve. So yeah, whether you use PickFu or something else or actually going through the bar or the coffee shop, it doesn't really matter. You just need to get some kind of feedback from your target consumers. Because as any entrepreneur knows, your friends and family, they stop giving you useful feedback after a few times of asking them. And they're honestly probably not even your target audience. So you do need to get some actual strangers to give you feedback on things. Speaker 1: Awesome, Justin. This has been great. If people want to reach out to you or find out more about PickFu, what's the best way for them to do that? Speaker 2: Yeah, definitely check out the website, P-I-C-K-F-U dot com. We have a very helpful customer success team there as well. So if you need any help, just ask on the live chat. You could also hop on a strategy call with them for free. So that's always a good way to get started. You can reach out to me on LinkedIn. Pretty easy to find under Justin Chen and connect with me there. Speaker 1: Awesome. Thanks, Justin. Appreciate it, man. Speaker 2: Yeah, thanks for having me on Kevin. Speaker 1: Great stuff from Justin. The guys over at PickFu are awesome, really good people. It's a great tool to use. I use it personally in my business all the time for all kinds of testing. So if you haven't tried out PickFu, go to pickfu.com. It can make a big difference in your business and your conversion rates and how much money you make. Give it a shot and let me know what you think. We'll be back again next week with another incredible episode here on the AM PM Podcast. Make sure you hit that subscribe button. Go back and download the episodes that you've missed. A lot of really good ones over the past year. And before we leave, I just want to leave you with some words of wisdom. Just remember a confused mind always says no. A confused mind Always says no. See you again next week.

This transcript page is part of the Billion Dollar Sellers Content Hub. Explore more content →

Stay Updated

Subscribe to our newsletter to receive updates on new insights and Amazon selling strategies.