Unpacking Programmatic Sampling with JD Deitch: Challenges, Innovation, and the Future of Research

by Karen Lynch

Head of Content

Join JD Deitch as we explore the challenges in programmatic sampling, AI’s role in research, and innovative strategies for better data quality.

Listen to the episode

In this episode, JD Deitch, veteran market researcher and advisor, joins us to dive deep into the complexities of programmatic sampling, data quality, and the evolving role of AI in research. We discuss how the "enshittification" of sampling has impacted the participant experience and what innovative solutions are emerging to address these challenges.

JD shares his thoughts on the engagement crisis, polling representivity, and the critical importance of collective action in the research industry. Whether you're a seasoned researcher or exploring new methodologies, this conversation provides actionable insights to navigate the future of market research with confidence.

In this episode, we’ll explore:

  • The Enshittification of Sampling: JD explains the term coined by Cory Doctorow and its relevance to the state of programmatic sampling.
  • Polling Challenges: Why representivity and engagement issues in surveys can lead to significant errors in public polling and beyond.
  • The Role of AI in Research: How technology is reshaping qualitative and quantitative methods while exposing new risks.
  • Client Expectations and Data Quality: The delicate balance of price, speed, and quality in client demands—and why that needs to change.
  • Future Outlook: Why scalable qualitative methods, synthetic data, and innovative panels are reshaping research—and what’s next for the industry.

Resources/Links:

You can reach out to JD on LinkedIn

Many thanks to JD Deitch for joining the show. Thanks also to our production team and our editor at Big Bad Audio.

Transcript

Lenny: Hello everybody. It’s Lenny Murphy, with another edition of the Greenbook Podcast. Thank you for taking time out of your busy day to spend it with myself and my guest. Because, as usual, I actually do have a real guest. It’s not my multiple personality, so—although that would probably be kind of fun if we did that. So, I’m going to let him introduce himself because you need to hear JD’s voice and his background in his own words. So JD, welcome.

JD: Lenny, my man, thank you for having me on your podcast. I’m really glad to be here. So, I’m Jonathan Deitch, JD, as most people know me. I’ve been in the industry for a pretty long time. I stopped counting at 20 years. I like to say I’m a 20-year veteran; it’s a little bit longer than that now. By training, I am a researcher. I started out client side, moved supply side, and then had a really interesting career that took me a bunch of different places, responsible for big research programs, and then responsible for big panels, and then sampling, and all sorts of things. And I have recently started doing some advisory work for startup and scale-up firms in the industry. But as I think we’re probably going to talk about today, I am still quite active in talking about what’s going on in the industry as well. I hope that does the job, Lenny, for a biography. Is that good enough?

Lenny: Yeah, good enough. I mean, you should mention where you live, and your hobby as well, just to give—because it’s just a cool— 

JD: Okay, yeah. Yeah, this is what Lenny and I actually really want to talk about, not really sample quality today. But I was very lucky to have had an expat assignment, probably, oh gosh, it’s now 12 years ago, where NPD brought me to Europe, into France, and I had a chance to live and work with the NPD team here in what’s now Circana, I guess, in Europe. I stayed for three years, loved it, didn’t want to go back. Had to go back. Then I got very lucky in a number of different ways, which includes marrying a person with a European passport, and so we live in France now. We left the US, sold everything in 2014, been here for ten-and-a-half years in our second stint. And my hobby, as Lenny knows, is that I’m a massive cheese fan, to the point where I had a cheese blog. I have an Instagram account with a thousand followers on it, and I actually do judging for cheese shows once in a while. And it is—it tickles me as much as it tickles you, Lenny [laugh]. 

Lenny: I just think that’s so awesome. As a cheese head, in many ways, that’s just a great thing. But we probably—we may squeeze that into the conversation at some point, but I think folks probably want to listen more about the other more researchy aspects of your experience. And particularly the eBook that you just published, which has the best title of anything that I have seen in our industry, ever: The Enshittification of Programmatic Sampling. And I’ve been waiting to say that all day. I woke up excited.

JD: It’s a great word, isn’t it.

Lenny: It is. It’s just, “I get to say, Enshittification.” What a fantastic word.

JD: And Lenny, I have to say, there are some people who know me—and yes, I will occasionally use words of that nature—but this isn’t just gratuitous obscenities here. It’s actually a term which, if you read the ebook, comes from a writer by the name of Cory Doctorow, who wrote about what happens in platforms when they start to seek yield and profitability, and growth. And I read that article, and it was what sort of inspired me to write the ebook. Which, Lenny, as you know because you were there, sort of when I first started talking about this ten years ago now, is the lamentable state of the participant experience in online surveys. And so, yeah, it tickled me. I have to say, nobody forced me to use the little poo emojis for the page numbers, but I did. I figured, like, I needed to be slightly decorative, right? Anyway.

Lenny: We have to make it fun, especially for a serious topic like that. And by the way, listeners, if you have not checked out anything by Cory Doctorow, you should. Actually Down and Out in the Magic Kingdom I think is one of his books. And just fantastic writer.

JD: Very good writer. Very astute writer.

Lenny: Yeah. And he coins terms like enshittification.

JD: Enshittification. Yeah.

Lenny: Yeah, so [laugh]. All right, so I was actually was talking to somebody before this, and telling them about our conversation, and I said, “You know, this is kind of like the Upton Sinclair, you know, Asphalt Jungle type of scenario where, you know, there’s a whistleblower who’s talking about how the sausage is made. And do we really want to eat that sausage?” 

JD: Right, exactly. The meatpacking industry in Chicago. I actually read that?

Lenny: Yeah. Yeah, I had read in high school. And there are, there’s some parallels between meat-packing industry [laugh]—

JD: Yeah, about how the sausage is made, indeed. Yeah, mm-hm.

Lenny: Yeah. Now, we’re not going to name names, right? That’s not what this conversation is about, but I think you bring up some points that I’ve been making for quite some time, and—as has many others—that what have we traded in the drive for efficiency and scale? And I think that trade-off is fundamentally quality. And quality has multiple dimensions. I agree with you, it really comes down to engagement. I think fundamentally we have an engagement problem, we have an incentive problem, we just don’t make this a very attractive option for consumers to spend time with us. However, the demand has increased to such an extent, and because of programmatic, it lends itself towards fraud. And what was, we could—kind of a tolerable level is no longer tolerable. We can’t hide it anymore. And I don’t mean hide it in a malicious way. I don’t think that’s what anybody ever attempted to do. Well, most people didn’t attempt to do—

JD: But you bring up a good point, Lenny. I mean, it’s not like we started out to make it a crappy experience, right? I mean, and of any industry that should be conscious of this, like, we should be really conscious of it. And everybody was really conscious of knowing that we needed to give our participants a decent experience so that they gave us reasonable data. So, it’s not like it started out that way, but it’s that series of events that ultimately led us down the road to a place where, both in absolute terms and relative terms, the experience is dreadful. And, you know, the consequences of that, that’s where you sort of wonder if people really understand the consequences of it because they’re not entirely obvious, right?

Lenny: Yeah. So, we’re recording this for the week before Thanksgiving, on the 21st, and we’re just on the other side of the election. I don’t want to get into the politics, one thing’s always paying attention to—and I’m betting you were too—were the polls. And my take going all the way through was we are missing a huge chunk of the population who are simply not engaged. They’re just not engaging. And that’s—I believe the polls were off, and I think I was proven correct that the polls were off, solely because of the engagement issue. That was it. Just observing, kind of, macro-dynamic things happening in the world, the splintering and fragmentation of media, the political polarization, blah, blah, blah. There’s a hundred and one reasons why that would be the case, but fundamentally, it comes down to we were not giving a chunk of the population in the United States, who weren’t real inclined to talk to us to begin with, a reason to change their opinion. And to your point, that was a very public example of when we screw up, it can take people by surprise.

JD: Yeah, I think it’s a good observation. I mean, one of the things—because I posted on this on LinkedIn, and it quickly became the most, sort of, engaged-with post I’ve ever made, right, the polls, if you look at them, sort of on a state-by-state basis, we’re not unreasonably off, right? I mean, you know, when we think about—and we have to acknowledge, I think, a really important part about this, which is that the people who are the serious pollsters know what it means to do sampling, and they’re not just, like, chucking something in a router someplace and trying to, you know, get gen pop sample at, you know, 20 cents a complete or something like that. And these are people who are also using probability-based sampling techniques and, like, they understand the nature of the problem, and yet, at the same time, whether the participant experience is relatively bad or bad in absolute terms, they are missing a chunk of the population. What we missed in the United States—what the industry missed—was, even if the polls were pretty close, a couple points in either direction changes radically the story that got reported, right? I mean, this is not a landslide election by any stretch of the imagination. Like, if you go back and look at the Reagan landslides or, I mean, you know, those were landslides where, like, you know, he got—Clinton got close to 400 seats in the Electoral College, right? I mean, this is—Trump will end up with 300 and change or something like that. But that’s way different than the, oh, it’s going to be a nail biter, right? I mean, that’s what it looks like from an electoral college point of view, even if the popular vote was really, really close.

Lenny: Yeah. And I think that we could go on in that specific topic, right? The reason I brought up as an example is that, sure, from an electoral standpoint, two to three percent—or two or three points—okay, that not particularly wide range overall. But if you’re Proctor and Gamble, and we know that we have demonstrated a problem with engagement with the population, and those are prospective customers. So, my concern, right, where I brought up as an example, is that we have—we often don’t see those challenges with representivity in samples in commercial business. It’s private, they’re discrete projects, you know, nobody’s really talking about that. The polling gives us an example to see what one of the systemic challenges that exists within the industry in a very visible way. And that’s why I wanted to drill into that of, okay, so we have this example that was clear. We were missing chunks for the population. To me, that speaks to the engagement problem. It speaks to that there is a systemic issue in how to reach these populations, and I don’t think it’s going to get any better. I think it is only going to get worse because of the ongoing fragmentation of media, and you know, all the competing noise for engagement within the research process. And especially if the a big chunk of that population is, let’s call it Millennials and Gen Z, for lack of a better term—I think there’s more nuance to it than that—but it is a gift for them to give us their time, and we don’t give them a very good reason to do that on a regular basis, is my perspective, and that has consequences when we’re dealing with a kind of a mass-scale, industrial level industry for the bulk of quantitative research that doesn’t deal with nuance, it just deals with, kind of, brute force, programmatic. And the emperor may not be fully naked yet, but, you know, it’s stripped down pretty far at this point, and it’s becoming pretty inescapable, which is what your book—your ebook—I think, helped demonstrate.

JD: I mean, look, I think there are two issues, right? One of them—and it’s an inescapable one—is that there is an engagement problem. And whether—you know, initially, this was a relative problem, right? Because when the internet became interactive, and wasn’t just a series of online brochures anymore, and all of a sudden, you know, our online forms—which, I mean, let’s face it, research has been done the same way since we were in paper and pencil, right? It hasn’t changed fundamentally online—then we started to look a little bit sort of long at a tooth there, right? And then it just got to be even worse over time. It’s not like the industry hasn’t been signaling this, but there’s this question as well is, if the emperor is naked or even close to naked, is anybody looking? Does anybody care, right?

Lenny: [laugh].

JD: And I think that—I mean one of the arguments that I think is equally inescapable is that, if this were a problem that the industry or its clients were really willing to solve, we might have solved it already, right? And so, you know, there’s this question of whether clients care. Because the signal that supply is poor—I mean, like, if you want better supply, then you know, somebody at some point of time is going to say, “I don’t want to pay for garbage anymore. If you’re telling me I need to pay more for better sample then—and I’ll get better data, then I’ll do that.” But there’s no demand signal in the market right now to do that, and that’s something that is easy to prove. I mean, Lenny, I wrote on your website about the way that programmatic would create this sort of transparency around pricing, and in doing so, would push prices down, right? It would sort of make margins a lot smaller, and create price compression as well. And that has borne itself out in a way that’s just blindingly obvious to anyone who’s ever looked at pricing in an exchange, or any sort of environment where you can see multiple suppliers, right? But then the clients, you’d sort of expect them to say that, if somebody’s ringing alarm bells—and the industry, over the past two years, has rung some pretty loud alarm bells—if somebody’s ringing alarm bells of that nature, then, why hasn’t it really gotten better, right? I mean, I don’t see any signals coming from clients. And one of two things is happening, right? Either they are unaware that the situation is as it is, or they are aware but they don’t care. And when I say, they don’t care, what I mean to say is that the way that they’re using online surveys is such that they’re making an iron triangle trade-off, right? We all know about the iron triangle. It’s about price, quality, and speed, right? And so, they’re saying, “What we’re going to do is we’re going to optimize for price and speed. We know we’re not going to get the best quality. That’s okay. It’s good enough for us.” Right?

Lenny: Yep. And I think that’s exactly what’s happening. 

JD: If we’re going to therefore, see a change in quality of the participant experience or fraud—and the two of them—I think most people are focused on fraud right now because it’s the obvious problem to solve, but if you get rid of fraud, then you’re left with the real problem to solve—but I think being able to solve those problems requires there to be a signal from the clients. And short of that, then we’re just going to continue talking and saying, “You know, we know what the problem is. Trust us. We’re working on it.” And it just becomes another paper at a conference, or we parade a bunch of respondents up on stage, as Pete Cape did some dozen years ago or something like that, and they all talk about how they trick us, and we all clutch our pearls, and then, you know, we all go away and go back to the same thing we’ve been doing.

Lenny: So, I hear you. I’ll push back a little bit. There is signal that is overt. I make a point of ensuring that there are conversations on this topic at IIEX on stage with brands. P&G continues to lead one. 

JD: Look, I agree with that. The industry is talking about this as a problem. The industry has been talking about this for a dozen years, though. Actually maybe even longer—

Lenny: Longer. 20 years. 

JD: Right? If you go back to the Gray Matter Report which was published, was like 2008 2009, I mean, when he did mystery shop—I can’t remember his name off the top of my head. Oh… anyway, he mystery shopped these panels and talked about the number of invites that was going out on a regular basis, and all the rest of that, right, and it was shocking to everybody. I mean, I first wrote about participant experience in February of 2014, and I said the same thing, right? “It’s impossible to understate the damage that the industry is doing itself through its cavalier approach to participant experience.” It’s almost a direct quote. 

Lenny: Well, so the other signal, though, is that they are speaking with their pocketbooks. And it gets cloudy because of the way that we track those things, both GRIT and SMR. But the shift, the spending shift, is moving away—

JD: Away from surveys, yeah.

Lenny: —away from surveys. Right. And the survey is still being used for the things where the challenges can be absorbed to an extent, right, tracking and testing, so—particularly testing. And then along came AI, right? And we are seeing that shift occur right now, into now, the scaling of qualitative. And that is that’s happening super fast across the board. And almost by default, qualitative addresses some of the fraud issues because it’s just harder to fake that, you know, it’s a real person in qualitative. And we’re seeing the scalability of AI, and now we’re also seeing rapid expansion of the synthetic sample models, which are based off of, as long as you got good data to build, to train the personas on, it doesn’t take that much, and it can answer a lot of the questions that, traditionally, we’ve used research for as well. So, I think the signal is very loud of, you guys didn’t fix this shit. Technology is now allowing us to do things differently. We’re just going to spend differently now. 

JD: Yeah. I mean, look, there’s a couple of points that you brought up there. I mean, the shifting of the spend and, sort of, the exploration of AI, I mean, I think that’s right, and I see that, at least on a small-scale, in some of my advisory work, I think about the, sort of, the companies that I’ve been working with that are scaling up. I feel like we should say that there are actually decent panels out there, and there are people who understand that, actually, if they keep their panels to themselves, and they nurture them well enough, they can get good things out of them, right? I mean, you know, there’s a client that I work with that does a whole lot of passive measurement, and they have a very healthy panel. I’ve seen the metrics. I mean, and it’s working for them, and it’s, you know, helping them create a great business. I wanted to speak to, you know, when you talk about more difficult with AI, sort of, asking the, you know, the conversational survey model, right, I think there’s only a matter of time before the AI of the researcher is talking to the AI of the hacker. 

Lenny: Yeah, yes [laugh].

JD: I know hacker is not the right word, either. Of the fraudster. Let’s just call it that. You know, there are a lot of people who we see in social media who are like, “Oh, I check my open ends, and you know, I can validate that.” Like, I’m sorry, you know, in a year from now, it’s not going to work anymore, right? I’ve seen a lot of instances of AI, and people trying to experiment with synthetic data and synthetic respondents, and my view on that is really simple, right? You know, then you mentioned the training data is really important, and these models get trained on, you know, data sets that have some sort of subject matter that’s associated with them, right, and they end up being pretty decent when you ask them questions about the subject matter upon which they’re based. And they end up being less decent when, you know, they have to kind of go off script and talk about other things. I mean, we’re in a place now in the industry where the thing that I think people need to remember is that with all the technology that we have, a lot of which is sort of based on continuing to try to solve the labor problem of who’s actually going to do the study—you know, is it going to be a person, is going to be an AI, something like that—research still fundamentally comes down to two things: one of them is people, or some sort of subjects that you need to study in order to represent a population, and some sort of data collection mechanism that allows you to observe data in some way that’s related to the subjects, right? And, you know, the technology has kind of allowed us to almost abstract that part away because it’s baked into the system. Except there’s some really fundamental assumptions about who you’re representing, the types of questions you ask them, and things like that, which we need to not lose sight of, right? I mean, the example I always give to people is, like, you know, you can look at these recommendation engines, and they do great work, and AI does great work, but a lot of it is based on the idea that it is optimizing. It’s trying to find something that looks just like what it was trained on, right? And so, if you’re in the world of optimization, then these are good things to have. But if you’re in new worlds where you’re trying to understand early-stage innovation, or you’re, you know, you’re talking about new product development or packaging or something like that, where you’re trying to think completely out-of-the-box, then you know, there’s still this need just sort of talk to real people who have something to say, right?

Lenny: Yeah. Agreed, a hundred percent. And so, here’s what I see happening from with the brands that I work with from an advisory standpoint, and as well as the suppliers that I work with as well. So, what I see happening right now—I have observed this through the business that is occurring—is a rapidly emerging system of saying, we have years worth of data that exists either within the brand or within the supplier, for panels—real panels, right? Real validated panels—and we can utilize AI to answer a lot of a subset of questions based on those things, so we don’t need to go ask those questions again in a survey. Now, that leads us with a whole series of questions to your point on, you know, breakthrough innovation, et cetera, et cetera, and we’re going to engage with people in a qualitative way because AI now allows for the analysis of that to happen at scale. And there’s platforms that are not just AI-moderated, or human-moderated, or a hybrid, but just makes it easier to have in-depth conversations with real humans. And the sample that is coming for those is not coming from the programmatic ecosystem, it is coming from a variety of smaller custom panels that have the populations that they need, and that’s where this spend is shifting. And the things that we can afford to have, you know, the 80% or 70% is close enough, then that’s still happening in the existing ecosystem, and that’s what’s keeping it alive are those things. But the shift is occurring, and this is happening from major brands right now, working with both existing major companies in the industry, as well as a variety of startups, and they are building that model and bringing it to market today. And now it’s just going to be a question of how [unintelligible 00:25:17] everybody else gets there. And based on the speed of adoption on AI, I don’t think we’re talking about the type of transforma—the slow transformations you and I have observed so many years. I mean, hell it could be next week [laugh] how things are going. And my point is, the industry shift, that’s the signal, right? Speaking with their wallet. You know, once the technology emerged to allow this, then they shifted. They just kind of got fed up with trying to fix a broken system, and when an alternative has emerged, then they’re utilizing that alternative.

JD: Yeah. Look, I mean, it makes perfect sense, and I have no doubt that that’s happening, right? I mean, it’s your experience, and I can see a little bit of it from my point of view as well. What’s interesting to me then is sort of like, what’s left in the ecosystem then, and what happens at that point, right? One of the things that we saw over the past several years is sample companies especially, trying to figure out how to operate in a world where they didn’t just make money on sample, they were doing other things right? And so, some of them went more down the road towards looking like a market research agency, but a lot of them, though, went down the road of trying to do first-party data in the broader advertising ecosystem. And I think one of the interesting things about that is that for, those broader ecosystem place, and certainly for the MRA place, but for the broader ecosystem place where you’re talking about advertising effectiveness and all those things, the quality of your ad effectiveness business is going to be a direct function of the quality of the underlying sample, right? That you can’t separate those businesses apart, right? Because, okay, if you can do advertising effectiveness, that’s great, right? That’s—you know, but it’s a methodology, right? The thing that you’re going to need is you’re going to need to make those observations somewhere in the broader ad space, which means you need access to people whose eyeballs you have seen, you know, who have seen those things. Buying them from another party means you’re paying margin on top of that in a space that’s actually quite competitive, right? And so, you know, you think about the assets that people need. And sample, for a lot of things, is still one of those assets. And accessing it at scale, you know, it’s sort of—at some point in time, if you’re relying on the programmatic ecosystem, you need to figure out a way really quickly to make sure you’re getting decent sample in that, otherwise you’re going to have a smaller business. And I think that’s part of the reason why we don’t see the industry tackling some of these things. Because let’s imagine that you, you know, try to strip out the fraudulent panelists, right? I mean, I’ve been working with a company that does facial verification, and biometrics, and stuff like that. I mean, the amount of fraud that’s still in a panel after a company does its own, sort of, fraud detection checks, is still staggering, right? I mean, it is a staggering amount. But what happens if we take that fraud out and we just say, “Okay, we’re not going to let that in. We’re going to take a stricter approach to that.” And all of a sudden you realize your feasibility goes completely through the floor, and there’s way less that you can do, and your revenue suffers. I mean, like, there are some really difficult choices the industry has to make at this point in time in order to sustain businesses of that scale that aspire to do these types of things, right? And acting as a single individual is not a winning strategy because you end up just sort of taking it in the chin while everybody else is continuing to grow, right? I mean, that’s the nature of the market failure here. It’s the sort of the prisoner’s dilemma, where, in order for this industry to kind of get back on its feet in terms of sample quality, at least in the programmatic space, it requires collective action, which I’ve been there at the table when we started having these discussions, and it’s not happening, right? I mean, it’s just, it’s not.

Lenny: Well, you were there when I tilted at this windmill with Veriglif, right? The idea of Veriglif, which is—it failed; I mean, let me be clear—was okay, we can create an ecosystem of connected data that validates identity, and you know, we’ll all sing Kumbaya together, right? We’ll deal with the fraud issue, and we’ll increase scale, and it’ll be a better experience for consumers, and yay, everybody wins. And look, we built it. It’s fantastic. And during that process of building it—and I don’t think that you actually engaged in one of those sessions, but we were doing brainstorming sessions in New York with brands and suppliers, all the big folks, right, with IBM—and we built the solution. And everybody said, “Yeah, you built it, and yeah, this will all work. This would be great.” Okay. So, we built it, we raised the capital, and we built it. The brands said, “Sure, it’s going to cost another two to three cents more, or whatever the case may be. Yeah, we’ll pay that all day long.” The sticking point was, the suppliers, when push came to shove, and when they realized the implications of needing to engage in a network like that, that it would expose that the emperor had no clothes, not only the emperor had no clothes, that the emperor was emaciated and dying. And secondly, it would have also exposed some of the challenges. We [laugh] had a conversation with one very large—one of the biggest companies in the world in our space, saying, “We could never do this because we make so much off of breakage on incentives, and this system would take away our breakage.” 

JD: Oh, yeah. Yeah, yeah, yeah. Yeah, a hundred percent. I mean, like, and you know, there are companies that treat their panels as assets as well, so all of a sudden you’re having a conversation about a balance sheet.

Lenny: Absolutely. Absolutely. And that vision was, but look, if we do this, we do it right, we’ll make that up because we’ll connect into the advertising ecosystem. I mean, that’s why IBM was supporting us, and doing it because they wanted to connect it into their Blue Ocean Media. Is that right? Anyway, you know, they saw this vision of, you know, yes, we’ll have data-driven right best to the right person at the right time. It’ll be the perfect solution. And yay, and everybody’s going to win off of that. And I will still argue that the research industry, the panel industry specifically, is a great foundation to build that off of, right? We have data. We have lots of information.

JD: Yeah, there’s no doubt that there’s a need. 

Lenny: Yes, but nobody—the system was—they were so entrenched, the inertia to change despite, like, [laugh] we listened. We built exactly what you said that you would do if we did this, and then they got cold feet. And I eventually realized, crap we should have just built our own panel. That would have been the right way to do it, right? But by that point, we were out of runway, and, you know, another brilliant idea in the dustpan of history. So, I’ve been there with you. That’s my whole point going through that history lesson was, you know, I’ve exerted all of my influence in this industry, which I think is fairly considerable, to get all the key stakeholders together to design a solution, which they all said yes, but when push came to shove, the economics of it because of the challenge—the economics and the exposure of the existing challenges in the system were the barrier to entry, and no one would do it. And I don’t think that’s changed.

JD: I can’t argue with you. I mean, it’s the point that I make in the book as well. And this is not just unfounded pessimism, right? I mean, like, both of us have had a front-row seat at this, and both of us have tried to, sort of, fight the good fight. I mean, it is an industry where, for many different reasons, the incentives do not exist to make quality sort of a primary feature. And I’ll tell you, it’s not, we shouldn’t say that it’s just the supply side’s fault in this— 

Lenny: Sure.

JD: —as well, right? I mean, you know, one of the discussions we’ve been having online, in fact just recently was, somebody made a comparison of our industry to sort of the world of credit ratings in consumer finance, right? Because this is an industry where, you know, you can’t really tell whether the consumer that you’re lending to is any good or not. And so, you know, there’s these sort of data clearing houses, the credit bureaus, that ultimately became the source of that truth.

Lenny: Which is what Bob’s trying to do, right? I mean that’s—

JD: Yeah, that’s exactly right. And look, I mean, the idea of building a data clearing house is absolutely the right thing to do, and it’s the same thing you were doing in principle, right? I mean, you are trying to create transparency and build sort of a two-sided market for data quality. Except that this sort of metaphor of the credit bureau is really, really strained, ultimately, right? It just doesn’t stand to reason because, first of all, it’s not like you can immediately sense the money you’re going to lose if you make a bad lending decision, right? Those things aren’t there. But then the bigger question is, okay, so let’s imagine we have a really high quality data source now. Are we going to go to market and say, okay, well, we have this high quality data source, or you can buy data from the low quality data source. What does it mean to be buying data from a low quality data source? The way that the consumer finance people look at it is like, okay, well, if I got somebody who’s a bad credit risk, I’m just going to make those people pay more money for loans because I know that somehow it’ll come out in the wash, right? I’ve got my models, and I can tell you. This metaphor does not exist for bad data. Like, what is the market for bad data? Like, bad data is shit, right? Like, you don’t learn anything from bad data, so how can we say that then, right? I mean, so sort of the thing falls apart. And I think it just kind of comes back to this question. And part of my reason for being very overt about this is because I’ve said that I don’t think that clients care, and it’s a very sort of provocative way of saying that. But one of the things I want to make sure that happens is that they’re not under any illusions about how ‘good’ in inverted quotes—or inverted commas—the data are, right? I think people think that—they know that there’s a quality problem, but they think that it’s not that bad. And so, part of trying to create a stronger signal from the demand side of the industry is getting people to understand how bad it really is, right?

Lenny: I think they know that, though. I don’t think that’s the issue, JD, I truly don’t. I think that, as we discussed, they accept that there is a quality—there’s a trade off they’re willing to make on quality for low impact decisions, necessary, but low impact. And whether it’s the best example or not, I always go to an ad test, right? What is the difference between 70 and 80 on an ad? Not much, right? It’s still the winning ad. So, if you get two ads that are within that range, [laugh] then that’s a quality problem to have. Generally, it’s pretty clear on a variety of metrics on what that is. So, you can absorb the quality issue there because it’s really not a consequential decision on that, in the big scheme of things, especially now when you can generate five ads in five minutes, and deploy on five different channels, and do a live A/B test. So, I think those things, they’ve accepted that. Now—

JD: Can I interrupt you just for a second?

Lenny: Sure. Yeah, please.

JD: Because I’m not convinced of that. I’m not sort of denying that you’ve had that experience. My experience with end clients, having been one of the largest sample suppliers in the industry—and also actually having been one of the largest buyers in the industry as well—is that corporate researchers especially know staggeringly little about the modern sample ecosystem, right? 

Lenny: Well, sure. And they think that it’s the suppliers problem to fix. The expectation is that it’s table stakes. So, what, you’re willingly selling me an inferior product? So, it’s a dirty little secret, back to Joel’s series back in the day, right? The Grey Matter Research. So, I get that. I agree some don’t realize that because they expect quality to be table stakes, and it is probably the agency—in all of its permutations—problem to fix.

JD: But I think it’s more than just the table stakes, though. I think it’s the magnitude of the problem, right? Because I think when people say it’s sort of like five or ten or twenty percent, maybe, maximum fraud, that people are like, yeah, that’s kind of big, but I could probably live with that. And it’s because—

Lenny: Right. But when it’s 40, 50, 60— 

JD: —bigger—

Lenny: —right.

JD: Then it’s a problem. And if it’s people who are pounding their way through a survey, right? I mean, there’s all sorts of data that we can bring to bear that suggests that people who are taking surveys, unless they’re sort of like really tiny, small surveys, people just sort of, you know, satisfies their way through surveys. And, you know, but quality is not going to—quality isn’t—I agree with you, fundamentally, though, that quality is not going to move the needle as a driver for—what’s the right way to say this?—quality is not going to be item number one on the list for a client to make sure that they—

Lenny: But it is now—

JD: —they look after.

Lenny: Because along came AI, right? And so, the contamination risk of—in a world where now it is—we talked about big data, right, through all of the last 10, 15 years of our career, and yay, that’s great. It was really freaking hard to do. Now, it’s not. Now, it is the era of synthesizing information, and be able to unlock more value from that is just a default, right? It’s just not a challenge any longer. So, garbage in, garbage out, if we are contaminating. If you’re—I’ll pick on P&G; I don’t know that they are doing this, but they’re always a leader, so they’re probably thinking about this, right? We know that P&G invested with Microsoft to build their own LLMs. We know the P&G is a very sophisticated buyer of research, that they have a massive repository of information. They’ve publicly stated they want to unlock value from the historical data and streamline that process. So, they have to be looking at that, and saying, “Wow, if we’re going to do that, but if the data is flawed, everything that comes after, this billion dollars that we’ve just invested to create our own internal LLM is now flawed as well, that’s intolerable.” At that level, when it goes from the bespoke project to a fundamental infrastructure for the organization, then it’s intolerable.

JD: I completely—look, I completely agree. I think it’s the opposite side of the same coin, though, right? Because I think the issue here is, yes, when you’re all of a sudden, the CFO is involved, and you’re spending a billion dollars, or a billion—whatever, even if it’s not a billion, even it’s only 10, 15 million, right, whatever it is, then you have way more attention to these matters, right? But you know, it’s still sort of the case where, if you’re just making a series of, sort of, bad decisions over time because you’re using, you know—it’s low impact, in a sense, but at some point in time, these things kind of accumulate, right? I mean, if you’re making low impact—it’s a series of low impact decisions that are wrong, systematically, about where you should be advertising, right? And that’s part of, sort of, the hidden nature of the problem, right?

Lenny: Right? You’re missing—

JD: It requires—

Lenny: You’re missing populations, you’re not able to engage with them, to sell more toilet paper, right?

JD: Exactly. And so, what ends up happening—but see, to me, this is where, sort of the you have two different audiences in an end client, right? You have the researchers who are just simply trying to do their job with the budget that they have, and then there are people at more senior levels who are stepping back and trying to answer the fundamental questions, who are then, you know, shifting pieces around on the board and making different decisions. Like, maybe we’re just not going to spend this type of money in this sort of methodology. We’re going to look for ways to leverage information that we have. I mean, I would be very surprised if all the major market research companies out there right now were not trying to figure out how to take the tens of thousands of studies that they have done previously and synthesize the data that they have on them to figure out what research products that they could sell, right? I mean, you look at the Kantars, the Dynatas, the Ipsos, I mean, there’s this staggering amount of data—that’s the third time I’ve used that word; I need to come up with a different word—there’s a ton of data that’s out there, right, for people to do that. And I fundamentally agree that that is, sort of, where the industry has moved in terms of, sort of the arms race at this point in time.

Lenny: Right. But you know, the problem with that—and I agree with you, right? We’re in radical agreement on many points.

JD: Violent agreement.

Lenny: Violent.

JD: Violent agreement.

Lenny: Oh, like our conversation about WWE, or back in the day, WWF.

JD: That’s going back [laugh] a bit. 

Lenny: [laugh]. I remember it, it was fun conversation. Anyway, point is, this is also one of the fundamental issues, though, is that research companies—most research companies, and most panel companies—are built on a transactional process where they don’t own the product. They own the supply, they do not own the final product.

JD: Yeah, yeah. Yeah.

Lenny: So, that’s very challenging for them to do those things. Overall, there’s only a few that could do it. And there are, and those companies are succeeding. Those companies have treated their data as an internal asset, either through their direct engagement with respondents, building high quality panels, and they’re engaging, they’re always asking questions, and that’s their data, or those, obviously companies that, you know, sell syndicated products, that’s very easy. But it is challenging for the others to do that, and that’s why brands, again, are investing in saying, “Well, this is our work product. We have paid for this. We are going to unlock the value here, and then we’re just not going to spend as much with you.” And the warning signs are there. If you look at the public results, watch the public companies, right? There’s been a plethora—recently—of public companies—well, there’s not many public companies in our space, but those that there are, most of them, have not had really great years, so far. Some have had really terrible years, so far. And you read between the lines, and it’s because you’re getting your ass kicked because you’re losing—the spend is shifting to things you don’t do, that you don’t offer, right? The new research spend is going to other places now because it is not necessary to ask the question the way we’ve been asking it through you. So, that’s my take. And the market will speak, right? These companies will adapt, or they won’t, and we’ll have Blockbusters or we’ll have Netflix. So—

JD: Yeah. I think the interesting thing about that, I was just thinking with some of my advisory work, the people who will end up sticking around and doing work in ways that we would sort of recognize now, right, are those who have figured out how to really get themselves entrenched in how the client operates and who become part of the broader business workflow, rather than being this sort of thing where, oh, wait, let’s stop, let’s do research, and then let’s kind of come back in, right? Because I see people doing really interesting things with a combination of technology, AI, and a bit of service, who are developing businesses that are thriving right now in places where you started to sort of wonder, like, why aren’t the MRAs able to do this, right? And I think that’s one of the interesting things about where the industry is evolving right now, and that is that you’re starting to see the new research company evolve where the value proposition, sort of, the thing that drives scale, is not the labor of the researcher, the project work of the researcher, it’s the product work of the researcher. And so, you see these people working with technologists, developers to build systems that embody the principles of [unintelligible 00:47:44]—I mean, I think that’s one of the really interesting thing of where the industry seems to be heading at this point in time is that there’s the possibility of actually making pretty significant improvements, both in terms of the quality of the outputs as well as the ease-of-use—because, let’s face it, nobody really wants to do DIY—but where you have researchers actually contributing to product development, and by virtue of the fact that there’s AI in it, it becomes this thing that’s way more usable with, proper guardrails on it, by ordinary business people, right? And it still doesn’t mean you can ignore the garbage-in, garbage-out phenomenon, but it sure does give you this ability to do things that, you know, a traditional researcher would have taken a very long time to do, and you would have needed multiple ones to do it, right?

Lenny: Yeah, yeah. I want to be conscious of time, but I want to make one point. What’s interesting watching this is the companies… there’s lots of, to your point, lots of different companies that are—and types of companies that are doing really well, but one of the perennial favorites of the research space, the field services companies, they’re kicking ass and doing just fine. And to your point, they provide—their adapting technology to create scale, and efficiency, and cost efficiencies, but they’re still delivering the need for both suppliers and for buyers to do some very specific things that they just don’t want to have to deal with. So— 

JD: This industry, fundamentally, Lenny, I mean, people try to—people who with great technology background—try to come in this industry and think that they can create software-as-a-service business, right?

 Lenny: Right [laugh]. And they always have to have service. 

JD: And their investors are super excited about this, and talk about ARR, and, you know, evergreen revenue, and subscription contracts, and all the rest of that, but that’s not how the industry works, right?

Lenny: No. No.

JD: In the best of circumstances, we are a tech-enabled service, right? 

Lenny: Yes. There has never been a pure SaaS company in this industry reach scale, ever. Ever. 

JD: But I think that, you know—to me, and maybe we’ll end here—I mean, I’m working with companies like that right now, and one of the things I’ve obviously spent a lot of my career doing is building service organizations, right, and more recently, they were tech-enabled service organizations. And so, you get companies who are now trying to figure out, okay, well, how do I penetrate this industry? How do I get where I need to be? And they’re realizing, oh, I need to build sort of a lightweight service function, right? And I think that’s, you know, if you look at sort of where AI and automation are going right now, two really interesting possibilities—if you’re leveraging technology appropriately—to provide service, to do it at scale, and to do it in a way that delivers a quality outcome, right? I mean, it’s sort of like, let the machine do the job that the machine is good at, let the human do the job that the human is good at. And if you’ve kind of plugged all the pieces in together, and you get a good outcome, but as long as, you know, the foundational elements, the data that you have there is good data.

Lenny: Yeah. Absolutely, absolutely. And we could play off of that much more as well, but we started this off saying how much I was looking forward to this conversation. It did not disappoint. I hope that you have enjoyed the conversation. I hope our listeners have enjoyed the conversation. I think we covered a lot of topics that are vital and important, and that’s why I wanted to have you on the show, JD, to do that. And I think we’ll do it again because this, we didn’t reach resolution, right, on this, and then at some point we’re going to have to. So, thank you. Thank you.

JD: My pleasure. Thanks for having me, Lenny. 

Lenny: Where can people find you, buddy?

JD: Website is probably the easiest: JD Deitch, jddeitch.com, or on LinkedIn. You’ll be liking my post, so if you follow Lenny, he’ll like my post at some point in time, and then you can track me down. But I’m there. And again, just thanks for having me, Lenny. It’s been a good discussion today.

Lenny: It’s great to have you, buddy. I really appreciate it. I want to give a shout out to our producer, Brigette. Brigette has come on and replaced Natalie, who’s gone on to do other things. Brigette rocks. Thank you, Brigette. To our audio team, Big Bad Audio, to our sponsors, most of all, to you, our listeners because without you, JD and I would have probably found a time to talk, but we wouldn’t have prioritized it because we wanted to make this happen for you. So, thank you, as always, for your time, and for your trust and engaging with us. So, that’s it for this edition of the Greenbook Podcast, probably the longest one yet. Let me know if you want to hear kind of a Lex Fridman type of approach. If you guys are down for a three or four hour podcast, JD, and I can probably make that happen at some point [laugh] so.

JD: Yeah, the subject will turn to cheese, at some point.

Lenny: [laugh]. Yes. Yes, oh, we didn’t even get to the cheese.

JD: We’ll do, like, a tasting in the middle. It’ll be the intermission.

Lenny: [laugh]. Then we’ll definitely have to do video. So, it’s kind of like Hot Ones, but we can do—but instead it’s Cheesy Ones. Anyway. All right, we digress. That’s it. We’re cutting this off. Thanks, everybody. Have a great rest of your day.

Listen and watch on your favorite platform

Sign Up for
Updates

Get content that matters, written by top insights industry experts, delivered right to your inbox.

67k+ subscribers
Building Insights & Trust at SimpliSafe with Bethan Blakeley
Karen Lynch

Karen Lynch

Head of Content at Greenbook

How a DIY Spirit Took Michelle Finzel from Small Business to DAP Global
Karen Lynch

Karen Lynch

Head of Content at Greenbook

From Community to Clarity: ConvoTrack’s Approach to Insights

Artificial Intelligence and Machine Learning

From Community to Clarity: ConvoTrack’s Approach to Insights

Karen Lynch

Karen Lynch

Head of Content at Greenbook

B2B Research Reinvented: Matt Powell Talks Trust and Tech

Business-to-Business (B2B) Market Research

B2B Research Reinvented: Matt Powell Talks Trust and Tech

Karen Lynch

Karen Lynch

Head of Content at Greenbook

What Makes IKEA’s Insights Team a Strategic Powerhouse?
Karen Lynch

Karen Lynch

Head of Content at Greenbook

Megan Ruxton on Mixed Methods in Market Research
Karen Lynch

Karen Lynch

Head of Content at Greenbook

Cultural Relevance in Branding with Kira Greenberg
Karen Lynch

Karen Lynch

Head of Content at Greenbook

How JDE Peet’s Uses AI to Transform Market Research – with Sarah Snudden
Karen Lynch

Karen Lynch

Head of Content at Greenbook

Building Insights & Trust at SimpliSafe with Bethan Blakeley
Karen Lynch

Karen Lynch

Head of Content at Greenbook

How a DIY Spirit Took Michelle Finzel from Small Business to DAP Global
Karen Lynch

Karen Lynch

Head of Content at Greenbook

From Community to Clarity: ConvoTrack’s Approach to Insights

Artificial Intelligence and Machine Learning

From Community to Clarity: ConvoTrack’s Approach to Insights

Karen Lynch

Karen Lynch

Head of Content at Greenbook

B2B Research Reinvented: Matt Powell Talks Trust and Tech

Business-to-Business (B2B) Market Research

B2B Research Reinvented: Matt Powell Talks Trust and Tech

Karen Lynch

Karen Lynch

Head of Content at Greenbook

What Makes IKEA’s Insights Team a Strategic Powerhouse?
Karen Lynch

Karen Lynch

Head of Content at Greenbook

Megan Ruxton on Mixed Methods in Market Research
Karen Lynch

Karen Lynch

Head of Content at Greenbook

Cultural Relevance in Branding with Kira Greenberg
Karen Lynch

Karen Lynch

Head of Content at Greenbook

How JDE Peet’s Uses AI to Transform Market Research – with Sarah Snudden
Karen Lynch

Karen Lynch

Head of Content at Greenbook