DOWNLOAD MRII’s 2025 Love of Learning Report
Discover why John Dick, Founder and CEO of CivicScience, believes that paying survey respondents introduces bias and why shorter surveys offer better data. In this episode of Insights and Innovators by MRII, host Jon Last explores CivicScience’s innovative micro-survey method that scales to over a million responses daily without financial incentives. Learn about the evolution of syndicated data, the impact of AI, and the importance of storytelling in market research. This session is a must-listen for anyone interested in the future of market research and insights.
[00:00:00] John Dick: As the kind of market starts to settle itself, where you’re gonna have four or five, uh, maybe fewer, um, uh, big companies that own the models, the differentiators across those models are gonna be what data sets they have in their models.
[00:00:13] MRII Announcer: Welcome to MRII’s Insights and Innovators podcast, where we talk to top market research professionals to get their inside stories about innovative and enduring best practices.
[00:00:24] MRII Announcer: Now here’s your host for today’s episode.
[00:00:26] Jon Last: Welcome to today’s episode of Insights and Innovators, the case against Ping respondents with John Dick of Civic Science. I’m your host and former MRII president John Last. What is the biggest problem with survey and market research is the way that we’ve been doing it for decades.
[00:00:43] Jon Last: This may or may not be true, but today’s guest, John Dick, founder and CEO of Civic Science believes that paying respondents introduces bias, and he also subscribes to the long held belief that longer surveys degrade the quality of response. His solution Micros surveys with just five [00:01:00] questions delivered at massive scale, over a million responses a day, and without a single financial incentive.
[00:01:06] Jon Last: The result is continuously evolving, syndicated data that aims to be faster, broader, and more representative than traditional panels, especially among populations that don’t usually take surveys. In this episode, we’ll explore why John believes that the conventional model is broken, how civic science validates its approach, the kinds of insights that this model enables, and how new technologies like AI may play a role going forward.
[00:01:31] Jon Last: John, it’s great to have you on Insights and Innovators. Let’s dive in.
[00:01:35] John Dick: Great to be here, John. Thanks. Love the work you all do.
[00:01:39] Jon Last: Let’s, let’s start by, by really kind of thinking about the fact that, you know, for years, researchers have relied on incentives to recruit respondents. You know, we live at this reality in, in, in our firm every day.
[00:01:50] Jon Last: And while there’s debate about the best length for a survey, they generally last about eight to 10 minutes, if not longer, but, but you’ve gone the complete opposite direction. [00:02:00] Why do you believe that shorter and unpaid surveys actually deliver better data? Let, let’s start with the surveys without paying respondents.
[00:02:07] Jon Last: How do you achieve this and, and why do you think it’s better?
[00:02:09] John Dick: How do we achieve it? I’ll get there in a second, but there’s, there’s really two underlying observations here that I think are fairly logical and intuitive. Um, the first is that, uh, uh. People who have the time and the inclination in their day to spend eight to 10 or more minutes answering surveys.
[00:02:30] John Dick: Are naturally different from the people who don’t have the time and the inclination to do that, whether they’re being compensated or not. Right. I, I, I don’t think that, that, that, that, uh, intellectually is offensive to anyone to, to make that claim. The second claim is that people have demonstrated. Um, lots of different ways, uh, a willingness and an even an eagerness to share their opinions about things, right?
[00:02:54] John Dick: Absent any extrinsic incentive, whether it’s comments they’re posting on social media or [00:03:00] elsewhere, uh, there is a natural predilection of most people to, to want to be heard, to want to have their opinions registered. And so I don’t think either of those two things are intellectually, uh, uh, again, offensive, and they’re really just the two underlying premises of the, the approach that we, we took to this.
[00:03:16] John Dick: Right. And so, um, we, uh, company, uh, companies based in Pittsburgh started out of an incubator at Carnegie Mellon based on some research we did on the, on these subjects. And we started to think about, okay, what are, what are ways that we could engage people in question and answer exercises without that kind of, uh, that, uh, that that, uh, burden of both the financial burden of the incentive and then, and then the methodological burden of, of that extrinsic motive for, for respondents.
[00:03:44] John Dick: And the answer was right in front of us. We looked at. Our hometown, Pittsburgh Post Gazette website had a poll of the day on its homepage. Thousands of people answered it every day. Um, or, or at the time when we started the company, it was really popular with these quizzes on Buzzfeed. What kind of, uh, [00:04:00] Simpson’s character are you?
[00:04:00] John Dick: What kind of wine are you? It just, it demonstrated that. Stick to those people could be motivated to do these things. Um, and so we started to un, we, we did a lot of, uh, research and testing on, on why, what was the, what were the underlying reasons people did it. And really the number one reason was we’re kind of all narcissists and voyeurists to some extent.
[00:04:19] John Dick: And we like to. You know, we’ll share our opinion to see how we fit among other people, or to see what the rest of the world thinks. Um, secondly, there are, you know, certainly a lot of people out there that feel like, um, whether they understand the, the seriousness or methodology, methodology of the survey, they’re participating in that maybe somebody’s listening, maybe somebody will hear their opinion and they’ll weigh in.
[00:04:38] John Dick: So we thought about how do we optimize for that experience for people? How do we, how do we get not only, you know, more people to want to engage in those types of poll and quiz experiences. Um, but also really kind of improve that, that intrinsic, uh, that, that intrinsic payoff for them. And so we did that.
[00:04:56] John Dick: So we partnered with hundreds of, of media properties, [00:05:00] uh, newspaper sites, uh, mobile games, um, uh, lots of lifestyle. Publishers, news publishers and the like, who embed our polling widgets as we call them, inside their content. And it, we use what we call an engagement question, which is generally germane to the topic of, say, an article someone is reading to bring them into the survey.
[00:05:19] John Dick: And then we pose four additional research questions to them. They get to see the results at the end. We have a couple different cool ways we show them how they compare to other people or how people like them answered, other questions they hadn’t been asked and so on. And it, and it really works and it’s allowed us to really scale the, the level of data we collect.
[00:05:35] Jon Last: It sounds like a great way to kind of create a conversation as much as, as, as, as a research study, and yet, you know, capping it to five questions. Gosh, I, I, I’m, I’m a admiring that, you know, thinking about survey creep and how big of an issue that’s, that’s been, uh, throughout our, my career. What happens if, if, if, if a survey goes longer, um,
[00:05:55] John Dick: you know, well, we see, uh, degradation and response.
[00:05:57] John Dick: Um, and, you know, one of our, [00:06:00] one of our little sayings here is that every time you add a question to a survey, you increase the bias and the people who will answer it and, and, you know, uh. Even between say one and two questions to five, we see a difference and, and we’ve tested six, we’ve tested eight, we’ve tested all different, and five is sort of turns out to be kind of the perfect number where we, we, we do see a little bit of linear degradation in, in response rate.
[00:06:22] John Dick: And then, and, and with that sort of a, a mild, uh, changes in the composition of the types of people who make it to five questions. But first, that that linear break becomes more severe at six for some reason. Again, this is now. We’ve done over 6 billion of these to be able to test that. Now, to know that five’s kind of the magic number.
[00:06:40] John Dick: We get the, um, the maximum amount of, of information gained from the session without, without creating that degradation. Now, one thing I will add is at the end of those five question polls or surveys, the respondent is given the opportunity to click to answer more, right? And, um, roughly about 15 to 20% of people often do.
[00:06:59] John Dick: But [00:07:00] we’ve also now learned that those people who will maybe spend upwards of eight to 10 minutes answering questions in a given session, they look very different from the people who won’t, we call them super responders. And when we do our analytics and manage our database, we have to account for that, that sort of over, um, over-indexing of that type of respondent because they are psychographically different from, from, say that five question answering segment.
[00:07:24] Jon Last: Which, which is interesting when you talk about the variation in respondent type, you know, you, you, you’ve achieved this impressive scale of over a million people responding daily. What gives you the confidence in the representativeness of, of, of the data and, and, and how are you reaching some kinds of folks that other firms might miss?
[00:07:41] Jon Last: Um,
[00:07:42] John Dick: a few things. Um. One is we, you know, we’re, we’re, we’re gathering self-reported demographic information from our respondents. It’s part of the poll experience. And so we can just very, you know, simply look at the composition of a response, um, sample on a given day relative to what we know the census norms of, of [00:08:00] demographics are.
[00:08:00] John Dick: Um. That’s, that’s the easy part. Uh, two is we, a lot of, uh, most of the kind of, uh, the, the data we collect, the questions we ask are kind of quota sampled. So we’re, you know, for every, say a thousand people we’re trying to ask about a given brand, we’re, we’re gonna get 510 women and 490 men. I’m being obtuse with these numbers, but you get the basic idea so we can manage for that.
[00:08:21] John Dick: So. Generally where, where we may have a problem is sort of a, an overabundance of maybe a given demographic group, but we’ll sort of, I don’t wanna say throw those people away, but we put them in a different bucket. Uh, so that’s one. Uh, number two is we, um, are are really thoughtful about where our polls are, are being derived.
[00:08:39] John Dick: So the, the publishers that we partner with and the platforms that we partner with. Uh, if we are looking at the, the day-to-day flow of, of the profile of our respondents, and we see that maybe we’re starting to get thin, say in a group like Hispanics or, or wealthier people, we go out and target to do partnerships with publishers that reach that audience to fill those.
[00:08:58] John Dick: To fill those gaps or, or, or, or, [00:09:00] or those buckets. Um, but I would say the real answer, the most important answer in terms of how, why we have so much confidence in our data. Is we’ve done a tremendous amount of work to, uh, compare our findings to known outcomes, um, where there’s empirical data about a given thing.
[00:09:18] John Dick: Um, one of the, one of the, um, uh, the case studies or academic papers you’ll find on our website was something we did with a guy by the name of Joel Rubinson, who you may know is a long time research person. This was years ago. Uh, we collaborated with the n then the NPD group, now Rcna, where they had. You know, empirical point of sale data say about given brands.
[00:09:40] John Dick: And we were really calibrating questions that we asked about, about brand purchase behavior to what NPD knew about brand market share. And so, and, and the first pass of that wasn’t perfect. So we tried to understand why w why would we have been off a little bit here and there and really calibrated the way that we.
[00:09:56] John Dick: Uh, uh, build our sample and ask our questions to, to ensure that it, [00:10:00] it was, it was, um, uh, reflective of ground truth, right? So that’s the, what we try to instill in our clients is, is that confidence that, um, methodology and all those other things aside, which we do take very seriously. The data only matters if it, if it’s representative of some future thing or some ground truth.
[00:10:17] John Dick: And so we’ve done a lot of work to prove that as well.
[00:10:20] Jon Last: And, and of course what you do is you roll this up into a syndicated database. Are, are there certain types of questions or, or issues that the data is particularly well suited to address?
[00:10:30] John Dick: Um, look, I think no, not, no more so necessarily than really any, most kinds of re survey questions.
[00:10:38] John Dick: Right. Um. We, uh, we always say we have the worst survey methodology in the world except for all the other ones. Um, there’s always going to be that self-reporting error. People will overstate say questions like, how likely are you to cancel your Netflix subscription if they raise their prices? People will always overstate things like that.
[00:10:55] John Dick: But now having done, you know, over 800,000 questions in our system and 6 billion [00:11:00] answers like that, we can sort of, um. Adjust for that error because we’ve seen it so many times and we can look at, say, Netflix cancellations, rate cancellation rates, and sort of, uh, adjust for that sort of reporting error.
[00:11:12] John Dick: Um, we tend to, and this is really maybe within the last five years, an increasing evolution for us. We do tend to ask a lot more questions about sentiment and attitudes and intention as opposed to questions about past behavior. Um. Frankly, because we have, there’s ample data out there about past behavior because of companies like Sarna and, uh, comScore and others.
[00:11:37] John Dick: So we’re not really solving a big industry problem by asking people about what they did yesterday. We tend to ask them how they feel about things, um, what they’re planning to do next, right. Things about maybe their lifestyle that might be a little bit more behavioral, but maybe where there isn’t as much empirical data out there.
[00:11:52] John Dick: So that’s really the, the, the kind of the, the niche that we’ve, we’ve carved out for ourselves is, um. Our, our tagline is that [00:12:00] attitudes change before behaviors do. And so we are constantly and even partnering with some of those companies that I mentioned looking for attitudinal kind of sentiment shifts in our data that foretell of future shifts in behavior.
[00:12:13] John Dick: So we really try to play in that space as much as we can.
[00:12:15] Jon Last: Yeah, no, I, I, I share that perspective as, you know, it’s, it’s, it’s one thing to look behind you, it’s another thing to look ahead and, and, and perceptions and attitudes and, and obviously manager and trending those it to, to us in, in, in our daily work also is, has been much more effective, uh, with a lot of the clients that we work with, which, which kind of leads me to where I wanted to go next.
[00:12:35] Jon Last: And it’s, it’s, it’s kind of the, the big elephant in the room. We all. Face this in, in running research companies, and that’s kind of convincing clients or potential clients that a particular approach can be trusted. You, you, you’ve obviously created something that you’ve, you’ve, you’ve really embraced is vastly different.
[00:12:52] Jon Last: Um, and I’m sure you know, we’ve all met lots of research buyers of, of every particular stripe. Share with us some of [00:13:00] the, some of the challenges that you’ve had, uh, in terms of those who’ve been harder to convince and, you know, how you kind of overcome those, those objections. It
[00:13:08] John Dick: is, there’s, so there’s kind of two, two categories of resistance I would say we get.
[00:13:13] John Dick: Um, one is, uh, a lot of, a lot of kind of, uh, legacy type research people in the client, at the client organizations we work with who’ve embraced panel-based research for a really long time. And not only embraced it, but have advocated for it to the leadership of their companies. When we come along and say.
[00:13:35] John Dick: Uh, we have to be very delicate in how we, how we talk about it to. We’re not gonna come and say, look, everything you’ve been doing is flawed and all the information you’ve been putting in front, in front of your leadership is flawed. They don’t wanna hear that, even if it’s true, right? Because that’s, that’s, uh, that they’ve been trusted with that.
[00:13:49] John Dick: So it’s more about, uh, we can, we can make it better, right? We can, we can allow you to do different things. Um, there are certainly roles for, for. Uh, panel based [00:14:00] research and, and fra and financially compensated research. Things like c uh, internal customer panels, for example, where you really wanna understand your most loyal customer.
[00:14:10] John Dick: There’s, there’s definitely a role that panel types of surveys can, can, can serve in that. Um. And if you do need to do really lengthy kind of qualitative types of research, there’s gonna need to be some kind of compensation involved in that. Right? So we, we have to kind of delicately manage that. We found a better way to do a lot of the things.
[00:14:27] John Dick: They used to do a different way. Um, but sometimes that’s a hill we have to climb with people. Um, on the, on the sort of, uh, decision maker side. So maybe the executive, uh, CMO or whomever that we talk to a lot when we bring up this issue of, of kind of survey panel. Survey, survey panelists bias. They, they immediately get it.
[00:14:47] John Dick: They’re like, I knew it. I knew it. I have always known it. I wondered who these people were who were sitting home and answering these, who these Guinea pig pigs were. So we connect with them on that level. But it still gets back to, um, I have a lot of other data. I have, I have this empirical data at [00:15:00] my fingertips.
[00:15:01] John Dick: I know what people bought yesterday and what they watched yesterday. Why do I need to ask them these? Why, why do I need survey data? Right? Survey data has come in and out of vogue over the last, you know, couple decades. And so depending on, we’re more or less educating those people on just the bene the additive benefits of, of the attitudinal signals that we capture.
[00:15:22] John Dick: And we’re very careful in that case to say that our data’s not better than yours, but our data can make your data better. Right? So, and that’s really true probably in both of those audiences that I mentioned.
[00:15:31] Jon Last: Is there an efficiency argument that that works to your advantage too? I, I, I kind of would envision, you know, has there been a scenario and perhaps where somebody said, let’s run parallel surveys, one with the older traditional method and one with, with your methodology that, that you’ve used as proof cases.
[00:15:46] John Dick: Oh, for, for sure. I mean, we’ve done more of those bakeoffs than I can mention or even recall. Yeah. Part, part of it is a, I would say efficiency, but that manifests itself in two ways. Um, maybe this is what you [00:16:00] meant, but the. Quite frankly, the fundamental Ben benefit of our business is that it’s very profitable because we don’t carry the cost of respondent incentive or respondent recruitment, or excuse me, recruitment.
[00:16:12] John Dick: Right. So we would estimate maybe the average survey business probably has somewhere from 25 to upwards of 40% of their p and l wrapped up in a, in a, in a, in respondent recruitment or incentive costs. We don’t have any of those costs, right? Uh, now that’s what really allows us to be a syndicated business, but we also do quite a bit of custom research, so we do have a cost structure that allows us to be.
[00:16:35] John Dick: You know, competitive to put it lightly. Right. Um, but the other I would say the more important benefit is, is the speed that it affords us. Um, right. And, and kind of the, and all, and the sort of always on this of our data. Right. I’m asking, I’m asking, you know, 5,000 people every single day about Nike, but Nike’s not a customer of ours.
[00:16:54] John Dick: The benefit there is I can share that data with Nike’s competitors, the hedge funds that trade in their stock, and the retailers that sell [00:17:00] their, that sell their apparel. And, and, and I can do that at, at a very, uh, you know, very profitable on a very, very profitable way. And so being able to constantly track these things every minute of every day, oftentimes a client will come to us with a question that we don’t need to ask anything, that we already have it.
[00:17:16] John Dick: And that’s a, that’s a major benefit to everybody involved.
[00:17:19] Jon Last: So, so I wanna talk about my, as I say, this sarcastically with, with quotations around it, my two favorite letters. Um, that if, if we had $10,000 for every seminar or article that talked about how it was going to destroy everything we all knew in the world, um, we would be able to retire.
[00:17:35] Jon Last: And that’s ai. How’s it changing what civic science does? And, and, and what’s the future of AI in terms of your current model and, and potentially expanded capabilities? Well first, would
[00:17:45] John Dick: this really even be a podcast if we didn’t talk about ai? You know, it’s probably not. And, and, and I You can’t have a
[00:17:52] Jon Last: Yeah, I
[00:17:52] John Dick: don’t want to be
[00:17:52] Jon Last: that
[00:17:53] John Dick: guy, but I, I gotta go there.
[00:17:55] John Dick: Yeah. Uh, so, so, um, of course we think about it [00:18:00] a lot. Um, you know, our roots at Carnegie Mellon and our, our Roots is an engineering first company. Um, my co-founder CTO is a, you know, is kind of doing AI before they called it ai, right? So, so we’re, we’re, it’s really kind of integral in the way we think about it.
[00:18:16] John Dick: I mean, uh, you know, it’s. We often roll our eyes at things that are called AI that are just machine learning and things that were called machine learning that are just statistics, right? So we do, so we’re, we’re a little bit, um, irreverent about some of it, but, you know, there are a few things that, uh, there are ways that we employ it.
[00:18:31] John Dick: Um, uh, the, the, that that I mentioned sort of the first question we ask people in say, uh, maybe they’re reading an article on, you know, variety or, or Rolling stone.com and, and what Pull pulls them into the survey. Is the first question we ask them, and we know that the more germane that question is to the content they’re reading, the more likely they are to come into the survey.
[00:18:50] John Dick: Right now, we presume that first in quote engagement question has minimal research value because it’s an, it’s an, it’s, it’s explicitly creating an opt-in bias, right? [00:19:00] But it is the, it is the candy that we use to lure, to, to, to lure people into the, into the experience. And so we’ve used, uh, uh, large language models to increasingly optimize that first engagement question to, and it’s had a a three x increase in our engagement rates over the last 18 months.
[00:19:16] John Dick: Just getting smarter and smarter about matching the question to the context of the article. So there’s operational ways like that, like that, that we use it. Um, we also, um, you know, of course it’s creating all kinds of efficiencies in terms of code writing across the company and even some of the underlying analytics that we do.
[00:19:34] John Dick: Mm-hmm. But I would say that where we’re thinking about it, bigger picture is, um. We own this database. We own this data set. And, and I can say very confidently, and I’ll say it publicly here and on the record, we have the largest attitudinal database that’s ever been created in the history of mankind.
[00:19:48] John Dick: Full stop. Um, over 6 billion collected answers, 165 million respondents, millions new responses every day, the things that we talked about. But we own all of that data, right? As, as [00:20:00] opposed to sort of being a. A broker in the middle of a, of a pay, a survey buyer and a survey taker we own, right? That it is our data set and, and it’s built to be, um, really complimentary and additive to the uni, the larger universe of data that’s out there.
[00:20:15] John Dick: And so where we see ourselves. Fitting in the longer term picture of AI is a, is a, is as a very valuable ingredient in AI models as a tuning element, as an input to analysis that’s being done. Now, of course, all the protections and security have to be built to, to make us feel safe doing that, but we really feel that that’s, that’s the, the future we’re, we’re aiming toward.
[00:20:38] John Dick: Um. And, and, and as the kind of market starts to settle itself, where you’re gonna have four or five, uh, maybe fewer, um, uh, big companies that own the models, the differentiators across those models are gonna be what data sets they have in their models. And, and we believe we’re really well positioned to, to play, to play there.
[00:20:56] John Dick: Um, so for us, it’s, it’s all net. [00:21:00] Great. Right? Um. The, the always on this and recency of our data is, is something of a protective moat. Because even if a, even if a model could, could recreate what we asked yesterday, it can’t recreate what we’re asking tomorrow. So, um, we, we, we feel, we feel pretty good about where the world is going.
[00:21:17] Jon Last: Yeah, and there’s, and there’s obviously the need for, for currency and fluidity as well, which, which I often, we, we, we’ve often, you know, both having been on the client side and, and, and, and now for the last 16 years on the, on the provider side, it’s, it’s always that battle that you have to fight. We’re, we’re almost up against the end of our time here.
[00:21:33] Jon Last: We could go on for quite a bit and I, I don’t want to, uh, let you go before we talk a little bit about. You know, the talent pool, the, the necessary tools that, that people need to be successful in the industry. We have a lot of folks that listen to this series that are earlier in their careers. And, and, and you’ve certainly challenged convention and, and that means building a team around you that buys into that vision.
[00:21:56] Jon Last: What are the most important skills or mindsets that you look for [00:22:00] in people joining your team? And, and, and complementary to that, what advice would you give to somebody who’s getting into the insights industry today?
[00:22:10] John Dick: It’s a good segue from your last question about ai. Again, the thing I think a lot about also with, you know, a, a, a daughter about to head into college with an interest in economics and data science. Um, so kind of coaching her on cre, on, on, on, on, on gaining, um, uh, sustainable skills in this future. Um, number one.
[00:22:32] John Dick: Is story, a storytelling capability? Um, uh, I was just, she yelled at me, dad, stop telling me what to do the other day, but I was like, you need to take some communications and public speaking classes alongside rest because your ability to take the numbers and all these kinds of things and, and, and build narratives around them is, is really, is really the defensible skill you’re going to have.
[00:22:51] John Dick: We always say here that, um, uh, the data is, uh, data or insights are ever more impactful. The more personally relatable you can make them to people [00:23:00] and, and that takes a lot of. Work and creativity and, and really strong kind of, um, elegant communication skills to be able to do it. And, and so because what the business leaders.
[00:23:10] John Dick: Even less so maybe like the consumer insights folks in these companies. ’cause I would urge, I urge them to, they’re the ones that also need to have these skills. What the, what the decision makers are. The C-level folks, they’re not really, they don’t want more numbers. They don’t want more tools. They want somebody to tell them a story.
[00:23:25] John Dick: They, they wanna know what’s happening. Why is it happening? Why is it, uh, so, so what? And now what? And, and so we can’t stop short of just being a, you know, producing. Um, tone deaf numbers for people. We have to be able to contextualize them and, and so we really, uh, prioritize people even at, even at the engineering and data science level, right?
[00:23:48] John Dick: People that are pretty far in the weeds and the work that they can understand and contextualize the information we’re, we’re bringing in and the information then that we’re kind of reproducing back to our clients. So that’s, that’s absolutely number one. [00:24:00] And I probably couldn’t even think of a close number two.
[00:24:02] Jon Last: Yeah. No, I’m, I’m with you a hundred percent. It always amazed me and, and still to this day. That, you know, something that I always took for granted because I kind of cut my teeth from that perspective. My, my first ever job out of undergrad was in public relations. And, and then I got into research. It just amazed me how there was a lack of that skillset, uh, uh, with people in our profession.
[00:24:21] Jon Last: It’s, it’s good to see that you and, and, and others are, are transforming that, uh, because I think it makes us more bulletproof, uh, as a profession going forward. Well,
[00:24:29] John Dick: and I’ll add one thing. I just wanna add a bit of one fi like additional statement I’ll make to that is that there’s this, there. A lot of companies shy away from that, what I’ll call consultative element of the work because it’s seen as being less profitable.
[00:24:46] John Dick: Right. It’s, there’s, there’s, there’s great. Annual recurring revenue to produce a, a software as a service platform that people just log into and, you know, maybe you provide a little bit of technical support, but you leave it up to them to do all the interpretation, but the [00:25:00] world’s moving away from that right now.
[00:25:01] John Dick: And so it’s more about what we focus on is given the cost structure we’ve achieved in the way we collect our data. It allows us to invest some of those resources in that, that that more service-based kind of white glove level, you know, part of our business that, that I think the clients really, that they really need and they really appreciate.
[00:25:19] Jon Last: Yeah, and it adds value. I mean, I always felt that without that you’re kind of creating a very easily catchable commodity. I mean, if all you’re doing is providing a platform. How are you adding value? You know, you will be replaced. Uh, it’s, it’s great to hear that perspective and great that that’s part of the model you’ve built.
[00:25:35] Jon Last: John, uh, wish we could chat further and, and I’m sure we will in other contexts, but, but thank you so much for being a part of, of, of Insights and Innovators, and thanks for providing your perspective and, and the bold choices you’ve made. Uh, it’s, it’s this, this has been one of my favorites, uh, that we’ve done.
[00:25:50] Jon Last: Thanks for being part. It’s very kind
[00:25:51] John Dick: of you. Thanks, John. I really appreciated it and, uh, enjoyed the time together.
[00:25:55] MRII Announcer: Thanks for joining the Insights and Innovators podcast for Market Research [00:26:00] Institute International. Click subscribe to never miss an episode and visit us@rii.org for more market research insights.