[PMC Weekly Consulting Insight] Survey marketing wrap-up

Philip Morgan

Back in May, I started an experiment I referred to as "survey marketing". Today I want to wrap that all up in a bow. This will be what I hope is the kind of naval-gazing email at least some of you can learn something from. A "learn from my mistakes" email. :)

What I did

I figured that if I undertook some research, I could get permission from folks not currently in my audience to market to them by sharing back the results of the research.

I still believe this idea is sound. My first implementation of it could be improved.

Here's what I did:

  1. I chose a question I wanted to answer.
  2. I designed a survey. It was OK, but could have been better.
  3. I recruited participants in my study from LinkedIn and my email list.
  4. I turned the survey results into a brief(ish) report, which I'll link to momentarily.
  5. I shared the report back with survey participants last week.
  6. Now I'm sharing the report, along with lessons learned, with y'all.

So that's the process that took about 5 months (of very part time work) to complete.

The results of what I did

Here's a link to the report that came out of this survey marketing process: https://drive.google.com/file/d/195-JqXKQWZqldOIhZmezXGi-cOuuXhd6/view

I directly emailed this report to all the survey participants who, when they filled out the survey, indicated that they wanted to receive this report.

You'll notice the final page of the report credits me for the work, briefly describes my business, and links to my site. Did this soft CTA result in a torrent of leads or new business for me?

No. :) It might have generated a few email signups at most. Maybe.

So by the numbers, this first iteration of research-as-marketing was not an impressive success. It's fair to say it was slow and did not move the needle very much.

Even so, I believe it's the start of something good. It's something I can learn from and iterate on, and I think a lot of successes start out in the same unimpressive way.

What I should have done instead

As I say in this episode of the Expertise Incubator podcast, if you're going to conduct research, your question is everything: https://share.transistor.fm/s/7f8c489f

Ask the wrong question, get un-interesting or irrelevant results.

Ask too big a question, get crushed by the work of answering it.

Ask too small a question... actually, I'm not sure there's anything wrong with that. It might provide limited insight set you up for additional research, but that's fine. In fact, building up momentum from small wins might be ideal for folks like us. It might help us complete our research in less than 5 months. :) So don't be afraid to start small.


My client Bob illustrates what I need to do with future research: own a question (and incidentally, own the right one):

I think I'm doing justice to Bob's insight by simplifying it to this: become the best and most authoritative resource for answering a question that matters to your audience. In fact, this might be the best way of serving that audience and -- as a second-order consequence -- furthering your own mission. That's the essence of what "owning a question" means, and you should still read Bob's writing on it because my brief summary of the idea necessarily glosses over important detail. Furthermore, this idea is just so relevant to folks that want to become advisors rather than implementors. Bob's articles are for research-driven organizations. Read. them. anyway. You'll see the relevance.

The question I'd like to own is this: How do implementors become advisors? How do you marshal the credibility and access needed to change your market position from implementor to advisor?

I've been dancing around this question for years. Time to own it.

One category of answer to this question that I find completely unsatisfactory is this one:

Alan's a smart cookie, but his answer here is incomplete and evasive. I think I can answer this question better, at least for folks who need more than a "Just Do It" slogan in response to the question of moving from implementor to advisor.

There are also related sub-questions. Here are just a few of many:

  • How do advisors bootstrap lead generation? For example, the folks at Newfangled have been very clear in their message that an email list smaller than ~2k won't help your firm thrive. Why is this? Why do I know people with much smaller lists who have no problem generating advisory services business? Can I gather, organize, and interpret data that better answers this apparent mystery?
  • How do established self-made expert advisors identify opportunity? How do they know where to focus their expertise cultivation efforts? How do they measure progress? How much will they invest before calling something a dead end?

Answering these and many related questions in a transparent, applicable way using data that meets a reasonable standard of integrity is where I want to direct my future research effort.

You may have noticed that the question I asked in my first round of survey marketing was not the question I want to own. It has almost nothing to do with the question I want to own. :)

My question was a not-terrible first attempt, and it is an interesting one, but not for the purposes of helping my audience and inspiring others to join my audience. It was an interesting one for me. As Lyle Lovett sang, it was then I knew I had made my first mistake.

This research wasn't a waste, it just was more useful to me than my audience, and so it's no wonder it didn't move the marketing needle for my business.

Still, I learned something useful.

According to my research, books, courses, and email lists are the triumvirate of reaching developers. My business needs to place much more emphasis on books. 1 Courses are a nut I (and a lot of others) haven't satisfactorily cracked and so it'll be a while before I consider trying again with a course. And -- at least within the confines of my imagination -- I'm already doing a decent job of using email. :) So the findings of my research on my first question are a useful feedback mechanism for me and others trying to reach self-employed software developers. My research will lead to better decision making in my business. The immediate decision making change is that I need to get systematically better at conceiving, writing, and publishing books.

Future research that I conduct, however, will be organized under the umbrella of what will become the question Philip Morgan Consulting owns: How do implementors become advisors?

By the way, all of my Expertise Incubator participants are doing an excellent job of aligning their research questions with the question they want to own:

  • One is in the throes of writing up a killer report on what his study found.
  • Another is in the throes of coding some survey and interview data.
  • Another is implementing a GraphQL data store to help spot patterns in scraped data.
  • Another is collecting qualitative data for a solutions clearinghouse to serve her target market.
  • Yet another is building a software tool based on his study's findings while also finding speaking opportunities to share those same findings.

This is an impressive group of people, and they have avoided the mistake of asking the wrong question. Perhaps the question I asked served as a warning to them. :)


PS: If you're new to this list and curious why I'm so into research, it's because I hypothesize that using research to build decision making tools for your clients is one of two accelerants on the journey from implementor to advisor. The other is publishing.

A client recently told me that 2 months of the right kind of weekly publishing to an email list has increased his acceptance rate for talk proposals something like 50%!

Here's what's been happening on my paid Daily Consulting Insights email list:

Try it for free for 2 weeks: https://pmc.substack.com/trial2w


  1. And I'm pretty sure these do not need to be traditionally published books. I think the self-published, self-distributed book model works quite well for reaching devs. F500 CEO's? They might need more in terms of what signals authority, including the imprint of a traditional publisher. But self-employed software developers? Self-published, self-distributed books as the mouth of a respectful, well-intentioned direct response marketing campaign work very well in both my personal experience and -- to an extent -- this method is supported by the research I'm writing about here.