top of page
Middle of Six

The Shortlist Episode 56: Brand and Client Perception Surveys




"How did we do?" It's a question that often remains unasked but holds vital importance. Without periodic touchpoints to gather candid performance feedback, strong relationships and potential repeat work can slip through our fingers. But how do we solicit honest feedback without things getting awkward? How do we select the right questions and deliver them appropriately? And once we have the feedback, how do we distill it into solid findings and actionable recommendations?


Launching a client perception survey might seem intimidating and daunting, but the payoff can profoundly impact our firm's financial and cultural health. Join Middle of Six Principals, Wendy, Melissa, and Allison, as they explore why perception surveys matter and how to extract the most value from this indispensable activity.


CPSM CEU Credits: 1.0 | Domain: 1


Podcast Transcript


Welcome to The Shortlist.


We're exploring all things AEC marketing to help your firm win The Shortlist.


I'm your host, Wendy Simmons, and each episode, I'll be joined by one of my team members from Middle of Six to answer your questions.


Today, we're talking with Melissa Richie and Allison Tivnon to discuss surveys, specifically brand and client perception surveys.


Hi, Melissa, hi, Allison.


Hi, there.


Hey, Wendy.


Which one of you thought of this brilliant topic to talk about?


Or raised your hand and thought, surveys, yes, we haven't hit on this.


Who was in, I don't know, inspired by this topic?


Well, I know at our retreat, we've had a big brainstorming session and just lined out a bunch of topics.


And then I went and looked at them and thought, I can talk about that.


I've performed those before.


It feels timely because we actually recently did our own survey at Middle of Six.


So I wasn't sure if it came from the retreat, you know, just thinking about what do people care about, or was it something more recent than that?


And I feel like just generally within the industry, companies are taking stock right now.


That seems to be a trend that's happening.


So I think this is a really timely topic for folks to dig into.


Yeah, well, it's a best practice, that's for sure.


You know, surveys of all kinds, right?


Talking to your clients or serving your team members, your employees, anything like that, just to gauge, how are people feeling, thinking?


If we focus it in on like the brand and client perceptions that becomes even like more targeted to marketing, right?


Less of the HR side of things or kind of more general, so you can get really focused.


And I know we have a ton of notes about what to, how to plan for this.


Is it something you want to do in house or bring on someone, you know, that can be just a third party to go through that.


So we'll go through all of our tips and recommendations and probably some anecdotes about what we've done ourselves.


But before we get into that, I dug up a little trivia.


I know everyone is dying to find out what question I'm going to ask at the beginning.


And this one seems pretty straightforward today.


So my question to you all, since you guys get to guess on behalf of our audience, how many people do you need to survey to get a statistically significant sample size?


Any thoughts on that?


Taking me back to statistics that I took in college.


I know.


I knew that answer at one point in time.


That was only like ten, five years ago, ten years ago, right?


Yeah.


Oh, yeah, totally.


Totally.


I got an A in the class.


I'm going to throw out 50.


I don't know why.


50.


I think it's lower than that.


I want to say it's like 20 percent.


I think so.


25.


Oh, percent.


Percent.


Interesting.


In my research, I always, always found varying information.


We're not going to tell you the answer yet, but most of it was actually surprisingly a number, a hard number, not a percentage.


So TBD.


Let's talk about that more later.


Oh, gosh.


Now it's coming back to me.


Now I'm like, oh, we have to know what the total population of the audience is.


And it's actually it's a really huge number if we want a truly statistically significant sample size.


Well, I know we're going to leave everybody hanging because we're not going to say the answer yet.


We got to get through the rest of this stuff.


But let's kick things off as we always do and talk about why this topic.


Why do we think it's important?


Either one of you go for it.


Your brand is how you're perceived in the marketplace.


So it's one way to see are the messages that we're putting out there, the experiences clients are having interacting with our firm and our staff, the reality of what they're experiencing.


I think the other thing about it is that the client's perception is their reality.


So regardless of what you think people are perceiving you to be, what you're putting out there, what the client is experiencing, that's what they believe.


So that's the other thing that you're not going to find that out unless you ask.


And so then you can find areas where you might be strong that you thought you would, or maybe you didn't think that was a strength, or you might find opportunities for improvement.


So it's all good data that you can then use to make decisions and guide strategy going forward.


Yeah.


And I think to your point about sampling sizes and going back to statistics, a lot of this work isn't so much quantitative analysis as it is qualitative analysis and your clients experience with your brand or experience working with you.


And right now, we are seeing competition heat up on projects and we're taking stock of where we are.


That's pretty typical when you start to see a slowdown in work that some of the marketing activities turn inward, which is a good thing.


It's really important to kind of dust off your understanding of how you're perceived in the industry.


And it is one of your biggest key differentiators from your counterparts is what does your brand stand for?


And really getting a sense of if your insights match your outsides is very important.


And sometimes we talk a big game in our proposals and on our websites and on our social media and the clients experience on the actual project work might differ from that.


So it isn't just about defining your key differentiators, making sure that you're amplifying them appropriately.


There is an opportunity here to course correct if something isn't quite measuring up to what you want it to measure up to.


Yeah, those are all good thoughts on why you would go through this effort.


We'll get into it about how if you're going to pull your clients, you know, you want to go into it with a purpose in mind, and that is to probably get a sense of what they're thinking so that you can make micro adjustments or just have a better understanding of, you know, are the things that you're putting out into the world being perceived the way you intend?


And then do you want to make some small changes and a way to tap into really the most important thing out there, which is like your target audience?


Do they, what do they think for the marketer?


I'm sure we're probably obsessed with this and only being able to do it every so often does not feel like enough, right?


We could always use more of the hard data and then also the quotes and the anecdotes that come out of those conversations.


Curious if either of you have an opinion or a preference on this, but we started this off by saying it's client and brand perception surveys.


You think those are one in the same?


Use those terms interchangeably?


Is it one big topic?


How do you use that?


What do you call it?


Because I think we're going to be all over the map as we have this conversation.


Well, my preference would be just to call it a perception survey because you might want to also include your teaming partners, others in the industry that you are trying to stand out from the crowd.


And usually, I'd say more often than not, firms are doing that by really leaning hard on their mission, vision and values, and putting that at the forefront, tying it very closely with the services that they provide, the types of clients they work with.


So it's almost a holistic perception of what your firm stands for that includes your brand.


It also includes the people and it includes the quality of the work that you do.


What kind of a partner are you?


Do you collaborate well?


If you're saying that you're one thing, do others actually read that and feel that when they're interacting with you?


So I think it's just overall perception and brand is a part of it.


But I don't think that it's necessarily the most critical aspect.


It's a key aspect, but it's the overall experience.


Yeah, that makes a lot of sense.


I think that I probably change my language a little bit, depending on what.


If a client is asking us for something or we see a need, it might be adjusted a little bit.


So if it's a brand perception survey, in my experience, it's usually tied to branding work that we're doing, because the questions and the way we got to that survey is really related to the brand.


That's a huge topic, by the way.


A brand isn't just a logo or anything like that, but there can be lots of things.


So they can have components of their name or their services or their competitors.


There's a lot that can be wrapped up in that.


And then maybe a slight nuance.


Again, this is just my experience.


It's not from a dictionary or anything, but then the client survey, client perception survey, might focus slightly more on services provided, experience with the employees, with the company.


Now, that is all part of your brand, so you can't take that out.


But I sometimes feel like we might propose something to our clients and focus it one way or the other.


But I'm just curious, and maybe for our listeners too, as we go back and forth and use those terms, I think it's safe to say pretty interchangeable.


And also another level here is if we're pushing out the survey, we might call it a client survey or a perception survey to the client, because the people who are receiving that invite to do that will understand kind of what that is a little bit better than the brand survey.


Melissa, any thoughts on that?


Yeah, that's kind of where my head was, is one, what's the goal?


What are you trying to achieve?


And then since it is going to an audience that you want to respond, then what is going to most likely resonate with them so that they'll respond?


And probably being a little more tidy and calling it a client survey, even if it's got questions about real specific branding aspects, the client survey might get a little more response.


I think so.


I think that they're excited.


They're like, okay, well, I'm valued.


I'm on your list.


I'm now at the top of the list, I would assume.


So, you know, they want to provide their input.


Well, let's get into the whole list of things that we brainstorm.


It is like what we would recommend.


Where do you start?


Why would you be compelled to go through this effort?


Well, I guess we maybe hit on the why, why we would be compelled.


But like, then once you decide to go for it, like what are all of the elements you need to start considering?


Well, I think this goes to what Melissa just said about determining what your goal is.


It's in any endeavor like this, coming up with a problem statement, I think is the most important thing.


Because sometimes we start solving before we actually understand what we're solving for.


I see this both in the public aspects of the work I do on City Council.


I see this in the private sector as well, that sometimes you put all this effort into something you don't quite know exactly to what end that you're doing it.


So really defining that internally and getting complete alignment amongst your leadership is going to be important because this isn't an investment of time and resources in doing something like this and dialing it in to serve what purpose is an important aspect of this.


So is it something that's going to inform your strategic planning efforts?


Is it something that's going to lay the groundwork for perhaps getting that buy-in to do a complete brand overhaul?


Is it something where you're starting to see that you're losing more or you're to certain competition or the amount of interactions you're having with key clients seems to be dropping off?


Are you trying to solve for something in the relationship aspect of things?


But dialing it into that will not only help you figure out who you want to survey, but also the questions that you want to ask them.


Is it always because there's some underlying problem or would there be a case for just doing it much more frequently, like after a project completion or annually, so you just pile up the data year after year?


Well, and I think the term problem is really just what are you trying to solve for?


So it's an equation of some sort.


Is it that the feedback loop isn't robust enough?


You're trying to get feedback and input to inform decisions that you're making internally, or just to use this confirmation that the ways in which you're messaging your firm are effective and accurate.


What other reasons or situations might you consider conducting a survey?


There's a few ways to go about it.


You can do one-on-one client feedback surveys as part of your kind of client satisfaction program where partway through the project, you're checking in and then checking in at the end.


That may give you some clues of things that are being experienced by your clients, and then that might be fodder for creating a more global client survey teaming partner survey that goes out.


Yeah, and a firm that I used to work at, we automated it and had a very easy to fill out survey with a set number of questions on it that released at the moment that we closed a project.


It was officially done, the final invoice was out, and while it was still fresh in the client's mind, they would get this survey deployed to them with a nice tailored email, you know, in advance of it so that they, we had a higher rate of people actually doing it.


And then there was always a checking in with the project manager internally to ask him, like, do we need to send the survey out if we just sent one to them on the last project we worked on?


Or is there a reason not to send the survey out?


Sometimes someone internally would say, we're not going to work with them again.


You don't need to send it out.


Or, you know, I don't know what they're going to say.


I don't feel like this project went well.


And they would still want it sent out, though, to get the feedback, which I thought that was a good thing that this wasn't a fear based thing for internal staff, that this was going to come back to haunt them, whatever the client said during it.


But it was primarily on project performance.


Did the project meet their needs and expectations?


That kind of thing.


Was it on schedule and budget?


But there was also aspects of it, like, were the reporting materials grammatically clean and free of typos?


Did the format of the deliverables help you message things to your stakeholders?


There were things that were branded elements of it that were captured in there, too.


And then, of course, there was the, you know, do you have anything else to say, which is a great place where we captured glowing testimonials that then we would use to further capture more work.


But doing that and having it be on this regular deployment that was tied directly to project completion gave us a lot of good information coming in the door.


But it also, to that point about quantitative versus qualitative, it did provide a certain amount of quantitative information that's really hard to get if you're doing these surveys one-off.


So you can kind of see consistency across the board of like, oh, well, what is our rate of returning emails in a timely matter?


Or is there something internally?


Do we need to do time management training for our staff?


Or are there little things that you can glean?


Yeah, you can see trends over time.


And that might relate to changes in the organization or staffing and different things.


I feel like there's a couple of directions that we can go.


I wanna, we definitely wanna get into the content and how you're asking the questions and what to ask for sure.


But before we move off of that, because it sort of popped into my mind as you were describing that, Allison, you can do automated surveys.


Like you were just saying, at the end of a project, and it can be pushed out and it's linked to their invoice or who knows how that's happening.


That is one way for sure.


You can also do interviews and it can be at the end of a project.


It can be on the phone.


It can be with the project manager or it could be with a third party.


So many options there.


Either of you have any thoughts on how you might, how you would choose one versus the other or make a recommendation if a client wants to get ongoing feedback, the pros and cons of all those options?


Well, one of the biggest upsides of doing face to face or screen to screen, one-on-one interviews is that gives you an opportunity at the end, everything came out relatively positive, is then you can ask what other work they come up.


So that's kind of the pro of doing more individual one-on-one on a specific project or with a specific client.


So more actionable to get future work, less ability to kind of aggregate results across clients.


Although if you ask consistent questions and give them a consistent way to answer that, you can start to do that, but it's a little more qualitative that way.


So what you described might be a business development manager's role, someone maybe not the project manager or...


I don't know.


I'm just thinking through how a client is giving feedback and their comfort level sharing.


Now, some clients are very happy to be asked.


They see that it's for the betterment of the business, but sometimes things are awkward and they don't know how to share that feedback, and so there needs to be a separation.


So maybe as you're considering how those in-person surveys are being done, you have to take into account.


Is it a best practice that you have someone in your firm do all of those who is not on the project, who can be almost like a, you know, unbiased surveyor, even though they're internal?


Or, you know, do you have project managers as part of their closeout?


So anyways, that just kind of came to mind.


I think that there are, you would want to think about how many of these surveys are being done, and what is the ability to actually do them consistently with your team and providing guidelines and even like a framework, right?


A form or something so that it can be done very consistently.


Otherwise, it should be one person's job to gather all of that.


Yeah, and I think it's certainly not a one-size-fits-all.


Sometimes it's project-specific, sometimes it's client-specific, sometimes it's service area-specific, sometimes it's tied to market forces, and the fact that maybe the work is getting more competitive and just wanting a better sense of that comparative analysis of the experience with others versus experience with your firm.


But as far as going internal or third-party, obviously third-party is going to cost money in a way that you're not going to necessarily feel as directly if you do it internally.


It's still costing you to do an internal survey.


But this one is a little more in-your-face for the folks that are writing the checks.


And we've performed those.


I think that those are usually tied to really needing to suss something out with the client and take time to truly talk to them in a very safe, cone of silence kind of way, because we've even said to the clients, like, would you like this feedback to be anonymous?


And we've had some take us up on that, say, don't tell them that I said this.


But, you know, and you can't get to that if you work internally.


It's very hard to get to that, because, you know, there is a natural tension between the client side and the consulting side.


It's competitive, procured work.


So they're covering a little.


They don't want to open themselves up to protests or anything like that.


But if you're able to, say, do a video conference where you can see them, you can read their body language, you can read between the lines on what they're saying, sometimes you can cull information from that that you wouldn't be able to get otherwise.


Those are all really good thoughts on why you would select each one.


And I think when we kind of summarize this, too, after we get through some of the content, the types of questions you would be asking, too, that might help clarify that in the minds of the marketers who are thinking about how to implement that.


I will say on the bigger, quote-unquote, bigger effort, because it's sort of all done at once and maybe done at one specific time that you're going to be getting before strategic planning or you're going to get a major report, you're using a third party, that effort takes a lot of consideration about what are those, who's going to be on that client list.


You can't, if you're not sending a link with a survey after every project, well, then you are being selective about who are we talking to and how many can we, do we want to survey.


So let's talk about creating that client list.


Let's say it's not the version where it's just at the end of every single project, but you're looking back one year, two years, three years, how do you cull down that list to something that feels like the right size?


Do we know?


I probably should have answered the trivia question to tell you how many people need to be in your survey sample before we get to decide what that is.


Well, I think that there is some secret sauce here.


You kind of have to read the tea leaves a little bit on this.


Again, we talk about time is finite, resources are finite.


You can't interview everybody, so you have to winnow the list down.


I'd say always start with your strategic plan.


Always start with where the money is going to be, kind of like Wayne Gretzky, or Skate to where the puck is going to be.


Because there is a subtle psychological aspect to surveying where you're actually marketing to your client as you're doing the survey.


Oh, definitely.


You know, it shows that you're being proactive, that you're being thoughtful, that you value their opinion, that what they say might actually be absorbed and used to the betterment of the firm.


There's a whole bunch of stuff around that that shows intention, and intention keeps you front of mind.


It just casts you in a good light.


So there is that component.


So, you know, maybe it's a mix.


Maybe you go with the folks that you know are huge fans of your firm and get that confirmation from them of the stuff you're doing great, and maybe some constructive feedback as well.


And then maybe go to the ones that are a little tougher, nuts to crack, or you know that they are really hard to please and get that kind of information.


So I think it just like do an assessment of, are these clients that we actually like working with?


Do we know that they have future work?


Are we positioning ourselves subtly for something?


Is it that we want just to give it to a straight, no matter how hard it is to hear kind of information?


Are we trying to just strengthen already strong relationships?


I think there's a lot of different nuances that could go into how you winnow that list down to the final few that you're going to be reaching out to.


Yeah, generally speaking, I like to not have too many huge fans of the firm on the list.


I want some of them because I want those great sound bites and things to report back on there.


But if I had to guess, I'd be aiming for like 20% of the list can be big fans.


And maybe they shouldn't be your biggest fans either, because you really, they're probably telling you all of the good stuff all of the time.


That might be, as I think back to working with clients on figuring out that list, usually we get the big, the whole leadership team together in a room.


We might pull up the BD list and look at projects and organizations.


We want to be talking to this school district or this university, making sure you have all of the agencies that you want to be talking to on that list.


But then when we are actually looking at the individuals, it's that, and you kind of were getting to this point, Allison, like, there's not a perfect formula for it, but it's like, who will talk to us, right?


It doesn't help to have a list of people who will not answer your call because they don't know who you are, or, you know, they just can't, you know, be bothered or whatever.


So that group.


But then people that I feel like you have to, this is the project team or the technical team has to be able to advise that these folks will have something to say, and not just like, I don't know, you know?


And we have, we've surveyed some people in the past where they're like, I can't answer that.


I don't know.


I don't have enough experience.


And that feels like that's sort of a wasted person to have talked to.


It's fine, you know, maybe in some cases that paints a bigger picture, but I think you actually have to look, start big, and bring it down to those individual folks who will be commenting.


And not too many cheerleaders on that list.


Mm-hmm.


This gets back to that confirmation bias that we have covered so many times on the podcast in the past.


Mm-hmm.


Okay, so take your master list, break it down into some form that feels like this makes sense, you know, this is who we're going to survey, and then make notes of why you picked those so that if you end up doing this twice a year or, you know, every other year, whatever that might be, you might consider, like, how did we get to this group anyway?


And a lot of the surveys we do, so we're just, like, one group, but I would say we might be reaching out to somewhere between 30 and 50.


It could be, it could actually be fewer.


We have done smaller surveys before, but you might start off with that size of a list, and then with the aim of we're going to actually have about 20 conversations or 30 conversations.


So, again, that's just sort of our past experience, and if you were automating this, and it was a, I don't know, survey monkey link, you could talk to way more.


But having 15 to 30 minute conversations in a timely manner, that's probably about the limit of what we would recommend.


Client communication, how do you ask them to spend time with you or a link or, you know, a consultant?


Any best practices for teeing that up and following up and all of that good stuff?


Definitely wanna give your clients a heads up if a consultant is gonna be calling.


So typically we ask, contact them on this week and let them know the following week, so and so will be reaching out to schedule a time, giving them an expectation of the time, this is gonna be 15 to 20 minutes, that feels very doable for people.


If you make it longer than that, then you're gonna start running into difficulties getting people's attention and schedule.


Yeah, just that thoughtful heads up.


And if it is gonna be a link to a survey, you do have the option in different survey platforms to do like an e-blast to an entire list, but the click through rate on things like that are not gonna be as robust as sending a personal message, which does take time to do that.


I used to send out like 75 of these a quarter, so it gets old, you know, having to tailor.


So I come up with boilerplate language that's primarily the same thing like you're gonna be receiving the survey.


It's a series of questions.


It should only take you five minutes to fill out or whatever the time is for it.


We'd love to know how your experience on and then blank project with blank staffer.


So you could boilerplate most of it, but then put in some key identifiers into it that's gonna make it feel really tailored to them.


And that way, it still is customized for the person.


It still is coming from a human, you know, well, from an email of a human.


But it's that little extra thing that is probably gonna get their attention and give them a little bit more incentive to click on the link and actually fill it out.


I think that's proven again and again that personal requests, whether that's coming through email or a phone call or if you, I don't know, happen to be seeing people in person, that works really well for our clients, it works really well for us.


So absolutely giving them a heads up and then making it more of a personal request.


And I have just seen it so many times where people are really jumping at the opportunity to share their thoughts and you're not gonna get everyone because people are busy and things are going on.


But I think even if they don't participate, it's a pretty positive experience to be asked to participate.


So how about we talk a bit about the survey design, the questions themselves?


We have the goal in mind, we know which way we're gonna go.


So we can't get into the nitty gritty of what you're actually going to ask, but how do you figure out what you want to address and what is the appropriate length?


How many multiple choice?


How many fill in the blank?


You know, all of that stuff.


I know all of us have drafted surveys multiple times.


Where do you start?


I think a lot goes into figuring this part out.


It is length.


It's making sure that it's not too bloated with too many questions, no matter how good they are, because that will increase your response rate if they actually can fill it out fairly quickly.


And you gotta find a good balance between open-ended questions versus like drop downs that have preloaded answers into them, because one of them is gonna give you really concrete feedback, and the other is going to be anecdotal.


Anecdotal can be very, very helpful, but it's gonna be hard apples to apples when you're trying to create consistency across clients.


So having a good balance between those two things, I think is really important.


And another one that is, this is a little bit of a geeky opinion of mine, but if you're gonna have a always, never, somewhat, or those types of responses, try and limit it to the three things and not give the nuanced third and fourth option.


So you're like poor, somewhat poor, like the some what's, there has to just be one in the middle because you're starting to make people have to do things by degrees at that point, and you're gonna get too much variance, I think, in the answers.


So giving them, instead of a scale of one to five, with that the two and the four in there, consider a one to three, it just simplifies things.


And it's easier when you're doing the data analysis later to make some concrete decisions on what you're seeing, in my opinion.


Yeah, and I guess if you only use three options, maybe you have to, the ends of those can't be really, really strong, like always or never.


I think you'd have to soften that, like mostly or less often, right?


Because I think some people take a survey and they can't check the box that says never or always.


That just is impossible to be true, and some people read it quite literally in that way.


But I could also imagine having five answers gives a range.


I'm not going to give five stars, I'm gonna give four.


Five might be impossible in someone's mind, but they're happy to have the four and it's not middle of the road.


So I think that's a really good point, Allison.


I was just suggesting if you only use three, do not have a never or always as part of that.


Well, and what we're talking about here, it truly is opening up a can of worms because statisticians have very strong opinions on this.


You could Google it and find full on straight up articles written on this particular subject, because it is all about trying to get people's perception down and what do they feel comfortable linking to themselves on their experience.


And so it's really, do you want to give them as many options as possible, so they find the sweet spot of exactly how they're feeling?


Or do you want to make the data analysis easier on yourself when you turn it around?


It's a really nuanced, almost philosophical conversation on what's going to get you the most bang for your buck in terms of feedback that you can then do something with.


But I think that's for a podcast about economics.


We don't need to bring in a different expert for that conversation.


Exactly, yeah.


The other part of that is then not giving a neutral, like kind of forcing people into a statement that is not neutral.


Maybe there's an NA, if that truly doesn't apply to them, they can skip that question.


But I've never done only a one, one, two, three scale.


But I am a proponent of not giving a neutral, push them into a choice.


If you're using a platform like SurveyMonkey or others, they provide a lot of guidance and they will even flag.


This question has a bias, bias in here.


This question is confusing.


We consider rewriting it this way.


Here are the best types of multiple choice answers for that type of question.


So luckily, I mean, if you're creating one for the first time, you're going to have a lot of help thinking through, but then that doesn't stop.


The need for QC and having other people look at it and read the questions so that they can confirm, like, was that clear or not?


Or this is NA, this is not applicable to me, but that's not a radial button for me to choose.


So those types of things can be really helpful if you're creating that digital survey.


When you're doing it in person or over the phone and you're asking questions, do you think that those choices, multiple choice, that probably changes a bit, right?


Because people can't remember you've given them 10 options.


They probably need less, at least I think, if I'm thinking back to experiences, right, with conversations, like they can't see it in front of them, they're having a conversation.


So you almost have to make it simple.


On a scale of one to five, you know, for all of these questions or whatever it might be, keep it easy for people to actually answer and not get lost in the question itself.


I had someone who had used letter grades, A through F.


Oh, sure.


Which by using the letter grading system, then that gave a good comparison.


Although the way that, at least at my kids' middle school, they're doing grades, they don't use those anymore.


So maybe that's gonna be obsolete.


There's no more Fs in this world.


That's a good idea because most people can relate to that pretty easily and they don't have to remember.


Is 10 the highest or is one the best?


I can't remember what you said at the start of that.


That's really good.


Well, I was just gonna mention that also in the asking of questions, to not have like two questions that need to be answered, especially when you're doing the interview style.


If you have the, and what else do you want to tell me about that, that's fine, but it should come after you've asked that initial question, but because that can sort of mess up when you're doing the comparison, where people will blend their answer for two that are closely related.


So anyways, simplify, simplify as much as possible.


That'll make it better.


Anything else that kind of comes to our mind about designing the survey and creating the questions, vetting that?


I always like if you're doing a electronic survey at the end to have a comment box or anything else you'd like to add, because you get some great nuggets, or maybe you get some real actionable feedback, but as opposed to giving other or place dialogue boxes on every single question, but give a space at the end.


And you might even tee that up at the beginning of the survey that there'll be a spot at the end for open-ended comments.


Hmm, yeah, might compel someone to finish the survey.


Like, well, I really want to share this tidbit here, so you got to get to the end of it.


There's also SurveyMonkey, you'd mentioned that platform earlier, Wendy, which is, I prefer that one.


I think it's got the most intuitive tools, and they've come a long way at adding more bells and whistles if you really want to get statistical with your surveying.


But they do have a free option, and for smaller firms, where this might be a bit of a lift for them to have to pay for the platform, for a license for it, that might be a hard sell.


Mm-hmm.


For the powers that be at your firm.


So there is a free option.


The thing is, is that not every single type of response or way of collecting feedback is gonna be available to you.


They put some of it behind the paywall.


So while you might have multiple choice, you might have very simple yes or nos, some of those more nuanced tools are not gonna be available to you.


So it's just something to keep in mind when you're going in there.


And going into SurveyMonkey and building out a very quick survey isn't hard to do.


So you can just kind of test it out to see as far as length goes, the types of questions, and then ask yourself, is it worth it for us to get a subscription to one of these services to give me the access to certain tools?


And I'll give an example of this.


I built a survey once that we just went with the free option, 10 questions or less.


It served us very well, but there was a certain type of feedback that we really wanted to get, that we weren't able to, with the functionality of the free version.


And so we just bit the bullet and decided, okay, we're gonna pay for it.


And then we just found other ways to send surveys out into the world.


We're like, well, now we have the ability to do it.


Let's get as much bang for our buck as we can.


That's smart.


And I also wanna say that there are ways to use some of the extra tools in Survey Monkey without having to have an annual subscription and pay several thousand dollars.


Sorry, Survey Monkey.


We're like, but if you poke around enough, you can clean it up and get your logo on there.


There are some ways around and you can keep it simple.


You may not have access to every single detail, but there's a lot you can do with that.


So don't give up if you feel like that's the right platform for you and you can't commit to that annual fee or whatever.


And also sometimes bite the bullet, pay $100 for the one month that you need it.


You don't have to continue it the rest of the time.


Wherever your firm is, start where you are and then go from there.


And you might find that actually this has been invaluable.


We want to do more and more.


I quickly hit on and I heard Allison you say too, you need to take the survey and test and proofread and all of that.


But let's just talk about the importance of sending it out to a group, a focus group who can see it with fresh eyes and try to answer the questions, right?


I mean, that's pretty important.


What about if it were an interview survey?


Same thing?


How do you kind of vet that these questions are not confusing or that they're on the right track for your audience?


You can use the same process, testing it out with some other people in your network, in your firm, does this seem clear?


For me, I think my style when I've done one-on-one surveys when I was in-house was a bit more free-flowing, where some clients were just real talkative and just I'd ask one question, and then they just downloaded for 15 minutes, and I'm just taking copious notes.


So that's a less scientific, harder to compare results, but that I wasn't going to stop someone from giving me all this good information so I could get to all my questions.


But usually what I would find was happened if I got someone that was real verbose in their answers, they kind of ended up answering those questions.


And then I would kind of review my notes.


Oh, well, can you touch on this aspect or that aspect?


And then I think when you're speaking to a person, they'll just straight up tell you, I don't know what you mean by that.


So that's been another thing that's happened.


But having another group look at it is helpful.


Yeah, and having a set of pretty common questions just ready to go, that's one thing.


But then there's also the, say your firm has been trying to break into a new type of service, or you're going after a body of work and you need to start building that reputation that this is something you do.


Sometimes you can ask a question like, when you think of name of firm, what comes to mind?


A question like that, and then it's totally open-ended and they'll tell you what they think.


It's gonna let you know, like are your marketing efforts to start branding the firm?


Are they effective?


Are they not?


And again, to the psychological aspects of this, you're reinforcing it for that particular client you're talking to, for them to start thinking about the firm.


That way, if they weren't, there's also like getting back to that client perception versus brand perception.


There is a intersection between the two of them.


So asking questions like that, especially in the open-ended format of an interview-style survey, is where you're probably going to get the most nuanced answer.


All right, you've deployed your survey in whatever form.


You've gotten results back in a beautiful report from SurveyMonkey or from your consultant or whatever that meet might be, or you've compiled a year of individual interviews with clients, however that is.


The important thing is what you do with that information.


Thoughts on where you start?


There's gonna be a lot to read through, and it's probably some of it's reading tea leaves, but how do you approach looking at all that information?


Yeah, I mean, definitely looking for themes.


Is there something related to the staff?


There's something that's going really great about involving us in the process to decide what are the best options.


So then maybe that's something that you start digging into your staff, like, okay, I need to talk to you.


Give me some case studies.


Where did you do this?


So you can start playing that up in your marketing materials and your SOQs.


If it's something negative, like my experience working with one project manager to the next is very inconsistent.


So then it's like, okay, maybe we need to invest in some internal project management training.


But I find reading the results is the most fun part.


I mean, just one, it's just to me so fascinating to just hear what people are thinking and then taking that all in and where are the nuggets?


Where are the themes?


What are the things we can do to improve?


I think one of the worst things you could do is get negative feedback and do nothing about it.


So there's another, if there's some real strong negative things, then that's a pretty important issue to escalate with the leadership team and really dig into and figure out how you're going to address that.


Yeah, and speaking of the leadership team, it's like, what format did they want to see the results in?


They probably, some of them might want to see the raw data, but a lot of them want short and snappy, tell me what it said.


So it's drafting up an executive summary that gives the percentage values of the information that you found that was quantitative and some key takeaways from the qualitative data and then end it with that series of recommendations on how do we make good on this.


That's also something to make sure if you are hiring a third party to do the surveying for you, is to get really clear in the type of deliverable that you need to make informed decisions based on all of the feedback that they are going to accrue during the surveying.


And that could take the shape of a PowerPoint where they walk you through all of the findings, but you really do want your third party to give you recommendations.


It's not just that they're gathering this information.


They're going to be in a really good position to kind of see it holistically across all of it because they were on the front lines of getting the information.


So asking them, what do you recommend is valuable.


You might have intelligence on the inside that will winnow their list of recommendations down or focus only on one particular aspect of the recommendations, but getting that, I think, is a key part of why you do go to a third party to help you with this type of surveying.


But getting really clear on that, I think it's going to give you the most value out of putting this in someone else's hands.


If you have had in-person meetings, clearly those survey responses are not anonymous.


I mean, you could disconnect the person or the project with the comments and serve those up to the leadership team in some way, and that would be acceptable.


Most of the surveys we're doing, all the answers are anonymous.


I mean, I know there, it's a numbered list.


I know which client we were talking to, and I've definitely had clients be very curious, especially when there's feedback that is so specific that they really want to address it, and it can be painful for them to not have a name.


But we end up providing all of the data, but just with no identifying elements, we take those out, you know, so they can see verbatim without it saying a person's name or a project specifically there so that they can kind of infer more from that.


But really, that presentation deck that has the key highlights, the executive summary, and then the recommendations at the end, that's like huge value in that piece of it.


And I think it also, keeping things anonymous and a little barrier between the people we were talking to and the internal staff, I think makes it a little easier to hear the feedback.


Because it can be a stressful moment when you get into that meeting to talk through the feedback.


A lot of times it's 90% positive or 85% positive, I mean, really high marks.


But when it's not the positive stuff, that can be very personal to business owners and to leaders and that sort of thing.


So if you're deciding between trying to figure out the value of having an anonymous survey, I mean, it's huge.


You're going to get pretty transparent feedback.


And then for us that get to deliver that, it just makes the conversation a little bit easier because we're not making excuses or connecting it to something.


Well, in that circumstance, there was this or that, the other thing.


So something to consider when you're thinking about the results and sharing the results.


It could be anonymous, but you could still have things like, what type of client are you?


A municipal client, a architect.


Like there's still some things you can do or a market.


There's some things you can do so that the results can be categorized without giving out specifically who said what.


Yeah, it is true.


People fixate on the negative feedback.


It's like on social media, you can have a hundred people that give a glowing comment and then one cranky, grumpy, trolly person who says something that that's the thing that you think of and it affects you.


So I think to your point, Wendy, about delivering the information back to the client, it's figuring out how to weave in the critical feedback, but in a way where you're still steering them to where the best return is going to come from through the recommendations.


Because sometimes there's just someone who had a bad experience and it's a one-off or it doesn't speak to a larger structural issue within the company.


Is it worth mentioning?


Sure.


Even if it's just to help with one particular person and their performance at the firm, there was a technique that you introduced to me.


Sometimes using the plus delta way of delivering good versus not so good information is easier for them to digest than the more harsh wording or they like this, they don't like this.


You're doing this well, you're not doing this well.


So maybe incorporating that is something to consider too, just to help your leadership focus on what they really need to help make whatever improvements or corrections or moves that they want to make based on the results of the survey.


Yeah, I like to look at the answers, especially when it's the open ended narrative type responses to questions and group them by here's a lot of the positive comments that we heard.


Here are some of the more critical or areas of opportunity.


You're always putting that in a positive spin, but that's going to be the sharper words.


So we're looking at the things on the either end of the spectrum, but then looking at the middle too, which is really probably the most accurate reflection of what most people are feeling is the stuff that's in the middle.


Or some of those things are like, oh, you know, I don't really know.


I've never really thought about that, that sort of thing, to understand like, that's where there's a lot of good too.


So, yeah, the client, we all need to hear the whole spectrum.


We want to look at all of the information, but grouping it different ways can help frame it up and make some things easier to digest and also not get us stuck on only the sunshiny part, right?


It'd be very, depending on your personality, maybe that's what you gravitate to, but then you might be missing some really good stories in there.


Melissa, you mentioned earlier that if you're gonna ask for feedback and if you're gonna hear something and maybe it's negative, but you don't do anything about it, that's like a waste of a survey, kind of the worst thing you could do there.


How do you make change or, I mean, show a client that is anonymous out there in the world that you are making changes?


Do you have any anecdotes from your experience or just how would you bring that to your leadership team?


And then hopefully see that something changed over time?


Well, my comment was a little bit more specific to a one-on-one interview, which experience from that, where there was a very specific issue, a very specific person.


And so then, even though I set up those surveys where it's like, you know, I want your candid feedback, da-da-da, that one where it's like, okay, I would like to follow up with the principal in charge on this project so they can call you to discuss this.


So would that be okay?


And of course, who doesn't want to hear from the leader of the project if you've identified a specific issue?


So yeah, with the anonymous surveys, it's a little bit harder.


And is it a anomaly?


Like, is it the, you know, redoing the bell curve where we're throwing out the very best rating and the very worst rating?


And are we seeing a trend?


But I think, you know, if it's a comment with a person in an anonymous survey, then I would definitely want to take that to leadership and let's problem solve, let's think about this.


And what do we think is the best course of action?


Yeah, so that there may not be a way to actually connect, you know, just to keep the promise that this was an anonymous survey, you know, there's not really a perfect way to connect with that client.


But you can, you know, the leadership team could also send out a general thank you note.


You know, if they asked people to participate, they could have a little bit of a summary is not the right word, but just to say that, you know, we appreciate your time and we heard some really helpful feedback and we are excited to implement that over the next couple of years.


And of course, our door and the email is always open.


So, you know, that can be another way to just finish that up too and let people know that their time was appreciated.


How about we get back to my little trivia question at the top of the podcast about how many people do you need to survey to get a statistically significant sample size?


And Allison, I forgot your number, and Melissa, I think you started small and you wanted it to be way bigger by the end of that.


Allison, did you say 50?


And then get smaller?


I did, I'm probably really wrong.


You know what, I think both of you had pretty good gut instincts when you started small.


They're basically the number that I saw most frequently.


There's a lot of different survey apps and tools and companies who will do this.


So they're writing this information in blogs and just making recommendations.


So basically 100 was the ideal survey size.


And some even said, and not more than a thousand, although I couldn't figure out why would you recommend not these larger numbers.


SurveyMonkey has a sample size calculator, so you can go on there and you can put in your population size and then how many responses you would need to get an accurate survey.


So, I mean, I think if you're a huge company, you could have thousands of clients who you were surveying.


So I'm not sure where that weird one thing that said you shouldn't have a thousand, but basically a hundred responses would be an ideal starting point, and then also many of these companies that do surveys, they were saying like, you could have 20 and 30 and 50 was also really good.


So it's not a perfect answer because it was a little bit all over the map, but it was a lot smaller than I thought it would be.


I thought for sure we were going to need thousands of surveys to be able to get a good response.


To nerd out, a statistically significant survey has a very specific meeting in terms of market research and statistics, and for our AEC friends, I don't think anyone requires that, needs that.


Yeah, I think you're spot on with that.


But you could get valuable feedback from one person responding, right?


One conversation is valuable, but then to start to see a trend, obviously you want to have more and more and add to that over time.


And then seeing, like we talked about at the beginning, the variances between years and like, what was going on?


What was the market like?


What was our mix of clients?


All of those things over time, I think would be pretty valuable.


So most of the surveys that we've done at Middle of Six, really they're about 20, 20, 30, 40 people max.


You know, we're not talking to 100 people for any client.


And I don't know that that would be necessary.


We get such good valuable feedback in that way as it is.


So I think that'd be fine.


Anyways, there's the answer to the question.


And definitely check out that calculator on SurveyMonkey if you have larger groups and you do want to actually figure out what that would be if you're just doing massive surveys over multiple years.


Well, we could wrap this up with a few kind of like highlight the tips that we felt like are the most helpful.


People are thinking about surveys.


Anyone want to rapid fire them out?


I'd say if you're having trouble convincing leadership that this is a good and worthwhile thing to invest in, just press in on the need to verify that your client's experiences are measuring up to what you're promising in your proposals.


I think that that's a really interesting exercise to do occasionally because the marketing staff spend so much time in proposal land on what we're telling the client we're gonna give to them, both in terms of deliverables, approaches, but also just the experience.


Like what is the brand promise that we're infusing in our proposals and are they truly getting that?


I think that is definitely something that if you can crack the code on it and get that feedback from your clients, it's probably going to not only reinforce what you're promising, but maybe even give you some new ideas on how you can amplify it further.


For the Riot leadership team and a lot of them that we talked to, they are very interested in what you just said, Allison.


They want to hear from their clients.


They want to understand how they might make different choices related to how the brand is being represented out there.


Is it time to invest in the website?


Are they telling those stories?


Is it they need to focus on a new project management system because things are falling through the cracks?


It's like, we might start with a brand perception survey and it highlights some really great things and some areas that are big gaps.


Getting those insights to, you know, well, the competition is doing it this way and that might be a blind spot.


So I haven't had any issues getting clients on board with it, but also if your leadership team is not thinking in that way, that realizing that there can be benefits just across the whole organization from this, that would be worth sharing.


Melissa, what would you say is your kind of like number one takeaway from this conversation?


Don't write the questions in a vacuum.


Collaborate with others on your marketing team, your business development team, your leadership team, and then having it proofread because you don't want to have multiple questions in one, stay in one, you know.


You want to make sure you got a good survey design.


Mm-hmm.


Massive typo.


You know, again, it's marketing, this survey, even just in their experience in taking it is going to be branded to you as the experience of interacting with your firm.


So making it easy for them to take, making it streamlined and intuitive will also reflect well on you.


I agree 100%, and then I'd also say just the practice of looking at who would you want to survey and why, and maybe some people aren't on the survey list, but they're on the list for other things to follow up or just general client care.


That's just a great mindset to be in.


So whether it's the annual survey that makes that top of mind or end-of-project stuff, that's an important piece of this whole thing, connecting with clients, making them feel like they have a voice and appreciating them and their feedback.


I don't think you'll be disappointed by going through that effort.


So if you haven't done it, consider when the right time might be and doesn't take too long to implement.


Just give yourself a quarter to get that in place, and it could be something that would be a very useful tool in the future.


All right, well, thank you so much for talking through surveys of all shapes and sizes all over the map.


It's a fun topic.


We've seen the benefits of it, for sure, so it's been something that we would definitely encourage other people to do.


Thanks for being my partners and teeing up this topic.


Yeah, thank you.


All right, thanks, Wendy.


Wendy?


The Shortlist is presented by Middle of Six and hosted by me, Wendy Simmons, Principal Marketing Strategist.


Our producer is Kyle Davis, with digital marketing and graphic design by the team at Middle of Six.


We wanna hear from you.


If you have a question or a topic you'd like us to discuss, send an email or voice memo to theshortlistatmiddleofsix.com.


If you're looking for past episodes or more info, check out our podcast page at middleofsix.com/theshortlist.


You can follow us on LinkedIn and Instagram at middleofsix.


Thanks so much for listening.


We hope you'll tell your friends and colleagues about the show, and be sure to subscribe so you don't miss any of our upcoming episodes.


Until next time, keep on hustling.


See you later.


Bye.


The Shortlist is a podcast that explores all things AEC marketing. Hosted by Middle of Six Principal, Wendy Simmons, each episode features members of the MOS team, where we take a deep dive on a wide range of topics related to AEC marketing including: proposal development, strategy, team building, business development, branding, digital marketing, and more. You can listen to our full archive of episodes here.

Comentários


This is the Beginning of Something Great.

Let's talk about your business, discuss your needs, and explore the possibilities. Click the button below, give us a call, or send us an email.

We have team members in Washington, Oregon, and California and work with clients across the country.
MAIL: PO BOX 18037, TACOMA, WA 98419
OFFICE: 706 COURT A, TACOMA, WA 98402

253.256.6592

WE ARE A WASHINGTON STATE CERTIFIED WOMAN-OWNED BUSINESS ENTERPRISE (WBE)
BRAND PHOTOGRAPHY BY EFFIE GURMEZA & LEO THE LION PHOTOGRAPHY
bottom of page