designing-surveys
Help users design effective surveys. Use when someone is creating customer surveys, NPS measurements, product-market fit surveys, or feedback collection mechanisms.
Packaged view
This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.
Install command
npx @skill-hub/cli install refoundai-lenny-skills-designing-surveys
Repository
Skill path: skills/designing-surveys
Help users design effective surveys. Use when someone is creating customer surveys, NPS measurements, product-market fit surveys, or feedback collection mechanisms.
Open repositoryBest for
Primary workflow: Design Product.
Technical facets: Full Stack, Designer.
Target audience: everyone.
License: Unknown.
Original source
Catalog source: SkillHub Club.
Repository owner: RefoundAI.
This is still a mirrored public skill entry. Review the repository before installing into production workflows.
What it helps with
- Install designing-surveys into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
- Review https://github.com/RefoundAI/lenny-skills before adding designing-surveys to shared team environments
- Use designing-surveys for development workflows
Works across
Favorites: 0.
Sub-skills: 0.
Aggregator: No.
Original source / Raw SKILL.md
--- name: designing-surveys description: Help users design effective surveys. Use when someone is creating customer surveys, NPS measurements, product-market fit surveys, or feedback collection mechanisms. --- # Designing Surveys Help the user design effective surveys using frameworks from 9 product leaders who have built rigorous research and feedback systems. ## How to Help When the user asks for help with surveys: 1. **Clarify the goal** - Determine if they're measuring satisfaction, identifying problems, or prioritizing features 2. **Choose the right metric** - Help them select between NPS, CSAT, PMF survey, or custom approaches 3. **Design clean questions** - Ensure each question measures one thing precisely 4. **Target the right respondents** - Help them reach users with fresh, relevant experience ## Core Principles ### NPS is scientifically flawed Judd Antin: "NPS is the best example of the marketing industry marketing itself. The consensus in the survey science community is that NPS makes all the mistakes. Customer satisfaction, a simple CSAT metric, is better. It has better data properties, it is more precise, it is more correlated to business outcomes." Use CSAT with 5-7 item scales instead. ### Force prioritization with constraints Nicole Forsgren: "Let them pick three, just three. Of those three, how often does this affect you? Is this hourly? Is this daily? Is this weekly?" Limit respondents to their top barriers to keep data clean, then measure frequency to weight impact. ### Survey your best customers at the right time Gia Laudi: "Very importantly, they signed up for your product recently enough that they remember what life was like before. Generally, we say that's in the three to six-month range." Target customers who have been using the product 3-6 months so their memory of the 'before' state is fresh. ### Onboarding surveys improve conversion Laura Schaffer: "We just asked for forgiveness and put these questions into the signup flow. An improved conversion by like 5%, just improved signups." Adding 'good friction' in the form of targeted questions can increase conversion by reassuring users they're in the right place. ### Avoid double-barreled questions Nicole Forsgren: "You're asking four different questions there. If someone answers yes, was it the build? Was it the test? Was it slow or was it flaky?" Ensure each survey question only asks about one specific variable. ### Use MaxDiff for feature prioritization Madhavan Ramanujam: "Identify the most important for you, and the least important. If you do this a few times, you will be able to prioritize the entire feature set in a relative fashion." MaxDiff (Most/Least) surveys are superior to simple ranking for identifying value drivers. ## Questions to Help Users - "What specific decision will this survey inform?" - "Are you asking about one thing per question, or multiple things?" - "Who are your 'best' customers and when did they sign up?" - "Are all scale options visible on mobile without scrolling?" - "How will you force respondents to prioritize rather than rate everything high?" ## Common Mistakes to Flag - **Double-barreled questions** - Asking about speed AND complexity in one question - **Too many options** - Allowing respondents to select unlimited items instead of forcing prioritization - **Wrong timing** - Surveying customers who are too new (no experience) or too old (forgot the 'before') - **NPS worship** - Relying on a metric with known scientific flaws over simpler, better alternatives - **Hidden scale options** - Mobile surveys where users can't see all options create response bias ## Deep Dive For all 10 insights from 9 guests, see `references/guest-insights.md` ## Related Skills - Writing North Star Metrics - Defining Product Vision - Prioritizing Roadmap - Setting OKRs & Goals --- ## Referenced Files > The following files are referenced in this skill and included for context. ### references/guest-insights.md ```markdown # Designing Surveys - All Guest Insights *9 guests, 10 mentions* --- ## Chris Hutchins *Chris Hutchins* > "run an ad for your podcast... what was my click-through rate on the ad? Which will tell you if someone doesn't click, it's either not a good description or it's not a good set of content... if they don't subscribe... my content probably sucks." **Insight:** Paid advertising metrics can serve as a quantitative survey to validate product descriptions and content quality. **Tactical advice:** - Use click-through rates (CTR) to test the appeal of your product's description or imagery - Use conversion rates to validate if the actual product meets the expectations set by the marketing *Timestamp: 01:01:57* ## Elena Verna *Elena Verna 2.0* > "Profile your people, know who you're talking to. It's more important than a couple percentage drop-off that will never going to activate in the first place. No, no, no die user please because all of us are tired of receiving outbound that is not relevant to us." **Insight:** Onboarding surveys are essential for identifying the 'buyer' vs. the 'user' and preventing irrelevant sales outreach. **Tactical advice:** - Ask about company size, department, seniority, and use case during sign-up - Limit onboarding profiling to 3-4 screens to maintain completion rates *Timestamp: 01:10:36* ## Gia Laudi *Gia Laudi* > "We identified SparkToro's best customers. Now, what I mean by best customers is those that get a ton of value from your product as of exist today, pay obviously. They're happy. They're low maintenance. And very importantly, they signed up for your product recently enough that they remember what life was like before. So generally, we say that's in the three to six-month range." **Insight:** To get accurate data on the customer journey, survey 'best' customers who signed up 3-6 months ago so their memory of the 'before' state is fresh. **Tactical advice:** - Target survey participants who have been customers for 3-6 months - Ask what was going on in their life when they first started seeking a solution - Identify the 'trigger moment' that led them to search for a product *Timestamp: 00:22:45* ## Judd Antin *Judd Antin* > "NPS is the best example of the marketing industry marketing itself... the consensus in the survey science community is that NPS makes all the mistakes... Customer satisfaction, a simple CSAT metric, is better. It has better data properties, it is more precise, it is more correlated to business outcomes." **Insight:** NPS is often a flawed and imprecise metric; Customer Satisfaction (CSAT) is a more scientifically rigorous and business-correlated alternative. **Tactical advice:** - Replace NPS with CSAT questions (e.g., 'Overall, how satisfied are you with your experience?'). - Use 5 to 7 item scales for better precision. - Ensure all scale options are visible on mobile screens to avoid bias. *Timestamp: 01:03:50* ## Laura Schaffer *Laura Schaffer* > "We just asked for forgiveness and put these questions into the silent flow and ran as Navy test with a small group... I'm not kidding, an improved conversion. There's no personalization, nothing past it, just the questions. An improved conversion by like 5%, just improved signups." **Insight:** Adding 'good friction' in the form of targeted questions can increase conversion by reassuring users they are in the right place. **Tactical advice:** - Ask questions about the user's coding language or specific use case early in the flow - Use questions to alleviate the user's fear that the product won't support their needs - Test adding questions to the signup flow even if it seems counterintuitive to reducing friction *Timestamp: 00:22:50* ## Madhavan Ramanujam *Madhavan Ramanujam* > "Identify the most important for you, and the least important... If you do this a few times, you will be able to prioritize the entire feature set in a relative fashion, and truly understand what drives willingness to pay." **Insight:** MaxDiff (Most/Least) surveys are superior to simple ranking for identifying value drivers. **Tactical advice:** - Present subsets of features and ask respondents to pick the 'most important' and 'least important' - Use purchase probability scales (1-5) to build demand curves, discounting '4s' and '5s' to reflect real-world behavior *Timestamp: 00:30:06* ## Naomi Ionita *Naomi Ionita* > "We would make a list of our features that we had and maybe new things we wanted to build and have people rank them as a must-have, nice to have, or not necessary that help us understand the relative prioritization. You can also get at it with a hundred point question where you give users a hundred points and say, 'Spend them across these different features.'" **Insight:** Use forced-ranking or point-allocation surveys to identify which features actually drive conversion and value. **Tactical advice:** - Use a '100-point question' to force users to prioritize feature value - Categorize features as 'must-have', 'nice-to-have', or 'not necessary' *Timestamp: 21:23* ## Nicole Forsgren *Nicole Forsgren 2.0* > "You can just ask a few questions, 'How satisfied are you? What are the biggest barriers to your productivity, or what are the biggest challenges to getting work done?' and let them pick either from a set of tools or maybe a set of processes and then say... Let them pick three, just three. Of those three, how often does this affect you? Is this hourly? Is this daily? Is this weekly? Is this quarterly?" **Insight:** Effective developer surveys should force prioritization and measure the frequency of friction to provide clear signals for improvement. **Tactical advice:** - Limit respondents to picking their top three barriers to keep data clean. - Measure the frequency of issues (e.g., hourly vs. quarterly) to weight their impact. *Timestamp: 00:54:12* --- > "A lot of folks go to write a survey question and they'll say something like, 'Were the build and test system slow or complicated in the last week?' You're asking four different questions there. If someone answers yes, was it the build? Was it the test? Was it slow or was it flaky or complicated or something?" **Insight:** Avoid 'double-barreled' questions in surveys to ensure the data collected is specific and actionable. **Tactical advice:** - Ensure each survey question only asks about one specific variable (e.g., separate 'speed' from 'complexity'). - Review survey questions with an LLM or expert to identify and fix ambiguous phrasing. *Timestamp: 00:55:17* ## Nilan Peiris *Nilan Peiris* > "We ask customers, is the short answer. So, we have an attribution model, as you can imagine, and we've had one from the early days, and it overlays all the referrer data and cookie data you have on visits comes to the website. So you kind of know that. And then you obviously have the soundtrack stuff, and we sample, and ask customers a set of questions on this, and then overlay that onto the... What turns up in your web tracking as direct traffic to give us a sense of how big that word of mouth number is" **Insight:** Combine in-product attribution surveys with digital tracking data to accurately measure the impact of word-of-mouth growth. **Tactical advice:** - Integrate attribution questions directly into the user flow - Overlay survey responses with referrer and cookie data to validate direct traffic - Sample customers regularly to maintain a clear picture of acquisition sources *Timestamp: 00:06:28* ```