Meeting Feedback: How to Get Honest Opinions From Your Team
Your team will not tell you which meetings suck unless you ask anonymously. Here is how to collect meeting feedback that actually leads to change.
Key Takeaways
- ✓Only 3.7% of companies regularly collect meeting feedback. The other 96.3% are guessing which meetings work and which ones waste time.
- ✓Power dynamics kill honesty. Nobody tells their manager "your weekly sync is useless," even when the whole team thinks it. Anonymity is the unlock.
- ✓One question is enough: "How valuable was this meeting?" on a 1-5 scale. More questions mean fewer responses.
- ✓Collecting feedback without acting on it is worse than not collecting at all. Every feedback cycle must end with a visible change.
You have 15 recurring meetings on your calendar. You think you know which ones are good and which ones need work. You're probably wrong. And without meeting feedback, you'll never know.
Not because you're a bad manager. Because you're the manager. Your experience of a meeting is fundamentally different from your team's experience. You run the weekly sync, you set the agenda, you speak the most. Of course it feels productive to you. But the three engineers sitting on mute, waiting for the part that's relevant to them, have a very different view.
The problem is that they'll never tell you. Not in the meeting, not in a 1:1, not in a retro. The social cost of saying "your meeting is a waste of my time" is too high, even in teams with strong trust. Without a safe channel for meeting feedback, bad meetings persist for months because the people suffering through them have no way to flag the problem.
Only 3.7% of companies always collect feedback on their meetings. The overwhelming majority have no systematic way to know which meetings are working and which ones are wasting time.
— Fellow.app, State of Meetings Report, 2024
Why your team won't tell you which meetings are bad
This is not a trust problem. It's a structure problem.
In most teams, there's no mechanism for meeting feedback at all. Nobody asks. No survey goes out. No rating is collected. The meeting happens, everyone moves on, and the only signal the organizer gets is attendance (which means nothing, since most people attend out of obligation).
Employees want to decline 31% of the meetings they attend but actually decline only 14%. The gap between what people think and what they do is enormous, and it applies to feedback too.
— Otter.ai, 2023
Even when managers ask for feedback directly, the responses are filtered. "How was the meeting?" in a team setting produces "it was good" or "maybe we could be more focused." Nobody says "this meeting has no purpose and we should cancel it." The power dynamic makes honest feedback socially expensive. The person who speaks up risks being seen as difficult, disengaged, or not a team player.
Anonymous meeting feedback changes this equation completely. When people know their responses can't be traced back to them, they say what they actually think. The weekly sync goes from "it's fine" to "I've been multitasking through this for three months." The design review goes from "it could be shorter" to "only two people in this meeting actually need to be here."
This isn't speculation. It's the consistent pattern every time a team switches from direct feedback to anonymous feedback: the data gets honest, and the honest data is useful.
3 ways to collect meeting feedback
There are three approaches, ranging from manual and free to automated and continuous. Each has trade-offs.
Method 1: Post-meeting survey (manual)
The simplest approach. After each meeting (or after a batch of meetings at the end of the week), send a one-question survey: "Rate this meeting 1-5." Use Google Forms, Typeform, or a Slack poll.
Pros: Free, flexible, works with any team size.
Cons: Someone has to create and send the survey every time. Response rates drop quickly because people get survey fatigue. You have to compile the data manually into something actionable. And unless you use an anonymous form, responses are filtered by the same social dynamics that prevent honest feedback in person.
Best for: Teams testing the concept for the first time. Run it for 2-3 weeks to see if meeting feedback is valuable before investing in a more permanent approach.
Method 2: Quarterly meeting audit
Once per quarter, send a spreadsheet listing every recurring meeting and ask the team to score each one on value (1-5). Compile the scores, identify the lowest-rated meetings, and make changes.
Pros: Comprehensive snapshot. Covers every meeting in one exercise. The meeting audit template provides the exact framework and scoring system.
Cons: Only happens quarterly, so problems linger for months between audits. Easy to procrastinate or skip. The data is a snapshot, not a trend, so you can't see if a meeting is improving or declining over time. And unless you use an anonymous form for the scores, the same honesty problems apply.
Best for: Teams that want a structured, periodic review. Works well as a complement to lighter ongoing feedback.
Method 3: Automated anonymous feedback
A tool that sends rating prompts to team members after each recurring meeting. Ratings are anonymous, data aggregates automatically over time, and the team lead sees a live ranking of meetings from worst to best.
Pros: Continuous data without manual effort. True anonymity by design. Trends over time show whether changes are working. No survey fatigue because the prompt is a single tap (1-5 scale), not a multi-question form.
Cons: Requires a tool. There's a small adoption curve to get the team in the habit of rating.
Best for: Teams that want ongoing, actionable data without the overhead of managing surveys or spreadsheets. This is the approach that scales.
What questions to ask
Less is more. The fewer questions you ask, the higher your response rate. And a high response rate with one question gives you better data than a low response rate with five.
The essential question: "How valuable was this meeting?" (1-5 scale)
That's it. This one question captures everything that matters. If a meeting consistently scores 2/5, you don't need to know whether the agenda was clear or whether it started on time. You know it's not working, and you can investigate why.
Optional follow-up: "Any suggestions?" (free text, optional)
Keep this optional. Most people will skip it, and that's fine. The ones who do write something will give you the qualitative context behind the number. A rating of 2/5 tells you there's a problem. A comment that says "same three people talk, the rest of us just listen" tells you what the problem is.
Questions to avoid:
- "Was the agenda clear?" / "Did it start on time?" / "Were the right people present?" These are useful in theory but they tank response rates because nobody wants to fill out a 5-question survey after every meeting. They also fragment the signal: you get mediocre scores on five dimensions instead of a clear signal on one.
- "How would you improve this meeting?" This requires too much effort per response. Save it for the quarterly meeting audit or for 1:1 conversations where you can dig deeper.
The golden rule: one question with 80% response rate beats five questions with 20% response rate. Optimize for participation, not comprehensiveness.
If your response rate drops below 50%, the data isn't reliable. The most common cause is too many questions or prompts that arrive at the wrong time. Keep it to one question, delivered within an hour of the meeting ending.
Get the meeting feedback survey template
A pre-built 1-question rating survey ready to duplicate. Just copy it and send it to your team after any meeting.
How to act on meeting feedback
Collecting feedback without acting on it is worse than not collecting at all. It signals that you asked but don't actually care. The team stops participating, and you've lost the channel permanently.
The action loop has five steps. Follow all five or don't start.
Step 1: Share the results. Post the meeting rankings with your team. Transparency builds trust and shows that the feedback is being taken seriously. You don't need to share individual ratings, just the averages. "Here are our 12 recurring meetings ranked by team rating. Three scored below 2.5."
Step 2: Pick the worst one. Don't try to fix five meetings at once. Pick the single lowest-rated meeting and focus there. One change, done well, builds momentum for the next.
Step 3: Fix or kill it. You have five options for any low-rated meeting:
- Cancel it. If nobody would miss it, remove it from the calendar.
- Shorten it. A 60-minute meeting might work as a 25-minute meeting with a tighter agenda.
- Reduce frequency. Weekly to biweekly. The meeting cadence guide covers when this makes sense.
- Cut attendees. A 10-person meeting where only 3 contribute should be a 3-person meeting with async notes for the rest.
- Go async. Replace the meeting entirely with a structured Slack post or a short async update.
Step 4: Tell the team what changed. "Based on your feedback, we're canceling the Wednesday all-hands and replacing it with a written update in Slack." This closes the loop. It proves that feedback leads to action, which is the single most important thing you can do to keep people participating.
Step 5: Repeat monthly. One meeting fixed per month is 12 meetings improved per year. That compounds into a transformed calendar. The key is consistency: same process, same cadence, visible results every time. Without this rhythm, meeting overload creeps back within months.
If your team has given feedback before and nothing changed, you have a trust deficit. Acknowledge it. Say "I know we've asked for feedback before and didn't act on it. This time is different, and here's the change we're making this week." Then follow through.
You can do this manually with surveys and spreadsheets. Many teams do, and it works. But it requires discipline: creating the surveys, compiling the data, chasing response rates, and repeating the cycle every month.
Kill One Meeting automates the entire loop. It sends anonymous rating prompts to your team after every recurring meeting, aggregates the scores into a live ranking, and surfaces the meetings your team rates lowest. No surveys to create, no spreadsheets to maintain, no data to compile. Your team rates, you act. One meeting fixed per month, compounding into a better calendar over time. Free for 30 days.
Frequently asked questions
- How do you get honest feedback on meetings?
- Anonymity is the key. People filter their feedback when it can be traced back to them, especially when the meeting organizer is their manager. Use anonymous surveys, anonymous rating tools, or at minimum a Google Form that does not collect email addresses. The question should be simple: "How valuable was this meeting?" on a 1-5 scale.
- What questions should you ask about meetings?
- One question is enough: "How valuable was this meeting?" on a 1-5 scale. An optional free-text field for suggestions captures qualitative context. Avoid multi-question surveys because they reduce response rates without proportionally improving the data quality. A high response rate on one question beats a low response rate on five.
- How do you give feedback about a bad meeting?
- If your team has anonymous meeting feedback in place, use it. A low rating speaks for itself. If not, raise it in your next 1:1 with the organizer using specific, constructive language: "I noticed the last three syncs covered topics I could have read async. Would you be open to trying a Slack update instead?" Focus on the format, not the person.