How to run an employee engagement survey as Foundation and Nonprofit Ops Teams

People & HRFor Foundation and Nonprofit Ops Teams3 apps10 steps~20 min to set up

Your four-person ops team runs an annual staff engagement survey the same way it's been done for years: a SurveyMonkey form someone built, results exported to a Google Sheet, a week of manual pivot tables to cut the data by department or tenure, and a slide deck assembled the night before the all-hands. You never have a clean picture of whether grant program staff feel differently than operations staff, whether remote employees answered at lower rates, or whether scores moved year-over-year — because stitching that together from a spreadsheet and a presentation tool takes more time than you have. The purpose-built HR survey platforms assume an HR department. You don't have one.

People & HRFor Foundation and Nonprofit Ops Teams3 apps10 steps~20 min to set up
Outcome

What you'll set up

An automated survey distribution workflow that sends personalized invitations, tracks response rates by team or role, and sends follow-up reminders to non-responders — without manual list management
A live results dashboard that breaks down engagement scores by department, tenure band, and employment type, updating as responses come in, so you're not waiting until the survey closes to see the picture
A board-ready summary presentation generated from the final results, with trend comparisons to prior cycles if you have historical data, formatted for your all-hands or audit committee without Sunday-night slide work
The Starch recipe

Apps, data, and prompts

The combination of Starch apps, the data sources they pull from, and the prompts you use to drive them.

Data sources & config

Starch syncs your Paylocity data on a schedule (employee list, departments, employment types, tenure) to power response-rate tracking. SurveyMonkey results are pulled via browser automation — no API needed — or exported to Google Drive and connected from Starch's integration catalog; the agent queries it live when the dashboard runs. Gmail is connected directly by Starch to handle survey invitation and reminder emails. Slack is available from Starch's integration catalog for daily status pings.

Prompts to copy
Build me an employee engagement survey tracker that shows response rates by department (program, operations, finance) and employment type (full-time, part-time, contractor), pulls employee list from our Paylocity data, flags departments below 70% response rate, and sends me a daily Slack summary while the survey is live.
Build me a survey results dashboard that takes our SurveyMonkey export from Google Drive, calculates an overall engagement score and sub-scores for manager effectiveness, mission alignment, and workload, compares to last year's numbers, and breaks everything down by tenure band (under 1 year, 1–3 years, 3+ years).
Draft an all-staff email announcing the engagement survey opens Monday, explain it's anonymous, takes 8 minutes, closes in two weeks, and that results will be shared at the June all-hands. Use a warm but direct tone appropriate for a 22-person foundation team.
Run these in Starch → or paste them into your favorite agent
Walkthrough

Step-by-step

1 Connect Paylocity — Starch syncs your employee roster on a schedule, so your distribution list is always current without a manual export the morning you launch.
2 Build the survey distribution tracker: describe your team structure to Starch ('we have program, operations, and finance departments; I want to track response rates by department and flag anyone below 70%') and Starch builds the tracking app on top of your Paylocity data.
3 Use the Email Agent app to draft and send the survey launch email — describe the tone and content in one sentence, review the draft, and send directly from Starch without switching to Gmail.
4 Schedule automated reminder emails at day 5 and day 10 for non-responders: tell Starch 'send a reminder to anyone who hasn't completed the survey by next Tuesday, subject line: Quick reminder — survey closes Friday,' and it handles the segmentation and send.
5 As responses arrive, point Starch at your SurveyMonkey results — either via browser automation or a Google Drive export — and tell it to calculate your engagement sub-scores: manager effectiveness, mission alignment, psychological safety, and workload.
6 Build the results dashboard: describe the breakdowns you need ('cut by department, tenure band, and whether the employee is program-facing or operations-facing') and Starch assembles the view with year-over-year comparison if you have a prior export.
7 Use the Knowledge Management app to document your survey methodology — which questions map to which sub-scores, what your benchmark is, what 'action required' means for a given score — so next year's ops lead isn't starting from scratch.
8 When the survey closes, tell Starch: 'Build me a 6-slide summary of the engagement results for our June all-hands: overall score, top 3 strengths, top 2 concerns, year-over-year trend, and one slide on what we're committing to change.' Review and export.
9 For the board or audit committee, generate a separate 2-page narrative summary: 'Write a brief engagement survey summary for our board packet — factual, no spin, note response rate, overall score, any significant changes from last year, and what management is doing about the lowest-scoring areas.'
10 Archive the final results file and methodology notes in Knowledge Management tagged by survey cycle, so you have an institutional record that survives staff turnover — especially important for a small team where one person often holds all the context.

See this running on Starch

Connect your tools, describe what you want, and the agent builds it. Closed beta is free.

Try it on Starch →
Worked example

Spring 2026 All-Staff Engagement Survey — Grantwell Foundation (22 staff)

Sample numbers from a real run
Overall engagement score74
Program staff score79
Operations staff score67
Response rate — program92
Response rate — operations71
Year-over-year change (overall)3

Grantwell's ops team launched a 22-question survey in May 2026. Starch pulled the current employee list from Paylocity — 22 staff across program, operations, and finance — and built a response-rate tracker that flagged operations at 71% on day 8 (below the 75% target), triggering an automatic reminder email drafted by the Email Agent and sent the same afternoon. Final response rate came in at 86% overall. The results dashboard, built on a Google Drive export of the SurveyMonkey data, showed an overall score of 74 — up 3 points from Spring 2025 — but flagged a 12-point gap between program staff (79) and operations staff (67), driven by low scores on 'workload is manageable' and 'I have the tools I need to do my job.' The board packet summary, generated in Starch in under five minutes, led with the response rate and overall score, named the program/operations gap directly, and noted management's commitment to a Q3 ops team working-conditions review. Total staff time to run the full cycle: about 4 hours, down from the 14 hours it took the prior year with manual pivot tables.

Measurement

How you'll know it's working

Survey response rate by department (target: 80%+ for results to be statistically meaningful across a 22-person team)
Overall engagement score and year-over-year change (foundation boards increasingly ask about staff health as a governance indicator)
Score gap between program-facing and operations staff (ops teams at small foundations are often the most under-resourced and hardest to retain)
Time from survey close to results shared with all-staff (a proxy for whether the ops team can act on feedback quickly enough to matter)
Percentage of prior-cycle action items completed before next survey (accountability metric for the leadership team)
Comparison

What this replaces

The other ways teams handle this today, and how the Starch version compares.

SurveyMonkey + manual Google Sheets analysis
Fine for collecting responses; the pain is everything after — cutting by department, comparing to last year, building the board summary — which SurveyMonkey doesn't do and Sheets requires hours of manual work to approximate.
Lattice or Culture Amp
Purpose-built for engagement surveys with benchmarks and manager dashboards, but priced and structured for HR teams; a 22-person foundation will pay for features it doesn't need and still have to manually connect results to its board reporting workflow.
Qualtrics
Powerful analytics and branching logic, but the cost and implementation complexity are sized for enterprise HR functions — not an ops generalist at a small foundation who also handles grants compliance and board prep.
Google Forms + Slides
Zero cost and already in your stack, but analysis is entirely manual; there's no automated response tracking, no year-over-year comparison, and building the board deck from raw Form data takes a full afternoon every cycle.
On Starch RECOMMENDED

One platform — task manager, knowledge management, email agent all running on connected data. Setup in plain English; numbers stay current via scheduled syncs and live agent queries.

Try it on Starch →
FAQ

Frequently asked questions

We only have 22 staff. Is it worth building an automated survey workflow for something this small?
The automation pays off less in the distribution step and more in the analysis step. For a team of 22, response tracking and reminder emails are maybe an hour of work. But cutting results by department and tenure, comparing to last year, and generating a board-ready summary — that's 6–10 hours of work with spreadsheets. That's the part Starch collapses to 30 minutes.
We currently use SurveyMonkey. Do we have to switch?
No. Keep SurveyMonkey as your collection tool. Starch connects to your SurveyMonkey results via browser automation — no API needed — or you can export to Google Drive and have the agent query it live when your dashboard runs. The survey instrument doesn't change; Starch handles what happens with the data after.
Our employee list lives in Paylocity. Does Starch actually connect to that?
Yes. Starch syncs your Paylocity data on a schedule — employees, departments, employment types — so your distribution list reflects actual headcount without a manual export. If someone joined last month or changed departments, the tracker knows.
Our board wants engagement data in the annual audit committee report. Can Starch format it for that?
Yes. Describe what the audit committee expects — response rate, overall score, year-over-year change, management response to low-scoring areas — and Starch generates a formatted narrative summary you can drop into the board packet. You review and edit; you're not starting from a blank page.
We don't have historical survey data in a consistent format. Can Starch still do year-over-year comparisons?
If you have prior results in any structured form — a spreadsheet, a PDF export, even a prior slide deck — Starch can work with what you have. The comparison will be as clean as your historical data allows. If year one is this survey cycle, you're building the baseline now.
Is Starch SOC 2 certified? Survey responses include staff sentiment data we'd want to handle carefully.
Starch is not SOC 2 Type II certified today. If your foundation has strict data handling requirements for employee feedback data — particularly if your HR policy treats engagement responses as sensitive personnel information — that's worth knowing upfront. For many small foundations, the practical risk profile of a hosted SaaS tool for internal survey analysis is manageable, but we'd rather you make that call with accurate information.

Ready to run run an employee engagement survey on Starch?

Request closed-beta access. Everything is free during beta.

You're on the list! We'll be in touch soon.