How to run a performance review cycle on Starch
A performance review cycle is how you formally assess how people on your team are doing, set expectations for what comes next, and make decisions about compensation, role changes, or development. Most operators run them once or twice a year, and most operators find them harder than they should be — not because the conversations are easy, but because the logistics fall apart before you get there. Scheduling, collecting input, tracking who's submitted what, synthesizing feedback into something usable, following through on the action items that come out of the conversation. The mechanics eat weeks.
What this looks like in practice depends on your team size, your industry, and how structured your process already is. A 12-person services firm running 360 reviews has a different problem than a 30-person e-commerce brand doing manager-only assessments. The spokes below go deeper on each of those variations.
On Starch, the cycle stays organized without you holding it together manually. Review requests go out on schedule, meeting notes from each 1:1 are captured and archived, follow-up tasks land in the right person's queue automatically, and the documentation for each employee lives in one searchable place — not scattered across your inbox and a shared drive you'll never find again. When the cycle closes, you have a clean record of what was said, what was decided, and what happens next.
Why it matters
Reviews done badly produce two outcomes: the feedback is vague enough to be useless, or it arrives too late to change anything. Both erode trust. People leave managers who can't tell them where they stand. Reviews done well create a documented track record you can point to when you're promoting someone, managing out a low performer, or onboarding a replacement. That paper trail also matters legally. The process isn't bureaucracy — it's infrastructure for every hard people decision you'll make.
Common pitfalls
Running the cycle on a calendar but not a system — relying on manual email follow-ups to collect self-reviews means half of them arrive late and you spend a week chasing. Letting meeting notes live in your head and nowhere else, so action items from the review conversation quietly die. Writing feedback that describes personality instead of behavior, which makes it unactionable and legally risky. Treating the cycle as an event rather than a loop — closing out the review without scheduling the 30-day check-in where the development plan actually gets used.
Starch apps used
See this running on Starch
Connect your tools, describe what you want, and the agent builds it. Closed beta is free.
Choose your operator
A version of this guide tailored to your role — same recipe, different starting context.
The AI stack built for small HR teams.
The AI stack built for the founder's office.
The AI stack built for boutique professional services firms.
The AI stack built for small law and accounting practices.
The AI stack built for independent clinic owner-operators.
The AI stack built for solo media and creator businesses.
The AI stack built for educators, coaches, and course creators.
The AI stack built for CPG brands.
The AI stack built for DTC founders.
The AI stack built for event planners and agencies.
The AI stack built for foundation and nonprofit ops teams.
The AI stack built for small customer success teams.
Related workflows in People & HR
Benefits enrollment is one of those operator workflows that looks manageable until it isn't.
Read guide →Employee offboarding is the set of steps you run every time someone leaves — voluntary or not.
Read guide →Onboarding a new hire is the first real test of whether your company runs on systems or on your memory.
Read guide →Headcount planning is the process of deciding who you need to hire, when, and what you can actually afford to pay them — and then holding that plan up against your cash position often enough that you don't get surprised.
Read guide →