When you start using a new type of assay with Reshape, you (most likely) want to ensure you can trust the results from the platform – we understand that. That's where our validation tool comes in.
Before you start validating your analysis models, you can choose how you want to learn:
🎥 Watch our Model Validation video below, or keep reading for the step-by-step written guide.
Why validation matters
Model validation helps you confirm that the AI-automated counts are accurate and reproducible.
By comparing manual counts against the AI’s analysis, you confirm the accuracy of results and ensure your models remain consistent across experiments.
Adding manual validation data
To validate a model, you’ll manually count colonies on a selection of plates and compare them the AI-generated results.
Open a completed job with an analysis model applied. Click Validate analysis data.
This will start the validation flow and you can begin to add your manual counts. This can be done in two different ways:
You can click on the image at which point a counter will appear. Click directly on colonies to count them. Zoom in with your mouse wheel or using +/- and hold Shift to pan around the plate.
You can also directly add the count in the input field.
If a plate is too numerous to count (TNTC), use the slider to mark it accordingly.
Note: when you want to validate plates in a time-lapse job, you can first navigate to a point on the timeline where the plates are well countable. When you then click Validate analysis data, it will take you to that time point to add manual counts.
Building your validation set
A solid validation set usually includes 30–60 plates from different jobs. For best results, group these jobs in a dedicated project to make tracking easier.
Validate in teams of 3 – so each provides counts independently to help ensure data accuracy and consistency.
Reviewing validation performance
Once manual counts are added, open the Library from the sidebar to review your model’s performance.
Find the analysis you validated in the list, click it, and scroll down to the Performance score section.
This shows how closely the AI results align with manual counts, expressed as a percentage of data points within the acceptable margin of error.
You can hover over results to see comparison details for each plate. If you click View details you can get a more detailed look at the manual scores – also see them, if done inside the tool. Inside the details on the plate, click View in job to open that specific plate in the job, in a new tab.
Understanding the performance score
The performance score gives you a quick, visual readout of model reliability.
For plates with fewer than 30 colonies, the margin of error is ±3 colonies. For plates with more than 30 colonies, the margin is ±10%.
Feeling inspired? Automate more of your assays with Reshape. Click a link to one of our public jobs below to get a sense of what other assays you can run using Reshape:
And that's the end of our User onboarding – our 5 part Reshape 101 series. We hope you understanding the fundamentals of Reshape having watched or read our guides.
We can't wait to see what you will achieve with Reshape 🙌




