CQC inspectors arrive at your practice with a framework, but they also arrive with instincts. Certain things they see, hear, or read in the first hour trigger a shift from routine assessment to focused scrutiny. These are not obscure technicalities. They are visible, practical indicators that suggest a practice's governance may not be as strong as its paperwork claims.
In brief: CQC inspectors look for five specific red flags during GP practice inspections: out-of-date risk assessments, reactive-only significant event analysis, policies that do not match actual practice, missing staff training records, and complaints systems without evidence of learning. Each signals a governance gap that prompts deeper investigation. This guide explains what inspectors are looking for, why each flag matters, and the specific steps to fix each one.
Knowing what these red flags are does not mean gaming the system. It means understanding what a well-governed practice looks like from the outside, so you can identify genuine gaps before an inspector does.
What counts as a red flag
A CQC red flag is not a single failing. It is a pattern that suggests the practice's governance systems are not working as intended. Inspectors are trained to look for disconnects: between what is documented and what is happening, between what staff say and what evidence shows, between what policies promise and what practice delivers.
The five red flags below are drawn from CQC inspection reports, GP Mythbuster guidance, and the CQC assessment framework for GP practices. Each one is something an inspector can identify quickly, often within the first hour of being on site.
1. Risk assessments that are out of date or generic
What inspectors look for
CQC inspectors check risk assessments under both the Safe and Well-led key questions. They are looking for three things:
Currency. Is the assessment dated within the last 12 months, or after the last significant change to the area it covers?
Specificity. Does it describe this practice, or could it apply to any practice in England?
Evidence of action. Have the control measures been implemented, and is there evidence of review?
Why this is a red flag
A risk assessment dated 2023 that does not mention same-day urgent access (contractual from April 2026) or the latest cleanliness standards tells the inspector that the practice is not reviewing its risks in response to change. A generic assessment downloaded from a template library tells them the practice has not thought about its own specific circumstances.
Both suggest the governance system is performative rather than functional.
What to do
Review your general health and safety risk assessment. If it is more than 12 months old, it needs updating regardless of whether anything has changed. If the practice has implemented operational changes (new access model, new digital systems, building work), those changes should be reflected
Check specificity. Every risk assessment should reference your practice by name, describe your specific arrangements, and identify your specific hazards. "Staff should wash hands regularly" is generic. "Clinical rooms 1-4 have wall-mounted sanitiser dispensers; reception has a freestanding unit by the front desk" is specific
Record reviews. When you review a risk assessment and decide no changes are needed, record that decision with a date and signature. "Reviewed [date], no changes required" is better than nothing. "Reviewed [date], no changes required because [reason]" is better still
For a detailed guide to writing risk assessments that meet CQC expectations, see our GP practice risk assessment guide. It covers the five elements every assessment needs, common mistakes, and the review cycles for each assessment type.
2. Significant events that are only analysed when something goes wrong
What inspectors look for
CQC expects six elements from a significant event analysis process:
Staff identify and report events
Information is gathered systematically
Events are analysed as a team
Changes are implemented as a result
Analysis is documented in a written report
Learning is shared across the practice
Inspectors check for all six. Most practices have some of these. Few have all six working as a connected system.
Why this is a red flag
A practice that formally analyses three or four significant events a year, all of them serious incidents, suggests a reactive culture. The near-misses and system failures that would have flagged problems earlier are not being captured.
Inspectors will ask: "How many significant events have you analysed this year?" and "Can you show me an example of where a significant event led to a change?" If the answer to the first question is "two or three", and the examples are all serious incidents, the inspector will conclude that the practice only investigates when forced to.
A healthy reporting culture shows more events, not fewer. Near-misses should outnumber harm events. All staff groups should be reporting, not just clinicians.
What to do
Implement a proportionate triage system. Not every event needs a full formal analysis. Use a harm grading framework so your response matches severity: quick discussions for low-harm events, structured investigation for moderate harm, full formal SEA for serious incidents
Start a central event log. Every event, from near-misses to positive outcomes, goes in one place. This becomes your evidence of a learning culture
Hold monthly learning meetings. Protected time, documented agenda, actions recorded and followed up. This is the single most powerful piece of evidence you can show an inspector
For a step-by-step implementation plan, see our improving significant event analysis guide. For a quick-start approach with worked AI-generated SEA examples mapped to CQC Mythbuster 3, see the significant event analysis template guide.
3. Policies that do not match what the practice actually does
What inspectors look for
Inspectors do not just read policies. They test them against reality. Common tests:
Ask a receptionist how they handle a patient who calls with chest pain. Compare the answer to the emergency management policy
Ask a nurse how they report a clinical incident. Compare the answer to the significant event policy
Ask the practice manager when the business continuity plan was last tested. Check the plan for a test date
Walk through the building and compare what they see with the health and safety risk assessment
Why this is a red flag
A policy that says one thing while staff do another is worse than having no policy at all. It suggests the practice creates documents for compliance purposes without embedding them in operations. This is a governance failure, not a documentation failure.
The disconnect is often invisible to the practice because the policies were written once, filed, and never revisited. Staff develop their own working practices over time, and those practices drift from the documented version.
What to do
Test your own policies. Pick three policies at random. Ask a staff member how they handle the situation each policy covers. If the answers do not match the documents, update either the policy or the practice
Date and version your policies. Every policy should show when it was last reviewed and by whom. A policy with no review date is a red flag in itself
Make policies accessible. If staff cannot find the policy quickly, they are not using it. Whether you use a shared drive, a printed folder, or a compliance platform, the test is: can any staff member locate any policy within two minutes?
For a framework covering all 11 compliance domains and their policy requirements, see our complete GP practice compliance guide.
4. Staff training that cannot be evidenced
What inspectors look for
Inspectors ask to see training records. Specifically:
Mandatory training completion rates. Safeguarding (adults and children), basic life support, fire safety, infection control, information governance. Who has completed what, and when?
Role-specific training. Has the person doing clinical triage been trained to do clinical triage? Has the fire warden completed fire warden training?
Induction records. For new staff and locums, is there a documented induction process with sign-off?
Why this is a red flag
A practice that cannot produce training records on request has one of two problems: the training has not happened, or it happened but was not recorded. Both are governance failures. The inspector cannot distinguish between them, and the burden of proof is on the practice.
This is particularly acute for safeguarding training, where CQC expects all clinical staff to be trained to the appropriate level and all non-clinical staff to have basic awareness training. A safeguarding gap is one of the findings most likely to result in a requirement notice.
What to do
Create a training matrix. A simple spreadsheet listing every staff member, the training they need, the date they last completed it, and when it expires. Update it whenever training happens
Record everything. External course certificates, internal training sessions (date, attendees, content covered, trainer), e-learning completion records. If it is not recorded, it did not happen
Set up renewal alerts. Most mandatory training has an annual or three-yearly renewal cycle. Automate the reminders rather than relying on memory
For a detailed implementation plan including delegation options, see our staff training and competency system guide.
5. Complaints handled without evidence of learning
What inspectors look for
CQC checks three things about complaints:
Process. Is there a documented complaints procedure? Are patients told how to complain? Is there an acknowledgment within three working days and a response within a reasonable timeframe?
Analysis. Are complaints categorised, trended, and reviewed for patterns? Is there a theme register?
Learning. Can the practice show specific examples where a complaint led to a change in practice?
Why this is a red flag
A practice that handles complaints individually but never reviews them collectively is missing the governance loop. Individual complaints get resolved, but the same types of complaints keep recurring because nobody is looking at the pattern.
Inspectors will ask: "What themes have you identified from complaints this year?" and "Can you give me an example of a change you made as a result of complaint trends?" If the practice cannot answer these questions, the inspector will conclude that the complaints system is administrative (processing each complaint) rather than governance-driven (using complaints to improve).
What to do
Start a theme register. Every complaint gets a category (access, communication, clinical care, staff attitude, environment). Review the categories monthly. If "access" complaints spike after an operational change, that is a governance signal
Hold monthly review meetings. Complaints themes go on the agenda alongside significant events. Document the discussion and any resulting actions
Close the loop. When a complaint leads to a change, record the change and link it back to the complaint. This creates the audit trail inspectors look for
For a framework including a claims-and-harms triage system, see our improving complaints handling guide.
The pattern across all five red flags
Every red flag on this list shares the same underlying problem: a disconnect between documentation and practice. The practice has policies, risk assessments, and processes, but they are not connected to what actually happens day to day.
CQC does not expect perfection. It expects practices to be aware of their risks, to have reasonable systems in place, and to be able to demonstrate that those systems work. The word inspectors use is "assured." They want to be assured that governance is real, not performative.
The practical test is simple: if an inspector asks about any of the five areas above, can your practice show evidence of a working system, not just a document that describes one?
What to check this week
Pull out your general health and safety risk assessment. Check the date. If it predates any significant operational change in the last year, it needs updating
Count your significant events for the last 12 months. If the number is under six, or if they are all serious incidents, your reporting culture needs attention
Pick a policy at random. Ask a member of staff how they handle that situation. If their answer does not match the policy, either the policy or the practice needs to change
Open your training matrix. If you do not have one, make one this week. If you do, check for expired mandatory training
Review your last five complaints. Can you identify a theme? Did any of them lead to a documented change?
How My Practice Manager helps
If you are managing compliance across risk assessments, policies, training, and significant events, a systematic approach saves time and reduces the chance of gaps. My Practice Manager brings your compliance tasks, risk assessments, policies, and document management into one place.
The AI Risk Assessment Generator creates practice-specific assessments from your context, not generic templates. The AI Policy Writer drafts policies tailored to your practice's arrangements. And the Compliance Library organises everything across 11 domains so nothing falls between the cracks.
Create your free account to generate your first compliance document — no credit card required. Start free (3 documents, 1 library domain, 5 tasks) or upgrade to Starter at £12/month for unlimited access to core templates.
Further reading
GP practice compliance: the complete guide to 11 essential domains
Infection control: what to check the night before CQC inspection
This article is for informational purposes only and reflects understanding as of March 2026. It does not constitute legal, financial, or medical advice. Practices should consult with relevant professional bodies and always refer to the latest official guidance from CQC, NHS England, and HSE.
