1) What is a VPAT
A VPAT (Voluntary Product Accessibility Template) is a standardized format used to document how a product supports accessibility requirements. When completed as an ACR (Accessibility Conformance Report), it describes conformance against one or more standards (commonly Section 508 and WCAG) using the VPAT structure.
VPATs can apply to many product types: web apps, mobile apps, software, documents, hardware, and support documentation. The key is that the VPAT should match the actual features and user workflows the product provides.
↑ Back to top2) Sections of a VPAT
A VPAT/ACR typically includes the following core sections (wording varies slightly depending on VPAT version and edition).
- Title: “Accessibility Conformance Report (ACR)”
- Product name, version, and platform(s)
- Report date
- Vendor contact information
- Clear description of the product and its main workflows
- Any limitations, exclusions, assumptions
- Dependencies (browser, OS, assistive tech, integrations)
- Manual testing approach
- Assistive technologies used (e.g., NVDA, JAWS, VoiceOver)
- Browsers / OS / devices tested
- Automated tools (optional, but should not be the only method)
- Supports
- Partially Supports
- Does Not Support
- Not Applicable
- Tables for each applicable standard/guideline
- Each row includes: Criterion + Conformance Level + Remarks/Explanations
- Remarks should explain how it supports or what fails + impact
- Software (desktop), support documentation, or hardware tables
- Functional performance statements
- Exceptions / alternate versions
3) How to spot red flags in a VPAT
These red flags suggest the VPAT may be incomplete, outdated, or not based on thorough testing.
- Blank rows or missing “Remarks and Explanations”
- “All Supports” with minimal or generic remarks
- “Partially Supports” but no user impact explanation
- Wrong terminology (“Pass/Fail” instead of VPAT language)
- Outdated report date (e.g., very old compared to product releases)
- Evaluation methods list only automation (no manual + no AT)
- Assistive technologies not named (e.g., “screen readers were used”)
- Browsers/devices not listed
- Scope unclear (which features, which platforms, which workflows)
- Too many “Not Applicable” for features the product clearly has
4) What questions / issues identified do you ask a Vendor after a VPAT review
Use vendor questions to clarify scope, confirm testing credibility, and resolve inconsistencies.
- Which platforms are covered (web, iOS, Android, desktop)?
- Which key workflows are included/excluded (login, forms, upload, approvals, reporting)?
- Does the report cover support documentation and help content?
- Which assistive technologies were used (NVDA, JAWS, VoiceOver, TalkBack)? Versions?
- Which browsers and OS versions were tested?
- What manual test scripts or scenarios were used?
- If automation was used, can you share tool outputs (high level)?
- Why are there many “Supports” with no explanation?
- Why is a requirement marked “Not Applicable” if the feature exists?
- Why is the VPAT dated X months ago if the product has been updated since?
- Who authored the VPAT (internal vs external) and what was their approach?
- For “Partially Supports/Does Not Support”, what is the remediation plan?
- Target timelines for fixes and re-testing?
- Do you have an accessibility statement, roadmap, or release notes for accessibility?
5) What does a good VPAT look like?
- Clear product description with included platforms and key workflows
- Evaluation methods include manual testing + named AT + browsers/OS
- Every row has meaningful “Remarks and Explanations”
- “Partially Supports” explains what works vs what doesn’t + user impact
- Uses standard VPAT terms (Supports / Partially Supports / Does Not Support / Not Applicable)
- Report is recent and aligns with product release cadence
- Known tough areas (keyboard, forms, focus, errors) have detailed remarks
- Any exceptions or limitations are transparent (not hidden)
- Vendor can answer follow-up questions without contradictions
Example remark style (good):
"Partially Supports — Keyboard access is available for all primary navigation and forms.
However, the drag-and-drop reordering in the approval workflow requires mouse input.
Workaround: use up/down buttons in the list view. Fix planned in Q3 release."
↑ Back to top
6) How do you assess the accuracy of a VPAT?
You can quickly assess accuracy by combining (A) a structured read-through, (B) basic consistency checks, and (C) a few fast “spot tests”.
- Does the VPAT accurately describe the product and essential features?
- Are any major workflows excluded (login, checkout, approvals, reporting)?
- Are evaluation methods specific (AT, browsers, manual steps)?
- Are there blank “Remarks and Explanations” cells?
- Do “Supports” rows explain how it’s supported (not just “Supported”)?
- Are “Not Applicable” rows truly not applicable?
- Keyboard: can you reach everything and operate key actions?
- Focus: is focus visible and not lost/hidden?
- Forms: do labels and errors work with screen readers?
- Reflow: does content avoid horizontal scroll at high zoom?
- If your spot test finds an issue, does the VPAT mention it in the relevant rows?
- If not, that’s a strong indicator the VPAT is inaccurate or incomplete.
- Follow up with the vendor for clarification and updated evidence.