Accessibility & Inclusive Design

A research project about technology adoption barriers has a particular obligation to be accessible to everyone. If someone cannot use our website because of a disability, we have failed at the most basic level. Accessibility is not a feature we add at the end — it is tested automatically in every pull request and enforced in code review.

Our Standard: WCAG AA

We target WCAG 2.1 Level AA compliance across all pages. This means:

  • Color contrast — text meets minimum contrast ratios against backgrounds (4.5:1 for normal text, 3:1 for large text)
  • Keyboard navigation — every interactive element is reachable and operable using only a keyboard
  • Screen reader support — content is structured with semantic HTML and appropriate ARIA attributes
  • Text alternatives — all images have descriptive alt text; all icons have accessible labels
  • Focus indicators — visible focus outlines on all interactive elements using focus-visible styles

Automated Testing with jest-axe

Key components have unit tests that include accessibility checks using jest-axe, which runs the axe-core accessibility engine against rendered components:

// Every component test includes this pattern:
import { axe, toHaveNoViolations } from 'jest-axe'
expect.extend(toHaveNoViolations)

it('has no accessibility violations', async () => {
  const { container } = render(<MyComponent />)
  const results = await axe(container)
  expect(results).toHaveNoViolations()
})

This runs in CI on every pull request. If a component introduces an accessibility violation — a missing ARIA label, an incorrect role, a form input without a label — the test fails and the PR cannot be merged.

What jest-axe Catches

  • Missing or empty alt text on images
  • Buttons and links without accessible names
  • Form inputs without associated labels
  • Incorrect ARIA roles and attributes
  • Color contrast violations
  • Missing document language
  • Duplicate IDs
  • Heading hierarchy issues

Real Fixes from Code Review

Automated Copilot code review regularly catches accessibility issues that tests miss. Here are real examples from our review history:

Icon-only buttons missing labels

A mobile menu hamburger button had an SVG icon but no aria-label. Screen readers announced it as simply "button." Fix: added aria-label="Open menu".

Breadcrumb navigation missing semantics

Breadcrumbs were using plain <span> and <a> elements. Fix: wrapped in <nav aria-label="Breadcrumb"> with <ol> structure and aria-current="page" on the current item.

Decorative separator exposed to screen readers

Breadcrumb separators () were being announced. Fix: added aria-hidden="true" to separator spans.

Video without captions

The TABS introductory video was added without a captions track. Fix: added a <track kind="captions"> element with the English VTT file, set as default.

Semantic HTML First

We prioritize semantic HTML over ARIA workarounds. The principle is simple: if there is a native HTML element that does what you need, use it instead of adding ARIA attributes to a generic element.

Instead ofWe useWhy
<div onClick><button>Built-in keyboard support, focus, and screen reader announcement
<div role="navigation"><nav>Native landmark, no ARIA needed
<div class="header"><header>Automatic landmark region
<span class="link"><a href>Native link behavior, right-click menu, browser history

Keyboard Navigation

The entire site is navigable using only a keyboard. Key patterns we ensure:

  • Tab order follows visual reading order (left-to-right, top-to-bottom)
  • Focus trapping in modal dialogs — Tab cycles within the dialog until it is closed
  • Escape key closes dialogs, menus, and overlays
  • Skip-to-content link available for screen reader users to bypass navigation
  • Visible focus indicators — all interactive elements show a clear blue ring when focused via keyboard (using Tailwind's focus-visible utilities)

Ongoing Commitment

Accessibility is never "done." We continually improve through:

  • PR accessibility checklist — every pull request template includes an accessibility section that reviewers must check
  • Copilot review — automated reviews specifically flag ARIA and accessibility issues
  • Lighthouse CI — accessibility scoring on every merge to main
  • Manual testing — periodic keyboard-only and screen reader testing of critical paths

If you encounter an accessibility issue on this site, please let us know. We take every report seriously.

Accessibility is a core value, not a compliance checkbox.