Overview
User Acceptance Testing (UAT) is the final gate before production, and it is notoriously difficult in IAM. Why? Because testing "Identity" often requires permissions, test accounts, and access to live systems that users don't have in a test environment. Furthermore, business users are busy; they don't want to spend 4 hours logging into a QA environment. A successful UAT phase requires the consultant to act as a "Testing Concierge"—preparing everything so the user simply has to click "Approve" and confirm it worked.
Methodology & Frameworks
The "White Glove" UAT
Do not just send a spreadsheet and say "Test this."
- Pre-Test: Verify the environment yourself (Smoke Test).
- Scheduled Sessions: Book 1-hour slots with testers.
- Guided Execution: Share screen. "Okay, click here. Now check your email. Did you get it? Great."
- Witness: You document the pass/fail. The user just validates.
Test Scenarios vs. Test Cases
- Test Case: "Click button A, enter text B." (Too granular).
- Test Scenario: "Hire a new Sales Manager in the NY office." (Business logic).
- Focus: Focus UAT on Scenarios. Let QA (IT) handle the Cases.
The "Test Data" Dilemma
IAM needs source data (HR) and target systems (AD).
- Gold Standard: Full refresh of QA from Prod (sanitized).
- Silver Standard: Manually created "Test Personas" (e.g., "Test User 1").
- Bronze Standard: Testing in Prod with dummy accounts (Risky but sometimes necessary).
Key Decisions
| Decision | Options | Recommendation | Notes / Gotchas |
|---|---|---|---|
| Environment | Dedicated QA vs. Production | Dedicated QA. | Never test provisioning logic in Prod unless you have a "Dry Run" mode. |
| Testers | IT Staff vs. Business Users | Business Users. | IT already thinks it works. You need the Business to agree it works. |
| Data Sanitization | Mask PII vs. Real Data | Mask PII. | Real PII in a lower environment is a security breach waiting to happen. Scramble SSNs and Emails. |
| Sign-off Criteria | 100% Pass vs. Critical Pass | Critical Pass + Workarounds. | If a minor UI glitch exists, don't hold up Go-Live. Document it as a "Known Issue." |
Implementation Approach
Phase 1: Test Planning
Activity: Define the "Day in the Life" scenarios. Example: "New Hire," "Promotion," "Name Change," "Termination," "Rehire." Output: UAT Test Plan Document.
Phase 2: Test Data Setup
Activity: Create the "Test Army."
- 5 New Hires (different departments).
- 2 Transfers.
- 3 Terminations. Tool: Scripts to inject this data into the Mock HR source.
Phase 3: Execution
Activity: Run the "UAT Workshops."
- Bring pizza (or digital equivalent).
- Gamify it: "Find a bug, get a gift card."
- Log: Record every pass/fail in Jira/Excel.
Phase 4: Sign-off
Activity: Formal approval. Document: "UAT Completion Certificate." Signatures: Project Sponsor, Key Stakeholders.
Deliverables
- UAT Test Script: Step-by-step instructions (screenshots helps).
- Test Data Sheet: List of test users and credentials to use.
- Bug Tracker: List of defects found and their resolution.
- Sign-off Email/Doc: The "Green Light" to go to Prod.
Risks & Failure Modes
| Risk | Likelihood | Impact | Early Signals | Mitigation |
|---|---|---|---|---|
| Environment Drift | High | High | QA config doesn't match Prod. Test passes in QA but fails in Prod. | Infrastructure-as-Code. Regular environment refreshes. |
| Tester No-Show | High | Med | Users are "too busy" to test. | Escalate to Sponsor. "We cannot go live without your sign-off." Book time on their calendar weeks in advance. |
| Data Mismatch | Med | High | QA AD has different OUs than Prod AD. | Audit the environments. Use "Bronze" testing (dummy accounts in Prod) for read-only validation. |
| "It's different than before" | High | Low | Users report "bugs" that are actually "design changes." | Reference the signed Requirements. "This is a feature, not a bug." |
KPIs / Outcomes
- Test Coverage: % of business scenarios tested.
- Defect Density: Number of bugs found per test hour.
- Fix Rate: Average time to fix a UAT defect.
- User Confidence: "On a scale of 1-10, how ready are we?"
Consultant's Notebook (Soft Skills)
The "Bug" vs. "Feature" Argument
- Users will often file bugs for things they just don't like.
- Tactic: Don't argue. "That is a valid observation. It works as designed right now, but I will log an Enhancement Request for Phase 2."
- This makes them feel heard without derailing the timeline.
Negative Testing
- Users love the "Happy Path" (everything works).
- You must force them to test the "Sad Path."
- "What happens if I reject this request?"
- "What happens if the new hire has no email?"
- These are the scenarios that break production. Make them test the errors.
The "UAT Pivot"
- Sometimes UAT reveals that the design was fundamentally wrong (e.g., "Oh, we can't use email as ID because shop floor workers don't have email").
- Courage: Stop the train.
- Better to delay Go-Live by a month than to deploy a broken system. "We learned something critical in UAT. We need to pause and adjust."
