Publicité
ERP IMPLEMENTATION
🇫🇷 Lire en français

ERP Testing and UAT: Complete Checklist for Successful Go-Live

Comprehensive ERP testing methodology: 5 test types, 10-point go-live checklist, bug management, and common mistakes to avoid.

ERP Testing and UAT: Complete Checklist for Successful Go-Live

Your ERP is configured. Workshops are complete. The implementer announces “everything is ready for testing.” This is when the most underestimated phase of any ERP project begins: User Acceptance Testing.

Poor UAT execution leads to go-live disasters. Invoices won’t generate. Inventory doesn’t reconcile. Key users call support every 10 minutes. And a rollback costs more than the entire project.

This guide provides a complete methodology to organize your tests, structure your UAT process, and make go-live decisions based on facts—not optimism.


Why ERP UAT is the Most Underestimated Phase

The Cost of Failed Go-Lives

The examples are numerous. In 2017, Revlon lost $64 million in unprocessed orders after a rushed SAP go-live at their North Carolina plant (CIO.com). In 1999, Hershey compressed their implementation timeline from 48 to 30 months—result: $100 million in blocked orders during Halloween season (TechTarget).

In both cases, the same pattern: insufficient testing, schedule pressure, and go-live triggered before end-users validated the system.

On a smaller scale, consequences for SMEs are equally concrete: manual data recovery, production stops, frozen billing for days, and loss of team confidence in the new system.

The 3 Essential Test Types

Before discussing methodology, we must distinguish three test levels that complement each other:

  • Unit tests: each function tested in isolation (create item master, enter sales order, calculate VAT). This is the implementer’s work, validated by internal IT.
  • Integration tests: modules communicate with each other. Does a sales order trigger a delivery note? Does delivery generate an invoice? Does invoicing impact accounting?
  • User Acceptance Testing (UAT): key users execute complete business scenarios, end-to-end, in production-like conditions. This is the final verdict—determining go/no-go.

Testing in the broader sense encompasses all three levels. But UAT makes the difference between controlled and suffered go-live.


Preparing UAT — Before Executing a Single Test

Building the Testing Team

UAT is not an IT project. It’s a business project led by IT. The team must include:

  • Key users: 1 per major business process (purchasing, sales, production, accounting, HR). They validate that the system matches their daily reality.
  • ERP project manager: coordinates testing schedule, prioritizes anomalies, and arbitrates edge cases.
  • IT lead: manages test environment, access rights, test data, and liaison with implementer.
  • Implementer: fixes reported issues and documents gaps between configuration and requirements.

Plan for 20 to 30% of key users’ time during UAT. This isn’t “extra” time—it’s time invested to avoid three months of struggle after go-live.

Defining Business Test Scenarios

A test scenario isn’t “test the purchasing module.” It’s a precise sequence of steps with documented expected results.

Examples of end-to-end scenarios:

  • Quote-to-cash: quote → sales order → delivery → invoice → payment → accounting entry
  • Procure-to-pay: purchase requisition → purchase order → receipt → vendor invoice → payment → bank reconciliation
  • Record-to-report: journal entries → month-end close → tax reports → financial reporting
  • Hire-to-retire (if HR module): hiring → employee record → payroll → tax filing → termination

For each scenario, document: steps, input data, expected results, modules involved, and external interfaces (banking, EDI, e-commerce).

Building Realistic Test Data

This is the classic trap: testing with 10 clean items, 5 customers without history, and zero edge cases. On go-live day, the system encounters 15,000 references, customers with special pricing conditions, cascaded discounts, and multiple currencies.

Realistic test data must include:

  • Representative sample of item master (including complex cases: kit items, lot tracking, obsolete items)
  • Customers with varied payment terms (cash, net 30, early payment discounts)
  • Historical orders for data migration testing
  • Exception cases: credit memos, partial returns, multi-currency orders, multi-site deliveries

If your test data is too clean, your tests will be too—and problems will appear in production.


Executing UAT — 5-Step Methodology

Step 1 — Unit Tests by Module

The implementer starts by verifying each function individually. Create an item, modify a customer record, run MRP calculation, generate accounting entry. This is the foundation: if basic building blocks don’t work, testing the assembly is pointless.

Expected result: each function produces documented results per specifications. Unit tests are tracked (spreadsheet or dedicated tool) with OK/KO status and execution date.

Step 2 — Inter-Module Integration Tests

Verify modules communicate correctly. A sales order in the commercial module must generate a pick order in WMS, a revenue entry in accounting, and stock update.

Common friction points:

  • Rounding differences between modules (VAT, discounts)
  • Missing data in flows (required accounting field not populated from sales)
  • Inconsistent numbering sequences (invoices, delivery notes)
  • Access rights blocking inter-module flows

Step 3 — End-to-End Tests

Key users execute complete business scenarios defined in preparation phase. This is UAT’s core. Each key user plays their real role: salesperson enters quote, warehouse staff picks order, accountant reconciles invoice.

Vendor test tools facilitate this phase. SAP Solution Manager offers a complete Test Suite module for planning, executing, and tracking tests with impact analysis (SAP Support). Odoo provides native test mode with separate sandbox databases. NetSuite supplies sandbox environments with automatic production data refresh.

Each end-to-end test must be documented: steps executed, actual vs. expected results, screenshots if variance, and any anomaly classification.

Step 4 — Load and Performance Testing

An ERP working with 3 simultaneous users can crash with 50. Load tests verify:

  • Response time on critical transactions (order entry, inventory lookup, invoice generation) with planned concurrent users
  • System behavior during peak activity (month-end close, batch billing)
  • Database performance with realistic data volume (not 100 records—100,000 records)

For cloud ERP (NetSuite, Odoo SaaS, Sage Intacct), load testing is often limited by SaaS contract—verify vendor performance commitments.

For on-premise or hybrid ERP, load testing with tools like JMeter or Gatling on critical processes is strongly recommended.

Step 5 — Data Migration Testing

Data migration is often the poor relative of ERP projects. Yet it determines whether go-live occurs with reliable data or a new system filled with corrupted data.

Test:

  • Migrated data integrity: are accounting balances correct? Does inventory match?
  • Data relationships: does an open order reference the correct customer, item, price?
  • Duplicates: did migration deduplicate customer and vendor records?
  • Completeness: are all required fields in the new system populated?

Plan at least two dry runs (complete migration rehearsals) before actual go-live.


Managing Anomalies — Prioritize, Track, Fix

Anomaly Classification

All anomalies don’t carry the same weight. Use a three-level classification:

  • Blocking: business process cannot complete. Example: cannot validate invoice, incorrect VAT calculation, data lost during flow. Go-live impossible while any blocking anomaly remains open.
  • Major: process works but with painful workaround or error risk. Example: bank reconciliation report requires manual export step. Go-live possible, but correction must occur within 2 weeks.
  • Minor: usage discomfort without business impact. Example: poorly translated label, inappropriate default sort. Fix schedulable in later batch.

Tracking Tools

For SMEs, a structured spreadsheet suffices provided it’s shared and updated daily. Essential columns: anomaly ID, date, module, test scenario, description, severity, fix owner, status, resolution date.

For mid-market companies, dedicated tracking tools (Jira, Azure DevOps, or ERP’s own incident module) provide necessary traceability and facilitate steering committee reporting.

The essential: each anomaly has an owner and target fix date. No orphaned anomalies.

Go/No-Go Criteria

Go/no-go isn’t a show of hands vote. It’s a decision based on objective criteria defined before UAT starts:

  • Open blocking anomalies = 0 (non-negotiable)
  • Open major anomalies < defined threshold (typically 3-5, with documented workaround for each)
  • Test scenario coverage rate > 95% (all critical scenarios executed)
  • Data migration validated by business (balances verified, reference data correct)
  • Key users trained and operational (not just “informed”—capable of using the system alone)

If any single criterion isn’t met, go-live is postponed. It’s costly, but infinitely less than a failed go-live.


Go-Live Checklist — 10 Points to Validate Before Cutover

  1. All critical UAT scenarios passed OK — with signed-off PV by key users
  2. Zero open blocking anomalies — remaining major ones have documented workarounds
  3. Data migration validated — final dry run executed successfully, balances verified
  4. Production environment ready — servers sized, access configured, licenses activated
  5. Training plan executed — each user trained on their processes (not just general demo)
  6. User support materials available — quick reference guides, FAQ by process
  7. Post-go-live support organization defined — who answers questions? What SLA for first 2 weeks?
  8. External interfaces tested — banking, supplier EDI, e-commerce, electronic invoicing platform
  9. Rollback plan documented — tested procedure for reverting, with point-of-no-return identified
  10. Formal steering committee validation — signed go PV, clarified responsibilities

Business Validation Signed by Key Users

The UAT sign-off isn’t administrative formality. It’s key users’ commitment that the system meets their workshop-validated requirements. Each key user signs for their functional scope. If they refuse to sign, issues remain to address—and that’s exactly the PV’s role to make them visible.

Documented Rollback Plan

Even with flawless UAT, Plan B must exist. Document:

  • Procedure for reverting to old system (database restore, access reactivation, flow switching)
  • Point of no return (typically: once first production transactions are recorded in new ERP)
  • Rollback timeframe (ideally < 4 hours for critical processes)
  • Rollback decision owners and escalation path

Common ERP UAT Mistakes

Testing with overly clean data. Artificial test data masks problems that will appear with real data: special characters, empty fields, inconsistent history, duplicates. Use an anonymized extract of your production data.

Not involving end users. If only consultants and IT test, UAT validates technique—not usage. Key users detect problems specifications don’t mention: screen too slow for warehouse entry pace, validation process adding 3 unnecessary clicks.

Starting UAT too late in schedule. UAT typically represents 20 to 30% of total project effort. If compressed into the last two weeks, you test under pressure, classify major anomalies as “minor” to meet schedule, and make go-live decisions under constraint. Plan 4 to 8 weeks for SME, 3 to 6 months for mid-market.

Confusing “it works” with “it’s validated.” A technically functional test doesn’t mean it meets business requirements. Results must be verified by key users, not just implementer.

Not testing exception cases. 80% of standard flows often work well from first tests. It’s the remaining 20% (credit memos, partial returns, multi-currency orders, year-end closes) that generate post-go-live crises.

Neglecting performance testing. A system responding in 2 seconds with 5 users can take 45 seconds with 50 simultaneous users. Test with actual planned load, not just your project team.


Going Further

ERP UAT doesn’t improvise—it’s prepared like a project within the project. If you’re in scoping phase, consult our guide to writing an ERP requirements document that includes a dedicated UAT requirements section. To understand how UAT fits into the overall schedule, read our article on the 5 phases of a successful ERP project. And if you want to avoid pitfalls leading to botched UAT, our analysis of mistakes that cause ERP project failure completes the picture.

Download our ERP integrator scorecard — 10 criteria on 100 points to compare 3 vendors and structure your decision.