# Test Harness / Validation Instructions

Validate your Processing Rules and Adobe setup using the following checklist:

### Setup

1. **Staging page with AppMeasurement**: Include [`embed script`](https://docs.hello-lisa.com/developers/analytics/abobe-appmeasurement/embed-script) specifying your report suite id in the `data-lsc-report-suite-id` attribute.
2. **Enable Debugging**: Use Adobe Debugger or browser devtools to inspect outgoing beacons (network requests).

### Validation Steps

1. **Fire a test event**\
   Simulate a LiSA shared event JSON (e.g., a generic playback event)
2. **Inspect the request**\
   Confirm the outgoing beacon includes:
   * `contextData` with expected `lsc.*` keys.
   * Correct custom event code in the `events` parameter.
3. **Processing Rules mapping**\
   In Adobe Analytics Workspace or Debug view, verify that each alias shows up in the target eVars/props you configured and that friendly labels are applied.
4. **Report-level sanity check**\
   Build a quick breakdown or funnel including the mapped `LiSA Event Type`, session/visitor IDs, and confirm that hits surface as expected.
5. **Edge cases**
   * Missing optional fields don’t break attribution.
   * UUIDs are passed but not used directly in heavy ad-hoc analysis (classification applied if needed).
   * Timestamps (`eventDate`) are present for discrepancy debugging.

### Reconciliation

Use `eventId`, `visitorId`, and `eventDate` to match Adobe hits to backend logs for troubleshooting and to verify no duplication. High fidelity correlation supports debugging complex user flows.
