Test Harness / Validation Instructions
Validate your Processing Rules and Adobe setup using the following checklist:
Setup
Staging page with AppMeasurement: Include
embed scriptspecifying your report suite id in thedata-lsc-report-suite-idattribute.Enable Debugging: Use Adobe Debugger or browser devtools to inspect outgoing beacons (network requests).
Validation Steps
Fire a test event Simulate a LiSA shared event JSON (e.g., a generic playback event)
Inspect the request Confirm the outgoing beacon includes:
contextDatawith expectedlsc.*keys.Correct custom event code in the
eventsparameter.
Processing Rules mapping In Adobe Analytics Workspace or Debug view, verify that each alias shows up in the target eVars/props you configured and that friendly labels are applied.
Report-level sanity check Build a quick breakdown or funnel including the mapped
LiSA Event Type, session/visitor IDs, and confirm that hits surface as expected.Edge cases
Missing optional fields don’t break attribution.
UUIDs are passed but not used directly in heavy ad-hoc analysis (classification applied if needed).
Timestamps (
eventDate) are present for discrepancy debugging.
Reconciliation
Use eventId, visitorId, and eventDate to match Adobe hits to backend logs for troubleshooting and to verify no duplication. High fidelity correlation supports debugging complex user flows.
Last updated