Test Harness / Validation Instructions
Validate your Processing Rules and Adobe setup using the following checklist:
Setup
Staging page with AppMeasurement: Include
embed script
specifying your report suite id in thedata-lsc-report-suite-id
attribute.Enable Debugging: Use Adobe Debugger or browser devtools to inspect outgoing beacons (network requests).
Validation Steps
Fire a test event Simulate a LiSA shared event JSON (e.g., a generic playback event)
Inspect the request Confirm the outgoing beacon includes:
contextData
with expectedlsc.*
keys.Correct custom event code in the
events
parameter.
Processing Rules mapping In Adobe Analytics Workspace or Debug view, verify that each alias shows up in the target eVars/props you configured and that friendly labels are applied.
Report-level sanity check Build a quick breakdown or funnel including the mapped
LiSA Event Type
, session/visitor IDs, and confirm that hits surface as expected.Edge cases
Missing optional fields don’t break attribution.
UUIDs are passed but not used directly in heavy ad-hoc analysis (classification applied if needed).
Timestamps (
eventDate
) are present for discrepancy debugging.
Reconciliation
Use eventId
, visitorId
, and eventDate
to match Adobe hits to backend logs for troubleshooting and to verify no duplication. High fidelity correlation supports debugging complex user flows.
Last updated