Events Tracking
The Lens SDK includes an analytics feature that automatically tracks and records various user interaction events with the Lens. This data is crucial for understanding user behavior and enhancing user experience.
How does tracking work
Upon integration, the SDK begins to monitor specified events within the Lens. These events can range from simple interactions, such as tapping a button, to more complex sequences like completing a task. The collected data provides insights into how users are engaging with the Lens.
How to enable this feature
Analytics tracking is enabled by default when you integrate the Lens SDK. There's no additional setup required to start collecting analytics data. Below, you will find a list of the specific events that the SDK can track
EVENT | DESCRIPTION | PARAMS |
---|---|---|
lens_session_start | A new session started | source=[camera,gallery,browser] |
lens_session_end | Camera is closed | |
lens_screen_show | Screen open | type=[submit,settings,gallery,crop] |
lens_camera_show | Camera is shown | N/A |
lens_camera_close | Camera is closed | N/A |
lens_camera_close_confirmation | Cancel capturing alert | N/A |
lens_camera_menu_open | More menu is opened | N/A |
lens_camera_menu_close | More menu is closed | N/A |
lens_camera_document_type_selected | Document type selected to scan | document_type=[receipt,long_receipt,bill,other,credit_card,business_card,check,code,w2,w9,barcode,bank_statements] |
lens_camera_gallery_open | The gallery button is clicked | source=[gallery] |
lens_camera_capture | The capture button is clicked | source=[capture_button] |
lens_camera_flash | The flash button is clicked | source=[flashButton] |
lens_camera_autocapture_start | The auto-capture mode started | N/A |
lens_camera_autocapture_end | The auto-capture mode ended | source=[menu] |
lens_stitching_start | Stitching started for long receipts | N/A |
lens_stitching_end | Stitching ended for long receipts | N/A |
lens_gallery_import_image | An image was imported from the gallery | N/A |
lens_gallery_open | The gallery is opened | type=[gallery] |
lens_gallery_close | The gallery is closed | N/A |
lens_submit_document_close | The confirm screen is closed | source=[btnSubmit] |
lens_browser_open | The browse document is open from the menu | N/A |
lens_browser_close | The browse document is closed | N/A |
lens_browse_import_document | A document was selected from the browser | N/A |
lens_submit_package_submitted | The user confirms the package to submit | source=[btnSubmit] |
lens_submit_document_detection_status | Status of the document captured | status=[detected,not_detected] |
lens_submit_no_document_alert_show | No document detected alert is shown | action=[try_again,continue] |
lens_submit_blur_alert_show | Blur-detected alert is shown | action=[try_again,continue], status=[detected,not_detected] |
lens_submit_glare_alert_show | Glare-detected alert is shown | action=[try_again, continue] |
lens_submit_add_document | The stitch button is clicked to add a new document | document_queue=[(amount)], source=[buttonStitchMore] |
lens_submit_back_check | The user accepts to add the back side of the check | N/A |
lens_submit_document_rotate | The user rotated the document | index=[(position)] |
lens_submit_document_scrolled | The document detected has been scrolled | N/A |
lens_submit_document_remove | The user wants to remove a document | index=[(position)], action=[true, false] |
lens_submit_document_crop | The user wants to crop the document | index=[(position)] |
lens_settings_doc_detection_changed | The option “Auto Doc Detec & Crop” from the menu was changed | status=[true,false] |
lens_settings_blur_detection_changed | The option “Auto Blur Detection” from the menu was changed | status=[true,false] |
lens_settings_glare_detection_changed | The option “Auto Torch (ambience detection)” from the menu was changed | status=[true,false] |
lens_settings_autotorch_changed | The option “Handsfree Document Capture” from the menu was changed | status=[true,false] |
lens_settings_autocapture_changed | The option “Auto Blur Detection” from the menu was changed | status=[true,false] |
lens_settings_backup_changed | The option “Backup Scans to Photo Gallery” from the menu was changed | status=[true,false] |
lens_settings_autoclose_changed | The option “After Scan Close Camera” from the menu was changed | status=[true,false] |
lens_crop_show | The Crop screen is open | N/A |
lens_crop_guide_show | The crop guide alert is shown | |
lens_crop_alert_show | The confirm crop alert is shown | action=[true,false] |
lens_veryfi_lens_success | The document was processed successfully | N/A |
JSON Model
This is an example of the json that you will receive in the different platforms for each event
If the event doesn't have params
{
event: "lens_veryfi_lens_success"
}
If the event have params
{
event: "lens_session_start",
params: {
"source": "camera"
}
}
Implementation
- iOS
- Android
- Capacitor
- React Native
- Flutter
- .Net for iOS
- .Net for Android
- Cordova
// Initialize the observer
NotificationCenter.default.addObserver(
self,
selector: #selector(logLensEvent(from:)),
name: NSNotification.Name.init(rawValue: "VeryfiLensAnalyticsEvent"),
object: nil
)
@objc private func logLensEvent(from notification: Notification) {
// Catch the event
}
You can register a Broadcast receiver to start listening to the events:
class LensAnalyticsReceiver : BroadcastReceiver() {
override fun onReceive(context: Context?, intent: Intent?) {
if (intent == null) return
if (intent.hasExtra(EVENT)) {
val event = intent.getStringExtra(EVENT) ?: ""
if (intent.hasExtra(PARAMS)) {
val params = intent.getStringExtra(PARAMS)
val value = intent.getStringExtra(VALUE)
}
}
}
}
val filter = IntentFilter(ANALYTICS_EVENT)
Val receiver = LensAnalyticsReceiver()
registerReceiver(receiver, filter)
// Initialize the observer
VeryfiLensCapacitor.observeAnalyticsEvents()
// Start listening the analytic events
VeryfiLensCapacitor.addListener("veryfiLensAnalytics", (data: Object) => {
// Catch the event
});
// Add the listener
VeryfiLensEmitter.addListener(VeryfiLens.Events.onVeryfiLensAnalytics, (event) => {
// Catch the event
});
// Initialize the observer
VeryfiLens.observeAnalyticsEvents();
// Declare this property in your widget class
StreamSubscription<Map<String, dynamic>>? _streamSubscription;
// In the initState() of your widget, initialize the property added above
_streamSubscription = Veryfi.analyticsStream.listen((analyticsEvent) {
// Catch the event
String name = analyticsEvent["event"];
Map<Object?, Object?>? params = analyticsEvent["params"];
});
// Initialize the observer
Veryfi.observeAnalyticsEvents();
// Initialize the observer
private void SetUpAnalyticsEvents()
{
NSString notificationName = new NSString("VeryfiLensAnalyticsEvent");
NSObject observer = NSNotificationCenter.DefaultCenter.AddObserver(notificationName, AnalyticsNotification);
}
private void AnalyticsNotification(NSNotification notification) {
if (notification.Object != null) {
// Catch the event
}
}
// Declare an interface to comunicate with the BroadcastReceiver
public interface IAnalyticsEventListener
{
void OnAnalyticsEventReceived(string eventValue, string paramsValue, string value);
}
// Declare an BroadcastReceiver
[BroadcastReceiver(Enabled = true, Exported = false)]
[IntentFilter(new[] { "com.veryfi.lens.VeryfiLensAnalyticsEvent" })]
public class AnalyticsEventReceiver : BroadcastReceiver
{
private static IAnalyticsEventListener listener;
public AnalyticsEventReceiver() { }
public static void RegisterListener(IAnalyticsEventListener eventListener)
{
listener = eventListener;
}
public static void UnregisterListener()
{
listener = null;
}
public override void OnReceive(Context context, Intent intent)
{
string eventValue = intent.GetStringExtra("event");
string paramsValue = intent.GetStringExtra("params");
string value = intent.GetStringExtra("value");
listener?.OnAnalyticsEventReceived(eventValue, paramsValue, value);
}
}
// Your activity should implement the interface declared before
public class MainActivity : AppCompatActivity, IFragmentCommunication, IAnalyticsEventListener {
private AnalyticsEventReceiver receiver;
// Implementes the method of the IAnalyticsEventListener interrface
public void OnAnalyticsEventReceived(string eventValue, string paramsValue, string value)
{
// Catch the event
}
}
// Initialize the receiver
protected override void OnResume()
{
super.OnResume();
receiver = new AnalyticsEventReceiver();
AnalyticsEventReceiver.RegisterListener(this);
IntentFilter filter = new IntentFilter("com.veryfi.lens.VeryfiLensAnalyticsEvent");
RegisterReceiver(receiver, filter);
}
// Stop the BroadcastREceiver
protected override void OnPause()
{
base.OnPause();
UnregisterReceiver(receiver);
AnalyticsEventReceiver.UnregisterListener();
}
cordova.plugins.Veryfi.Lens.observeAnalyticsEvents(
function (response) {
// Catch the event
const jsonObject = JSON.parse(response);
}, function (error) {
// Handle the error
}
);