Difference between revisions of "Digital Measurement FAQ"

From Engineering Client Portal

Line 199: Line 199:
 
**com.google.android.gms.common.GooglePlayServicesRepairableException;
 
**com.google.android.gms.common.GooglePlayServicesRepairableException;
 
**com.google.android.gms.common.GooglePlayServicesNotAvailableException
 
**com.google.android.gms.common.GooglePlayServicesNotAvailableException
 
==== We are currently using AppSDK 1.2.3 (for Android). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do I get tracking metrics for both Nielsen and Adobe? ====
 
Adobe VHL Joint SDK component bundle contains '''Nielsen AppSDK 4.0.0''' which is capable of measuring all Nielsen measurement products like mTVR / DTVR, Digital Audio/Radio and mDAR measurements along with Digital Content Ratings (DCR). Starting from Nielsen AppSDK version 4.0.0, Nielsen AppSDK is no longer singleton and is capable of tracking up to 4 players at a time. To accommodate this change, Nielsen AppSDK initialization API is modified from a static API <code>AppSdk.getInstance (context, config, iappNotifier)</code> to a constructor <code>AppSdk (context, config, iappNotifier)</code>. Perform [[these steps]] to implement Adobe VHL Joint SDK.
 
  
 
==== We are currently using AppSDK 4.0.0 (for Android). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe? ====
 
==== We are currently using AppSDK 4.0.0 (for Android). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe? ====
Line 228: Line 225:
 
sendNielsenEvent('14',{type:"static",assetid:"static123",section:"sitesection",segA:"Segment1", segB:"Segment2",seg C:"Segment3"});
 
sendNielsenEvent('14',{type:"static",assetid:"static123",section:"sitesection",segA:"Segment1", segB:"Segment2",seg C:"Segment3"});
 
</syntaxhighlight>
 
</syntaxhighlight>
 
== Digital Audio ==
 
=== What should a radio / audio app developer do during interrupt scenarios? ===
 
Radio / audio app developer must trigger / stop the API calls during interrupt scenarios, as mentioned in [[Digital Measurement Interruption Scenarios]] section.
 

Revision as of 23:48, 29 May 2019

Engineering Portal breadcrumbArrow.png Digital breadcrumbArrow.png DCR & DTVR breadcrumbArrow.png Digital Measurement FAQ

Contents

DCR/DTVR (Digital Content Ratings/Digital Television Ratings)

SDK APIs

Should we call the play and stop methods whenever the user clicks on the pause/play button?

Yes. Call stop method and stop sending the playhead when the user clicks on pause button and when content resumes playback call play, loadMetadata and start sendID3 / playheadPosition.

Is playheadPosition needed for reporting on live streams?

Playhead position should be reported for both live streaming and VOD content streaming.

  • For VOD content (not live) playing, playhead position must be number of seconds from the beginning of the content. This includes ads.
  • For live content playing, playhead position must be the current Unix timestamp (seconds since Jan-1-1970 UTC).

How to send the Pause event in an Interactive radio station streams as it does not have a stop function?

Call Nielsen App SDK stop API to send a pause / stop event. Upon a station change,

  • Call stop for the previous station and
  • Call play and loadMetadata for the new station.

For Events, what is the difference between play and loadMetadata?

When content starts playing, call play and loadMetadata APIs

  • play being called the first time since initialization or following a previous stop call prepares the SDK to measure content being played. Calling play consecutively does nothing.
  • loadMetadata allows the metadata to be changed during playback.

Can we always send stop, then play, loadMetadata and playheadPosition even if it is brief buffering, as there is no way to predict the buffering time?

For App SDK (iOS and Android), do not call stop and play APIs in that sequence. Always call play first and then call stop, as play starts a new viewing session and stop stops the viewing session.

  • For brief buffering on all platforms, do not call stop right away. Instead, stop sending the playheadPosition during the buffer time and start the playheadPosition once the playback resumes. If there is a mechanism to know if buffering is crossing 30 seconds or longer, then call stop at that point.

Is there any debug facility to see if App SDK APIs are being called successfully?

Include nol_devDebug to the SDK initialization call to check whether App SDK API calls are successful.

Note: DO NOT activate the Debug flag in production environment.

What are the different interrupt scenarios app developer should take care of?

There are various interrupt scenarios which app developer should take care of

  • Pause / Play
  • Network loss (Wi-Fi / Airplane / Cellular)
  • Call Interrupt ( SIM or Third party Skype / Hangout call)
  • Alarm Interrupt
  • Content Buffering
  • Lock / Unlock device (Video players only)
  • App going Background / Foreground (Only video players without PIP mode support )
  • Channel / Station Change Scenario
  • Unplugging of headphone

What should a video app developer do during interrupt scenarios?

Video app developer must trigger / stop the API calls during interrupt scenarios, as mentioned in Digital Measurement Interruption Scenarios section.

How to get the opt-out URL from App SDK?

Once App SDK is initialized successfully, use the API public String userOptOutURLString() to retrieve the opt-out URL.

Metadata

What is ChannelName? Is it the value in the Nielsen MediaView dictionary, or is it a value that corresponds to television?

ChannelName is the name of the channel of the playing content (e.g., MTV, ESPN, etc.). It is a requirement for iOS and Android apps to inform App SDK that this channel/content is being played at player side. Refer to play API for more details.

What value should "tag" carry?

Tag refers to the Channel name. Channel Name can be a free-form value (a friendly name for the content being played). If no name is available, pass the URL string of the content being played.

What format does the length field need to be in? (Minutes, Seconds, Milliseconds, etc.)?

Length of content should be in Seconds.

What are the possible types of play that can be sent?

'Type' refers to the type of play event like preroll, midroll, postroll, content, or static.

What values are typically passed for ChannelInfo?

ChannelInfo refers to the Channel name. This can be a free-form value (a friendly name for the content being played). If no name is available, pass the urlstring of the content being played.

Example: ESPN or http://www.XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX.m3u8

What should the value be for "Episode Name"? If full video name in CMS is: "The Good Wife – The Deep Web" Should we instead concatenate it to just be "The Deep Web"?

For online ( Episode Name can be client-defined or as it appears in the CMS. — default aggregation for reporting will be Episode > Program > Channel. Program Name will be defined and reported, based on the category parameter. So program name need not be included within the "Episode Name" field.

Should season and episode numbers be included in "Episode Name"? Is it the best practice to include this information?

Nielsen can now report down to the episode level. Having the ability to distinguish episodes for reports is recommended. Season/Episode numbers can be used if preferred.

Which of the following "Episode Names" best suits the short form content like http://www.cbs.com/shows/the_good_wife/video/43F8FA6E-10F1-F508-1909-C3E3150C8C7D/the-good-wife-a-true-politician/ ?

  • "The Good Wife – A True Politican"
  • "A True Politican"
  • "The Good Wife – The Deep Web"
  • "The Deep Web"

Is ‘type' always hard-coded to have a value ‘content'?

Depending on the type of content being played, ‘type' must be set to ‘content', ‘preroll', ‘midroll' or ‘postroll' to define content and ‘ad' for ads

What are the mandatory parameters to initialize the SDK?

appName, appVersion, sfcode and appid are mandatory parameters to initialize the SDK.

Ads

Is Nielsen SDK dependent on Ad Support Framework? Does Nielsen support mobile apps without ads?

No, it is not dependent. Nielsen SDK supports mobile apps without ads.

How should the playhead position be sent under adbreaks (Stop sending the playhead or send the playhead for the ad break time)?

App should always send the playhead position regardless of ad break. The App should

  • Call stop() before starting an ad break.
  • Call loadMetadata() to load ad.
  • Once adbreak is complete, call stop and loadMetadata(content).

To summarize, Nielsen AppSDK needs to call loadMetadata() for every asset to load both content and ads, stop() when the current asset is changed or complete. Call setPlayheadPosition() all the time.

When running a DAR campaign, how is the "DAR Tag" added into the metadata object?

Nielsen DAR tag, or Nielsen DAR beacon, may come in different forms from ad service via VAST XML or ad service integrated player framework library. The Android developer should

  • Identify a Nielsen beacon from the ad server response
  • Create JSON metadata with Nielsen beacon
JSON:{ "type":"ad", "ocrtag":"" }

Our video players currently pass DAR information through the use of ad 1x1s that are fired off as each ad plays. Here's an example: http://secure-us.imrworldwide.com/cgi-bin/m?ci=nlsnci370&am=3&at=view&rt=banner&st=image&ca=nlsn11134&cr=crtve&pc=cbs_plc0001&ce=cbs&r=%n Do we need to forego this approach and use the DAR tag with the SDK? Will the 1x1s be sufficient?

For online (web) DAR, there is no need to use the SDK for DAR. For mobile DAR, as we need to access the IDFA / Google ID, follow one of the approaches mentioned below when the app is tagged. Handle the IDFA collection, and append that data to the same OCR tag, using the mDAR alternative method. (OR) To use App SDK to make the request, the same tag url (1×1) would be used. Pass the DAR tag (AS-IS) and the App SDK will handle the retrieval and concatenation.

Miscellaneous

How to handle the delay in HLS streams from the actual live stream (often by at least 30 seconds)?

This delay need not be considered since it will apply to all stream viewing instances.

What is Adobe VHL Joint SDK?

Adobe VHL JointSDK is a collaboration between Adobe and Nielsen to measure video viewer-ship, app launches, app crashes, advertisements, page views, and more. Adobe VHL JointSDK allows clients to obtain video demographic and volumetric (quantity) data along with published Nielsen ratings.

Does the App SDK generate HTTP traffic immediately once the call is made to the SDK? How do we know if a correct call to SDK is made or not?

When SDK starts up, there is an initial ping. HTTP traffic can be captured within few seconds of passing the OCR tag to SDK. For other products like mTVR, the ping is immediate upon start of video stream.

Are the Privacy Policy and opt-out URLs the same? If not, is there a method that delivers the Privacy Policy URL?

No, they are not the same. There is a link to our opt-out page that goes into the app itself. This URL is never hardcoded into the app and there is a method userOptOutURLString() to pull the URL from the App SDK. For our Privacy Policy, refer to Nielsen Privacy Requirements. The same can also be accessed from the link to our privacy policy in the app store description (Work in progress).

Is there a separate Terms & Conditions page to link to, apart from privacy policy URL? Is there a method that delivers this URL?

Terms & conditions are present in the Privacy Policy page.

iOS

Is there any debug facility to see if App SDK APIs are being called successfully?

Include nol_devDebug to the SDK initialization call to check whether App SDK API calls are successful.

NSDictionary* appInformation = @{
            @"appid": appid,
            @"appversion": appversion,
            @"appname": appname,
            @"sfcode": sfcode,
            @"nol_devDebug": @"INFO"};

Note: DO NOT activate the Debug flag in production environment.

We are currently using App SDK 4.0.0 (for iOS). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe?

Perform the following steps to implement Adobe VHL Joint SDK.

  • Remove the NielsenAppApi.framework from the existing project since the Adobe VHL Plugin already bundles the Nielsen SDK components.
  • Import the NielsenAppApi header file into the class in order to ensure access to all of the Nielsen methods.
    • Contact Nielsen Technical Account Manager (TAM) to get the latest Nielsen header file.
  • Implement new products using the Adobe VHL methods. This will keep the previous Nielsen integration intact and allows capturing both Nielsen and Adobe tracking Metrics.

Why does SDK throw a ‘selector not recognized' exception while calling loadMetadata for iOS?

This is an issue with the Apple linker. For 64-bit and iPhone OS applications, a linker bug prevents -ObjC to load object files from the static libraries (that contain only categories and no classes). The methods when found missing at runtime results in ‘selector not recognized' exception. To fix this issue, add the -all_load flag or -force_load flags to Other Linker Flags in build settings for their application target.

  • -all_load forces the linker to load all object files from every archive it sees, even those without Objective-C code.
  • -force_load allows finer grain control of archive loading and is available in Xcode 3.2 and later. Each -force_load option must be provided a path to an archive, and every object file in that archive will be loaded.

For more details click here.

Android

How to upgrade from pre-DCR to DCR?

The main changes required in existing Client Apps (integrated with Pre-DCR SDK) for integration with DCR SDK, are as follows.

  • Removal of C/C++ code
  • Removal of Singleton design for SDK
  • Addition/Removal of few public API's

Click here to view the guidelines / steps for integrating DCR SDK for existing Client Apps.

Initialization of App SDK object
String config = "{"+ "\"appName\" : \"" + "AppName" + "\","
                    + "\"appVersion\" : \"" + "1.0" + "\","
                    + "\"sfcode\" : \"" + "uat-cert" + "\","
                    + "\"appid\" : \"" + "PXXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX"
                    + "\""
                    + "}";
AppSdk mAppSdk = new AppSdk(context, config, iappNotifier);
if (mAppSdk == null || !(mAppSdk != null && AppSdk.isValid()))
{
          Log.e(TAG, "Failed in creating the App SDK framework");
          return;
}

Is there any debug facility to see if AppSDK APIs are being called successfully?

Yes, you can use Debug flag to check whether an App SDK API call made is successful. To activate the Debug flag, Pass the argument + "\"nol_devDebug\" : \"" + "I" + "\",", while initializing the App SDK object. Once the flag is active, API call made and the data passed are logged. The log created by this flag is minimal. DO NOT activate the Debug flag in production environment.

Does App SDK need Google Play services? If Yes, why?

Yes, App SDK needs Google Play services to be included in the Android project. App SDK needs this to fetch Advertising ID from google devices. For Amazon devices, App SDK uses Android ID for this purpose.

Google Play services project is bulky, do we have a list of classes that App SDK uses?

Yes, you need to edit Google Play services project / lib to have App SDK use only the following classes / package.

  • Library
    • google-play-services_lib
  • Classes / package
    • com.google.android.gms.ads.identifier.AdvertisingIdClient;
    • com.google.android.gms.ads.identifier.AdvertisingIdClient.Info;
    • com.google.android.gms.common.ConnectionResult;
    • com.google.android.gms.common.GooglePlayServicesUtil;
    • com.google.android.gms.common.GooglePlayServicesRepairableException;
    • com.google.android.gms.common.GooglePlayServicesNotAvailableException

We are currently using AppSDK 4.0.0 (for Android). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe?

Perform the following steps to implement Adobe VHL Joint SDK.

  • Remove existing Nielsen-only SDK components (appsdk.jar) from the app's project. Adobe VHL SDK (VideoHeartbeat.jar) already bundles Nielsen SDK components.
  • Update the classpath / build settings pointing to latest Adobe VHL SDK jar. Remove Nielsen Only appsdk.jar from classpath to avoid any runtime issues as the api definitions got changed in latest SDK version bundled in Adobe VHL jar.
  • Include Adobe VHL Joint SDK component VideoHeartbeat.jar in the app's project.
  • Concentrate on implementing new measurement products like DCR (Digital Content Ratings) using Adobe VHL API's and get both Nielsen and Adobe tracking Metrics.

Browser

How to upgrade from pre-DCR to DCR?

Perform the following steps to upgrade from pre-DCR to DCR.

  • Modify the current plugin URL to point to the latest URL provided by the TAM. (change from ggcmb400 URL to the ggcmb500 URL)
  • Modify the apid and the sfcode to the latest provided by your TAM

Note: If the implementation is done via a plugin, make the changes on the Plugin UI

  • Initialize the SDK as follows.
var gg1 = NOLCMB.getInstance("unique_string");
gg1.ggInitialize(_nolggGlobalParams);

Note: To measure static content on a page and integrate the SDK with a player for video measurement, it is recommended to create one SDK instance.

  • To also enable static (page) measurement,
    • Repeat the above three steps across all the website page templates, and
    • Enable the static page event along with the static metadata (example below) towards the bottom of each page (as close to </body> as possible).
sendNielsenEvent('14',{type:"static",assetid:"static123",section:"sitesection",segA:"Segment1", segB:"Segment2",seg C:"Segment3"});