Digital Measurement FAQ: Difference between revisions

From Engineering Client Portal

m (Admin moved page Digital FAQ to Digital Measurement FAQ without leaving a redirect)
 
(13 intermediate revisions by 4 users not shown)
Line 1: Line 1:
{{Breadcrumb|}} {{Breadcrumb|Digital}} {{Breadcrumb|DCR & DTVR}} {{CurrentBreadcrumb}}
{{Breadcrumb|}} {{Breadcrumb|Digital}} {{Breadcrumb|US DCR & DTVR}} {{CurrentBreadcrumb}}
[[Category:Digital]]
[[Category:Digital]]
== iOS ==
=== How to upgrade from pre-DCR to DCR? ===
Please follow the [[Pre-DCR to DCR iOS Upgrade]] guide.


=== Should we call the [[play]] and [[stop]] methods whenever the user clicks on the pause/play button? ===
== DCR/DTVR (Digital Content Ratings/Digital Television Ratings) ==
=== SDK APIs ===
==== Should we call the [[play]] and [[stop]] methods whenever the user clicks on the pause/play button? ====
Yes. Call stop method and [[stop]] sending the playhead when the user clicks on pause button and when content resumes playback call [[play]], [[loadMetadata]] and start [[sendID3]] / [[playheadPosition]].
Yes. Call stop method and [[stop]] sending the playhead when the user clicks on pause button and when content resumes playback call [[play]], [[loadMetadata]] and start [[sendID3]] / [[playheadPosition]].


=== Is [[playheadPosition]] needed for reporting on live streams? ===
==== Is [[playheadPosition]] needed for reporting on live streams? ====
Playhead position should be reported for both live streaming and VOD content streaming.
Playhead position should be reported for both live streaming and VOD content streaming.
*For VOD content (not live) playing, playhead position must be number of seconds from the beginning of the content. This includes ads.
*For VOD content (not live) playing, playhead position must be number of seconds from the beginning of the content. This includes ads.
*For live content playing, playhead position must be the current UTC value.
*For live content playing, playhead position must be the current Unix timestamp (seconds since Jan-1-1970 UTC).


=== How to handle the delay in HLS streams from the actual live stream (often by at least 30 seconds)? ===
==== How to send the Pause event in an Interactive radio station streams as it does not have a stop function? ====
This delay need not be considered since it will apply to all stream viewing instances.
 
=== What is ChannelName? Is it the value in the Nielsen MediaView dictionary, or is it a value that corresponds to television? ===
ChannelName is the name of the channel of the playing content (e.g., MTV, ESPN, etc.). It is a requirement for iOS and Android apps to inform App SDK that this channel/content is being played at player side. Refer to [[play]] API for more details.
 
=== How to send the Pause event in an Interactive radio station streams as it does not have a stop function? ===
Call Nielsen App SDK [[stop]] API to send a pause / stop event. Upon a station change,
Call Nielsen App SDK [[stop]] API to send a pause / stop event. Upon a station change,
*Call [[stop]] for the previous station and
*Call [[stop]] for the previous station and
*Call [[play]] and loadMetadata for the new station.
*Call [[play]] and loadMetadata for the new station.


=== For Events, what is the difference between [[play]] and [[loadMetadata]]? ===
==== For Events, what is the difference between [[play]] and [[loadMetadata]]? ====
When content starts playing, call [[play]] and [[loadMetadata]] APIs
When content starts playing, call [[play]] and [[loadMetadata]] APIs
*[[play]] being called the first time since initialization or following a previous [[stop]] call prepares the SDK to measure content being played. Calling play consecutively does nothing.
*[[play]] being called the first time since initialization or following a previous [[stop]] call prepares the SDK to measure content being played. Calling play consecutively does nothing.
*[[loadMetadata]] allows the metadata to be changed during playback.
*[[loadMetadata]] allows the metadata to be changed during playback.


=== Is Nielsen SDK dependent on Ad Support Framework? Does Nielsen support mobile apps without ads? ===
==== Can we always send [[stop]], then [[play]], [[loadMetadata]] and [[playheadPosition]] even if it is brief buffering, as there is no way to predict the buffering time? ====
No, it is not dependent. Nielsen SDK supports mobile apps without ads.
 
=== Can we always send [[stop]], then [[play]], [[loadMetadata]] and [[playheadPosition]] even if it is brief buffering, as there is no way to predict the buffering time? ===
For App SDK (iOS and Android), do not call [[stop]] and [[play]] APIs in that sequence. Always call [[play]] first and then call [[stop]], as [[play]] starts a new viewing session and [[stop]] stops the viewing session.
For App SDK (iOS and Android), do not call [[stop]] and [[play]] APIs in that sequence. Always call [[play]] first and then call [[stop]], as [[play]] starts a new viewing session and [[stop]] stops the viewing session.


*For brief buffering on all platforms, do not call [[stop]] right away. Instead, stop sending the [[playheadPosition]] during the buffer time and start the [[playheadPosition]] once the playback resumes. If there is a mechanism to know if buffering is crossing 30 seconds or longer, then call [[stop]] at that point.
*For brief buffering on all platforms, do not call [[stop]] right away. Instead, stop sending the [[playheadPosition]] during the buffer time and start the [[playheadPosition]] once the playback resumes. If there is a mechanism to know if buffering is crossing 30 seconds or longer, then call [[stop]] at that point.


=== Is there any debug facility to see if App SDK APIs are being called successfully? ===
==== Is there any debug facility to see if App SDK APIs are being called successfully? ====
Include ''nol_devDebug'' to the SDK initialization call to check whether App SDK API calls are successful.
Include ''nol_devDebug'' to the SDK initialization call to check whether App SDK API calls are successful.
<syntaxhighlight lang="objective-c">
NSDictionary* appInformation = @{
            @"appid": appid,
            @"appversion": appversion,
            @"appname": appname,
            @"sfcode": sfcode,
            @"nol_devDebug": @"INFO"};
</syntaxhighlight>
<blockquote>'''Note:''' DO NOT activate the Debug flag in production environment.</blockquote>
<blockquote>'''Note:''' DO NOT activate the Debug flag in production environment.</blockquote>


=== What are the different interrupt scenarios app developer should take care of? ===
==== What are the different interrupt scenarios app developer should take care of? ====
There are various interrupt scenarios which app developer should take care of
There are various interrupt scenarios which app developer should take care of
*Pause / Play
*Pause / Play
Line 61: Line 43:
*Unplugging of headphone
*Unplugging of headphone


=== What should a radio / audio app developer do during interrupt scenarios? ===
==== What should a video app developer do during interrupt scenarios? ====
Radio / audio app developer must trigger / stop the API calls during interrupt scenarios, as mentioned in [[Digital Measurement Interruption Scenarios]] section.
Video app developer must trigger / stop the API calls during interrupt scenarios, as mentioned in [[Digital Measurement Interruption Scenarios]] section.
 
==== How to get the opt-out URL from App SDK? ====
Once App SDK is initialized successfully, use the API public String [[userOptOutURLString()]] to retrieve the opt-out URL.
 
=== Metadata ===
 
==== What is ChannelName? Is it the value in the Nielsen MediaView dictionary, or is it a value that corresponds to television? ====
ChannelName is the name of the channel of the playing content (e.g., MTV, ESPN, etc.). It is a requirement for iOS and Android apps to inform App SDK that this channel/content is being played at player side. Refer to [[play]] API for more details.
 
==== What value should "tag" carry? ====
Tag refers to the Channel name. Channel Name can be a free-form value (a friendly name for the content being played). If no name is available, pass the URL string of the content being played.
 
==== What format does the length field need to be in? (Minutes, Seconds, Milliseconds, etc.)? ====
Length of content should be in Seconds.
 
==== What are the possible types of play that can be sent? ====
'Type' refers to the type of play event like ''preroll'', ''midroll'', ''postroll'', ''content'', or ''static''.


=== What should a video app developer do during interrupt scenarios? ===
==== What values are typically passed for ChannelInfo? ====
Video app developer must trigger / stop the API calls during interrupt scenarios, as mentioned in [[Digital Measurement Interruption Scenarios]] section.
ChannelInfo refers to the Channel name. This can be a free-form value (a friendly name for the content being played). If no name is available, pass the urlstring of the content being played.


=== Why does SDK throw a ‘selector not recognized' exception while calling loadMetadata for iOS? ===
'''Example:''' <code>ESPN</code> or <code><nowiki>http://www.XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX.m3u8</nowiki></code>
This is an issue with the Apple linker. For 64-bit and iPhone OS applications, a linker bug prevents -ObjC to load object files from the static libraries (that contain only categories and no classes). The methods when found missing at runtime results in ‘selector not recognized' exception. To fix this issue, add the <code>-all_load</code> flag or <code>-force_load</code> flags to Other Linker Flags in build settings for their application target.
*<code>-all_load</code> forces the linker to load all object files from every archive it sees, even those without Objective-C code.
*<code>-force_load</code> allows finer grain control of archive loading and is available in Xcode 3.2 and later. Each <code>-force_load</code> option must be provided a path to an archive, and every object file in that archive will be loaded.
For more details [http://developer.apple.com/mac/library/qa/qa2006/qa1490.html click here].


=== What is Adobe VHL Joint SDK? ===
==== What should the value be for "Episode Name"? If full video name in CMS is: "The Good Wife – The Deep Web" Should we instead concatenate it to just be "The Deep Web"? ====
Adobe VHL JointSDK is a collaboration between Adobe and Nielsen to measure video viewer-ship, app launches, app crashes, advertisements, page views, and more. Adobe VHL JointSDK allows clients to obtain video demographic and volumetric (quantity) data along with published Nielsen ratings.
For online (
Episode Name can be client-defined or as it appears in the CMS. — default aggregation for reporting will be Episode > Program > Channel. Program Name will be defined and reported, based on the category parameter. So program name need not be included within the "Episode Name" field.


=== We are currently using App SDK 3.2.1 (for iOS). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do I get tracking metrics for both Nielsen and Adobe? ===
==== Should season and episode numbers be included in "Episode Name"? Is it the best practice to include this information? ====
Adobe VHL Joint SDK component bundle contains Nielsen App SDK 4.0.0 which is capable of measuring all Nielsen measurement products like mTVR / DTVR, Digital Audio / Radio and DAR measurements along with Digital Content Ratings (DCR).
Nielsen can now report down to the episode level. Having the ability to distinguish episodes for reports is recommended. Season/Episode numbers can be used if preferred.


Starting from Nielsen App SDK version 4.0.0, the SDK no longer uses shared instance and allows instantiating multiple instances of NielsenAppApi object and can be used simultaneously without any issues. Maximum of 4 SDK instances per appid are supported in the releases 4.0.0 and above.
==== Which of the following "Episode Names" best suits the short form content like http://www.cbs.com/shows/the_good_wife/video/43F8FA6E-10F1-F508-1909-C3E3150C8C7D/the-good-wife-a-true-politician/ ? ====
*"The Good Wife – A True Politican"
*"A True Politican"
*"The Good Wife – The Deep Web"
*"The Deep Web"


Perform [[these steps]] to implement Adobe VHL Joint SDK.
==== Is ‘type' always hard-coded to have a value ‘content'? ====
Depending on the type of content being played, ‘type' must be set to ‘content', ‘preroll', ‘midroll' or ‘postroll' to define content and ‘ad' for ads


=== We are currently using App SDK 4.0.0 (for iOS). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe? ===
==== What are the mandatory parameters to initialize the SDK? ====
Perform the following steps to implement Adobe VHL Joint SDK.
<code>appName</code>, <code>appVersion</code>, <code>sfcode</code> and <code>appid</code> are mandatory parameters to initialize the SDK.
*Remove the NielsenAppApi.framework from the existing project since the Adobe VHL Plugin already bundles the Nielsen SDK components.
*Import the NielsenAppApi header file into the class in order to ensure access to all of the Nielsen methods.
**Contact Nielsen Technical Account Manager (TAM) to get the latest Nielsen header file.
*Implement new products using the Adobe VHL methods. This will keep the previous Nielsen integration intact and allows capturing both Nielsen and Adobe tracking Metrics.


== Android ==
=== Ads ===
=== How to upgrade from pre-DCR to DCR? ===
The main changes required in existing Client Apps (integrated with Pre-DCR SDK) for integration with DCR SDK, are as follows.
*Removal of C/C++ code
*Removal of Singleton design for SDK
*Addition/Removal of few public API's
[[Click here]] to view the guidelines / steps for integrating DCR SDK for existing Client Apps.


=== What value should "tag" carry? ===
==== Is Nielsen SDK dependent on Ad Support Framework? Does Nielsen support mobile apps without ads? ====
Tag refers to the Channel name. Channel Name can be a free-form value (a friendly name for the content being played). If no name is available, pass the URL string of the content being played.
No, it is not dependent. Nielsen SDK supports mobile apps without ads.


=== How should the playhead position be sent under adbreaks (Stop sending the playhead or send the playhead for the ad break time)? ===
==== How should the playhead position be sent under adbreaks (Stop sending the playhead or send the playhead for the ad break time)? ====
App should always send the playhead position regardless of ad break. The App should
App should always send the playhead position regardless of ad break. The App should
*Call [[stop()]] before starting an ad break.
*Call [[stop()]] before starting an ad break.
Line 108: Line 99:
To summarize, Nielsen AppSDK needs to call [[loadMetadata()]] for every asset to load both content and ads, [[stop()]] when the current asset is changed or complete. Call [[setPlayheadPosition()]] all the time.
To summarize, Nielsen AppSDK needs to call [[loadMetadata()]] for every asset to load both content and ads, [[stop()]] when the current asset is changed or complete. Call [[setPlayheadPosition()]] all the time.


=== When running an OCR campaign, how is the "OCR Tag" added into the metadata object? ===
==== When running a DAR campaign, how is the "DAR Tag" added into the metadata object? ====
Nielsen OCR tag, or Nielsen OCR beacon, may come in different forms from ad service via VAST XML or ad service integrated player framework library. The Android developer should
Nielsen DAR tag, or Nielsen DAR beacon, may come in different forms from ad service via VAST XML or ad service integrated player framework library. The Android developer should
*Identify a Nielsen beacon from the ad server response
*Identify a Nielsen beacon from the ad server response
*Create JSON metadata with Nielsen beacon
*Create JSON metadata with Nielsen beacon
Line 117: Line 108:
*Send this JSON to Nielsen SDK using [[loadMetadata()]] API
*Send this JSON to Nielsen SDK using [[loadMetadata()]] API


=== What format does the length field need to be in? (Minutes, Seconds, Milliseconds, etc.)? ===
==== Our video players currently pass DAR information through the use of ad 1x1s that are fired off as each ad plays. Here's an example: http://secure-us.imrworldwide.com/cgi-bin/m?ci=nlsnci370<wbr />&am=3<wbr />&at=view<wbr />&rt=banner<wbr />&st=image<wbr />&ca=nlsn11134<wbr />&cr=crtve<wbr />&pc=cbs_plc0001<wbr />&ce=cbs<wbr />&r=%n Do we need to forego this approach and use the DAR tag with the SDK? Will the 1x1s be sufficient? ====
Length of content should be in Seconds.
For online (web) DAR, there is no need to use the SDK for DAR. For mobile DAR, as we need to access the IDFA / Google ID, follow one of the approaches mentioned below when the app is tagged.
Handle the IDFA collection, and append that data to the same OCR tag, using the mDAR alternative method. (OR)
To use App SDK to make the request, the same tag url (1×1) would be used.
Pass the DAR tag (AS-IS) and the App SDK will handle the retrieval and concatenation.


=== What are the possible types of play that can be sent? ===
=== Miscellaneous ===
‘Type' refers to the type of play event like preroll, midroll, postroll or content.


=== What values are typically passed for ChannelInfo? ===
==== How to handle the delay in HLS streams from the actual live stream (often by at least 30 seconds)? ====
ChannelInfo refers to the Channel name. This can be a free-form value (a friendly name for the content being played). If no name is available, pass the urlstring of the content being played.
This delay need not be considered since it will apply to all stream viewing instances.


'''Example:''' <code>ESPN</code> or <code><nowiki>http://www.XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX.m3u8</nowiki></code>
==== What is Adobe VHL Joint SDK? ====
Adobe VHL JointSDK is a collaboration between Adobe and Nielsen to measure video viewer-ship, app launches, app crashes, advertisements, page views, and more. Adobe VHL JointSDK allows clients to obtain video demographic and volumetric (quantity) data along with published Nielsen ratings.


=== Does the App SDK generate HTTP traffic immediately once the call is made to the SDK? How do we know if a correct call to SDK is made or not? ===
==== Does the App SDK generate HTTP traffic immediately once the call is made to the SDK? How do we know if a correct call to SDK is made or not? ====
When SDK starts up, there is an initial ping. HTTP traffic can be captured within few seconds of passing the OCR tag to SDK. For other products like mTVR, the ping is immediate upon start of video stream.
When SDK starts up, there is an initial ping. HTTP traffic can be captured within few seconds of passing the OCR tag to SDK. For other products like mTVR, the ping is immediate upon start of video stream.


=== Our video players currently pass OCR information through the use of ad 1x1s that are fired off as each ad plays. Here's an example: http://secure-us.imrworldwide.com/cgi-bin/m?ci=nlsnci370<wbr />&am=3<wbr />&at=view<wbr />&rt=banner<wbr />&st=image<wbr />&ca=nlsn11134<wbr />&cr=crtve<wbr />&pc=cbs_plc0001<wbr />&ce=cbs<wbr />&r=%n Do we need to forego this approach and use the OCR tag with the SDK? Will the 1x1s be sufficient? ===
==== Are the Privacy Policy and opt-out URLs the same? If not, is there a method that delivers the Privacy Policy URL? ====
For online (web) OCR, there is no need to use the SDK for OCR. For mobile OCR, as we need to access the IDFA / Google ID, follow one of the approaches mentioned below when the app is tagged.
No, they are not the same. There is a link to our opt-out page that goes into the app itself. This URL is never hardcoded into the app and there is a method [[userOptOutURLString()]] to pull the URL from the App SDK. For our Privacy Policy, refer to [[Nielsen Privacy Requirements]].
Handle the IDFA collection, and append that data to the same OCR tag, using the mOCR alternative method. (OR)
To use App SDK to make the request, the same tag url (1×1) would be used.
Pass the OCR tag (AS-IS) and the App SDK will handle the retrieval and concatenation.


=== What should the value be for "Episode Name"? If full video name in CMS is: "The Good Wife – The Deep Web" Should we instead concatenate it to just be "The Deep Web"? ===
==== Is there a separate Terms & Conditions page to link to, apart from privacy policy URL? Is there a method that delivers this URL? ====
For online (
Terms & conditions are present in the Privacy Policy page.
Episode Name can be client-defined or as it appears in the CMS. — default aggregation for reporting will be Episode > Program > Channel. Program Name will be defined and reported, based on the category parameter. So program name need not be included within the "Episode Name" field.


=== Should season and episode numbers be included in "Episode Name"? Is it the best practice to include this information? ===
=== iOS ===
Nielsen can now report down to the episode level. Having the ability to distinguish episodes for reports is recommended. Season/Episode numbers can be used if preferred.


=== Which of the following "Episode Names" best suits the short form content like http://www.cbs.com/shows/the_good_wife/video/43F8FA6E-10F1-F508-1909-C3E3150C8C7D/the-good-wife-a-true-politician/ ? ===
==== Is there any debug facility to see if App SDK APIs are being called successfully? ====
*"The Good Wife – A True Politican"
Include ''nol_devDebug'' to the SDK initialization call to check whether App SDK API calls are successful.
*"A True Politican"
<syntaxhighlight lang="objective-c">
*"The Good Wife – The Deep Web"
NSDictionary* appInformation = @{
*"The Deep Web"
            @"appid": appid,
=== Is ‘type' always hard-coded to have a value ‘content'? ===
            @"appversion": appversion,
Depending on the type of content being played, ‘type' must be set to ‘content', ‘preroll', ‘midroll' or ‘postroll' to define content and ‘ad' for ads
            @"appname": appname,
            @"sfcode": sfcode,
            @"nol_devDebug": @"INFO"};
</syntaxhighlight>
<blockquote>'''Note:''' DO NOT activate the Debug flag in production environment.</blockquote>


=== Are the Privacy Policy and opt-out URLs the same? If not, is there a method that delivers the Privacy Policy URL? ===
==== We are currently using App SDK 4.0.0 (for iOS). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe? ====
No, they are not the same. There is a link to our opt-out page that goes into the app itself. This URL is never hardcoded into the app and there is a method [[userOptOutURLString()]] to pull the URL from the App SDK. For our Privacy Policy, refer to [[Nielsen Privacy Requirements]]. The same can also be accessed from the link to our privacy policy in the app store description (Work in progress).
Perform the following steps to implement Adobe VHL Joint SDK.
*Remove the NielsenAppApi.framework from the existing project since the Adobe VHL Plugin already bundles the Nielsen SDK components.
*Import the NielsenAppApi header file into the class in order to ensure access to all of the Nielsen methods.
**Contact Nielsen Technical Account Manager (TAM) to get the latest Nielsen header file.
*Implement new products using the Adobe VHL methods. This will keep the previous Nielsen integration intact and allows capturing both Nielsen and Adobe tracking Metrics.


=== Is there a separate Terms & Conditions page to link to, apart from privacy policy URL? Is there a method that delivers this URL? ===
==== Why does SDK throw a ‘selector not recognized' exception while calling loadMetadata for iOS? ====
Terms & conditions are present in the Privacy Policy page.
This is an issue with the Apple linker. For 64-bit and iPhone OS applications, a linker bug prevents -ObjC to load object files from the static libraries (that contain only categories and no classes). The methods when found missing at runtime results in ‘selector not recognized' exception. To fix this issue, add the <code>-all_load</code> flag or <code>-force_load</code> flags to Other Linker Flags in build settings for their application target.
*<code>-all_load</code> forces the linker to load all object files from every archive it sees, even those without Objective-C code.
*<code>-force_load</code> allows finer grain control of archive loading and is available in Xcode 3.2 and later. Each <code>-force_load</code> option must be provided a path to an archive, and every object file in that archive will be loaded.
For more details [http://developer.apple.com/mac/library/qa/qa2006/qa1490.html click here].


=== Can we always send stop(), then play(), loadMetadata() and setPlayheadPosition() even if it is brief buffering, as there is no way to predict the buffering time? ===
=== Android ===
For AppSDK (iOS and Android), do not call stop() and play() APIs in that sequence. Always call play() first and then call stop(), as play() starts a new viewing session and stop() stops the viewing session.
==== How to upgrade from pre-DCR to DCR? ====
*For brief buffering on all platforms, do not call stop() right away. Instead, stop sending the playheadPosition() / setPlayheadPosition() during the buffer time and start the playheadposition() / setPlayheadPosition() once the playback resumes. If there is a mechanism to know if buffering is crossing 30 sec or longer, then try to call stop() at that point.
The main changes required in existing Client Apps (integrated with Pre-DCR SDK) for integration with DCR SDK, are as follows.
*Removal of C/C++ code
*Removal of Singleton design for SDK
*Addition/Removal of few public API's
[[Click here]] to view the guidelines / steps for integrating DCR SDK for existing Client Apps.


=== What are the mandatory parameters to initialize the SDK? ===
===== Initialization of App SDK object =====
<code>appName</code>, <code>appVersion</code>, <code>sfcode</code> and <code>appid</code> are mandatory parameters to initialize the SDK.
 
==== Initialization of App SDK object ====
<syntaxhighlight lang="java">
<syntaxhighlight lang="java">
String config = "{"+ "\"appName\" : \"" + "AppName" + "\","
String config = "{"+ "\"appName\" : \"" + "AppName" + "\","
                     + "\"appVersion\" : \"" + "1.0" + "\","
                     + "\"appVersion\" : \"" + "1.0" + "\","
                     + "\"sfcode\" : \"" + "uat-cert" + "\","
                     + "\"sfcode\" : \"" + "uat-cert" + "\","
                     + "\"appid\" : \"" + "TXXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX"
                     + "\"appid\" : \"" + "PXXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX"
                     + "\""
                     + "\""
                     + "}";
                     + "}";
Line 181: Line 182:
</syntaxhighlight>
</syntaxhighlight>


=== Is there any debug facility to see if AppSDK APIs are being called successfully? ===
==== Is there any debug facility to see if AppSDK APIs are being called successfully? ====
Yes, you can use Debug flag to check whether an App SDK API call made is successful. To activate the Debug flag, Pass the argument + "\"nol_devDebug\" : \"" + "I" + "\",", while initializing the App SDK object. Once the flag is active, API call made and the data passed are logged. The log created by this flag is minimal. DO NOT activate the Debug flag in production environment.
Yes, you can use Debug flag to check whether an App SDK API call made is successful. To activate the Debug flag, Pass the argument + "\"nol_devDebug\" : \"" + "I" + "\",", while initializing the App SDK object. Once the flag is active, API call made and the data passed are logged. The log created by this flag is minimal. DO NOT activate the Debug flag in production environment.


=== Does App SDK need Google Play services? If Yes, why? ===
==== Does App SDK need Google Play services? If Yes, why? ====
Yes, App SDK needs Google Play services to be included in the Android project. App SDK needs this to fetch Advertising ID from google devices. For Amazon devices, App SDK uses Android ID for this purpose.
Yes, App SDK needs Google Play services to be included in the Android project. App SDK needs this to fetch Advertising ID from google devices. For Amazon devices, App SDK uses Android ID for this purpose.


=== Google Play services project is bulky, do we have a list of classes that App SDK uses? ===
==== Google Play services project is bulky, do we have a list of classes that App SDK uses? ====
Yes, you need to edit Google Play services project / lib to have App SDK use only the following classes / package.
Yes, you need to edit Google Play services project / lib to have App SDK use only the following classes / package.
*Library
*Library
Line 199: Line 200:
**com.google.android.gms.common.GooglePlayServicesNotAvailableException
**com.google.android.gms.common.GooglePlayServicesNotAvailableException


=== How to get the opt-out URL from App SDK? ===
==== We are currently using AppSDK 4.0.0 (for Android). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe? ====
Once App SDK is initialized successfully, use the API public String [[userOptOutURLString()]] to retrieve the opt-out URL.
 
=== What are Interrupt scenarios? ===
Interrupt scenarios are use cases that lead to interruption in content (audio/video) playback. App sdk/developers should watch for these use cases and implement the Nielsen AppSDK in a way that AppSDK does accurate measurement.
 
=== What are the different Interrupt scenarios that app developer should take care of? ===
There are various interrupt scenarios which app developer should take care of
*Pause / Play
*Wi-Fi OFF / ON
*Airplane Mode ON / OFF
*Call Interrupt ( SIM or Third party Skype / Hangout call)
*Alarm Interrupt
*Content Buffering
*Lock / Unlock device
*App going Background / Foreground
 
=== What should a radio / audio app developer do during interrupt scenarios? ===
Radio / audio app developer must trigger / stop the API calls during interrupt scenarios, as mentioned in [[Digital Measurement Interruption Scenarios]] section.
 
=== What should a video app developer do during these interrupt scenarios? ===
Video app developer must trigger / stop the API calls during interrupt scenarios, as mentioned in [[Digital Measurement Interruption Scenarios]] section.
 
=== What is Adobe VHL Joint SDK? ===
Adobe VHL JointSDK is a collaboration between Adobe and Nielsen to measure video viewer-ship, app launches, app crashes, advertisements, page views, and more. Adobe VHL JointSDK allows clients to obtain video demographic and volumetric (quantity) data along with published Nielsen ratings.
 
=== We are currently using AppSDK 1.2.3 (for Android). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do I get tracking metrics for both Nielsen and Adobe? ===
Adobe VHL Joint SDK component bundle contains '''Nielsen AppSDK 4.0.0''' which is capable of measuring all Nielsen measurement products like mTVR / DTVR, Digital Audio/Radio and mDAR measurements along with Digital Content Ratings (DCR). Starting from Nielsen AppSDK version 4.0.0, Nielsen AppSDK is no longer singleton and is capable of tracking up to 4 players at a time. To accommodate this change, Nielsen AppSDK initialization API is modified from a static API <code>AppSdk.getInstance (context, config, iappNotifier)</code> to a constructor <code>AppSdk (context, config, iappNotifier)</code>. Perform [[these steps]] to implement Adobe VHL Joint SDK.
 
=== We are currently using AppSDK 4.0.0 (for Android). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe? ===
Perform the following steps to implement Adobe VHL Joint SDK.
Perform the following steps to implement Adobe VHL Joint SDK.
*Remove existing Nielsen-only SDK components ('''appsdk.jar''') from the app's project. Adobe VHL SDK ('''VideoHeartbeat.jar''') already bundles Nielsen SDK components.
*Remove existing Nielsen-only SDK components ('''appsdk.jar''') from the app's project. Adobe VHL SDK ('''VideoHeartbeat.jar''') already bundles Nielsen SDK components.
Line 235: Line 207:
*Concentrate on implementing new measurement products like DCR (Digital Content Ratings) using Adobe VHL API's and get both Nielsen and Adobe tracking Metrics.
*Concentrate on implementing new measurement products like DCR (Digital Content Ratings) using Adobe VHL API's and get both Nielsen and Adobe tracking Metrics.


== Browser ==
=== Browser ===
=== How to upgrade from pre-DCR to DCR? ===
==== How to upgrade from pre-DCR to DCR? ====
Perform the following steps to upgrade from pre-DCR to DCR.
Perform the following steps to upgrade from pre-DCR to DCR.
*Modify the current plugin URL to point to the latest URL provided by the TAM. (change from ggcmb400 URL to the ggcmb500 URL)
*Modify the current plugin URL to point to the latest URL provided by the TAM. (change from ggcmb400 URL to the ggcmb500 URL)

Latest revision as of 15:26, 6 April 2023

Engineering Portal / Digital / US DCR & DTVR / Digital Measurement FAQ

DCR/DTVR (Digital Content Ratings/Digital Television Ratings)

SDK APIs

Should we call the play and stop methods whenever the user clicks on the pause/play button?

Yes. Call stop method and stop sending the playhead when the user clicks on pause button and when content resumes playback call play, loadMetadata and start sendID3 / playheadPosition.

Is playheadPosition needed for reporting on live streams?

Playhead position should be reported for both live streaming and VOD content streaming.

  • For VOD content (not live) playing, playhead position must be number of seconds from the beginning of the content. This includes ads.
  • For live content playing, playhead position must be the current Unix timestamp (seconds since Jan-1-1970 UTC).

How to send the Pause event in an Interactive radio station streams as it does not have a stop function?

Call Nielsen App SDK stop API to send a pause / stop event. Upon a station change,

  • Call stop for the previous station and
  • Call play and loadMetadata for the new station.

For Events, what is the difference between play and loadMetadata?

When content starts playing, call play and loadMetadata APIs

  • play being called the first time since initialization or following a previous stop call prepares the SDK to measure content being played. Calling play consecutively does nothing.
  • loadMetadata allows the metadata to be changed during playback.

Can we always send stop, then play, loadMetadata and playheadPosition even if it is brief buffering, as there is no way to predict the buffering time?

For App SDK (iOS and Android), do not call stop and play APIs in that sequence. Always call play first and then call stop, as play starts a new viewing session and stop stops the viewing session.

  • For brief buffering on all platforms, do not call stop right away. Instead, stop sending the playheadPosition during the buffer time and start the playheadPosition once the playback resumes. If there is a mechanism to know if buffering is crossing 30 seconds or longer, then call stop at that point.

Is there any debug facility to see if App SDK APIs are being called successfully?

Include nol_devDebug to the SDK initialization call to check whether App SDK API calls are successful.

Note: DO NOT activate the Debug flag in production environment.

What are the different interrupt scenarios app developer should take care of?

There are various interrupt scenarios which app developer should take care of

  • Pause / Play
  • Network loss (Wi-Fi / Airplane / Cellular)
  • Call Interrupt ( SIM or Third party Skype / Hangout call)
  • Alarm Interrupt
  • Content Buffering
  • Lock / Unlock device (Video players only)
  • App going Background / Foreground (Only video players without PIP mode support )
  • Channel / Station Change Scenario
  • Unplugging of headphone

What should a video app developer do during interrupt scenarios?

Video app developer must trigger / stop the API calls during interrupt scenarios, as mentioned in Digital Measurement Interruption Scenarios section.

How to get the opt-out URL from App SDK?

Once App SDK is initialized successfully, use the API public String userOptOutURLString() to retrieve the opt-out URL.

Metadata

What is ChannelName? Is it the value in the Nielsen MediaView dictionary, or is it a value that corresponds to television?

ChannelName is the name of the channel of the playing content (e.g., MTV, ESPN, etc.). It is a requirement for iOS and Android apps to inform App SDK that this channel/content is being played at player side. Refer to play API for more details.

What value should "tag" carry?

Tag refers to the Channel name. Channel Name can be a free-form value (a friendly name for the content being played). If no name is available, pass the URL string of the content being played.

What format does the length field need to be in? (Minutes, Seconds, Milliseconds, etc.)?

Length of content should be in Seconds.

What are the possible types of play that can be sent?

'Type' refers to the type of play event like preroll, midroll, postroll, content, or static.

What values are typically passed for ChannelInfo?

ChannelInfo refers to the Channel name. This can be a free-form value (a friendly name for the content being played). If no name is available, pass the urlstring of the content being played.

Example: ESPN or http://www.XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX.m3u8

What should the value be for "Episode Name"? If full video name in CMS is: "The Good Wife – The Deep Web" Should we instead concatenate it to just be "The Deep Web"?

For online ( Episode Name can be client-defined or as it appears in the CMS. — default aggregation for reporting will be Episode > Program > Channel. Program Name will be defined and reported, based on the category parameter. So program name need not be included within the "Episode Name" field.

Should season and episode numbers be included in "Episode Name"? Is it the best practice to include this information?

Nielsen can now report down to the episode level. Having the ability to distinguish episodes for reports is recommended. Season/Episode numbers can be used if preferred.

Which of the following "Episode Names" best suits the short form content like http://www.cbs.com/shows/the_good_wife/video/43F8FA6E-10F1-F508-1909-C3E3150C8C7D/the-good-wife-a-true-politician/ ?

  • "The Good Wife – A True Politican"
  • "A True Politican"
  • "The Good Wife – The Deep Web"
  • "The Deep Web"

Is ‘type' always hard-coded to have a value ‘content'?

Depending on the type of content being played, ‘type' must be set to ‘content', ‘preroll', ‘midroll' or ‘postroll' to define content and ‘ad' for ads

What are the mandatory parameters to initialize the SDK?

appName, appVersion, sfcode and appid are mandatory parameters to initialize the SDK.

Ads

Is Nielsen SDK dependent on Ad Support Framework? Does Nielsen support mobile apps without ads?

No, it is not dependent. Nielsen SDK supports mobile apps without ads.

How should the playhead position be sent under adbreaks (Stop sending the playhead or send the playhead for the ad break time)?

App should always send the playhead position regardless of ad break. The App should

  • Call stop() before starting an ad break.
  • Call loadMetadata() to load ad.
  • Once adbreak is complete, call stop and loadMetadata(content).

To summarize, Nielsen AppSDK needs to call loadMetadata() for every asset to load both content and ads, stop() when the current asset is changed or complete. Call setPlayheadPosition() all the time.

When running a DAR campaign, how is the "DAR Tag" added into the metadata object?

Nielsen DAR tag, or Nielsen DAR beacon, may come in different forms from ad service via VAST XML or ad service integrated player framework library. The Android developer should

  • Identify a Nielsen beacon from the ad server response
  • Create JSON metadata with Nielsen beacon
JSON:{ "type":"ad", "ocrtag":"" }

Our video players currently pass DAR information through the use of ad 1x1s that are fired off as each ad plays. Here's an example: http://secure-us.imrworldwide.com/cgi-bin/m?ci=nlsnci370&am=3&at=view&rt=banner&st=image&ca=nlsn11134&cr=crtve&pc=cbs_plc0001&ce=cbs&r=%n Do we need to forego this approach and use the DAR tag with the SDK? Will the 1x1s be sufficient?

For online (web) DAR, there is no need to use the SDK for DAR. For mobile DAR, as we need to access the IDFA / Google ID, follow one of the approaches mentioned below when the app is tagged. Handle the IDFA collection, and append that data to the same OCR tag, using the mDAR alternative method. (OR) To use App SDK to make the request, the same tag url (1×1) would be used. Pass the DAR tag (AS-IS) and the App SDK will handle the retrieval and concatenation.

Miscellaneous

How to handle the delay in HLS streams from the actual live stream (often by at least 30 seconds)?

This delay need not be considered since it will apply to all stream viewing instances.

What is Adobe VHL Joint SDK?

Adobe VHL JointSDK is a collaboration between Adobe and Nielsen to measure video viewer-ship, app launches, app crashes, advertisements, page views, and more. Adobe VHL JointSDK allows clients to obtain video demographic and volumetric (quantity) data along with published Nielsen ratings.

Does the App SDK generate HTTP traffic immediately once the call is made to the SDK? How do we know if a correct call to SDK is made or not?

When SDK starts up, there is an initial ping. HTTP traffic can be captured within few seconds of passing the OCR tag to SDK. For other products like mTVR, the ping is immediate upon start of video stream.

Are the Privacy Policy and opt-out URLs the same? If not, is there a method that delivers the Privacy Policy URL?

No, they are not the same. There is a link to our opt-out page that goes into the app itself. This URL is never hardcoded into the app and there is a method userOptOutURLString() to pull the URL from the App SDK. For our Privacy Policy, refer to Nielsen Privacy Requirements.

Is there a separate Terms & Conditions page to link to, apart from privacy policy URL? Is there a method that delivers this URL?

Terms & conditions are present in the Privacy Policy page.

iOS

Is there any debug facility to see if App SDK APIs are being called successfully?

Include nol_devDebug to the SDK initialization call to check whether App SDK API calls are successful.

NSDictionary* appInformation = @{
            @"appid": appid,
            @"appversion": appversion,
            @"appname": appname,
            @"sfcode": sfcode,
            @"nol_devDebug": @"INFO"};

Note: DO NOT activate the Debug flag in production environment.

We are currently using App SDK 4.0.0 (for iOS). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe?

Perform the following steps to implement Adobe VHL Joint SDK.

  • Remove the NielsenAppApi.framework from the existing project since the Adobe VHL Plugin already bundles the Nielsen SDK components.
  • Import the NielsenAppApi header file into the class in order to ensure access to all of the Nielsen methods.
    • Contact Nielsen Technical Account Manager (TAM) to get the latest Nielsen header file.
  • Implement new products using the Adobe VHL methods. This will keep the previous Nielsen integration intact and allows capturing both Nielsen and Adobe tracking Metrics.

Why does SDK throw a ‘selector not recognized' exception while calling loadMetadata for iOS?

This is an issue with the Apple linker. For 64-bit and iPhone OS applications, a linker bug prevents -ObjC to load object files from the static libraries (that contain only categories and no classes). The methods when found missing at runtime results in ‘selector not recognized' exception. To fix this issue, add the -all_load flag or -force_load flags to Other Linker Flags in build settings for their application target.

  • -all_load forces the linker to load all object files from every archive it sees, even those without Objective-C code.
  • -force_load allows finer grain control of archive loading and is available in Xcode 3.2 and later. Each -force_load option must be provided a path to an archive, and every object file in that archive will be loaded.

For more details click here.

Android

How to upgrade from pre-DCR to DCR?

The main changes required in existing Client Apps (integrated with Pre-DCR SDK) for integration with DCR SDK, are as follows.

  • Removal of C/C++ code
  • Removal of Singleton design for SDK
  • Addition/Removal of few public API's

Click here to view the guidelines / steps for integrating DCR SDK for existing Client Apps.

Initialization of App SDK object
String config = "{"+ "\"appName\" : \"" + "AppName" + "\","
                    + "\"appVersion\" : \"" + "1.0" + "\","
                    + "\"sfcode\" : \"" + "uat-cert" + "\","
                    + "\"appid\" : \"" + "PXXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX"
                    + "\""
                    + "}";
AppSdk mAppSdk = new AppSdk(context, config, iappNotifier);
if (mAppSdk == null || !(mAppSdk != null && AppSdk.isValid()))
{
          Log.e(TAG, "Failed in creating the App SDK framework");
          return;
}

Is there any debug facility to see if AppSDK APIs are being called successfully?

Yes, you can use Debug flag to check whether an App SDK API call made is successful. To activate the Debug flag, Pass the argument + "\"nol_devDebug\" : \"" + "I" + "\",", while initializing the App SDK object. Once the flag is active, API call made and the data passed are logged. The log created by this flag is minimal. DO NOT activate the Debug flag in production environment.

Does App SDK need Google Play services? If Yes, why?

Yes, App SDK needs Google Play services to be included in the Android project. App SDK needs this to fetch Advertising ID from google devices. For Amazon devices, App SDK uses Android ID for this purpose.

Google Play services project is bulky, do we have a list of classes that App SDK uses?

Yes, you need to edit Google Play services project / lib to have App SDK use only the following classes / package.

  • Library
    • google-play-services_lib
  • Classes / package
    • com.google.android.gms.ads.identifier.AdvertisingIdClient;
    • com.google.android.gms.ads.identifier.AdvertisingIdClient.Info;
    • com.google.android.gms.common.ConnectionResult;
    • com.google.android.gms.common.GooglePlayServicesUtil;
    • com.google.android.gms.common.GooglePlayServicesRepairableException;
    • com.google.android.gms.common.GooglePlayServicesNotAvailableException

We are currently using AppSDK 4.0.0 (for Android). What is the recommended way to switch to Adobe VHL Joint SDK without affecting our current implementation? (OR) How do we get tracking metrics for both Nielsen and Adobe?

Perform the following steps to implement Adobe VHL Joint SDK.

  • Remove existing Nielsen-only SDK components (appsdk.jar) from the app's project. Adobe VHL SDK (VideoHeartbeat.jar) already bundles Nielsen SDK components.
  • Update the classpath / build settings pointing to latest Adobe VHL SDK jar. Remove Nielsen Only appsdk.jar from classpath to avoid any runtime issues as the api definitions got changed in latest SDK version bundled in Adobe VHL jar.
  • Include Adobe VHL Joint SDK component VideoHeartbeat.jar in the app's project.
  • Concentrate on implementing new measurement products like DCR (Digital Content Ratings) using Adobe VHL API's and get both Nielsen and Adobe tracking Metrics.

Browser

How to upgrade from pre-DCR to DCR?

Perform the following steps to upgrade from pre-DCR to DCR.

  • Modify the current plugin URL to point to the latest URL provided by the TAM. (change from ggcmb400 URL to the ggcmb500 URL)
  • Modify the apid and the sfcode to the latest provided by your TAM

Note: If the implementation is done via a plugin, make the changes on the Plugin UI

  • Initialize the SDK as follows.
var gg1 = NOLCMB.getInstance("unique_string");
gg1.ggInitialize(_nolggGlobalParams);

Note: To measure static content on a page and integrate the SDK with a player for video measurement, it is recommended to create one SDK instance.

  • To also enable static (page) measurement,
    • Repeat the above three steps across all the website page templates, and
    • Enable the static page event along with the static metadata (example below) towards the bottom of each page (as close to </body> as possible).
sendNielsenEvent('14',{type:"static",assetid:"static123",section:"sitesection",segA:"Segment1", segB:"Segment2",seg C:"Segment3"});