Important: The information in this article is restricted and should not be shared with anyone outside of Sizmek.


Data filtration excludes robot traffic from data processing for reporting and billing.

Levels of Filtration

Sizmek data validation process is based on several levels of filtration.

IAB Blacklist

Sizmek has a full integration with IAB; every call (for example, impression or click) to the Sizmek server is crossed-reference with an IAB blacklist based on user-agent or IP. If there is a match, the call is filtered to the robots logs. Sizmek retrieves the list from IAB whenever the list is updated.


When a tag is called and the server does not have an answer or ad to return, Sizmek saves this call log as blank.

Unknown User-Agent Filtration

If the on-site setup feature, Filtered Unknown User-Agent, is enabled, all uncommon user-agents (other than desktop, mobile, or tablet) data will be filtered out.


  • White listed user IPs: Events coming from white listed IPs are considered as non-robot, regardless of traffic thresholds.
  • Black listed user IP ranges: Events coming from black-listed IP ranges are considered as robot, regardless of traffic thresholds, or IP ranges that represent cloud traffic, such as starting with 54.

Thresholds (DB)

The system uses internal logic to filter abnormal activity.

Note: See Sizmek Source for more information regarding thresholds.

Filtered Data Mapping

  • Server filtration (EDS): SizmekIP, IABBlacklist, and blanks are filtered to the Server Robots folder. 
  • DB filtration (GP): When users exceed the thresholds, static IPs are filtered to the robots table.
  • S3: When users which exceed the thresholds, static IPs are filtered to the robots folder.

Note: The system saves a reason for Server and DB filtration, but not for S3.

Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request