Table of Content

CLIENT OVERVIEW

Our client is one of the leading digital media trading experts providing highly targeted web advertisement services. Client’s media trading platform provides technology, strategy, services and insights to optimize interactions across the leading display advertising exchanges on behalf of advertisers. This platform allows marketers to activate data, automate execution and transform the way marketing performs.

ENGAGEMENT SITUATION

Our client had partnered with major display exchanges for optimized consumer interactions across multiple channels. With more number of advertisers and exchangers, the quantity of data increased significantly making the existing systems ineffective and slow in response against the rapid flow of data. With the increasing number of partners from ad-exchanges and advertisers, the amount of data had increased rapidly from 50 GB to 2.5-3 TB. Analyzing this data in real time was imperative for the client to gauge the campaign performance for their individual partners and end customers.

Due to lack of effective data gathering and management, client was losing on potential customer base and new opportunities. This led to an immediate requirement of a robust data warehouse along with reporting capabilities. The client wanted to effectively merge, transform the data received from various ad-exchanges and facilitate easy, common analysis for quick decision making and actions on gained intelligence. As an intermediate party between ad agencies and advertisers, client’s business required a lot of data analysis to ensure success of the client’s campaigns.

In addition to the data management needs, our client was looking for operations support of their production system.

KEY REQUIREMENTS

  • Implement a scalable data warehouse to handle increased number of advertisers and ad-exchange’s data
  • Implement a data management process to Develop a complete analytics system to enable deriving insights from the received data
  • Application development for converting their existing web based application into SOA application
  • QA and testing of the operations system
  • Design the system to be able to incorporate the rapidly increasing data ranging from 50-80GB to 1.5-3TB
  • Migrating entire data from Oracle database to another environment to allow scalability and flexibility
  • Maintain data quality and ensure a maximum of 1% error rate
  • Re-architecting the existing web-based application to enable scalability and robustness
  • Develop intuitive UI that seamlessly captures and retrieves data from the database

XORIANT CONTRIBUTION

After a thorough analysis of the client requirements, Xoriant team incorporated three teams each from data warehousing, GUI engineering and QA support experts for this engagement. Xoriant team delivered unified, powerful data warehousing solutions, with inputs of structured and unstructured information, operational and transactional data in real time. Xoriant Big Data and Analytics experts developed a customized attribution module for the client addressing their unique campaign coverage and real time events collection to gauge the campaign performance. This required thorough data collection, data cleansing and runtime analytics on the incoming volumes of data. As per the customer’s demands our experts migrated the data from Oracle to Netezza (version 6.0.3). Looking at the complexity and dynamically changing requirements from the client, Xoriant development team leveraged Xoriant Continuous Delivery Accelerator’ (XCDA) to deliver a highly scalable solution. Continuous integration and rapid deliveries ensured transparent, error-free reporting and powerful analytics to provide insights into the performance drivers.

KEY CONTRIBUTIONS

  • Using Oracle warehouse builder (OWB) the team designed the database(s) and data warehousing processes like ETL along with other data management jobs
  • Cron scheduler was used for scheduling jobs that automate system maintenance and administration
  • Database development team developed database objects, related scripts using Unix Shell scripting, development and testing of specific data warehouse processes. Adhering to the proven in-house methodologies, Xoriant team consistently maintained the data quality with minimum of 99% accuracy
  • Xoriant developed daily and weekly automated jobs using Cron and OWF to execute the transformation process. Once data gets populated in warehouse schema with validations check, TIBCO Spotfire was used to provide information about the actual conversion of advertisement against only impressions on daily basis
  • The team created new data models and migrated the code as per Netezza environment
  • This enabled the client to respond to the intelligence gained through all the structured and unstructured data rapidly and effectively without any down time
  • For archiving and managing large volumes of data, migrated the ETL data warehouse from OWB to IBM Netezza and Big Data Hadoop providing 24x7 production support
  • For faster processing of over 15TB of data every hour, designed and implemented the attribution process to map the latest impressions to clicks data using Impala
  • To process ~50 million rows of data every hour, used Hadoop Distributed File System Hive and developed scrubbing logic for deduping and IP scrubbing
  • Xoriant developed the Netezza data replication process from production to reporting server
  • Using Perl, PHP and Python the team developed an intuitive UI that captures information specific to an ad ’Like’, number of clicks, impressions, action servers, and user viewing the ad, ad usage, etc. This data is retrieved in .CSV formats and stored in the database. Various scripts have been developed for RMX and ADX ad exchange to pick data from API. Data are fetched from API using application written in Perl scripting for RMX (right media exchange) and ADX (ad exchange) process
  • Used Microstrategy for reporting and visualizations
  • Implemented the ABE concept to major BI reports like Merkle and Artemis
  • XCDA’s comprehensive “Continuous Quality Control” module that ensured a unique way to create automated testing scripts in Sahi Pro and JIRA

ARCHITECTURE DIAGRAM

TOOLS AND TECHNOLOGIES

  • Netezza Release 6.0.3
  • PostgreSQL
  • Oracle 10g
  • Oracle Warehouse Builder
  • Oracle workflow
  • PL/SQL
  • Unix shell scripting
  • Cron scheduler
  • Win Runner
  • Selenium
  • Perl
  • Python
  • PHP
  • Java/ J2EE
  • JIRA
  • Sahi Pro

RESULTS/CLIENT BENEFITS

  • New data warehouse implementation reduced the data analysis time from 20 hrs to 3 hrs
  • Rapid analysis with accuracy led to a cost effective way of reaching to potential customers and buyers which increased the customer base significantly
  • 24x7 working environment assured end-to-end production and operation support to dynamically changing scenarios from user to user

CLIENT APPRECIATIONS

Technical Product Manager – "Thanks for delivering all of the caught-up data on time. This is a huge accomplishment for us and we could not have done it without ebw's commitment. We are excited about the accomplishments of the development focus."

VP –Information Strategy – "Excellent job. Thanks for your persistence in finding a solution that supports our business. This allows the reporting team to turn their attention on a major pillar of our business which is insight development."

Download Case Study

Data Management and Analysis Solution for a Digital Media Client

Data Management and Analysis Solution for a Digital Media Client

Implemented scalable data warehouse for a digital media client and reduced data analysis time by 30% thus increasing the client customer base significantly.

Key Requirements

  • Implement a scalable data warehouse to handle increased number of advertisers and ad-exchange’s data.
  • Implement a data management process to Develop a complete analytics system to enable deriving insights from the received data.
  • Application development for converting their existing web based application into SOA application. Know More >>

Key Contributions

  • Using Oracle warehouse builder (OWB) the team designed the database(s) and data warehousing processes like ETL along with other data management jobs.
  • Cron scheduler was used for scheduling jobs that automate system maintenance and administration.
  • Database development team developed database objects, related scripts using Unix Shell scripting, development and testing of specific data warehouse processes. Adhering to the proven in-house methodologies, Xoriant team consistently maintained the data quality with minimum of 99% accuracy. Know More >>

Benefits

  • New data warehouse implementation reduced the data analysis time from 20 hrs to 3 hrs.
  • Rapid analysis with accuracy led to a cost effective way of reaching to potential customers and buyers which increased the customer base significantly. Know More >>
Register here to download the entire case study

Please fill in the below details to access the entire case study

Verification Code * :
Image CAPTCHA
Enter the characters shown in the image.