Who We Are
DoubleVerify is an Israeli-founded big data analytics company (Stock: NYSE: DV). We track and analyze tens of billions of ads every day for the biggest brands in the world like
Nike, Apple, Disney, Vodafone, and partner with leading platforms like
Meta, TikTok, YouTube and others. If you’ve seen an ad on your phone, laptop, or digital TV - we likely analyzed it.
We operate at a massive scale, handling over 100B events per day and over 1M RPS at peak, we process events in real-time at low latencies (ms) and analyze over 2.5M video years
every day. We verify that all ads are fraud free, appear next to appropriate content, appear to people in the right geography and measure the viewability and user’s engagement throughout the ad’s lifecycle.
We are global, with HQ in NYC and R&D centers in Tel Aviv, New York, Finland, Berlin, Belgium and San Diego. We work in a fast-paced environment and have a lot of challenges to solve. If you like to work in a huge scale environment and want to help us build products that have a huge impact on the industry, and the web - then your place is with us.
What Will You Do
As a Senior Big Data Engineer, you will be taking a
central technical leadership role in
designing and implementing our new Big data Lakehouse infrastructure (PB’s of data) as part of our effort of migrating our current on-prem data solutions to Google cloud (GCP)
.
You will be the
owner of the data strategy and as such, you will learn how the data serve our goals, come up with ways to
improve our tens of billions of events and TBs of data processes while maintaining high data quality; conduct proof of concepts with latest data tools; and by that, help our clients make smarter decisions that continuously improve their ad-impression quality.
You will work with a
wide array of languages and technologies such as GCP, DataBricks, Spark, Python, Scala, SQL, BigQuery, Vertica, Kafka, Docker, Kubernetes, Gitlab and more.
We believe in our people’s abilities to
take things end to end; working with product managers, designing solutions, continuously building, deploying and analyzing data -
getting things done.
Who You Are
- A team Player with an ability to work independently and good communication skills.
- Actively seek ways to improve software processes and interactions
- A versatile developer with decision making capability and “getting-things-done” attitude.
- 4+ years of experience with one of the following languages: Python, Scala or Java.
- Hands-on experience in one of the following: Kafka/Kafka Streams/Spark/Flink/Beam
- Extensive experience in working with SQL/NoSQL Databases and data warehouses such as Databricks,BigQuery,Snowflake,Redshift,Vertica, etc.
- experience working with public cloud providers such as GCP/AWS/Azure
- Interested in learning about the ad tech industry and a general willingness to learn new things.