Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Manager, Software Engineering

Immuta enables organizations to unlock value from their cloud data by protecting it and providing secure access. The Immuta Data Security Platform provides s...


From Immuta - Gisborne

Published a month ago

Senior Director, Software Services Us – Cloud Storage

Senior Director, Software Services US – Cloud StorageOur client is a fast-growing, young, award-winning company in the space of cloud storage with high profi...


From Chronos Consulting - Gisborne

Published a month ago

Ict Hardware Technician

Our Store operates every day, from morning 8 : 3 0 am to 05 : 3 0 pm, and weekend work is an essential part of this role. Key Responsibilities : Diagnose and...


From Noor Tech Ltd - Gisborne

Published a month ago

Chinese Accent Reduction Tutor Jobs In Gisborne

Company Superprof is New Zealand's leading private tutoring platform, dedicated to connecting people who wish to learn with those who wish to teach. Our know...


From Superprof - Gisborne

Published a month ago

Data Engineer (Latin America/Remote, Non-U.S.)

Data Engineer (Latin America/Remote, Non-U.S.)
Company:

Pulsepoint


Details of the offer

PulsePoint is a leading technology company that uses real-world data in real-time to optimize campaign performance and revolutionize health decision-making. Leveraging proprietary datasets and methodology, PulsePoint targets healthcare professionals and patients with an unprecedented level of accuracy—delivering unparalleled results to the clients we serve. The company is now a part of Internet Brands, a KKR portfolio company and owner of WebMD Health Corp.
PulsePoint Data Engineering team plays a key role in our technology company that's experiencing exponential growth. Our data pipeline processes over 80 billion impressions a day (> 20TB of data, 220 TB uncompressed). This data is used to generate reports, update budgets, and drive our optimization engines. We do all this while running against extremely tight SLAs and provide stats and reports as close to real-time as possible.
The most exciting part about working at PulsePoint is the enormous potential for personal and professional growth. We are always seeking new and better tools to help us meet challenges such as adopting proven open-source technologies to make our data infrastructure more nimble, scalable and robust. Some of the cutting-edge technologies we have recently implemented are Kafka, Spark Streaming, Presto, Airflow, and Kubernetes.
What you'll be doing :
Design, build, and maintain reliable and scalable enterprise-level distributed transactional data processing systems for scaling the existing business and supporting new business initiatives
Optimize jobs to utilize Kafka, Hadoop, Presto, Spark, and Kubernetes resources in the most efficient way
Monitor and provide transparency into data quality across systems (accuracy, consistency, completeness, etc)
Increase accessibility and effectiveness of data (work with analysts, data scientists, and developers to build/deploy tools and datasets that fit their use cases)
Collaborate within a small team with diverse technology backgrounds
Provide mentorship and guidance to junior team members
Team Responsibilities :
Ingest, validate and process internal & third party data
Create, maintain and monitor data flows in Spark, Hive, SQL and Presto for consistency, accuracy and lag time
Maintain and enhance framework for jobs(primarily aggregate jobs in Spark and Hive)
Tool evaluation/selection/implementation
Backups/Retention/High Availability/Capacity Planning
Review/Approval - DDL for database, Hive Framework jobs and Spark Streaming to make sure they meet our standards
Technologies We Use :
Airflow - for job scheduling
Docker - Packaged container image with all dependencies
Graphite/Beacon - for monitoring data flows
Hive - SQL data warehouse layer for data in HDFS
Presto - fast parallel data warehouse and data federation layer
SQL Server - Reliable OLTP RDBMS
GCP BQ
Requirements: 5+ years of data engineering experience
Fluency in Python, experience in Scala/Java is a huge plus (Polyglot programmer preferred!)
Proficiency in Linux
Strong understanding of RDBMS, SQL;
Passion for engineering and computer science around data
Knowledge and exposure to distributed production systems i.e Hadoop is a huge plus
Knowledge and exposure to Cloud migration is a plus
Willing and able to work East Coast U.S. hours (9am-6pm EST); you can work remotely
Willingness to participate in 24x7 on-call rotation
Selection Process :
1) Initial Screen (30 mins)
4) Team Interview (60 mins + 3 x 45 mins) + SVP of Engineering (15 mins)
WebMD and its affiliates is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, ancestry, color, religion, sex, gender, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law. 
#J-18808-Ljbffr


Source: Jobleads

Requirements

Data Engineer (Latin America/Remote, Non-U.S.)
Company:

Pulsepoint


Built at: 2024-05-14T23:45:22.313Z