- Design, build and support big data infrastructure/End-to-End platform for customer analytical needs
- Design, build and Support batch processing and streaming data pipelines
- Write and maintain extract, transform and load scripts (ETLs)
- Work with cross functional stakeholders to understand their analytics needs to build efficient and scalable data solutions
- Own data quality for allocated areas of ownership
- Triage infrastructure issues and drive them to resolution
- Bachelor’s or Master’s Degree in Computer Science or Software Engineering
- Proficient in computer science fundamentals, algorithms & data structures
- Proficient in programming languages like Java & Python
- Experience with Database Technologies (SQL/NoSQL) & Big Data Technologies (Hadoop, Hive)
- Nice to have:- Experience with Cloud Platforms like AWS, GCP and Azure
- Proficient in architect and designing batch processing and streaming data pipelines using Spark, Kafka
- Familiar with working in an agile software development framework with test driven development approaches
- Excellent communication, analytical and problem solving skills
ABOUT WESTERN DIGITAL
The future. It’s on you. You & Western Digital.
We’ve been storing the world’s data for more than 50 years. Once, it was the most important thing we could do for data. Now we’re helping the world capture, preserve, access and transform data in a way only we can.
The most game-changing companies, consumers, professionals, and governments come to us for the technologies and solutions they need to capture, preserve, access, and transform their data.
But we can’t do it alone. Today’s exceptional data challenges require your exceptional skills. It’s You & Us. Together, we’re the next big thing in data.
Western Digital® data-centric solutions are found under the G-Technology™, HGST, SanDisk®, Upthere™, and WD® brands.
SNDK Data Analytics Engineering (G&A)
SanDisk India DDC Pvt Ltd
Requisition # JR-0000059193