Senior Data Engineer - Remote Opportunity at Cognizant
About the Role
We're hiring a Senior Data Engineer to join our dynamic team at Cognizant. This Senior Data Engineer remote position offers you the chance to work with cutting-edge technologies and contribute to innovative data solutions for our clients. You'll be part of a collaborative environment that values your expertise and encourages professional growth.
What You'll Do
- Independently design and build/configure big data pipelines according to specifications.
- Identify and solve complex big data problems using your strong technical skills.
- Collaborate with senior developers to report and resolve issues during the build process.
- Participate in defect fixes during QA and UAT phases.
- Manage the lifecycle of code from low-level design to production and post-production support.
Requirements
- Strong technical and analytical skills to tackle complex big data challenges.
- Experience with the Spark software stack, including Python, Scala, or Java.
- Hands-on knowledge of HDFS, MapReduce, Hive, Impala, Sqoop, Pig, Oozie, HBase, Cassandra, MongoDB, Kafka, and more.
- Familiarity with CI/CD and DevOps concepts.
- Experience working in cloud environments (Azure, AWS, GCP) using Agile methodologies.
Nice to Have
- Experience with Apache Phoenix, Spark Streaming, and Cloudera tools.
- Strong communication skills and ability to manage stakeholder expectations.
- Experience in mentoring junior engineers.
What We Offer
- Competitive salary and benefits package.
- Opportunities for professional development and training.
- A supportive and diverse work environment.
- Flexible work arrangements with a focus on work-life balance.
- Access to cutting-edge technology and industry trends.
This Senior Data Engineer role at Cognizant offers a unique opportunity to work remotely with cutting-edge technology in a supportive environment.
Who Will Succeed Here
Proficient in building efficient data pipelines using Databricks and PySpark, demonstrating a deep understanding of big data processing frameworks and optimization techniques.
Self-motivated and disciplined, thriving in a remote work environment while effectively managing time and prioritizing tasks to meet project deadlines without direct supervision.
Extensive experience with Hadoop ecosystem tools such as HDFS, Hive, and MapReduce, along with a strong problem-solving mindset to address complex data challenges and implement scalable solutions.
Learning Resources
Career Path
Market Overview
Skills & Requirements
Domain Trends
Industry News
Loading latest industry news...
Finding relevant articles from the last 6 months