About the Role
We are looking for a Remote Senior Software Engineer to own, maintain, and evolve our production data and machine learning platform. This role is responsible for the reliability, architecture, and scalability of the pipelines and systems that power our core data products, transforming millions of data points into unparalleled opportunities. As a Remote Senior Software Engineer, you will support everything from electing Democrats to combating climate change throughout the world. Join us to put your skills to use for disruptive innovation that powers social good.
What You'll Do
- Design, develop, and maintain scalable data pipelines and systems.
- Ensure the reliability and performance of our data platform.
- Collaborate with product teams to integrate machine learning models into production.
- Implement best practices for data governance and security.
- Contribute to the architecture and design of new features and enhancements.
Requirements
- 5+ years of experience as a software engineer, with a focus on data platforms.
- Strong proficiency in programming languages such as Python, Java, or Scala.
- Experience with cloud platforms (AWS, GCP, or Azure) and big data technologies (Hadoop, Spark).
- Knowledge of machine learning concepts and frameworks.
- Excellent problem-solving skills and a passion for social impact.
Nice to Have
- Experience with data visualization tools (Tableau, Power BI).
- Familiarity with CI/CD practices.
- Previous work in a non-profit or social impact organization.
What We Offer
- Competitive salary ranging from $140,000 to $180,000 per year.
- Fully remote work environment with flexible hours.
- Opportunities for professional development and training.
- Health and wellness benefits.
- A chance to work on projects that make a difference in the world.
This role offers a unique opportunity to work on impactful projects while enjoying a competitive salary and a fully remote work environment.
Who Will Succeed Here
Proficient in Python and Scala for building scalable data pipelines, with hands-on experience in frameworks like Apache Spark for processing large datasets.
Strong understanding of cloud platforms (AWS, GCP, Azure) for deploying data solutions, with a focus on using services like AWS Lambda and GCP Dataflow for serverless architectures.
Experience in designing and implementing machine learning models, with a problem-solving mindset that embraces iterative development and experimentation.
Learning Resources
Career Path
Market Overview
Skills & Requirements
Domain Trends
Industry News
Loading latest industry news...
Finding relevant articles from the last 6 months