Duration: 5 months (likely to extend)
Start Date: ASAP
Location: Westbrook, ME (2x per week onsite)
W2/C2C: W2 Only
Additional Notes:
- This role requires an "On Call" rotation - 1 week per month/5 weeks
- This is the primary reason for the role; the product requires a 24/7 support presence
TOP (3) REQUIRED SKILLSETS:
- Strong background in AWS cloud services, with a focus on data storage and pub/sub platforms (S3, SNS, SQS, Lambda, etc.)
- Proven experience in building and maintaining operational data pipelines, particularly with modern technologies like dbt, Airflow, and Snowflake.
- Strong proficiency in Python, and SQL.
NICE TO HAVE SKILLSETS:
- Familiarity with Infrastructure as Code solutions like Terraform
- Experience with Quality Engineering processes
Job Description:
Key Responsibilities:
- Design and implement ingestion and storage solutions using AWS services such as S3, SNS, SQS, and Lambda.
- Develop and implement analytical solutions leveraging Snowflake, Airflow, Apache Iceberg
- Collaborate with cross-functional teams to understand and meet data needs for processing.
- Develop and maintain scalable and reliable data solutions that support operational and business requirements.
- Document data flows, architecture decisions, and metadata to ensure maintainability and knowledge sharing.
- Design and implement fault-tolerant systems, ensuring high availability and resilience in our data processing pipelines.
- Actively participate in testing and quality engineering (QE) processes, collaborating closely with the QE team to ensure the reliability and accuracy of data solutions.
Required Skills:
- Strong problem-solving skills and the ability to operate independently, sometimes with limited information.
- Strong communication skills, both verbal and written, including the ability to communicate complex issues to both technical and non-technical users in a professional, positive, and clear manner.
- Initiative and self-motivation with strong planning and organizational skills.
- Ability to prioritize and adapt to changing business needs.
- Proven experience in building and maintaining operational data pipelines, particularly with modern technologies like dbt, Airflow, and Snowflake.
- Strong proficiency in Python, and SQL.
-   Strong background in AWS cloud services, with a focus on data storage and pub/sub platforms (S3, SNS, SQS, Lambda, etc.). 
 
 - Familiarity with Infrastructure as Code solutions like Terraform is a plus
 
-   Familiarity with a broad range of technologies, including: 
 
 - Cloud-native data processing and analytics
- SQL Databases, specifically for analytics
- Orchestration and ELT technologies like Airflow and dbt
- Scripting and programming with Python, or similar languages
- Infrastructure as Code languages like terraform
 
- Ability to translate complex business requirements into scalable and efficient data solutions.
- Strong multitasking skills and the ability to prioritize effectively in a fast-paced environment.
Success Metrics:
- Meeting delivery timelines for project milestones.
- Effective collaboration with cross-functional teams.
- Ensure high standards of data accuracy and accessibility in a fast-paced, dynamic environment.
- Reduction in data pipeline failures or downtime through resilient and fault-tolerant design.
- Demonstrated contribution to the stability and scalability of the platform through well-architected, maintainable code.
- Positive feedback from stakeholders (engineering, product, or customer-facing teams) on delivered solutions.
- Active contribution to Agile ceremonies and improvement of team velocity or estimation accuracy.
- Proactive identification and mitigation of data-related risks, including security or compliance issues.





