Sr Data Engineer

Sr Data Engineer

Contract Type:

Contract

Location:

Westbrook - Maine

Industry:

Information Technology

Contact Name:

Bailey Vela

Contact Email:

bvela@dewintergroup.com

Contact Phone:

669 289 3352

Date Published:

09-11-2025

Salary:

$65.00 - $70.00 Hourly

Job ID:

BH-37504

Title: Sr Data Engineer
Duration: 5 months (likely to extend)
Start Date: ASAP
 
Location: Westbrook, ME (2x per week onsite)
 
W2/C2C: W2 Only
 
Additional Notes:

  • This role requires an "On Call" rotation - 1 week per month/5 weeks
  • This is the primary reason for the role; the product requires a 24/7 support presence
 
TOP (3) REQUIRED SKILLSETS:
  • Strong background in AWS cloud services, with a focus on data storage and pub/sub platforms (S3, SNS, SQS, Lambda, etc.)
  • Proven experience in building and maintaining operational data pipelines, particularly with modern technologies like dbt, Airflow, and Snowflake.
  • Strong proficiency in Python, and SQL.
 
NICE TO HAVE SKILLSETS:
  • Familiarity with Infrastructure as Code solutions like Terraform
  • Experience with Quality Engineering processes
 
 
Job Description:
We are seeking a highly motivated and experienced Senior Data Engineer to join our team and support the accelerated advancement of our global instrument data pipelines. This role is pivotal in developing changes that deliver instrument data to stakeholders.
 
 
Key Responsibilities:
  • Design and implement ingestion and storage solutions using AWS services such as S3, SNS, SQS, and Lambda.
  • Develop and implement analytical solutions leveraging Snowflake, Airflow, Apache Iceberg
  • Collaborate with cross-functional teams to understand and meet data needs for processing.
  • Develop and maintain scalable and reliable data solutions that support operational and business requirements.
  • Document data flows, architecture decisions, and metadata to ensure maintainability and knowledge sharing.
  • Design and implement fault-tolerant systems, ensuring high availability and resilience in our data processing pipelines.
  • Actively participate in testing and quality engineering (QE) processes, collaborating closely with the QE team to ensure the reliability and accuracy of data solutions.
 
Required Skills:
  • Strong problem-solving skills and the ability to operate independently, sometimes with limited information.
  • Strong communication skills, both verbal and written, including the ability to communicate complex issues to both technical and non-technical users in a professional, positive, and clear manner.
  • Initiative and self-motivation with strong planning and organizational skills.
  • Ability to prioritize and adapt to changing business needs.
  • Proven experience in building and maintaining operational data pipelines, particularly with modern technologies like dbt, Airflow, and Snowflake.
  • Strong proficiency in Python, and SQL.
  • Strong background in AWS cloud services, with a focus on data storage and pub/sub platforms (S3, SNS, SQS, Lambda, etc.).
    • Familiarity with Infrastructure as Code solutions like Terraform is a plus
  • Familiarity with a broad range of technologies, including:
    • Cloud-native data processing and analytics
    •  SQL Databases, specifically for analytics
    • Orchestration and ELT technologies like Airflow and dbt
    • Scripting and programming with Python, or similar languages
    • Infrastructure as Code languages like terraform
  • Ability to translate complex business requirements into scalable and efficient data solutions.
  • Strong multitasking skills and the ability to prioritize effectively in a fast-paced environment.
 
Preferred Background-
  • Candidates should have a minimum of eight years of experience in a similar role, preferably within a technology-driven environment.
  • Experience building data services and ETL/ELT pipelines in the cloud using Infrastructure as Code and large-scale data processing engines
  • Strong experience with SQL and one or more programming languages (Python preferred)
 
Success Metrics-
  • Meeting delivery timelines for project milestones.
  • Effective collaboration with cross-functional teams.
  • Ensure high standards of data accuracy and accessibility in a fast-paced, dynamic environment.
  • Reduction in data pipeline failures or downtime through resilient and fault-tolerant design.
  • Demonstrated contribution to the stability and scalability of the platform through well-architected, maintainable code.
  • Positive feedback from stakeholders (engineering, product, or customer-facing teams) on delivered solutions.
  • Active contribution to Agile ceremonies and improvement of team velocity or estimation accuracy.
  • Proactive identification and mitigation of data-related risks, including security or compliance issues.


DeWinter Group and Maris Consulting  is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.  We post pay scales which are based on our client pay ranges. DeWinter, Maris, and our clients have the right to modify the requirements of the role which can impact the pay ranges posted.

APPLY NOW

Share this job

Interested in this job?
Save Job
Create As Alert

Similar Jobs

Read More
SCHEMA MARKUP ( This text will only show on the editor. )