

Sr Data Engineer – AWS Lambda & PostgreSQL - Need Local
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Sr Data Engineer – AWS Lambda & PostgreSQL" located in NY or Columbus, OH, with a contract length of 12 months and a pay rate of "unknown." Requires 10+ years of experience, strong AWS Lambda, PostgreSQL, and Looker skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 23, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Columbus, Ohio Metropolitan Area
-
🧠 - Skills detailed
#API (Application Programming Interface) #Data Architecture #dbt (data build tool) #Security #Scala #S3 (Amazon Simple Storage Service) #AWS Lambda #Looker #Indexing #IAM (Identity and Access Management) #BigQuery #Data Engineering #Airflow #Data Ingestion #GitLab #Cloud #Python #AWS (Amazon Web Services) #Data Pipeline #Data Exploration #Lambda (AWS Lambda) #Data Modeling #Schema Design #R #Debugging #PostgreSQL #"ETL (Extract #Transform #Load)" #Data Governance #Redshift #Snowflake #SQL (Structured Query Language) #Fivetran #Jenkins #Logging
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Sr Data Engineer – AWS Lambda & PostgreSQL - Need locals
Location: NY or Columbus OH Onsite
Duration: 12 months
Role Overview:
We are seeking a skilled S r Data Engineer to design and implement scalable data pipelines using AWS Lambda that push structured and semi-structured data into a PostgreSQL data store. The role also requires experience in data modeling, Looker dashboard development, and strong SQL/database expertise to support reporting and analytics needs across the organization.
Key Responsibilities:
• Design, develop, and deploy serverless data ingestion pipelines using AWS Lambda
• Write and optimize Lambda functions to clean, transform, and push data into PostgreSQL
• Develop and maintain scalable, efficient data models supporting analytical workloads
• Create LookML models and build dashboards in Looker to enable self-service analytics
• Maintain database integrity, indexing, and performance optimization
• Collaborate with product, engineering, and analytics teams to understand data needs
• Build robust error-handling, logging, and retry mechanisms for data pipelines
• Ensure data governance, quality, and security best practices are followed
Required Skills:
• Overall 10+ years of experience
• 3–5 years of experience in Data Engineering or Backend Engineering roles
• Strong experience with AWS Lambda and serverless data architecture
• Proficient in Python or Node.js for writing Lambda functions
• Solid experience with PostgreSQL – schema design, optimization, and advanced SQL
• Proven expertise in data modeling for analytics and reporting
• Hands-on experience with Looker (LookML, dashboards, data exploration)
• Familiarity with AWS services like S3, CloudWatch, API Gateway, and IAM
• Excellent debugging, problem-solving, and communication skills
Good to Have:
• Experience integrating third-party APIs or webhooks into Lambda functions
• Familiarity with data warehousing concepts (e.g., Snowflake, Redshift, or BigQuery)
• Exposure to CI/CD for data pipelines using tools like GitLab or Jenkins
• Understanding of modern data stack tools (Fivetran, dbt, Airflow, etc.)