

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of 6+ months, located in the DFW area. Requires 7+ years of experience, strong Python and Terraform skills, and expertise in data architecture and machine learning. US citizenship is mandatory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 11, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas-Fort Worth Metroplex
-
π§ - Skills detailed
#Compliance #Agile #Programming #Tableau #Anomaly Detection #Observability #Kafka (Apache Kafka) #Visualization #Data Warehouse #Data Architecture #Cloud #Documentation #Data Pipeline #GIT #Automated Testing #AWS (Amazon Web Services) #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #Version Control #Computer Science #Data Ingestion #Data Engineering #ML (Machine Learning) #Data Quality #Deployment #Security #Data Science #Apache Spark #Storage #Data Lake #Python #Data Processing #Apache Kafka #AI (Artificial Intelligence) #Data Security #Data Modeling #Terraform #AWS Kinesis #Azure #Automation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Client is committed to providing innovative solutions and exceptional service .Thier mission is to empower businesses through technology and data-driven insights, fostering a culture of collaboration and continuous improvement.
About the Role
As a Data Engineer, you will be responsible for the development, optimization, and management of data ingestion, transformation, and storage processes using modern frameworks. The successful candidate will implement infrastructure-as-code and automation solutions to streamline and build resiliency into data processing workflows. You will work closely with our data science and analytics teams to create and optimize data and machine learning models that support stakeholder needs.
Responsibilities
β’ Design, build, and maintain large-scale data systems, including data warehouses, data lakes, and data pipelines
β’ Develop and implement data architectures that meet the needs of our business stakeholders
β’ Collaborate with data scientists and analysts to develop and deploy machine learning models and data products
β’ Ensure data quality, security, and compliance with standards
β’ Develop and maintain technical documentation of data systems and architectures
β’ Troubleshoot and resolve data-related issues and optimize system performance
β’ Develop and implement automated testing and deployment scripts to ensure smooth and efficient delivery of data solutions
β’ Collaborate with cross-functional teams to identify and prioritize data requirements and develop solutions to meet business needs
Qualifications
β’ Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent combination of education and experience)
β’ 7+ years of experience in data engineering, software development, or a related field
β’ Strong programming skills in languages such as Python and Terraform
β’ Strong understanding of data architecture, data modeling, and data warehousing concepts
β’ Experience with data pipeline tools such as Apache Spark, Apache Kafka, AWS Kinesis
β’ Experience with Artificial Intelligence and Machine Learning technologies
β’ Strong understanding of data security and compliance principles and practices
β’ Excellent problem-solving skills, with the ability to analyze complex data systems and identify areas for improvement
β’ Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams
Required Skills
β’ US Citizenship is required for this role
Preferred Skills
β’ Experience with cloud-based data platforms such as AWS and Azure
β’ Experience with agile development methodologies and version control systems such as Git
β’ Experience with data visualization tools such as Tableau and Cognos Analytics
β’ Familiarity with complex data regulatory requirements such as NIST 800-171 and SOX
β’ Familiarity with OpenTelemetry standards
β’ Optimization for observability data and real-time anomaly detections
Pay range and compensation package
Duration: 6+ Months contract to Hire
Location: DFW area
Resource needs to be willing to convert to an employee after some length of contracting assuming fit/performance expectations are satisfied. Resource needs to be located within 30-50 minutes of driving distance from one of the following LM offices:
DFW area, TX
Central Florida
King of Prussia, PA
Stratford, CT
Denver area
Marietta, GA.
Resource needs to be willing to go into the office on some frequency (not 100% defined but it sounds like client is moving back to more of a hybrid β remote/in office model). Resource needs to not have dual citizenship and be a US citizen only.
Contact Information
Shashi Parashar
Talent Acquisition Manager
sparashar@calance.com
Calance Jobs: www.calancejobs.com
Corporate: www.calanceus.com