

Senior IT Big Data Developer - ITAJS
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior IT Big Data Developer in Washington, DC, on a contract basis. The position requires 10+ years of experience in programming, ETL/ELT development, and data architecture, with expertise in cloud and hybrid environments. Pay rate is competitive.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
720
-
ποΈ - Date discovered
July 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Logan Circle, DC
-
π§ - Skills detailed
#SAP #DevOps #Informatica #Scripting #Agile #Java #MongoDB #Big Data #Data Lake #AWS (Amazon Web Services) #Hibernate #CouchDB #Python #Storage #SSIS (SQL Server Integration Services) #Data Architecture #Lean #SAP BusinessObjects #Redshift #Spark (Apache Spark) #Aurora #Scala #HBase #Databases #Databricks #Oracle #"ETL (Extract #Transform #Load)" #Infrastructure as Code (IaC) #Shell Scripting #Data Engineering #Informatica PowerCenter #Apache NiFi #Cloud #.Net #MS SQL (Microsoft SQL Server) #NiFi (Apache NiFi) #Amazon Redshift #Terraform #SQL Server #C# #JavaScript #Programming #NoSQL #Data Warehouse #Snowflake #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior IT Big Data Developer Location: Washington, DC Type: Contract
Job Summary: We are seeking a highly skilled Senior IT Big Data Developer to support advanced data engineering initiatives using modern cloud and big data technologies. The ideal candidate will bring deep experience in building scalable data platforms, leveraging infrastructure as code, and working across hybrid data environments (cloud and on-premises). This role is critical for driving innovative solutions in a fast-paced, collaborative environment.
Key Responsibilities:
Develop and maintain applications utilizing cloud-based and hybrid big data systems such as Amazon Aurora, Snowflake, Databricks, Amazon Redshift, and MS SQL Server.
Implement infrastructure as code (IaC) solutions using tools such as Terraform and AWS CloudFormation as part of a broader DevOps program.
Design, build, and optimize automated ETL/ELT pipelines that process large volumes of structured and unstructured data.
Apply modern software development lifecycles (SDLCs) such as Agile, DevOps, Lean, Iterative, and Waterfall in delivery of data-centric solutions.
Architect and implement solutions using Big Data best practices and design patterns for data lakes, data warehouses, and hybrid storage solutions.
Required Qualifications:
10+ years of hands-on experience with object-oriented programming and scripting languages including:
C#, Java, Python, Scala, .NET, Spark, Spring, Hibernate
JavaScript, Node.js, and Shell scripting
10+ years of experience with ETL/ELT development using tools such as:
Informatica PowerCenter, Apache NiFi, Data Oracle Integrator, SSIS, IBM Infosphere, or SAP BusinessObjects Data Integrator
10+ years of experience working with relational databases (MS SQL Server, Oracle) and NoSQL databases such as MongoDB, Cassandra, CouchDB, HBase, Cosmos DB, Oracle NoSQL
7+ years of experience delivering software using formal SDLC methodologies
Deep understanding of Big Data ecosystems, performance optimization, and data architecture best practices
Experience with cloud-native data platforms and hybrid data environments
Preferred Certifications:
AWS Certified Developer β Associate
AWS Certified Data Engineer β Associate