Senior IT Big Data Developer- Federal Labor Category

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior IT Big Data Developer on a contract basis, offering $60.00 - $70.00 per hour for 40 hours per week. Key requirements include 10 years of experience in Big Data, ETL, and federal projects, with expertise in C#, Java, Python, and cloud-based systems. Certifications in AWS Developer and AWS Data Engineer are required. Work location is in-person.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
560
-
πŸ—“οΈ - Date discovered
August 16, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Washington, DC 20551
-
🧠 - Skills detailed
#Microsoft SQL Server #Cloud #Terraform #Databricks #DevOps #Amazon Redshift #Data Warehouse #Hibernate #CouchDB #Agile #JavaScript #SAP #Databases #Data Lake #Informatica #MS SQL (Microsoft SQL Server) #AWS (Amazon Web Services) #MongoDB #Java #Lean #Big Data #Snowflake #Scala #Infrastructure as Code (IaC) #Redshift #Spark (Apache Spark) #Python #HBase #SAP BusinessObjects #NiFi (Apache NiFi) #Data Architecture #Scripting #"ETL (Extract #Transform #Load)" #Apache NiFi #.Net #SSIS (SQL Server Integration Services) #C# #Data Engineering #Informatica PowerCenter #Microsoft SQL #Oracle #SQL Server #Aurora #Programming #SQL (Structured Query Language) #NoSQL
Role description
Senior IT Big Data Developer Personnel Qualifications:At least seven years of demonstrated experience developing software according to software development lifecycles (SDLCs), including DevOps, Agile, Lean, Iterative, or Waterfall.Demonstrated understanding of Big Data architecture, design patterns, and repositories such as data lakes and data warehouses.In-depth understanding of developing against Big Data technology stacks.In-depth understanding of how to optimize automated ETL processes when working with large structured and unstructured data sets (TBs+ range).At least ten years' demonstrated experience with:Object-oriented programming languages and frameworks such as C#, Java, Python, Scala, .NET, Spark, Spring, or Hibernate.Scripting languages such as JavaScript, Node.JS, or Shell script.Developing automated extract, transform and load (ETL) and extract, load, and transform (ELT) processes using major tools such as Informatica PowerCenter, Apache NiFi, Data Oracle Integrator, Microsoft SQL Server Integrated Services (SSIS), IBM Infosphere Information Server, or SAP BusinessObjects Data Integrator.At least ten years of demonstrated experience with relational databases such as Microsoft SQL Server or Oracle, and NoSQL databases such asMongoDB, CouchDB, HBase, Cosmos DB, Oracle NoSQL Database, or Cassandra DB. Capabilities:Ability to develop applications based on cloud-based data and big data systems including: Amazon Aurora, Snowflake, Databricks, Amazon Redshift.Ability to use infrastructure as code (IaC) such as Terraform, AWS CloudFormation to build and configure data systems as part of a DevOps program.Ability to develop applications utilizing hybrid data systems including a mixture of cloud and on-premises data systems such as MS SQL Server. Certifications:AWS Developer - AssociateAWS Data Engineer - Associate Job Type: Contract Pay: $60.00 - $70.00 per hour Expected hours: 40 per week Experience: Big Data Developer: 10 years (Required) ETL: 10 years (Required) Federal: 10 years (Required) Labor: 10 years (Required) SDLC: 10 years (Required) C#: 10 years (Required) Java: 10 years (Required) Python: 10 years (Required) Scala: 10 years (Required) .Net: 10 years (Required) Spark: 10 years (Required) Spring: 10 years (Required) Hibernate: 10 years (Required) JavaScript: 10 years (Required) Node.js: 10 years (Required) Work Location: In person