

Senior IT Big Data Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior IT Big Data Developer based in Washington DC, with a contract length of "Unknown" and a pay rate of "Unknown." Requires 7+ years in software development, expertise in Big Data architecture, and certifications in AWS.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 31, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Washington, DC
-
π§ - Skills detailed
#Amazon Redshift #Microsoft SQL #Aurora #Lean #SSIS (SQL Server Integration Services) #Python #Big Data #CouchDB #HBase #Spark (Apache Spark) #AWS (Amazon Web Services) #Programming #Databases #MongoDB #Oracle #Snowflake #Scala #Informatica #NoSQL #SAP BusinessObjects #Informatica PowerCenter #Cloud #Databricks #Java #"ETL (Extract #Transform #Load)" #Data Engineering #MS SQL (Microsoft SQL Server) #DevOps #Data Architecture #NiFi (Apache NiFi) #SQL Server #Terraform #Microsoft SQL Server #.Net #Agile #Apache NiFi #Hibernate #Infrastructure as Code (IaC) #Redshift #Data Warehouse #Scripting #Data Lake #JavaScript #SAP #SQL (Structured Query Language) #C#
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior IT Big Data Developer
Location: Washington DC
Personnel Qualifications
β’ At least seven years of demonstrated experience developing software according to software development lifecycles (SDLCs), including DevOps, Agile, Lean, Iterative, or Waterfall.
β’ Demonstrated understanding of Big Data architecture, design patterns, and repositories such as data lakes and data warehouses.
β’ In-depth understanding of developing against Big Data technology stacks.
β’ In-depth understanding of how to optimize automated ETL processes when working with large structured and unstructured data sets (TBs+ range).
β’ At least ten yearsβ demonstrated experience with:
o Object-oriented programming languages and frameworks such as C#, Java, Python, Scala, .NET, Spark, Spring, or
Hibernate.
o Scripting languages such as JavaScript, Node.JS, or Shell script.
o Developing automated extract, transform and load (ETL) and extract, load, and transform (ELT) processes using major tools such as Informatica PowerCenter, Apache NiFi, Data Oracle Integrator, Microsoft SQL Server Integrated Services (SSIS), IBM Infosphere Information Server, or SAP BusinessObjects Data Integrator.
β’ At least ten years of demonstrated experience with relational databases such as Microsoft SQL Server or Oracle, and NoSQL databases such as
o MongoDB, CouchDB, HBase, Cosmos DB, Oracle NoSQL Database, or Cassandra DB.
Capabilities
β’ Ability to develop applications based on cloudbased data and big data systems including: Amazon Aurora, Snowflake, Databricks, Amazon Redshift.
β’ Ability to use infrastructure as code (IaC) such as Terraform, AWS CloudFormation to build and configure data systems as part of a DevOps program.
β’ Ability to develop applications utilizing hybrid data systems including a mixture of cloud and on premises data systems such as MS SQL Server.
Certification:
β’ AWS Developer - Associate
β’ AWS Data Engineer - Associate