

Senior Data Engineer - Python, SQL, ETL (Airflow), Visualization
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in San Jose, California, requiring 8-10 years of experience. Key skills include Python, SQL, ETL (Airflow), and data visualization (Tableau, Power BI). A Bachelor's degree in Computer Science or related field is necessary.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 1, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
San Jose, CA
-
π§ - Skills detailed
#GIT #Databases #Microsoft Power BI #Consulting #Data Pipeline #Database Performance #Version Control #SQL (Structured Query Language) #Oracle #PostgreSQL #Hadoop #Tableau #Data Integrity #Luigi #SQL Queries #Data Cleaning #Spark (Apache Spark) #Data Science #MySQL #Visualization #AWS (Amazon Web Services) #Python #Data Engineering #Airflow #Scala #Azure #Big Data #BI (Business Intelligence) #"ETL (Extract #Transform #Load)" #Computer Science #Apache Airflow #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Who We Are
Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to customers.
Job Description
Job Title : Senior Data Engineer β Python, SQL, ETL (Airflow), Visualization
Job Type : W2/C2C
Experience : 8-10 years
Location : San Jose, California (On-Site)
Responsibilities
β’ 5-7 years of experience in data engineering or a related field.
β’ Strong analytical and problem-solving skills.
β’ Experience with data pipeline orchestration tools (e.g., Apache Airflow, Luigi).
β’ Experience with data visualization tools such as Tableau or Power BI.
β’ Experience with version control systems (e.g., Git).
β’ Excellent communication skills, both verbal and written.
β’ Proficiency in Python for data pipeline development.
β’ Strong SQL skills with experience in relational databases (MySQL, PostgreSQL, Oracle).
β’ Knowledge of data warehousing concepts and ETL processes.
β’ Ability to work collaboratively in a team environment.
β’ Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud).
β’ Knowledge of big data technologies (e.g., Hadoop, Spark) is a plus.
β’ Design, develop, and maintain scalable data pipelines using Python.
β’ Write efficient SQL queries to extract, manipulate, and analyze data from relational databases such as MySQL, PostgreSQL, or Oracle.
β’ Collaborate with data scientists, analysts, and business stakeholders to understand and gather data requirements.
β’ Automate data collection and reporting processes to enhance efficiency.
β’ Ensure data integrity and optimize database performance through best practices.
β’ Develop scripts and tools for data cleaning, transformation, and visualization using tools like Tableau or Power BI.
β’ Document code, processes, and workflows to ensure maintainability and knowledge sharing.
Qualification
β’ Bachelor's degree in Computer Science, Information Technology, or a related field.