Overview
Job Title: Data Engineer Work From Home Company: EmpireTech Solutions Location: Remote Job Type: Full-Time Salary: Competitive, Based on Experience About EmpireTech Solutions: EmpireTech Solutions System Solutions is a leading IT solutions provider, delivering innovative software development, mobile apps, and data-driven services to global clients. We are committed to building high-performance systems that harness the power of data to drive business results. We’re looking for a skilled Data Engineer to help scale our data infrastructure and pipelines in a remote-first work environment. Job Summary: We are seeking a Remote Data Engineer to design, build, and optimize our data pipelines and infrastructure. You will work closely with data analysts, software developers, and project managers to ensure data is accessible, accurate, and efficiently processed. This role requires both engineering expertise and an understanding of data architecture, ETL processes, and cloud-based technologies. Key Responsibilities: Design, develop, and maintain scalable data pipelines for batch and real-time data processing. Build and maintain data architectures, including databases, data lakes, and data warehouses. Ensure data quality, integrity, and consistency across systems. Collaborate with analysts and business teams to understand data needs. Optimize database queries and performance for large datasets. Implement and manage ETL/ELT workflows using tools like Apache Airflow or similar. Monitor, troubleshoot, and debug production data issues. Ensure compliance with data governance and security standards. Requirements: Bachelor’s degree in Computer Science, Engineering, or related field (Master’s preferred). 2+ years of experience in a Data Engineer or similar role. Strong experience with SQL and data modeling. Proficiency in Python, Java, or Scala for data processing. Experience with ETL tools and frameworks. Knowledge of cloud platforms (AWS, GCP, or Azure) and services like S3, Redshift or Snowflake. Familiarity with big data technologies (e.g., Spark, Hadoop, Kafka). Experience with version control tools like Git. Excellent problem-solving and communication skills. Preferred Qualifications: Experience with containerization and orchestration tools (Docker, Kubernetes). Working knowledge of CI/CD for data engineering workflows. Exposure to machine learning pipelines and data science collaboration. Familiarity with data privacy and regulatory compliance (e.g., GDPR, HIPAA). What We Offer: 100% remote work flexibility Competitive salary and benefits Access to latest tools and cloud infrastructure A collaborative, innovation-driven culture Career growth opportunities in a fast-paced tech environment