Software Engineer (ETL Data Engineer) – JOB ID – ABT25080001

Strength in unity. Purpose in progress. Innovation for your future.

We are thought leaders, problem solvers and knowledge seekers. Each day, we look for opportunities to expand technology, uncover trends and discover new ways of doing business. We partner with industry-leading companies in pursuit of the next great idea. Our experts provide actionable insights that power our clients’ most critical projects. It is through knowledge sharing—powered by strong relationships, industry-leading data and innovative technology—that we empower our clients to reimagine how business gets done. When companies want knowledge, leadership and flexibility, they look to AmBrightTech. Together, let’s do great things.

Company: AmBright Tech LLC

Job Title: Hadoop Developer
Client: One of AmBright Tech LLC’s Premier Financial Clients
Location: Plano, TX (Hybrid – 3 Days Onsite per Week)
Duration: 12 Months (with Possible Extension), Texas

Software Engineer – ETL Data Engineer – JOB ID – ABT25080001


About the Role:

AmBright Tech LLC is seeking a Hadoop Developer to support one of our key financial services clients. This mid to senior-level role focuses on production support and development within a high-performance data engineering environment. The ideal candidate will have strong experience in Big Data ecosystems, ETL workflows, and DevOps practices, especially in Credit Risk or financial data platforms.


Key Responsibilities:

  • Manage and support large-scale Big Data applications in a production support and DevOps environment.
  • Perform data analysis and troubleshoot data quality/integrity issues across the Hadoop ecosystem.
  • Develop and optimize Apache Spark jobs for data processing and transformation.
  • Write efficient Hive, Impala, and SQL queries.
  • Collaborate with cross-functional teams to support Credit Risk platforms and ensure smooth ETL operations.
  • Handle root cause analysis for data-related incidents and performance bottlenecks.
  • Work within Agile teams, participate in sprint planning and proactively identify risks and mitigation strategies.
  • Implement and manage CI/CD pipelines using tools like Bitbucket, Gradle, Jenkins, Ansible, and Artifactory.

Required Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field.
  • 7+ years of experience in ETL development, Big Data technologies, and Data Warehousing.
  • Hands-on experience with:
    • Hadoop (HDFS, Hive, Impala)
    • Spark (optimization, troubleshooting, and development)
    • Unix/Linux scripting
    • SQL and query performance tuning
  • Strong knowledge of DevOps tools and automation pipelines.
  • Proven track record in Credit Risk or financial industry data platforms is a strong plus.
  • Exceptional communication and collaboration skills.

Preferred Skills:

  • Previous experience working with financial clients, especially in Credit Risk domains.
  • Understanding of regulatory requirements and risk modeling data pipelines.

If you’re ready to bring your Big Data expertise into a critical production support environment and make a real impact on risk data operations, apply now and join AmBright Tech LLC’s growing team of data professionals!

Job Category: Visa Sponsorship Available
Job Type: Contract Fulltime W2
Job Location: United States

Apply for this position

Allowed Type(s): .pdf, .doc, .docx