Data Engineer
Required Skills
Job Description
**Job Description**
**This role is categorized as hybrid. This means the successful candidate is expected to report to Austin IT Innovation Center three times per week, at minimum \[or other frequency dictated by the business if more than 3 days].**
**What You’ll Do**
* Design, develop, and maintain scalable data engineering pipelines and backend services built on **Quarkus** and **Spring Boot** frameworks.
* Build and manage **cron** **\-based orchestration services** that retrieve data from multiple enterprise systems via REST APIs.
* Work with event streaming and messaging platforms such as **Kafka** and **Azure Event Hub** for real\-time data processing.
* Design and optimize database solutions using **PostgreSQL** and other relational data stores.
* Deploy, monitor , and manage applications on **Azure Kubernetes Service (AKS)** .
* Implement CI/CD pipelines using **GitHub Workflows** and automate deployments using **ArgoCD** .
* Write and maintain **Terraform scripts** for infrastructure automation and cloud resource provisioning.
* Design observability solutions using **Prometheus metrics** , **Datadog monitoring** , and alerting systems.
* Build and maintain data processing workflows using **Databricks** and distributed data frameworks.
* Collaborate with cross\-functional teams to gather requirements and translate them into technical implementations.
* Optimize application performance, reliability, and scalability across data and service layers.
* Build and maintain email services, templates, and customer communication workflows using **Adobe Journey Optimizer (AJO)** or similar tools.
* Troubleshoot production issues and implement proactive monitoring and resiliency improvements.
**Your Skills \& Abilities (Required Qualifications)**
* Bachelor’s degree in Computer Science , Software Engineering, Information Systems, or related field, or equivalent practical experience.
* 4\+ years of software or data engineering experience.
* Strong proficiency in **Java development** and object\-oriented programming .
* Hands\-on experience with **Quarkus** and **Spring Boot** application development.
* Experience building and consuming **REST APIs** and microservices architectures.
* Strong knowledge of event\-driven architectures using **Kafka** or **Azure Event Hub** .
* Experience with relational databases, especially **PostgreSQL** , including performance tuning and query optimization.
* Hands\-on experience with **Azure cloud services** , especially **AKS** , networking, and managed services.
* Experience implementing CI/CD pipelines using **GitHub Actions** .
* Infrastructure\-as\-Code experience using **Terraform** .
* Experience with observability tools such as **Prometheus** and **Datadog** .
Understanding of containerization technologies such as Docker and Kubernetes.
*
**What Can Give You a Competitive Advantage (Preferred Qualifications)**
* Strong knowledge of **Azure platform services and architecture patterns** .
* Experience with **email marketing or customer communication platforms** , including template design and orchestration workflows.
* Experience integrating enterprise marketing tools with backend services.
* Knowledge of security best practices in cloud and API development.
* Familiarity with telemetry and log analytics in cloud environments.
* Experience working with large\-scale customer engagement or notification systems.
This job may be eligible for relocation benefits.
**Compensation:**
* The expected base compensation for this role is: $95,200 \- $139,050\. Actual base compensation within the identified range will vary based on factors relevant to the position.
* Bonus Potential: An incentive pay program offers payouts based on company performance, job level, and individual performance.
* Benefits: GM offers a variety of health and wellbeing benefit programs. Benefit options include medical, dental, vision, Health Savings Account, Flexible Spending Accounts, retirement savings plan, sickness and accident benefits, life insurance, paid vacation \& holidays, tuition assistance programs, employee assistance program, GM vehicle discounts and more.
GM DOES NOT PROVIDE IMMIGRATION\-RELATED SPONSORSHIP FOR THIS ROLE. DO NOT APPLY FOR THIS ROLE IF YOU WILL NEED GM IMMIGRATION SPONSORSHIP NOW OR IN THE FUTURE. THIS INCLUDES DIRECT COMPANY SPONSORSHIP, ENTRY OF GM AS THE IMMIGRATION EMPLOYER OF RECORD ON A GOVERNMENT FORM, AND ANY WORK AUTHORIZATION REQUIRING A WRITTEN SUBMISSION OR OTHER IMMIGRATION SUPPORT FROM THE COMPANY (e.g., H\-1B, OPT, STEM OPT, CPT, TN, J\-1, etc.)
\#LI\-CC1
\&\#xa;\&\#xa;\&\#xa;\&\#xa;
**About GM**
Our vision is a world with Zero Crashes, Zero Emissions and Zero Congestion and we embrace the responsibility to lead the change that will make our world better, safer and more equitable for all.
**Why Join Us**
We believe we all must make a choice every day – individually and collectively – to drive meaningful change through our words, our deeds and
**This role is categorized as hybrid. This means the successful candidate is expected to report to Austin IT Innovation Center three times per week, at minimum \[or other frequency dictated by the business if more than 3 days].**
**What You’ll Do**
* Design, develop, and maintain scalable data engineering pipelines and backend services built on **Quarkus** and **Spring Boot** frameworks.
* Build and manage **cron** **\-based orchestration services** that retrieve data from multiple enterprise systems via REST APIs.
* Work with event streaming and messaging platforms such as **Kafka** and **Azure Event Hub** for real\-time data processing.
* Design and optimize database solutions using **PostgreSQL** and other relational data stores.
* Deploy, monitor , and manage applications on **Azure Kubernetes Service (AKS)** .
* Implement CI/CD pipelines using **GitHub Workflows** and automate deployments using **ArgoCD** .
* Write and maintain **Terraform scripts** for infrastructure automation and cloud resource provisioning.
* Design observability solutions using **Prometheus metrics** , **Datadog monitoring** , and alerting systems.
* Build and maintain data processing workflows using **Databricks** and distributed data frameworks.
* Collaborate with cross\-functional teams to gather requirements and translate them into technical implementations.
* Optimize application performance, reliability, and scalability across data and service layers.
* Build and maintain email services, templates, and customer communication workflows using **Adobe Journey Optimizer (AJO)** or similar tools.
* Troubleshoot production issues and implement proactive monitoring and resiliency improvements.
**Your Skills \& Abilities (Required Qualifications)**
* Bachelor’s degree in Computer Science , Software Engineering, Information Systems, or related field, or equivalent practical experience.
* 4\+ years of software or data engineering experience.
* Strong proficiency in **Java development** and object\-oriented programming .
* Hands\-on experience with **Quarkus** and **Spring Boot** application development.
* Experience building and consuming **REST APIs** and microservices architectures.
* Strong knowledge of event\-driven architectures using **Kafka** or **Azure Event Hub** .
* Experience with relational databases, especially **PostgreSQL** , including performance tuning and query optimization.
* Hands\-on experience with **Azure cloud services** , especially **AKS** , networking, and managed services.
* Experience implementing CI/CD pipelines using **GitHub Actions** .
* Infrastructure\-as\-Code experience using **Terraform** .
* Experience with observability tools such as **Prometheus** and **Datadog** .
Understanding of containerization technologies such as Docker and Kubernetes.
*
**What Can Give You a Competitive Advantage (Preferred Qualifications)**
* Strong knowledge of **Azure platform services and architecture patterns** .
* Experience with **email marketing or customer communication platforms** , including template design and orchestration workflows.
* Experience integrating enterprise marketing tools with backend services.
* Knowledge of security best practices in cloud and API development.
* Familiarity with telemetry and log analytics in cloud environments.
* Experience working with large\-scale customer engagement or notification systems.
This job may be eligible for relocation benefits.
**Compensation:**
* The expected base compensation for this role is: $95,200 \- $139,050\. Actual base compensation within the identified range will vary based on factors relevant to the position.
* Bonus Potential: An incentive pay program offers payouts based on company performance, job level, and individual performance.
* Benefits: GM offers a variety of health and wellbeing benefit programs. Benefit options include medical, dental, vision, Health Savings Account, Flexible Spending Accounts, retirement savings plan, sickness and accident benefits, life insurance, paid vacation \& holidays, tuition assistance programs, employee assistance program, GM vehicle discounts and more.
GM DOES NOT PROVIDE IMMIGRATION\-RELATED SPONSORSHIP FOR THIS ROLE. DO NOT APPLY FOR THIS ROLE IF YOU WILL NEED GM IMMIGRATION SPONSORSHIP NOW OR IN THE FUTURE. THIS INCLUDES DIRECT COMPANY SPONSORSHIP, ENTRY OF GM AS THE IMMIGRATION EMPLOYER OF RECORD ON A GOVERNMENT FORM, AND ANY WORK AUTHORIZATION REQUIRING A WRITTEN SUBMISSION OR OTHER IMMIGRATION SUPPORT FROM THE COMPANY (e.g., H\-1B, OPT, STEM OPT, CPT, TN, J\-1, etc.)
\#LI\-CC1
\&\#xa;\&\#xa;\&\#xa;\&\#xa;
**About GM**
Our vision is a world with Zero Crashes, Zero Emissions and Zero Congestion and we embrace the responsibility to lead the change that will make our world better, safer and more equitable for all.
**Why Join Us**
We believe we all must make a choice every day – individually and collectively – to drive meaningful change through our words, our deeds and
Posted: 2026-03-18