Kettle Engineer as 100% Remote
Job Title: Kettle Engineer Location: 100% Remote Duration: 12 Months Pay Rate: $90.50/- on W2 Kindly help me out with your most updated resume
Summary
We're seeking a Kettle Engineer to design, operate, and continuously improve our Pentaho Data Integration (PDI/Kettle) platform and data movement processes that underpin business critical workflows.
You will own end to end lifecycle management-from environment build and configuration to orchestration, monitoring, and support - partnering closely with application teams, production operations, and data stakeholders.
The ideal candidate combines strong hands on Kettle expertise with solid SQL, automation, and production support practices in a fast moving, highly collaborative environment.
Roles And Responsibilities
Platform ownership: Install, configure, harden, and upgrade Kettle/PDI components (e.g., Spoon) across dev/prod.
Process engineering: Migrate, re-engineer, and optimize jobs and transformations for existing transformations.
Reliability & support: Document all workflows including ownership and escalation protocols. Knowledge transfer with Automation and Application Support teams.
Observability: Implement proactive monitoring, logging, and alerting for the Kettle platform including all dependent processes.
Collaboration: Partner with application, data, and infrastructure teams to deliver improvements to existing designs
Required Qualifications
Kettle/PDI expertise: Experience installing and configuring a Kettle instance (server and client tools, repositories, parameters, security, and upgrades).
Kettle/PDI expertise: Experience creating, maintaining, and supporting Kettle processes (jobs/transformations, error handling, recovery, and performance tuning).
4+ years hands on SQL (writing/diagnosing and optimizing queries).
Strong communication skills for both technical and non technical audiences; effective at documenting and sharing knowledge.
Preferred Qualifications
Experience integrating Kettle with cloud platforms (AWS and/or Azure); familiarity with containers or Windows/Linux server administration.
Exposure to monitoring/observability stacks (e.g., DataDog, CloudWatch, or similar).
Scripting/automation for operations (Python, PowerShell, or Bash); experience with REST APIs within Kettle.
Background in financial services or other regulated/mission critical environments.
Key Outcomes (First 90 Days)
Stand up or validate a hardened Kettle environment with baseline monitoring and runbooks.
Migrate at least two high value Kettle workflows using shared templates and standardized error handling. #CareerBuilder #Monster #Dice #Indeed
Summary
We're seeking a Kettle Engineer to design, operate, and continuously improve our Pentaho Data Integration (PDI/Kettle) platform and data movement processes that underpin business critical workflows.
You will own end to end lifecycle management-from environment build and configuration to orchestration, monitoring, and support - partnering closely with application teams, production operations, and data stakeholders.
The ideal candidate combines strong hands on Kettle expertise with solid SQL, automation, and production support practices in a fast moving, highly collaborative environment.
Roles And Responsibilities
Platform ownership: Install, configure, harden, and upgrade Kettle/PDI components (e.g., Spoon) across dev/prod.
Process engineering: Migrate, re-engineer, and optimize jobs and transformations for existing transformations.
Reliability & support: Document all workflows including ownership and escalation protocols. Knowledge transfer with Automation and Application Support teams.
Observability: Implement proactive monitoring, logging, and alerting for the Kettle platform including all dependent processes.
Collaboration: Partner with application, data, and infrastructure teams to deliver improvements to existing designs
Required Qualifications
Kettle/PDI expertise: Experience installing and configuring a Kettle instance (server and client tools, repositories, parameters, security, and upgrades).
Kettle/PDI expertise: Experience creating, maintaining, and supporting Kettle processes (jobs/transformations, error handling, recovery, and performance tuning).
4+ years hands on SQL (writing/diagnosing and optimizing queries).
Strong communication skills for both technical and non technical audiences; effective at documenting and sharing knowledge.
Preferred Qualifications
Experience integrating Kettle with cloud platforms (AWS and/or Azure); familiarity with containers or Windows/Linux server administration.
Exposure to monitoring/observability stacks (e.g., DataDog, CloudWatch, or similar).
Scripting/automation for operations (Python, PowerShell, or Bash); experience with REST APIs within Kettle.
Background in financial services or other regulated/mission critical environments.
Key Outcomes (First 90 Days)
Stand up or validate a hardened Kettle environment with baseline monitoring and runbooks.
Migrate at least two high value Kettle workflows using shared templates and standardized error handling. #CareerBuilder #Monster #Dice #Indeed