Remote - Lead Data Warehouse Engineer
Published
We’re looking for talented professionals, anywhere in the United States, to join us in bringing smart money management and payment solutions to everyone’s fingertips.
At Green Dot, we are evolving to a new and permanent “Work from Anywhere” model designed to maximize the benefits of remote work, promote and enable a strong culture of performance and connectedness, and attract the best and brightest talent who align with our entrepreneurial spirit and mission.
<<>><<>><<>><<>><<>><<>><<>><<>><<>><<>>
JOB DESCRIPTION
This is an exciting opportunity for an experienced data engineer for one of the leading FinTech companies of US. Our mission is to deliver trusted, best-in-class money management & payment solutions to our customers & partners, seamlessly connecting people to their money. You will join an engineering team delivering a transformative next-gen data platform that enables faster, secure & efficient data solutions.
The ideal candidate will have a proven track record in implementing next-gen cloud data solutions and transformation data pipelines for large scale organizations with “cloud-first mindset”. We are seeking someone with deep technical skills in a variety of data warehouse related technologies to play an important role in developing and delivering early proofs of concept and production implementation. The successful candidate will have hands-on experience in a multitude of data domains; including, but not limited to enterprise cloud data warehouse and reporting solutions; design and build data warehouse models, ETL development, performance tuning, service delivery, incident tracking, and change management.
Responsibilities
- Design and implement enterprise-wide data strategy for data modeling, data pipelines, and data integrations across various environments and technology stacks; associated with Operational data stores (ODS) and Enterprise Data Warehouse (EDW)
- Bring a thought leadership through understanding the key business objectives and strategically providing data integration solutions to support various data domains.
- Designing and implementing performant cloud data warehouse platforms (Azure Databricks, AWS Redshift, or similar)
- Ensures data warehouse features and capabilities are incorporated into data model designs to optimize performance, resiliency, and scalability.
- Developing and maintaining data pipelines using cloud-based ETL tools (Informatica, ADF, Databricks, dbt, etc.).
- Developing performant data movement pipelines using real-time event data streams (Apache Kafka, Azure Event Hub, AWS Kinesis, etc.)
- Developing scalable and re-usable frameworks for ingestion of structured and semi-structured data; and implementing various large-scale settlement and reconciliation frameworks
- Delivering and presenting proofs of concept to of key technology components to project stakeholders and internal technology leaders
- Developing the data architecture and governance models for systems of engagement and provides the framework for integrating source systems with the transactional databases and/or data warehouses; Ensuring data architectural designs are consistent, maintainable, flexible, extensible, and cost effective.
- Ensuring data security and privacy standards are implemented including- role based security, encryption, tokenization, and obfuscation.
- Perform problem-solving of application issues and production errors, including high level critical production issues that require immediate attention.
Qualifications
- Bachelor’s and/or master’s degree in computer science, Computer Engineering, or related field preferred
- 8+ years of overall software development experience required.
- 6+ years of experience in a data engineering / architecture role
- 4+ years of experience in designing, developing, and deploying cloud-based solutions which leverage cloud provided platforms (Azure / AWS)
- 3+ years of hands-on development experience with cloud-based data warehouse platform (Redshift, Snowflake, Azure SQL Data Warehouse, etc.)
- 3+ years of hand-on experience with ETL tools (Informatica, ADF, SSIS etc..) and Python scripting
- 3+ years of hands-on development experience with Relational DBs (MS SQL Server, PostgreSQL, Oracle, etc.); experience with complex stored procedures and functions using SQL, optimization of SSIS packages and Stored Procs/T-SQL.
- Experience in designing ETL and ELT pipelines to support Enterprise Data Warehouses and Data Marts, with a strong understanding of data lake, data fabric and semantic layer design concepts.
- Demonstrated expertise in performance tuning in various DB environments with large volume and velocity of data.
- Experience working in a Dev/Ops environment with tools such as Azure Dev Ops, Microsoft Visual Studio Team Services, Chef, Puppet or Terraform
Pluses:
- Financial technology domain knowledge
- Microsoft Azure Data Engineer or other cloud certifications
PAY RANGE
The targeted base salary for this position is $108,600 to $165,900 per year. The final compensation will be determined by a number of factors such as qualifications, expertise, and the candidate’s geographical location.