Senior Manager, Data Operations & Engineering
United States (Remote)
HashiCorp is a fast-growing startup that solves development, operations, and security challenges in infrastructure so organizations can focus on business-critical tasks. We build products to give organizations a consistent way to manage their move to cloud-based IT infrastructures for running their applications. Our products enable companies large and small to mix and match AWS, Microsoft Azure, Google Cloud, and other clouds as well as on-premises environments, easing their ability to deliver new applications for their business.
About the role:
At HashiCorp, centralized Systems Operations entails streamlining the value flow across the entire customer journey. We need a leader to help us drive the entire data operations lifecycle for our portfolio of enterprise and cloud products, from data acquisition, data manipulation, data analytics, data governance, and data delivery through the launch of v1 of unified identity.
This is a great role for a seasoned Data Operations lead or a seasoned data engineer looking to lead a team. You will work closely with Product Management, Engineering, and Operations to define the vision and strategy for data accessibility across our complex systems stack in order to see this unified customer journey, from OSS user to cloud or enterprise customer to an expanding account. Additionally, you will work cross functionally to progress data engineering and DataOps standards, processes and methodologies for operating and maintaining successful continuous delivery while ensuring capabilities are reusable, scalable, and cost effective. You will help identify and deliver new patterns and technologies to meet emerging requirements for democratizing data across our enterprise, cloud, and GTM systems. You are able to operate in an agile organization that is highly matrixed and fast paced to deliver highly visible, impactful work for senior leadership.
In this role, you can expect to:
- Lead a team of data engineering and systems integrations resources to architect data pipelines, and build applications at scale
- Own the data pipeline across our entire systems stack consisting of core GTM systems and Cloud applications
- Evolve our existing data definitions and governance framework, recommending ways to improve the accuracy, integrity, and reliability of our entire data platform
- Define the core marketing tech stack and core data management platform
- Architect solutions for complex data platforms, and large scale CI/CD data pipelines utilizing a variety of technologies (REST APIs, Advanced SQL, Amazon S3, Data-Lakes, etc.), relational databases (MySQL), and data warehouse solutions (RedShift)
- Work with analytics, data science and wider engineering teams to democratize data and help automate visualization and analysis needs
- Advise on transformation processes to populate data models, and explore ways to design, develop, and scale data infrastructure
You may be a good fit if you have:
- Experience leading and building a team of platform and data operators as well as systems integrators
- Proven success in driving data operations programs and architecting data pipelines across complex systems stack
- Ability to gather requirements, coordinate org wide efforts to scope, schedule, and deploy new features sets; analyze cost/benefits and communicate results throughout the organization.
- Strong technical and product/business judgment
- 5+ years as data engineer / backend software engineer, or related experience, and experience developing in variety of object oriented and scripting languages to integrate different data systems
- Architect level experience as a data engineer developing and deploying cloud technologies is a strong plus
- Experience architecting a unified identity/customer data model at a Cloud Enterprise company also a strong plus
- Advanced working SQL knowledge and experience working with relational / non-relational databases, schema design, and excellent SQL troubleshooting skills working with large datasets
- Extensive experience developing for high/elastic scalable, 24x7x365 high availability marketing and go-to-market systems (e.g. Marketo, Salesforce, Gainsight, Cloud applications)
- Strong background on ETL development, data modeling, metadata management, and data quality, data retention, and data cleansing with exposure to GDPR/CCPA compliance requirements
- Experience with implementing CI/CD processes using tools such as Maven, GIT, Jenkins, and with monitoring and alerting for production systems
HashiCorp embraces diversity and equal opportunity. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. We believe the more inclusive we are, the better our company will be. #LI-SY1