Lead Specialist Solutions Architect
While candidates in the listed location(s) are encouraged for this role, candidates in other locations will be considered.
As a Specialist Solutions Architect and Migration Practice lead on the Field Engineering global migrations team, you will guide customers in migrating existing workloads to Databricks and building data and AI solutions on Databricks that span a large variety of use cases with complex/specialized needs. This is a customer-facing role, working with and supporting the Solution Architects, requiring hands-on production experience with Hadoop, SQL and data warehousing and expertise in other data technologies. Reporting to the Field Engineering Senior Manager, you will collaborate with customers, product teams, and the broader customer-facing teams to develop architectures and solutions using our platform and APIs. You will guide customers through the competitive landscape, best practices, and implementation; and develop technical champions along the way.
- In this role, the Hadoop Migration Practice lead will own the strategy and framework for hadoop migrations globally ensuring standardization and scale
- The person will also be responsible for conducting initial discovery calls, strategizing with the customer and account teams for the key opportunities directly owned by the individual while providing oversight for others on the broader specialist teams
- Be the goto technical expert around both technology platforms - Hadoop and Databricks
- Work internally to formulate best practices for Hadoop and enable the broader specialists to be experts at running the Hadoop migration play and work on opportunities independently
- Lead the creation of technical assets that can be used in the sales cycle for migration opportunities like demo notebooks, feature analysis, use cases, migration patterns
- Help build the Hadoop migration practice with thought leadership by demonstrating a strong understanding and articulating benefits of Lakehouse architecture over legacy Hadoop and data warehouse architecture
- Build a strong foundation for the migration practice by defining the framework and its essential components including but not limited to best practices for Hadoop migration, available tooling, technology component mapping, GTM assets and more
- Have a thorough understanding of migration implementation and ability to clearly articulate to customer the journey of Hadoop migration, address any gaps in technology mapping and recommend optimal solutions using other cloud services
- Have a strong knowledge of current migration solutions available and work on actively pursuing and recommending migration solutions/tooling options that can accelerate the Hadoop migrations
- Be on top of Databricks product advancements and actively work with the product team in understanding the new features of Lakehouse. Actively feed the product team with the learnings from the field to improve the product and recommend features to be included in the product.
The impact you will have:
- Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural and security design to development and deployment best practices.
- Provide guidance, best practices and oversight for large-scale enterprise data warehouse migrations, serving as a trusted technical advisor to senior tech leads and executives
- Become a technical expert in areas such as cloud platforms, data management, performance, and architecture
- Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, workload sizing, and custom architectures
- Create best practices assets and thought leadership content to be used by the wider field engineering team
- Improve community adoption (through tutorials, training, hackathons, conference presentations)
What we look for:
- Pre-sales or post-sales experience working with external clients across a variety of industry markets
- Ability to design highly performant, scalable, secure and cost effective cloud-based data solutions, and articulate architectures and design choices to customer’s senior stakeholders.
- Experience with design and implementation of a broad range of data technologies such as Hadoop, Spark, NoSQL, OLTP, OLAP, and ETL/ELT.
- Hands-on experience working with MPP data warehouse appliances (Oracle Exadata, Teradata, IBM Netezza, ) or cloud data warehouses (Amazon Redshift, Azure Synapse, Snowflake)
- Familiarity with common data modelling methodologies such as dimensional modelling, Data Vault, Inmon
- Extensive knowledge and experience in SQL language or any SQL dialect (PL/SQL, Transact-SQL, etc.)
- Experience with BI tools such Power BI, Tableau, Qlik, or others
- Deep knowledge of development tools and best practices for data engineers including CI/CD, unit and integration testing, plus automation and orchestration
- Production programming experience in one or more of the following languages - Python, Scala, or R
- Bachelor’s degree in Computer Science, Information Systems, Engineering, or equivalent work experience
- Comprehensive health coverage including medical, dental, and vision
- 401(k) Plan
- Equity awards
- Flexible time off
- Paid parental leave
- Family Planning
- Gym reimbursement
- Annual personal development fund
- Employee Assistance Program (EAP)
Databricks is the data and AI company. More than 9,000 organizations worldwide — including Comcast, Condé Nast, and over 50% of the Fortune 500 — rely on the Databricks Lakehouse Platform to unify their data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe. Founded by the original creators of Apache Spark™, Delta Lake and MLflow, Databricks is on a mission to help data teams solve the world’s toughest problems. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.
Our Commitment to Diversity and Inclusion
At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.
If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.