Snowflake Architect

LakeFusion seeks a Snowflake Architect to design scalable data pipelines, models, and AI-integrated solutions, ensuring seamless Snowflake–Databricks integration for enterprise-grade data management.

India, Remote
Full-Time
1 Opening
Apply Now
About the Role

LakeFusion is seeking a Snowflake Architect to spearhead the design and implementation of a multi-platform backend strategy that extends our Databricks-native foundation to include Snowflake. In this role, you will lead the architectural design, requirements gathering, and gap analysis needed to ensure LakeFusion’s platform integrates seamlessly with Snowflake, enabling scalability, reliability, and cost efficiency for enterprise-grade data management.

You will be responsible for architecting high-performance data pipelines, robust data models, and comprehensive solutions within the Snowflake ecosystem, applying best practices in data quality, governance, and dimensional modeling. This role also involves exploring and designing architectural patterns for integrating Generative AI and Retrieval-Augmented Generation (RAG) capabilities at scale within Snowflake, aligning closely with LakeFusion’s AI-first product vision.

As the primary technical expert for Snowflake within LakeFusion, you will independently solve complex architectural challenges, advise on re-architecture where necessary, and provide expert guidance to product and engineering teams. You will also create clear technical documentation and communicate strategic recommendations to leadership and stakeholders, ensuring alignment across the organization.

This is a senior, highly autonomous role for a data architecture leader who thrives on tackling complex technical problems and shaping scalable enterprise solutions at the intersection of modern data platforms and AI.

What you’ll do
  • Lead Snowflake Backend Architecture: Spearhead the architectural design and strategic planning for integrating Snowflake as a backend for the LakeFusion platform, expanding our current Databricks-native capabilities to support a multi-platform strategy.
  • Conduct Comprehensive Analysis: Execute a thorough requirements gathering and technical gap analysis, meticulously comparing LakeFusion's existing application architecture and data interaction patterns with Databricks against the requirements for robust Snowflake compatibility.
  • Design Scalable Data Solutions: Architect high-performance, cost-optimized data pipelines and comprehensive data management solutions within Snowflake, applying industry best practices for data modeling (including Slowly Changing Dimensions - SCDs), data quality, and reliability.
  • Plan and Roadmap Migration/Expansion: Develop detailed solution architecture designs, technical specifications, and a comprehensive work plan for the phased migration and expansion required to support Snowflake, identifying potential challenges and proposing innovative mitigation strategies.
  • Integrate AI/ML in Snowflake: Design architectural patterns for leveraging Generative AI models and Retrieval-Augmented Generation (RAG) architectures at scale within the Snowflake ecosystem, aligning with LakeFusion's AI-first product vision.
  • Problem Solve and Advise: Act as the primary technical expert for Snowflake, independently solving complex architectural and integration challenges, and providing expert guidance on potential re-architecting needs for the LakeFusion application to enable seamless communication with a Snowflake backend.
  • Document and Communicate: Create clear, concise, and thorough technical documentation for all architectural designs, analysis findings, and work plans. Effectively communicate complex technical concepts and strategic recommendations to engineering leadership, product teams, and other key stakeholders.
What We're Looking For
  • Extensive experience (10+ years) as a Senior Data Architect, Solutions Architect, or similar lead architectural role, with a significant focus on Snowflake implementations.
  • Expert-level proficiency in Snowflake's ecosystem, including advanced data warehousing, data lake, data sharing, and security capabilities, along with a deep understanding of its cost optimization strategies.
  • Demonstrated experience in designing and documenting large-scale data pipelines and data management solutions for complex enterprise applications.
  • Expert-level SQL proficiency and significant experience with Python for data engineering, scripting, and automation.
  • Practical experience with PySpark is highly desirable for understanding LakeFusion's existing architecture and facilitating effective gap analysis.
  • Proven ability to architect and implement solutions involving Generative AI models and RAG architectures within Snowflake at scale.
  • Knowledge of data modeling principles, including Slowly Changing Dimensions (SCDs), Kimball dimensional modeling, and other modern data modeling approaches.
  • A highly self-motivated, independent, and driven problem-solver with a proven track record of owning ambiguous, complex workstreams from analysis through strategic planning and architectural design.
  • Exceptional communication and technical documentation skills, with the ability to articulate complex architectural decisions and strategic roadmaps to both technical and non-technical audiences.
Nice-to-Have
  • Previous experience as an independent contractor or consultant delivering high-impact architectural engagements.
  • Familiarity with the Databricks Lakehouse Platform (Delta Lake, Unity Catalog, Databricks SQL) and its architectural patterns to better understand LakeFusion's current operational environment.
  • Experience with Master Data Management (MDM), Entity Resolution, or enterprise data quality platforms.
  • Knowledge of other cloud platforms (AWS, Azure) and their respective data and AI/ML services.
About the LakeFusion

LakeFusion is the modern Master Data Management (MDM) company. Global enterprises across industries ranging from retail to manufacturing and financial services  rely on the LakeFusion platform to unify, govern, and deliver trusted data entities such as customers, products, suppliers, and employees. Built natively on the Databricks Lakehouse, LakeFusion creates a single source of truth that powers analytics and AI. LakeFusion enables organizations worldwide to accelerate innovation with trusted and governed data.

Apply Now