Shortcuts:

IMAGE: Return to Main IMAGE: RSS Feed IMAGE: Show All Jobs

Position Details: Snowflake Solution Architect

Location: Baltimore, MD
Openings: 1

Description:

Must Have Technical/Functional Skills
  • Snowflake expertise: Warehouses, databases, roles, RBAC, SCIM, MFA.
  • Data Engineering: ELT/ETL tools (dbt, Talend), orchestration (Airflow).
  • Cloud Platforms: AWS, Azure, or GCP with Snowflake integration.
  • Programming: SQL, Python; familiarity with ML frameworks.
  • Security & Compliance: Data masking, encryption, audit processes
  • Strong experience with LLMs (OpenAI, Anthropic, Hugging Face, LangChain).
  • Proficiency in Python and modern AI frameworks.
  • Familiarity with vector databases, prompt engineering, and AI best practices.
  • 5+ years of product-focused engineering experience.
  • Knowledge of cloud deployment and scaling AI systems.
  • Strong SQL and Python skills.
  • Hands-on experience with dbt and Snowflake.
  • Familiarity with cloud platforms (AWS ).
  • Knowledge of CI/CD, DevOps practices, and data orchestration tools (Airflow, Prefect).
  • Ability to create lineage graphs, documentation, and validation frameworks
Must Have skills : snowflake, Cortex AI, AWS, DBT

Roles & Responsibilities
  • Design and manage data pipelines using dbt, Airflow, and CI/CD frameworks. 
  • Implement Snowpipe for continuous ingestion, Streams & Tasks for real-time processing. 
  • Enable AI/ML integration: Support predictive analytics and generative AI use cases. 
  • Leverage Snowflake Cortex and Copilot for LLM-based applications. 
  • Ensure data governance, RBAC, and security compliance across Snowflake environments. 
  • Optimize performance and implement Time Travel, Zero Copy Cloning, and Secure Data Sharing 
  • Build production-ready AI applications and LLM-powered features. 
  • Collaborate with AI + Data teams to develop agentic AI workflows. 
  • Experiment with open-source models and translate prototypes into production systems. 
  • Implement RAG pipelines, fine-tuning, and observability for AI models. 
  • Design and deploy secure, scalable, and highly available architectures on AWS. 
  • Select appropriate AWS services for application design and deployment. 
  • Implement cost-control strategies and disaster recovery plans. 
  • Collaborate with teams to integrate systems and ensure compliance. 
  • Develop Infrastructure as Code (IaC) using Terraform or CloudFormation 
  • Design, build, and maintain data pipelines using dbt for analytics and operational use cases. 
  • Implement standards for data quality, consistency, and reliability. · Optimize query performance and manage compute costs. 
  • Collaborate with analysts and stakeholders to understand data requirements. 
  • Build automation into workflows and ensure compliance with governance policies.

Perform an action:

IMAGE: Apply to Position




Powered by: CATS - Applicant Tracking System