We've launched the dbt_semantic_view package, seamlessly integrating the Snowflake semantic view with dbt's disciplined workflow. This means you can now: - Centralize Metrics: Define key metric logic (FACTS, DIMENSIONS, METRICS) natively in Snowflake's semantic view for a single source of truth. - Enable CI/CD: Apply dbt's strengths (declarative YAML, testing, and version control) directly to your semantic layer definitions. - Ensure Governance: Automatically manage privileges and validate relationships to ensure every consumer, from BI tools to AI agents, uses trusted data. - Unify your governance framework and ensure consistent, high-performance metrics across your enterprise. Learn more and get started: https://lnkd.in/eZqiwT-g
Snowflake Developers
Software Development
Menlo Park, California 50,427 followers
Build Massive-Scale Data Apps Without Operational Burden #PoweredBySnowflake #SnowflakeBuild
About us
Snowflake delivers the AI Data Cloud — mobilize your data apps with near-unlimited scale and performance. #PoweredbySnowflake
- Website
-
https://www.snowflake.com/en/developers/
External link for Snowflake Developers
- Industry
- Software Development
- Company size
- 5,001-10,000 employees
- Headquarters
- Menlo Park, California
- Founded
- 2012
- Specialties
- big data, sql, data cloud, cloud data platform, developers , ai data cloud, agentic ai, ai, and data engineering
Updates
-
Join this instructor-led lab to learn how to build agent-based AI applications using the GPT model within Snowflake Cortex AI’s secure environment. Expect live coding, real-world use cases and an interactive demo to get you started.
This content isn’t available here
Access this content and more in the LinkedIn app
-
You can't have great AI without a solid data foundation. Join Senior Developer Advocate Vino Duraisamy for a hands-on bootcamp at #SnowflakeBUILD and learn to build pipelines that businesses can actually trust. In this session, you’ll build a production-grade, enterprise-ready data pipeline in Python that feeds your AI models. You'll cover: 1️⃣ Multimodal Ingestion: Ingesting and transforming structured and unstructured data using Snowpark and pandas DataFrame API. 2️⃣ Automation: Automating the entire workflow with Snowflake Tasks. 3️⃣ CI/CD & Monitoring: Integrating with Git, implementing testing, and setting up monitoring for reliable data delivery. Master the skill set that separates you from the pack in the age of AI. It’s virtual and it's free for all. Register today: https://lnkd.in/gVw6qqHg
-
We recently hosted our inaugural AI Agents Hackathon at the #SVAIHub, bringing together over 120 builders for a day of non-stop innovation, and the results were incredible. The winning teams proved the power of multimodal agents applied directly to enterprise data: 🥇 SnowOptima AI: An autonomous marketing agent that ingested campaign data, analyzed optimization vectors, and orchestrated a Creative Agent to generate high-conversion ads. Talk about multimodal output driven by data insights! 🥈 Plexfort: A clean, effective email-to-database agent that securely converts natural language queries into SQL and synthesizes the data into a conversational response. 🥉 Accessibilify: An agentic ADA compliance checker that analyzed a webpage’s visual and DOM, then autonomously suggested the exact code-level fixes needed. A huge thank you to everyone who joined, built, and shared their genius at our inaugural Silicon Valley AI Hub hackathon, we can't wait for the next one. Learn all about it: https://lnkd.in/g74e6V-k
-
Want to get data directly to your business users, right where they work? 🚀 Join us on October 21 to learn how to build a conversational app that seamlessly integrates into Microsoft Teams, and make data insights accessible to everyone in your organisation. You'll learn how to: ❄️ Integrate Snowflake Cortex features ❄️ Build a natural language conversational interface ❄️ Extract insights from structured and unstructured data Ready to boost your team's productivity? Register here: https://lnkd.in/eq6exY72
-
-
We’re excited to announce Anthropic’s Claude Haiku 4.5 is now available for customers on Snowflake Cortex AI to use natively within the Snowflake secure perimeter. With its near-frontier performance and lightning-fast speed, it’s great for enterprise use cases: ❄️ AI-powered pipelines – Build high-volume, multimodal analytics pipelines directly in Cortex using AISQL. Claude Haiku 4.5 delivers Sonnet 4 level performance at lower cost and faster speeds which is ideal for scaling data processing workloads. 👉 https://lnkd.in/gEZm-KK7 ❄️ Agentic Systems – Power low-latency, cost-efficient agents that reason, act, and adapt in real time—built for enterprise scale and secured within Snowflake’s trusted, governed environment. 👉 https://lnkd.in/gzbNXN_T ❄️ Actionable Insights – Coming soon in Snowflake Intelligence, to give business users more precise, state-of-the-art insights from structured and unstructured data, particularly in finance and research. 👉 https://ai.snowflake.com/
-
-
Confidently power your most critical data pipelines with a connected and governed view of your entire Iceberg ecosystem. Build a Snowflake Lakehouse with a single pane of glass using catalog-linked databases to federate to Microsoft OneLake and other Iceberg REST Catalogs. Apply policies to all your data assets, across regions and clouds, in one experience, regardless of metadata catalog, with Snowflake Horizon Catalog.
[LIVE] Build a Connected and Governed Lakehouse
www.linkedin.com
-
We’re announcing the General Availability (GA) of advanced data engineering capabilities for open table formats on Snowflake, empowering you to build a connected, governed, and high-performance lakehouse for the AI era. This release eliminates forced choices by delivering: - Catalog-Linked Databases (GA): Zero-ETL connection to any Iceberg REST catalog (AWS Glue, Unity, OneLake) from a single Snowflake environment for automatic discovery. - Write to Any Iceberg Table (GA): Full data engineering support for ingestion, transformation, and modeling, centralized on Snowflake's fully managed platform. - Automatic Optimization: Get the flexibility of open formats without the operational overhead—Snowflake now optimizes file sizes and partitions across your Iceberg ecosystem (GA). - Secure Zero-ETL Sharing (GA): Share Iceberg and Delta Lake tables across regions and clouds with governance policies persisting for the data consumer. Full details: https://lnkd.in/gGmF_x4C To learn more, join us at Data Engineering Connect: https://lnkd.in/gxezCyb4
-
-
The future of AI is agentic, and we'll be discussing it at #SnowflakeBUILD! Join us for the Opening Keynote: Building the Agentic AI Future with EVP of Product, Christian Kleinerman, Director of Product Management, Jeff Hollan, and Senior Product Manager, Doris Lee. Discover how developers can unlock real-time value from data by building flexible AI agents, from data ingestion and transformation to modern app development and deployment. See live demos and real-world examples from Snowflake experts, and learn how to start building your own AI agents today. Register today and tune in November 4 👉 https://lnkd.in/gRq5X-br
-
-
Snowflake Openflow: Powering Modern Data Integration Across Every Use Case Snowflake Openflow is a managed, extensible data integration platform built for all data types, architectures, and workloads, batch or streaming, structured or unstructured. Explore how Openflow accelerates your data movement and unlocks new possibilities: ❄️ Change Data Capture (CDC) from OLTP Databases for Operational Analytics Stream changes in real time from MySQL, PostgreSQL, or SQL Server directly into Snowflake using managed CDC connectors. Create near real-time replicas of your operational data for analytics without impacting source system performance or building complex pipelines. ❄️ Unlocking Unstructured Enterprise Data for Generative AI Ingest and process unstructured data (documents, images, audio, and more) at scale. Openflow enables seamless movement of unstructured assets into Snowflake, powering advanced AI and LLM workloads with enterprise-grade governance and observability. ❄️ Batch and Streaming Data Pipelines for Analytics and AI Unify batch and streaming data integration in a single platform. Openflow supports high-throughput, low-latency pipelines for both historical and real-time analytics, enabling faster, more accurate insights for your business. Connect to virtually any data source or destination with Openflow’s open, extensible framework. Build and monitor complex data workflows with built-in reliability, governance, and limitless scalability, all managed within Snowflake. Explore our Snowflake Openflow solution: https://lnkd.in/gqTpQehN