Last week, we made OpenAI's new models gpt-oss-20B and gpt-oss-120B available natively on Databricks and the early adoption has been incredible! These open-weight models deliver chain-of-thought reasoning and tool use, best-in-class latency and cost efficiency through Mixture-of-Experts architecture, and a massive 131k context window for long documents and RAG. Build custom agents, automate tasks, or run real-time copilots securely next to your data with governance baked in → https://lnkd.in/g7AdqZnX
Excited for this 🔥
Excited for this 🔥
Margaret Amori - You are welcome to share your channel positions in this great Linkedin group of 23K channel pro’s and the largest for AI MSPs https://www.linkedin.com/groups/121739/ along with channel partner recruiting and solutions promotions.
Big thanks for sharing 👏
Open-weight adoption at this scale shows just how fast the AI ecosystem is moving.
Democratize Data & AI | From data to AI, without silos or limits
2moIncredible step! In my opinion what makes this really unique is not just “having new models,” but how they’re delivered: • Open-weight LLMs → full control and customization • Mixture-of-Experts → lower costs and faster latency • 131k context window → long docs and RAG at scale • And above all: models run natively next to governed data, with security and compliance by design That’s a game changer for every industry — from financial services to Utilities, where trust, governance and cost efficiency are mission-critical. Many competitors claim similar capabilities, but here’s the difference: • Their models usually run outside the data plane → higher data-movement costs, latency and risk. • Governance is bolted on later, not unified across data + AI. • More tools and connectors = more complexity, more lock-in. With Databricks it’s native: one platform where data, governance and AI live together. That’s what makes adoption faster, safer and truly scalable. Just my take 🇮🇹☕