August 2025
Made with Nova Canvas. Prompt: "Cascading waves of light weave through blue machinery while magenta circuit paths dance like digital constellations"

August 2025

Our latest newsletter features the DeepFleet multirobot coordination model, breakthrough research in machine-based reasoning, Wanda++ LLM compression technology, along with Amazon Aurora's journey in database innovation.

Deep dives

Three challenges in machine-based reasoning: Amazon VP and distinguished scientist Byron Cook explains how Amazon's new Automated Reasoning checks tackle language translation, defining truth, and definitive reasoning.

A better path to pruning large language models: Amazon researchers present novel LLM compression, Wanda++, which compresses 7B parameter models in under 10 minutes on a single GPU, improving performance by 32% over previous compression methods.

Amazon builds first foundation model for multirobot coordination: DeepFleet is trained on millions of hours of data from fulfillment centers, predicting future traffic patterns for robot fleets, increasing deployment efficiency by 10% to deliver packages faster at lower costs.

Article content
Sample models of a fulfillment center (top) and a sortation center (bottom).

News and updates

A decade of database innovation: The Amazon Aurora story: What started as a vision to combine the cost effectiveness and simplicity of MySQL with the speed and availability of high-end commercial databases has evolved into a fully serverless solution trusted by tens of thousands of customers.

Agentic AI Summit: Amazon researchers presented their work on a distributed multi-agent framework for autonomous IT support and maintenance. The framework orchestrates agents across edge and cloud systems to diagnose and resolve IT issues, a critical solution for Amazon's massive scale.

Winners of the Amazon Nova AI Challenge: Amazon recently announced the winners of the first-ever Nova AI Challenge, where university teams competed to develop and hack AI coding assistants, with teams from UIUC and Purdue winning the defender and attacker categories respectively.

Conference roundup

Best Paper Award at ACL 2025 Industry Track: Amazon researchers were recognized for their paper demonstrating significant improvements in LLM latency through knowledge distillation and speculative decoding techniques, achieving up to a 25× speed-up and 180× cost reduction.

Featured publications

In the news


LinkedIn | X/Twitter | Facebook | Instagram | GitHub | RSS

© 1996-2025 Amazon.com, Inc. or its affiliates | Privacy | Conditions of Use

Diamond Redmond MSc., MBA

AI Transformation Architect | Multi-exit Digital Technologist | Demystify and Democratize AI | Nurturing Sustainable Value: A Servant’s Approach to Digital Excellence | Creative Catalyst | Curious Compassion

3w

Much appreciated - Does Amazon Science have any research papers or notes to share, relative to the Neurosymbolic AI article? We'd love to hear more about the architecture, findings, and limitations of this research!

Like
Reply

To view or add a comment, sign in

Explore content categories