Center for Project-Based Learning D-ITET hat dies direkt geteilt
Very happy to see our work on hybrid RGB-DVS sensing for autonomus drones and beyond, in collaboration with Sony (Lyes Khacef), within my Sony Research Award Program is highlighted by PROPHESEE as a paper not to miss! 😃 Excellent work from Christian Vogt, Pietro Bonazzi, and Michael Jost at the Center for Project-Based Learning, D-ITET.
👀 5 event-based vision papers you can’t miss this month! Our Academic Research Library is continuously expanding, featuring over 200 papers from well-recognized institutions and researchers using Prophesee’s event-based Metavision® technologies. 🧠 From advancing event-based sensing and predictive vision for dynamic scenes to novel imaging applications and neuromorphic benchmarks — these papers provide ideas and inspiration for academics, developers, and enthusiasts alike: 📚 Paper #1 – "FRED: The Florence RGB-Event Drone Dataset" This paper introduces FRED - a multimodal dataset for drone detection, tracking, and trajectory forecasting combining RGB video and event streams. 👉http://bit.ly/4nair3e Authors: Gabriele Magrini, Lorenzo Berlincioni, Federico Becattini, et al. 📚 Paper #2 – "Looking into the Shadow: Recording a Total Solar Eclipse with High-resolution Event Cameras" This paper presents the first recording of a total solar eclipse with a pair of high-resolution event cameras, with accompanying methodology. 👉 http://bit.ly/41RhnJ7 Authors: Fernando Cladera, Kenneth Chaney, Caroline Pritchard, et al. 📚 Paper #3 – "The neurobench framework for benchmarking neuromorphic computing algorithms and systems" This article introduces NeuroBench, a common set of tools and a systematic methodology for inclusive benchmark measurement, providing an objective reference framework for quantifying neuromorphic approaches in both hardware-independent and hardware-dependent settings. 👉 http://bit.ly/3JZ6hM2 Authors: Jason Yik and a global team of nearly 100 researchers, including our very own CTO Christoph Posch. 📚 Paper #4 – "RGB-Event Fusion with Self-Attention for Collision Prediction" This paper proposes a neural network framework for predicting the time and collision position of an unmanned aerial vehicle with a dynamic object, using RGB and event-based vision sensors. 👉 http://bit.ly/41O3Fqw Authors: Haotong Qin, Michele Magno, Lyes Khacef, et al. 📚 Paper #5 – "Asynchronous Multi-Object Tracking with an Event Camera" This paper presents the Asynchronous Event Multi-Object Tracking (AEMOT) algorithm for detecting and tracking multiple objects by processing individual raw events asynchronously. 👉 http://bit.ly/3V8UhtQ Authors: Angus Apps, Tim Molloy, Robert Mahony, et al. 🔗 Explore the full library and discover research advancing event-based vision: http://bit.ly/4nrfJ9s #EventBasedVision #NeuromorphicVision #AcademicResearch #ComputerVision #AIResearch #ResearchLibrary #UAV