LinkedIn and 3rd parties use essential and non-essential cookies to provide, secure, analyze and improve our Services, and to show you relevant ads (including professional and job ads) on and off LinkedIn. Learn more in our Cookie Policy.
Select Accept to consent or Reject to decline non-essential cookies for this use. You can update your choices at any time in your settings.
Give your Autonomous Mobile Robots the power to dynamically navigate complex environments. 🦾
Introducing our Integrated Sense and Compute Package, designed to power the next generation of compact, scalable, and sensor-rich AMR systems. Featuring Advantech AFE-R360/R760 solutions based on Intel® Core™ Ultra, D3 Embedded customizable DesignCore® Discovery ISX031 PRO Series GMSL2 Cameras, and RealSense™ Depth D457 GMSL2 Cameras.
⚡ Accelerated computing performance
🎯 Enhanced vision sensor accuracy
Read the press release to learn more ➡️ https://lnkd.in/edj5D5tw#AMR#Robotics#EmbeddedVision#AI#AutonomousSystems#3DVision
The term “collaborative robot” is gone — the future is all about collaborative applications.
Without accurate force sensing, a robot marketed as “collaborative” could still pose a safety risk. With robust sensing, even high-power robots can be deployed in safe, collaborative applications.
At XJCSENSOR, we see this shift as a welcome clarification. Collaboration is not a marketing buzzword — it’s an engineering achievement powered by advanced sensing and system-level design.
👉 As the industry evolves, our focus remains the same: delivering high-precision multi-axis force sensors that enable the safe, intelligent, and truly collaborative applications of tomorrow.
#Robotics#Sensors#Collaboration#Safety#ForceSensor#XJCSENSOR.
Already one month under the K Youth Programme at Renesas Semiconductor together with Ahmad Fahmi , where we’re part of the Smart Factory team. I’ve learned a lot about Autonomous Mobile Robots (AMRs) and how they are applied in real factory environments.
Here, we need to wear what’s called the Apollo attire to prevent dust from entering the process line, ensuring a clean and controlled environment. There are two types of AMRs in the factory. The first one carries trolleys containing magazines, a system that has been running for about three years. Recently, a Modbus system has been implemented to allow communication between the AMR, automatic doors, and lifts , enabling the robot to move across different floors and improving production efficiency.
The second AMR is more complex, equipped with a robotic arm currently being set up for integration with the wire bonding machine. This system requires careful consideration of factors such as lighting and light reflection, as many sensors rely heavily on these conditions. The robot uses LiDAR, ultrasonic, camera, and infrared sensors , each playing a vital role in executing precise movements and actions. Its programming is done using a block diagram-based system, making it easier to visualize and manage each process.
Feeling grateful to gain hands-on experience in automation, robotics, and smart manufacturing technologies!
#kyouth#roadtosmartfactory#robotics#mechatronics
New from Sony: the IMX927 Series of Global Shutter CMOS sensors is ideal for robotics and AMRs, delivering non-distortion imaging at high speed with Pregius S back-illuminated pixels and the SLVS-EC high speed interface for rapid readout. The lineup scales to ~105 MP and ~100 fps for high throughput perception. It is not limited to robotics, the same capabilities support broader industrial imaging from factory automation to inspection and infrastructure monitoring.
Evaluate with Macnica Americas. Our Sensor to Solution model provides no soldering evaluation, drivers, ISP tuning, interface bridging, and lifecycle planning, so your team can hit application GTM objectives. Read more about the IMX927 series:
https://lnkd.in/gXwgxBrQ#SonyImageSensors#IMX927#GlobalShutter#Robotics#AMR#MacnicaAmericas
🚗 AI at the Intelligent Edge — NXP at MATLAB EXPO
I’m excited that NXP Semiconductors is part of this year’s MATLAB EXPO Germany, and especially looking forward to the keynote by Daniel Weyl.
In his talk, “AI and Machine Learning at the Intelligent Edge”, Daniel explores how vehicles are evolving into intelligent, connected systems — delivering advanced safety, autonomous driving, predictive maintenance, and efficient power management. He highlights how powerful hardware and cutting-edge AI software come together to meet automotive-grade quality standards and unlock new capabilities for the road ahead.
View the agenda and register: https://spr.ly/6047A5kS7#MATLABEXPO#Semiconductors#SoftwareDefined#Safety#AIinEngineering
Imagine robotic fingers that can tell what’s inside a non‐transparent container without cameras—just by touch. A multimodal tactile sensor system combining thermal conduction and vibration sensors, processed via CNNs, has achieved ~97.5% accuracy classifying both types of liquid and the fill volume.
Nature - https://lnkd.in/gEiQWNzu
This is huge for applications in smart manufacturing, quality control, and robotics where optical sensing fails (fog, darkness, obscured view).
What’s one industrial or biomedical scenario where this tactile sensing could outperform traditional vision-based systems?
#TactileSensing#MachineLearning#Robotics#SmartManufacturing#Mechatronics
SkipcrossNets: Revolutionizing Road Detection with Adaptive Multi-Modal Fusion 🚗💡
Tired of rigid multi-modal fusion in autonomous driving? Meet SkipcrossNets—a game-changing architecture that dynamically fuses Altitude Difference Images (ADIs) from point clouds and camera data across all network layers! 🌉🔄
Unlike traditional two-stream networks stuck at a single fusion point, SkipcrossNets uses a skip-cross strategy to connect every layer bidirectionally. This lets features propagate freely, picking the most complementary layers from both modalities to boost road detection accuracy. 🧠✨
By projecting point clouds into ADIs, it preserves critical height/depth details often lost in 2D. The result? 96.85% MaxF on KITTI and 84.84% F1 on A2D2—all while running at 68.24 FPS with just 2.33 MB of memory. 🚀📊
Perfect for real-time edge devices! 📱🔋 Dive into the future of adaptive fusion:
https://lnkd.in/gR97TZ6n#AutonomousDriving#ComputerVision#AI
Thinking about the future of home automation? Dive into the technology behind the latest robotic vacuums!
Our new blog post explores the critical design elements that are transforming these devices from simple cleaners into intelligent, autonomous platforms. We cover:
⚡ The evolution from "bump-and-go" to advanced SLAM technology.
⚡The power of multi-modal sensor fusion for seamless navigation.
⚡How precision motor control enhances efficiency and reduces noise.
⚡Strategies for optimizing battery life and overall system performance.
Read the full guide and join the conversation on how smarter design is shaping the future of robotics!
💻 Elevate Your Robotic Vacuum Designs
From navigation smarts to efficient control, these elements shape the next generation of vacuum cleaner robots. Join our webinar on October 9, 2025, at 8:30 PM IST for a deep dive, featuring a live A89301 demo from Allegro MicroSystems.
Links in comment section.
#Robotics#HomeAutomation#VacuumDesign#Engineering#TechInnovation#AI#SensingTechnology#TenxerLabs
We happy to share this technical video we created to demonstrate updates in our #driving#scenarios extraction software for #simulation for #Automomous driving and #ADAS systems. Here we show the object sizes estimation (#3D Bounding boxes) as additional feature of our solution.
Our software allows to collect at scale real-word driving scenarios and road users trajectories 24x7 from #CCTV cameras and detect all critical situations that makes possible to test #automotive software in corner-case scenarios optimizing the amount of testing in #simulator.
Additionally, our approach gives big enough scenarios data stream for training of wide range of #AI-based functions, enabling next generation of #artificial#traffic solutions.
Will be glad hear your questions and comments
#Mobility#Validation#Algorithms#SmartCity#SmartRoad#AD_Simualtion#ASAM#OpenScenario
Something remarkable just happened in edge AI and robotics🤖
Our partner Everfocus Electronics Corporation has launched the EAR-100T Robotic Controller, the first standard industrial computer powered by NVIDIA Jetson Thor🤩
In short: it’s like giving robots and autonomous machines a super-brain that can think fast, process huge amounts of data, and act in real time.
Highlights:
🔺Handles 12 cameras at once with perfect sync (by iSSA Technology)
🔺Built for fast decision-making in robotics, drones, and fleets
🔺Delivers up to 2070 FP4 TFLOPS of AI computing power
🔺128 GB LPDDR5X high-bandwidth memory for heavy workloads
🔺Enough performance to support future-proof AI applications
For developers and robotics teams, this is a ready-to-use, off-the-shelf solution to build smarter, safer, and more autonomous systems🦾
Want to know more? Contact sales@blue-line.com for more information.
#blueline#AI#Robotics#EdgeAI#AutonomousTechnology#JetsonThor#EverFocus#Innovation#SmartMachines#AIatTheEdge#IndustrialComputing#Everfocus
Founder and CEO at RealSense
1moGreat work D3 team 👏