🚀 Lessons teach faster than lines of code. Robots are entering a new era—where they learn by watching, listening, and practicing, not by waiting for a programmer. Show a task. Speak a goal. Give feedback. The robot refines its behavior like a student, not a script. This is learning from demonstrations, corrections, and natural language—opening robotics to teachers, nurses, technicians, and kids. Fewer barriers. Safer behavior. Real skills that transfer from lab to life. Because the future of AI isn’t just smart—it’s teachable. Speaker Dr. Qamar Ul Islam D.Engg. B.Tech. M.Tech. Ph.D. FHEA #Robotics #AI #ImitationLearning #LearningFromDemonstration #VisionLanguageAction #HumanRobotInteraction #TeachNotCode #InfiniteMind
More Relevant Posts
-
💡 Curiosity is the true engine of learning — and remembering your “why” is what keeps you moving forward. Our GM Tylor Ng recently joined 《盈知學問》 with host Crystal Fung🌟 to share Campus X’s mission: empowering the next generation with robotics, AI, and entrepreneurial skills to think big, stay curious, and innovate beyond the classroom. From robotics education to real-world entrepreneurial experiences, we believe that children can become not just learners, but future creators and problem-solvers. 🎥 Watch the full interview here: https://lnkd.in/gzY3r4Ms #CampusX #EducationInnovation #AI #Robotics #CuriosityDriven #NeverForgetYourWhy
To view or add a comment, sign in
-
Awesome news to share. 🎉 Our paper, “Grounded Instruction Understanding with Large Language Models: Toward Trustworthy Human-Robot Interaction,” has been accepted to the AAAI 2025 Fall Symposium! We study how grounding LLMs in visual context improves a robot’s ability to interpret and execute natural-language instructions reliably. Using multimodal inputs (vision + language), we analyze failure modes, propose a framework for safer instruction following, and report gains on real-world, HRI-style tasks. Using our SCOUT++ human-robot instruction dataset (vision + language instructions with execution traces), we benchmark grounded vs. text-only models and analyze failure modes. The goal: more dependable, transparent robot behavior. Grateful to my co-authors and mentors for the support and late-night iterations. If you’ll be at the symposium, let’s connect! #AAAI #AAAI2025 #AAAIFallSymposium #HRI #Robotics #TrustworthyAI #VisionLanguageModels #LLM #MultimodalAI #Research #IRAL #UMBC
To view or add a comment, sign in
-
-
Watch a robot receive instructions, analyze its surroundings, plan the sequence, then execute perfectly. Gemini Robotics 1.5 turned science fiction into workplace reality. As a CS student passionate about AI and robotics, this breakthrough excites me. Google DeepMind launched two powerful models: •ER 1.5: Plans complex tasks using reasoning •Model 1.5: Executes actions from visual input The combination is game-changing. Robots can now understand natural language commands. They analyze their environment. Then they complete multi-step tasks independently. Imagine the applications: •Manufacturing automation •Healthcare assistance •Household robotics •Warehouse operations This tech bridges the gap between AI reasoning and physical action. From my experience with Python and Java, I know how challenging it is to make systems work together seamlessly. DeepMind solved a massive integration problem. The future of robotics just accelerated. Robots that think, plan, and act like humans are no longer distant dreams. They're here. What excites you most about AI-powered robotics? How do you see this impacting your field? #AI #Robotics #DeepMind
Overview of Google's Gemini Robotics Capabilities for Interactive and Dexterous AI Agents
To view or add a comment, sign in
-
Control robot movements using the human voice? Yes, it's possible ✅ #AI 👉 Natural Language Robot Programming ✅ Robotics👉🅰️🅱️🅱️✅ Peter Wirth Constantin Weiss Billy Cogum Marcin Gwóźdź Vanessa Loiola Stay up to date with me 🛎️💥👉 Miloš Kučera 👉Activate Bell 🛎️ for all posts 🌃Join to 100k+ followers on my #LinkedIn | 153M+ impressions | Post 592 | 2025 #abb #robotics #ai #abbrobotics
To view or add a comment, sign in
-
At Eduflier, we believe education should go beyond textbooks. Our mission is to bring Robotics, AI, and Coding into schools, so students don’t just learn theory — they create, innovate, and solve real-world problems. 🤖💡 By making classrooms more future-focused, and skill-driven, we aim to prepare young minds for the opportunities of tomorrow. 🚀 Because the future belongs to learners who can think, build, and lead. #Eduflier #FutureOfLearning #AI #Robotics #CodingForKids #InnovationInEducation
To view or add a comment, sign in
-
This is what the future looks like when robotics meets emotion. Disney Research revealed a BD-1-like robot that learns to move using reinforcement learning and it feels alive. Every tilt of the head. Every curious pause. Every bounce in its step. None of it is pre-programmed. It has learned through thousands of trial-and-error movements until it feels human. What used to be mechanical now looks expressive. What used to be code now looks like character. This is not about entertainment anymore. It is about robots learning to communicate through behavior. Watch this short clip and tell me this does not change how you see robotics forever. 🤖 Emotion is now a design choice. 🤖 Personality is now a dataset. 🤖 Movement is now a language.
To view or add a comment, sign in
-
Grateful to receive a Certificate of Participation for the “Coding, AI & Robotics” webinar by SCITECHE × DoNoCode.In—great session on practical, no‑code ways to explore AI and robotics fundamentals, workflows, and career pathways. Key takeaways: No‑code approaches can rapidly prototype AI/robotics ideas and lower the barrier to entry. Building blocks: logical flows, sensors/actuators, and automation with visual tools. Next steps: apply concepts to mini‑projects and share learnings with the community. Thanks to the organizers and instructor for an engaging experience—excited to build, iterate, and contribute to real‑world solutions. #AI #Robotics #NoCode #Learning #IITPatna #StudentDeveloper #TechCommunity
To view or add a comment, sign in
-
-
𝐋𝐢𝐯𝐢𝐧𝐠 𝐌𝐲 𝐂𝐡𝐢𝐥𝐝𝐡𝐨𝐨𝐝 𝐃𝐫𝐞𝐚𝐦!! ✨ 𝐖𝐚𝐧𝐧𝐚 𝐃𝐢𝐬𝐚𝐩𝐩𝐞𝐚𝐫 𝐥𝐢𝐤𝐞 𝐇𝐚𝐫𝐫𝐲 𝐏𝐨𝐭𝐭𝐞𝐫!? 🪄 𝘕𝘰𝘵𝘦: 𝘎𝘭𝘪𝘵𝘤𝘩𝘦𝘴 𝘤𝘶𝘻 𝘰𝘧 𝘴𝘦𝘮𝘪-𝘵𝘳𝘢𝘯𝘴𝘱𝘢𝘳𝘦𝘯𝘵 𝘤𝘭𝘰𝘵𝘩 — 𝘤𝘰𝘶𝘭𝘥𝘯’𝘵 𝘧𝘪𝘯𝘥 𝘢 𝘱𝘦𝘳𝘧𝘦𝘤𝘵 𝘳𝘦𝘥 𝘰𝘯𝘦 𝘭𝘰𝘭 😅 𝐎𝐩𝐞𝐧𝐂𝐕 𝐢𝐬 𝐫𝐞𝐚𝐥 𝐟𝐮𝐧! Maybe I’m late to this trend, but I’m super happy I finally tried it. This project has been sitting on my “to explore” list for a long time… and now ~ here comes the magic! 🪄 𝐀𝐛𝐨𝐮𝐭 𝐭𝐡𝐞 𝐏𝐫𝐨𝐣𝐞𝐜𝐭: Using OpenCV (Computer Vision), I created an illusion of invisibility by detecting a specific cloth color and replacing it with the background so it looks like I’ve disappeared! 𝐖𝐡𝐲 𝐢𝐭’𝐬 𝐜𝐨𝐨𝐥:🎯 OpenCV shows how computers “see” the world ~ detecting, segmenting, and processing what our eyes take for granted. It’s a small glimpse into the world of Computer Vision, which powers everything from facial recognition to autonomous cars. 🤖 Need Source code of this fun magic experiment ?? Drop #interested in the comments!!! 📌 Follow Atchaya Senthilkumaran for more fun tech experiments, GenAI, and hands-on learning! 🔁 Repost this if you love projects that turn code into magic! #OpenCV #Python #ComputerVision #ArtificialIntelligence #AI #MachineLearning #AIProjects #Innovation #LearningByDoing #TechFun #C2XAI #CodingIsFun #HarryPotter #InvisibilityCloak #ProjectShowcase #WomenInTech #AICommunity #TechJourney #DataScience #STEM #Magic #Automation #ATSpeaks #AtchayaSenthilkumaran #LinkedIn #LinkedPosts
To view or add a comment, sign in
-
🌍 𝐌𝐞𝐞𝐭 𝐘𝐚𝐧𝐬𝐡𝐞𝐞 – 𝐓𝐡𝐞 𝐎𝐩𝐞𝐧-𝐒𝐨𝐮𝐫𝐜𝐞 𝐇𝐮𝐦𝐚𝐧𝐨𝐢𝐝 𝐟𝐨𝐫 𝐀𝐝𝐯𝐚𝐧𝐜𝐞𝐝 𝐒𝐓𝐄𝐌 & 𝐀𝐈 𝐋𝐞𝐚𝐫𝐧𝐢𝐧𝐠 At iRobotics EduAI, we believe learning robotics should go beyond theory — it should inspire creativity, innovation, and real-world problem solving. 🤖 Yanshee is more than just a humanoid robot — it’s a complete open-source platform that empowers students, educators, and researchers to explore the frontiers of STEM, AI, and robotics. 🔍 𝐊𝐞𝐲 𝐇𝐢𝐠𝐡𝐥𝐢𝐠𝐡𝐭𝐬 𝐒𝐞𝐧𝐬𝐨𝐫 𝐬𝐮𝐢𝐭𝐞: Cameras, gyro, ultrasonic—live data. 𝐀𝐈 𝐢𝐧𝐭𝐞𝐫𝐚𝐜𝐭𝐢𝐨𝐧: Face/NLP/emotion ready. 𝐎𝐩𝐞𝐧-𝐬𝐨𝐮𝐫𝐜𝐞: Python, C, Blockly, ROS. 𝐌𝐨𝐭𝐢𝐨𝐧: High-torque, human-like control. 🎓 𝐈𝐦𝐩𝐚𝐜𝐭 𝐨𝐧 𝐋𝐞𝐚𝐫𝐧𝐢𝐧𝐠 𝐀𝐈 𝐨𝐧-𝐫𝐨𝐛𝐨𝐭: Train & deploy models. 𝐇𝐚𝐧𝐝𝐬-𝐨𝐧 𝐫𝐨𝐛𝐨𝐭𝐢𝐜𝐬: Mech, sensors, control. 𝐈𝐨𝐓 𝐩𝐫𝐨𝐣𝐞𝐜𝐭𝐬: Connect robots to devices. 𝐄𝐭𝐡𝐢𝐜𝐚𝐥 𝐀𝐈: Privacy, safety, impact. #Robotics #AI #STEMEducation #HumanoidRobot #Yanshee #iRoboticsEduAI #EdTech #Innovation #OpenSource #AIinEducation #FutureOfLearning #TechnologyInEducation #STEMLearning #SmartEducation #21stCenturySkills #RobotForEducation #ResearchAndDevelopment #LearningThroughRobotics
To view or add a comment, sign in
-
We've all prompted AI agents to write code or generate images. But what if you could prompt a physical robot to clean your room or patrol your house? That's the promise of MARS, the new personal AI robot just launched by Y Combinator-backed startup Innate from founders Axel Peytavin and Vignesh Anand. It's a general-purpose, teachable robot designed to finally bring robotics to the rest of us. Instead of writing complex code for every action, you teach MARS physical "skills" (like picking up a sock) just by demonstrating the movement with a controller. Then, you create a "behavior" with a simple prompt, like "You are a security guard. Patrol the house and watch for intruders." The AI brain connects the skills to the mission. What I think is cool about MARS: 1. All-in-One & Accessible: It comes fully assembled with an onboard GPU (Jetson Orin Nano) for under $2k. No powerful external computer is needed to run it. 2. Truly Teachable: Its "teach-by-showing" model and simple SDK drastically lower the barrier to entry beyond traditional coding. 3. Open & Robust: Built on ROS2, it's open-source and designed to be modded and withstand real-world use by the community. This isn't just a robot; it's a powerful, affordable platform for experimenting with real-world AI agents. Every new skill taught can be shared, helping the entire community grow. What's the first task you would teach your personal robot? #Robotics #AI #Hardware #OpenSource
To view or add a comment, sign in
More from this author
-
Brain Computer Interfaces - BCI and Robotics: A User Friendly Handshake.
Dr. Qamar Ul Islam D.Engg. B.Tech. M.Tech. Ph.D. FHEA 4y -
Machine with Intelligence
Dr. Qamar Ul Islam D.Engg. B.Tech. M.Tech. Ph.D. FHEA 4y -
Autonomous Driving - Trends 2021
Dr. Qamar Ul Islam D.Engg. B.Tech. M.Tech. Ph.D. FHEA 4y
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development