How to make Maze Solver robot in 5 mins : https://youtu.be/MXioRPCVOF8
Maze Solver Demonstration : https://youtu.be/sV5-sxzwgt4
Maze solving problem is a very old problem, but still now it is considered as an important field of robotics. This field is based on decision making algorithms. The main aim of this project is to make an Arduino based efficient autonomous maze solver robot. Two simple mazes solving algorithms “Wall following algorithm” and “Flood fill algorithm” are used to make this robot. In this project Hardware development, software development and maze construction had been done. For performance testing, the robot will implement to solve 4×4 maze. Capability of finding the shortest path is also verified.
-- Musfiqur Rahman; email: [email protected]
Obstacle detection Robot using Ultrasonic Sensor and Arduino UNOSanjay Kumar
This document describes how to build an obstacle detection robot using an Arduino UNO, ultrasonic sensor, and motor driver module. It explains the components used, including the Arduino, ultrasonic sensor to detect obstacles from 2-400cm away, and an L298N motor driver module to control DC motors. It provides details on connecting the components, programming the ultrasonic sensor to trigger and receive echo signals to determine distances, and controlling the motor's direction depending on detected obstacles to help the robot navigate. Code and more details are available at the provided GitHub link.
School LAC Plan for school and classroom uses.docxIvyLood1
The document is a School Learning Action (SLAC) Plan submitted by Baring Elementary School for the 2023-2024 school year. It outlines the school's plan to conduct monthly Learning Action Cell (LAC) sessions for teachers to improve teaching competencies. The plan details the objectives, participants, budget, and scheduled activities for the LAC sessions, which will cover topics like instructional strategies, assessment techniques, and developing teaching skills. The goal is to continuously build the capacity of teachers through collaboration and sharing of best practices.
Schizophrenia is a mental disorder characterized by abnormal social behavior and failure to recognize what is real. It is believed to be caused by a combination of genetic and environmental factors. Common symptoms include false beliefs, unclear thinking, hearing voices, reduced social engagement, and lack of motivation. Diagnosis is based on observed behavior and reported experiences, and involves meeting criteria in diagnostic manuals. Treatment primarily involves antipsychotic medication, which can help reduce positive symptoms within weeks but has limited impact on negative symptoms and cognitive dysfunction.
El documento presenta información sobre el sistema PLD + ADM de Mercedes Benz, incluyendo una descripción de sus componentes principales (módulo PLD, módulo ADM), sus funciones (control de inyección, sensores, alimentación) y diagramas de su configuración.
This robot follows a black line on a bright surface or white line on a dark surface using IR sensors to detect the line. It uses a microcontroller, IR sensors, motor driver, and DC motors to sense the line and drive the wheels to stay on the line. When the sensors detect the line on one side, the microcontroller stops that side's motor to turn the robot.
This document describes a 4-bit synchronous binary counter. It contains the truth table for a JK flip-flop, diagrams of the counter circuit using 4 JK flip-flops connected in series with a common clock, and tables showing the output logic states and timing diagram as the counter counts from 0 to 15 over 16 clock pulses.
Thia presentation is presented by Naveed Ahmed, Rizwan Mustafa and Muzaffar Ahmad at Robot Expo in Information Technology University of Punjab, Lahore.
Autonomous robots are robots that can perform tasks intelligently depending on themselves, without any human assistance. Maze Solving Robot is one of the most popular autonomous robots. It is a smallself-reliant robot that can solve a maze from a known starting position to the center area of the maze in the shortest possible time.
This document describes the design and functioning of a light following robot. The robot uses light dependent resistors (LDRs) to sense light and an op-amp circuit to compare the light readings from the LDRs. When more light falls on one LDR, the op-amp output activates the corresponding transistor which drives the motor on that side, causing the robot to turn towards the light source. The robot aims to follow a light source such as a flashlight by moving its motors based on the LDR sensor readings processed by the op-amp circuitry. Applications include uses in street lights, alarms, and devices that adjust screen brightness based on ambient lighting.
This document describes a line following robot project built using an Arduino microcontroller. It lists the components used, which include the Arduino UNO, IR sensors, an L298N motor driver, DC motors, and a chassis. It explains the working principle of how the IR sensors detect a line and the motor driver is used to control the DC motors to follow the line. Diagrams of the circuit, programming code, potential applications, and advantages/disadvantages of the line following robot are also provided.
The document outlines requirements for a line following robot and discusses methods for line detection. It lists key requirements as being able to follow and take turns along a line, while being insensitive to lighting and noise. It also notes the line color does not matter as long as it is darker or lighter than the surroundings. The document further explains that infrared sensors produce analog outputs that need to be converted to digital signals, which can be done using analog to digital converters or comparators. It also provides an overview of features of the 8051 microcontroller, including memory, serial communication ports, timers, I/O pins, interrupts and clock speed.
Introduction to PLL - phase loop lock diagramHai Au
This document provides an introduction and overview of phase-locked loops (PLLs). It discusses the basic components of a PLL including the phase detector, voltage controlled oscillator (VCO), and loop filter. It describes how PLLs are used for applications such as frequency synthesis and data recovery. The document also provides examples of PLL design, including selecting component values for a VCO and calculating PLL parameters like voltage output and frequency sensitivity.
This document describes the design of a line-following robot that uses an ATMega8 microcontroller. The robot uses IR sensors to detect a black or white line and follow it, taking turns automatically. It includes IR sensors, a comparator IC, motor driver IC, DC motors, and a microcontroller board to process sensor input and control the motors accordingly to follow the line. The robot is able to detect the line with the IR sensors, send the sensor signals to the microcontroller via comparators, and have the microcontroller turn the appropriate motor on or off to steer the robot along the line.
The document describes the design and development of a 4-legged walking robot. It discusses the use of an Arduino Uno microcontroller and servo motors to control the robot's legs. The pantograph leg mechanism is employed to simplify the robot's kinematics and reduce computational complexity for controlling the multi-degree of freedom movement required for walking. Software like the Arduino IDE is used to program the microcontroller to coordinate the servo motors to enable the robot's walking abilities.
A proximity sensor is a sensor able to detect the presence of nearby objects without any physical contact. It detects An Object When The Object Approaches Within The Detection Range And Boundary Of The Sensor. Proximity Sensor Includes All The Sensor That Perform Non-Contact Detection In Comparison To Sensors Such As Limit Switch, That Detect The Object By Physically Contacting Them. It is a sensor able to detect the presence of nearby objects without any physical contact
This slides are on the project: Line follower robot using arduino board and PID algorithm. If else coding makes the robot shake way often, so I took the time to learn PID and this is the result.
This document summarizes key concepts about combinational logic circuits. It defines combinational logic as circuits whose outputs depend only on the current inputs, in contrast to sequential logic which also depends on prior inputs. Common combinational circuits are described like half and full adders used for arithmetic, as well as decoders. The design process for combinational circuits is outlined involving specification, formulation, optimization and technology mapping. Implementation of functions using NAND and NOR gates is also discussed.
The document outlines a 5-step process for developing a PLC program for a paint spray application system. The steps are: 1) define the task, 2) define inputs and outputs, 3) develop a logical sequence of operation using a flowchart, 4) develop the PLC program, and 5) test the program. It then provides details for each step as applied to a conveyor system that feeds boxes through a spray nozzle, with inputs like start/stop buttons and outputs like conveyor motor and spray valve control.
The document describes a line follower robot that uses infrared sensors to detect and follow a black line on a white surface. It uses an L293d motor driver IC to control two DC motors and drive the wheels. An LM324 comparator IC compares the output of the IR sensors to a reference voltage to determine if the sensor is over the black line or white surface. The robot also uses an L7805 voltage regulator to maintain a constant voltage supply for the components. The robot is able to navigate tight curves by sensing the line with the IR sensors and maneuvering accordingly using the closed-loop control system.
This project report summarizes the design and working of a line follower robot. It discusses the components used including an LM324 comparator IC, AT89C51 microprocessor, L293D H-bridge motor driver, and IR transmitter and receiver. It explains how the IR sensors detect the line and the microprocessor controls the motors to follow the line by turning when sensors detect line edges. The working principle section describes the robot's line detection and movement logic in detail. Applications mentioned include industrial transport, automated vehicles, and museum tour guides.
This document describes the design and working of an intelligent line following robot. It uses infrared sensors to detect a black line on a white surface and a microcontroller to control motors that navigate the robot along the line. The microcontroller receives sensor input and determines whether the robot should move straight, turn right, or turn left to stay on the line. The line following robot demonstrates principles of sensing, feedback control, and programming intelligence into machines.
The document describes a self-balancing two-wheeled robot project. The goals of the project are to demonstrate techniques for balancing an unstable robotic platform on two wheels and to design a digital control system using a state space model. The robot uses motors, sensors like an accelerometer and gyroscope, and a microprocessor to automatically balance itself in the upright position like an inverted pendulum. It classifies the robot system into three main parts: an inertial sensor unit to read angular velocity and position, an actuator unit with motors driven by analog signals from the controller, and a logical processing unit that processes sensor inputs and controls the actuators to return the robot to vertical position when tilted.
JK flip-flops have two outputs, Q and Q', and four modes of operation: hold, set, reset, toggle. The primary output is Q. There are two stable states that can store state information. JK flip-flops are used for data storage in registers, counting in counters, and frequency division. They can divide the frequency of a periodic waveform in half by toggling on each input clock pulse.
The document describes a proposal for a line maze solver robot project. It includes an introduction to line mazes, the objectives of the project to build an autonomous robot that can solve a line maze, and the key components and methodology. The robot will use 6 light sensors to detect the black line on a white surface and make decisions at intersections. It will use an Arduino microcontroller to process sensor input and control the motors. The first run will record wrong turns to avoid on the second run when it can solve the maze quickly.
This robot follows a black line on a bright surface or white line on a dark surface using IR sensors to detect the line. It uses a microcontroller, IR sensors, motor driver, and DC motors to sense the line and drive the wheels to stay on the line. When the sensors detect the line on one side, the microcontroller stops that side's motor to turn the robot.
This document describes a 4-bit synchronous binary counter. It contains the truth table for a JK flip-flop, diagrams of the counter circuit using 4 JK flip-flops connected in series with a common clock, and tables showing the output logic states and timing diagram as the counter counts from 0 to 15 over 16 clock pulses.
Thia presentation is presented by Naveed Ahmed, Rizwan Mustafa and Muzaffar Ahmad at Robot Expo in Information Technology University of Punjab, Lahore.
Autonomous robots are robots that can perform tasks intelligently depending on themselves, without any human assistance. Maze Solving Robot is one of the most popular autonomous robots. It is a smallself-reliant robot that can solve a maze from a known starting position to the center area of the maze in the shortest possible time.
This document describes the design and functioning of a light following robot. The robot uses light dependent resistors (LDRs) to sense light and an op-amp circuit to compare the light readings from the LDRs. When more light falls on one LDR, the op-amp output activates the corresponding transistor which drives the motor on that side, causing the robot to turn towards the light source. The robot aims to follow a light source such as a flashlight by moving its motors based on the LDR sensor readings processed by the op-amp circuitry. Applications include uses in street lights, alarms, and devices that adjust screen brightness based on ambient lighting.
This document describes a line following robot project built using an Arduino microcontroller. It lists the components used, which include the Arduino UNO, IR sensors, an L298N motor driver, DC motors, and a chassis. It explains the working principle of how the IR sensors detect a line and the motor driver is used to control the DC motors to follow the line. Diagrams of the circuit, programming code, potential applications, and advantages/disadvantages of the line following robot are also provided.
The document outlines requirements for a line following robot and discusses methods for line detection. It lists key requirements as being able to follow and take turns along a line, while being insensitive to lighting and noise. It also notes the line color does not matter as long as it is darker or lighter than the surroundings. The document further explains that infrared sensors produce analog outputs that need to be converted to digital signals, which can be done using analog to digital converters or comparators. It also provides an overview of features of the 8051 microcontroller, including memory, serial communication ports, timers, I/O pins, interrupts and clock speed.
Introduction to PLL - phase loop lock diagramHai Au
This document provides an introduction and overview of phase-locked loops (PLLs). It discusses the basic components of a PLL including the phase detector, voltage controlled oscillator (VCO), and loop filter. It describes how PLLs are used for applications such as frequency synthesis and data recovery. The document also provides examples of PLL design, including selecting component values for a VCO and calculating PLL parameters like voltage output and frequency sensitivity.
This document describes the design of a line-following robot that uses an ATMega8 microcontroller. The robot uses IR sensors to detect a black or white line and follow it, taking turns automatically. It includes IR sensors, a comparator IC, motor driver IC, DC motors, and a microcontroller board to process sensor input and control the motors accordingly to follow the line. The robot is able to detect the line with the IR sensors, send the sensor signals to the microcontroller via comparators, and have the microcontroller turn the appropriate motor on or off to steer the robot along the line.
The document describes the design and development of a 4-legged walking robot. It discusses the use of an Arduino Uno microcontroller and servo motors to control the robot's legs. The pantograph leg mechanism is employed to simplify the robot's kinematics and reduce computational complexity for controlling the multi-degree of freedom movement required for walking. Software like the Arduino IDE is used to program the microcontroller to coordinate the servo motors to enable the robot's walking abilities.
A proximity sensor is a sensor able to detect the presence of nearby objects without any physical contact. It detects An Object When The Object Approaches Within The Detection Range And Boundary Of The Sensor. Proximity Sensor Includes All The Sensor That Perform Non-Contact Detection In Comparison To Sensors Such As Limit Switch, That Detect The Object By Physically Contacting Them. It is a sensor able to detect the presence of nearby objects without any physical contact
This slides are on the project: Line follower robot using arduino board and PID algorithm. If else coding makes the robot shake way often, so I took the time to learn PID and this is the result.
This document summarizes key concepts about combinational logic circuits. It defines combinational logic as circuits whose outputs depend only on the current inputs, in contrast to sequential logic which also depends on prior inputs. Common combinational circuits are described like half and full adders used for arithmetic, as well as decoders. The design process for combinational circuits is outlined involving specification, formulation, optimization and technology mapping. Implementation of functions using NAND and NOR gates is also discussed.
The document outlines a 5-step process for developing a PLC program for a paint spray application system. The steps are: 1) define the task, 2) define inputs and outputs, 3) develop a logical sequence of operation using a flowchart, 4) develop the PLC program, and 5) test the program. It then provides details for each step as applied to a conveyor system that feeds boxes through a spray nozzle, with inputs like start/stop buttons and outputs like conveyor motor and spray valve control.
The document describes a line follower robot that uses infrared sensors to detect and follow a black line on a white surface. It uses an L293d motor driver IC to control two DC motors and drive the wheels. An LM324 comparator IC compares the output of the IR sensors to a reference voltage to determine if the sensor is over the black line or white surface. The robot also uses an L7805 voltage regulator to maintain a constant voltage supply for the components. The robot is able to navigate tight curves by sensing the line with the IR sensors and maneuvering accordingly using the closed-loop control system.
This project report summarizes the design and working of a line follower robot. It discusses the components used including an LM324 comparator IC, AT89C51 microprocessor, L293D H-bridge motor driver, and IR transmitter and receiver. It explains how the IR sensors detect the line and the microprocessor controls the motors to follow the line by turning when sensors detect line edges. The working principle section describes the robot's line detection and movement logic in detail. Applications mentioned include industrial transport, automated vehicles, and museum tour guides.
This document describes the design and working of an intelligent line following robot. It uses infrared sensors to detect a black line on a white surface and a microcontroller to control motors that navigate the robot along the line. The microcontroller receives sensor input and determines whether the robot should move straight, turn right, or turn left to stay on the line. The line following robot demonstrates principles of sensing, feedback control, and programming intelligence into machines.
The document describes a self-balancing two-wheeled robot project. The goals of the project are to demonstrate techniques for balancing an unstable robotic platform on two wheels and to design a digital control system using a state space model. The robot uses motors, sensors like an accelerometer and gyroscope, and a microprocessor to automatically balance itself in the upright position like an inverted pendulum. It classifies the robot system into three main parts: an inertial sensor unit to read angular velocity and position, an actuator unit with motors driven by analog signals from the controller, and a logical processing unit that processes sensor inputs and controls the actuators to return the robot to vertical position when tilted.
JK flip-flops have two outputs, Q and Q', and four modes of operation: hold, set, reset, toggle. The primary output is Q. There are two stable states that can store state information. JK flip-flops are used for data storage in registers, counting in counters, and frequency division. They can divide the frequency of a periodic waveform in half by toggling on each input clock pulse.
The document describes a proposal for a line maze solver robot project. It includes an introduction to line mazes, the objectives of the project to build an autonomous robot that can solve a line maze, and the key components and methodology. The robot will use 6 light sensors to detect the black line on a white surface and make decisions at intersections. It will use an Arduino microcontroller to process sensor input and control the motors. The first run will record wrong turns to avoid on the second run when it can solve the maze quickly.
The document summarizes a maze solving robot project that uses a wall-following algorithm. The robot, built with a Lego Mindstorms kit, uses an infrared sensor to follow the right wall and navigate through the maze. It initializes its position and direction, then checks for a wall on the right with its sensor. If no wall is detected, it moves forward; if a wall is detected, it turns right and moves forward. This allows it to hug the right wall and solve the maze without needing memory storage to map the maze. The algorithm is tested by having the robot navigate an actual physical maze and its path is reconstructed on a Matlab plot.
We've implemented an autonomous vehicle equipped with infrared sensors. Our main task is to basically design a small scaled vehicle which will have the capability to travel between pre-defined starting and finishing points of a maze.
How to Build a Maze Solving Robot Using ArduinoCircuitDigest
Learn how to make an Arduino-powered robot that can navigate mazes on its own using IR sensors and "Hand on the wall" algorithm.
This step-by-step guide will show you how to build your own maze-solving robot using Arduino UNO, three IR sensors, and basic components that you can easily find in your local electronics shop.
The document describes a proposed system for an autonomous robot that can solve mazes without using sensors by using image processing and the A* search algorithm. The robot would take a photo of the maze, use image processing to analyze the maze, apply the A* algorithm to find the shortest path, convert the path to an 8-chain code, and transmit the code to the robot via Bluetooth to allow it to traverse the maze autonomously. The system aims to provide a more efficient approach than existing robots that rely on sensors and training by allowing the robot to observe the full maze layout at once to intelligently determine the best route.
The document describes an autonomous maze solving robot project. The robot uses three ultrasonic sensors to detect the maze walls and two servo motors controlled by a microcontroller to navigate. The microcontroller receives distance readings from the sensors and instructs the motor driver which direction to move the motors to avoid obstacles as it solves the maze. Components include an Arduino, ultrasonic sensors, servo motors, batteries and other basic electronics. The robot is programmed to use an obstacle avoidance algorithm to autonomously solve mazes without human interference.
Robot chooses a simpler non-branching path leads to reach goal very easily from a path or collection of paths, typically from an entrance to goal is known as " MAZE SOLVER ROBOT ".
This is a full report of my project in Level 3 Term 1. The project was basically a self-driven vehicle capable of localizing itself in a grid and planning a path between two nodes. It can avoid particular nodes and plan path between two allowed nodes. Flood Fill Algorithm will be used for finding the path between two allowed nodes. The vehicle is also capable of transferring blocks from one node to another. In fact, this vehicle is a prototype of a self-driven vehicle capable of transporting passengers and it can also be used in industries to transfer different items from one place to another.
The document describes a path following robot project created by engineering students. It uses IR sensors to detect a black path on a white surface and a PIC microcontroller to process sensor inputs and control motors to follow the path. It provides a block diagram of the robot's components and architecture. It also details the algorithm used by the microcontroller to determine motor movements based on sensor readings to navigate straight paths and turns.
The document describes how to build a maze follower robot using Arduino, IR sensors, and a motor driver. The robot uses 4 IR sensors - two to follow lines and two more to detect intersections and choose a path. It can identify straight paths, left turns, right turns, intersections, and dead ends. The robot follows a left-hand or right-hand wall tracking algorithm to navigate the maze and reverse direction when it reaches a dead end. Components include an Arduino, IR sensors, motor driver, battery, and wheels. The circuit is assembled on a breadboard and code is used to control motor direction and speed based on sensor input.
The document describes the design of an autonomous navigation robot that can avoid obstacles. An ATmega328P microcontroller is used to process signals from ultrasonic sensors and direct the robot's movement. When an obstacle is detected, the microcontroller determines the distance and redirects the robot by turning or reversing direction to avoid collisions. The robot's movement is controlled by the microcontroller sending signals to motors through a motor driver. The goal is for the robot to intelligently navigate unknown environments without needing remote control by detecting obstacles with sensors and maneuvering autonomously.
Goal The goal is to build a Maze-runner special edition of your .pdfaucmistry
Goal
The goal is to build a "Maze-runner special edition" of your Toy Robot. You must basically be
able to select from a list of available obstacle modules, and then get your robot to navigate
through this maze of obstacles. The various obstacle modules will be provided by your team
members.
What you will program
You will work from your Toy Robot 4 codebase and add to that.
Create a new obstacles.py module that generates obstacles.
It must keep the same functions (i.e. interface) as now, but you can change the implementation to
create a designed maze (i.e. you determine the positions of each obstacle) or you can bring in
some other way of creating a randomized maze.
Get other obstacle modules from your team members, and share yours with them.
Find a manual way to navigate through the various obstacle modules.
Implement a maze-runner module that must be able to solve all the obstacle modules provided by
the various team members
What you will learn
Working in a team and sharing code.
Figuring out how somebody elses code works.
Working with multiple modules providing the same interface.
TipDo not throw away the unit tests you have for your solution from Toy Robot 3; these tests are
still valid and must still succeed (allowing for updated tests in the various test .py files). Rather
extend these tests to cover new functionality.
Instructions
ImportantThis exercise must be completed and submitted for review.
STEP 1 Implement new Obstacles module
Please use your solution for the Toy Robot 4 exercise as starting point for this problem.
First you need to implement your very own module, based on the obstacles.py module from the
previous exercise, but with a better approach to generating obstacles so that it creates a kind of
maze that is more difficult to navigate through.
First create a new package maze by creating it as a sub-directory.
Move your existing obstacles.py to that package, and update your existing code so that it imports
obstacles from the maze package.
Now add a new module to the maze package with the same functions as in obstacles.py.
Use a descriptive filename for your maze code, e.g. the_worlds_most_crazy_maze.py
Change the implementation of the functions in your new module to create the obstacles in an
interesting way.
First just create a few obstacles by hand and try that in your robot.py to see how it renders.
Evolve using random obstacles, hard coded obstacles (for a hand-designed maze), or even look at
interesting approaches like procedural generation
Use your new maze in your robots world, and see how it works.
You are welcome to try out different maze implementations.
For now, just change the necessary import statements to choose which one to use.
Run your robot using the turtle world, so that you can visually see the maze and move through it
manually.
STEP 2 Share a maze
Next you must share your maze implementation(s) with your team members. This is where a
tool like git is handy, using a repository everyone can access, such .
This document describes a final exam for an embedded systems lab course involving programming an Arduino robot. The objectives are for teams of two students to program a robot that can navigate a tape playing field to either knock down water bottles or hunt down the opposing team's robot using sensors. The document outlines the playing field design, provides flowcharts describing the code, explains the code and programming process, discusses code limitations, and includes photos and an appendix with a development diary.
This document summarizes a research paper on a shortest path follower robot. It describes the design of a line following robot that can detect the shortest path using IR sensors to follow a black line on a white surface. The robot uses an Arduino microcontroller connected to IR sensors and motors to determine the optimal path between a starting point and destination. It aims to solve the single source shortest path problem by identifying obstacles and navigating efficiently. The system architecture includes IR sensors to detect the line, motors to move the robot, and an Arduino board to process sensor readings and control the motors to follow the shortest route.
The document summarizes the work of a team designing a robot to navigate through an unknown maze efficiently. They tested different code for the robot to follow as well as modifications to its physical design. Their optimal solution involved code using a 3 bump threshold to trigger turns and adjustments that lowered the robot's center of gravity and extended its bumpers. Through testing, their robot averaged 56.7 seconds to complete mazes, demonstrating steady and reliable performance.
Wall follower autonomous robot development applying fuzzy incremental controllerrajabco
This paper presents the design of an autonomous robot as a basic development of an intelligent wheeled mobile robot for air duct or corridor cleaning. The robot navigation is based on wall following algorithm. The robot is controlled us- ing fuzzy incremental controller (FIC) and embedded in PIC18F4550 microcontroller. FIC guides the robot to move along a wall in a desired direction by maintaining a constant distance to the wall. Two ultrasonic sensors are installed in the left side of the robot to sense the wall distance. The signals from these sensors are fed to FIC that then used to de- termine the speed control of two DC motors. The robot movement is obtained through differentiating the speed of these two motors. The experimental results show that FIC is successfully controlling the robot to follow the wall as a guid- ance line and has good performance compare with PID controller.
This document describes the development of an autonomous mobile robot for wall following using a fuzzy incremental controller. Two ultrasonic sensors are used to sense the distance to the wall and provide input to the controller. The controller determines the speed of two DC motors to guide the robot along the wall. Experimental results showed the fuzzy controller successfully controlled the robot to follow the wall, performing better than a PID controller. The robot is intended for applications like cleaning air ducts or corridors by autonomously navigating while maintaining a set distance from the wall.
soulmaite review - Find Real AI soulmate reviewSoulmaite
Looking for an honest take on Soulmaite? This Soulmaite review covers everything you need to know—from features and pricing to how well it performs as a real AI soulmate. We share how users interact with adult chat features, AI girlfriend 18+ options, and nude AI chat experiences. Whether you're curious about AI roleplay porn or free AI NSFW chat with no sign-up, this review breaks it down clearly and informatively.
Exploring the advantages of on-premises Dell PowerEdge servers with AMD EPYC processors vs. the cloud for small to medium businesses’ AI workloads
AI initiatives can bring tremendous value to your business, but you need to support your new AI workloads effectively. That means choosing the best possible infrastructure for your needs—and many companies are finding that the cloud isn’t right for them. According to a recent Rackspace survey of IT executives, 69 percent of companies have moved some of their applications on-premises from the cloud, with half of those citing security and compliance as the reason and 44 percent citing cost.
On-premises solutions provide a number of advantages. With full control over your security infrastructure, you can be certain that all compliance requirements remain firmly in the hands of your IT team. Opting for on-premises also gives you the ability to design your infrastructure to the precise needs of that team and your new AI workloads. Depending on the workload, you may also see performance benefits, along with more predictable costs. As you start to build your next AI initiative, consider an on-premises solution utilizing AMD EPYC processor-powered Dell PowerEdge servers.
Your startup on AWS - How to architect and maintain a Lean and Mean account J...angelo60207
Prevent infrastructure costs from becoming a significant line item on your startup’s budget! Serial entrepreneur and software architect Angelo Mandato will share his experience with AWS Activate (startup credits from AWS) and knowledge on how to architect a lean and mean AWS account ideal for budget minded and bootstrapped startups. In this session you will learn how to manage a production ready AWS account capable of scaling as your startup grows for less than $100/month before credits. We will discuss AWS Budgets, Cost Explorer, architect priorities, and the importance of having flexible, optimized Infrastructure as Code. We will wrap everything up discussing opportunities where to save with AWS services such as S3, EC2, Load Balancers, Lambda Functions, RDS, and many others.
Jeremy Millul - A Talented Software DeveloperJeremy Millul
Jeremy Millul is a talented software developer based in NYC, known for leading impactful projects such as a Community Engagement Platform and a Hiking Trail Finder. Using React, MongoDB, and geolocation tools, Jeremy delivers intuitive applications that foster engagement and usability. A graduate of NYU’s Computer Science program, he brings creativity and technical expertise to every project, ensuring seamless user experiences and meaningful results in software development.
Presentation given at the LangChain community meetup London
https://lu.ma/9d5fntgj
Coveres
Agentic AI: Beyond the Buzz
Introduction to AI Agent and Agentic AI
Agent Use case and stats
Introduction to LangGraph
Build agent with LangGraph Studio V2
Neural representations have shown the potential to accelerate ray casting in a conventional ray-tracing-based rendering pipeline. We introduce a novel approach called Locally-Subdivided Neural Intersection Function (LSNIF) that replaces bottom-level BVHs used as traditional geometric representations with a neural network. Our method introduces a sparse hash grid encoding scheme incorporating geometry voxelization, a scene-agnostic training data collection, and a tailored loss function. It enables the network to output not only visibility but also hit-point information and material indices. LSNIF can be trained offline for a single object, allowing us to use LSNIF as a replacement for its corresponding BVH. With these designs, the network can handle hit-point queries from any arbitrary viewpoint, supporting all types of rays in the rendering pipeline. We demonstrate that LSNIF can render a variety of scenes, including real-world scenes designed for other path tracers, while achieving a memory footprint reduction of up to 106.2x compared to a compressed BVH.
https://arxiv.org/abs/2504.21627
Co-Constructing Explanations for AI Systems using ProvenancePaul Groth
Explanation is not a one off - it's a process where people and systems work together to gain understanding. This idea of co-constructing explanations or explanation by exploration is powerful way to frame the problem of explanation. In this talk, I discuss our first experiments with this approach for explaining complex AI systems by using provenance. Importantly, I discuss the difficulty of evaluation and discuss some of our first approaches to evaluating these systems at scale. Finally, I touch on the importance of explanation to the comprehensive evaluation of AI systems.
Scaling GenAI Inference From Prototype to Production: Real-World Lessons in S...Anish Kumar
Presented by: Anish Kumar
LinkedIn: https://www.linkedin.com/in/anishkumar/
This lightning talk dives into real-world GenAI projects that scaled from prototype to production using Databricks’ fully managed tools. Facing cost and time constraints, we leveraged four key Databricks features—Workflows, Model Serving, Serverless Compute, and Notebooks—to build an AI inference pipeline processing millions of documents (text and audiobooks).
This approach enables rapid experimentation, easy tuning of GenAI prompts and compute settings, seamless data iteration and efficient quality testing—allowing Data Scientists and Engineers to collaborate effectively. Learn how to design modular, parameterized notebooks that run concurrently, manage dependencies and accelerate AI-driven insights.
Whether you're optimizing AI inference, automating complex data workflows or architecting next-gen serverless AI systems, this session delivers actionable strategies to maximize performance while keeping costs low.
Top 25 AI Coding Agents for Vibe Coders to Use in 2025.pdfSOFTTECHHUB
I've tested over 50 AI coding tools in the past year, and I'm about to share the 25 that actually work. Not the ones with flashy marketing or VC backing – the ones that will make you code faster, smarter, and with way less frustration.
Domino IQ – Was Sie erwartet, erste Schritte und Anwendungsfällepanagenda
Webinar Recording: https://www.panagenda.com/webinars/domino-iq-was-sie-erwartet-erste-schritte-und-anwendungsfalle/
HCL Domino iQ Server – Vom Ideenportal zur implementierten Funktion. Entdecken Sie, was es ist, was es nicht ist, und erkunden Sie die Chancen und Herausforderungen, die es bietet.
Wichtige Erkenntnisse
- Was sind Large Language Models (LLMs) und wie stehen sie im Zusammenhang mit Domino iQ
- Wesentliche Voraussetzungen für die Bereitstellung des Domino iQ Servers
- Schritt-für-Schritt-Anleitung zur Einrichtung Ihres Domino iQ Servers
- Teilen und diskutieren Sie Gedanken und Ideen, um das Potenzial von Domino iQ zu maximieren
What is Oracle EPM A Guide to Oracle EPM Cloud Everything You Need to KnowSMACT Works
In today's fast-paced business landscape, financial planning and performance management demand powerful tools that deliver accurate insights. Oracle EPM (Enterprise Performance Management) stands as a leading solution for organizations seeking to transform their financial processes. This comprehensive guide explores what Oracle EPM is, its key benefits, and how partnering with the right Oracle EPM consulting team can maximize your investment.
Establish Visibility and Manage Risk in the Supply Chain with Anchore SBOMAnchore
Over 70% of any given software application consumes open source software (most likely not even from the original source) and only 15% of organizations feel confident in their risk management practices.
With the newly announced Anchore SBOM feature, teams can start safely consuming OSS while mitigating security and compliance risks. Learn how to import SBOMs in industry-standard formats (SPDX, CycloneDX, Syft), validate their integrity, and proactively address vulnerabilities within your software ecosystem.
In this talk, Elliott explores how developers can embrace AI not as a threat, but as a collaborative partner.
We’ll examine the shift from routine coding to creative leadership, highlighting the new developer superpowers of vision, integration, and innovation.
We'll touch on security, legacy code, and the future of democratized development.
Whether you're AI-curious or already a prompt engineering, this session will help you find your rhythm in the new dance of modern development.
Interested in leveling up your JavaScript skills? Join us for our Introduction to TypeScript workshop.
Learn how TypeScript can improve your code with dynamic typing, better tooling, and cleaner architecture. Whether you're a beginner or have some experience with JavaScript, this session will give you a solid foundation in TypeScript and how to integrate it into your projects.
Workshop content:
- What is TypeScript?
- What is the problem with JavaScript?
- Why TypeScript is the solution
- Coding demo
If You Use Databricks, You Definitely Need FMESafe Software
DataBricks makes it easy to use Apache Spark. It provides a platform with the potential to analyze and process huge volumes of data. Sounds awesome. The sales brochure reads as if it is a can-do-all data integration platform. Does it replace our beloved FME platform or does it provide opportunities for FME to shine? Challenge accepted
ELNL2025 - Unlocking the Power of Sensitivity Labels - A Comprehensive Guide....Jasper Oosterveld
Sensitivity labels, powered by Microsoft Purview Information Protection, serve as the foundation for classifying and protecting your sensitive data within Microsoft 365. Their importance extends beyond classification and play a crucial role in enforcing governance policies across your Microsoft 365 environment. Join me, a Data Security Consultant and Microsoft MVP, as I share practical tips and tricks to get the full potential of sensitivity labels. I discuss sensitive information types, automatic labeling, and seamless integration with Data Loss Prevention, Teams Premium, and Microsoft 365 Copilot.
Evaluation Challenges in Using Generative AI for Science & Technical ContentPaul Groth
Evaluation Challenges in Using Generative AI for Science & Technical Content.
Foundation Models show impressive results in a wide-range of tasks on scientific and legal content from information extraction to question answering and even literature synthesis. However, standard evaluation approaches (e.g. comparing to ground truth) often don't seem to work. Qualitatively the results look great but quantitive scores do not align with these observations. In this talk, I discuss the challenges we've face in our lab in evaluation. I then outline potential routes forward.
Evaluation Challenges in Using Generative AI for Science & Technical ContentPaul Groth
Ad
Maze Solver Robot using Arduino
1. INTRODUCTION
A maze is a complicated system of paths from entrance to exit. Maze
solving problem involves determining the path of a mobile robot from its
initial position to its destination while
travelling through environment consisting of obstacles. In addition, the robot
must follow the best possible path among various possible paths present in the
maze.
Maze solving - a seemingly minor challenge for the analytical minds of
humans – has generated enough curiosity and challenges for A.I. experts to
make their machines (robots) solve any given maze.
Applications of such autonomous vehicles range from simple tasks like
robots employed in industries to carry goods through factories, offices,
buildings and other workplaces to dangerous or difficult to reach areas like
bomb sniffing, finding humans in wreckages etc.
Some maze solving methods are designed to be used inside the maze
with no prior knowledge of the maze whereas others are designed to be used
by a computer program that can see whole maze at once. We used the former
one. Our mazes are simply – connected i.e., without any closed loops.
Autonomous maze solving robot that we have built first goes from start to
end searching in the maze using wall follower algorithm (left hand rule) and
then processes out the shortest possible path by eliminating dead ends and
takes that path next time . It does all of that without any human assistance.
OBJECTIVE :
1. Design and build the robot using ultrasonic sensors.
2. Understand and implement wall-follower algorithm.
3. Construct Arduino based program to solve the maze.
COMPONENTS REQUIRED
•Arduino UNO board
•Motor driver
•2 DC motors
•Bread board
•3 ultrasonic sensors
•2 9V battery
•Chassis
•2 tires and ball caster
2. WALL FOLLOWING ALGORITHM
The wall follower, one of the best known rules for traversing mazes, is
also known as either left-hand rule or the right-hand rule. Whenever the robot
reaches a junction, it will sense for the opening walls and select its direction
giving the priority to the selected wall. Here, the selected wall is left! The
priority in left hand wall is in the following order: left, straight and right and if
all the sides are closed, then U-turn.
The robot uses three ultrasonic sensors attached to it, to determine the
distance of the walls in three directions: Front, Left and Right. If the distance
of the wall from the bot is less than the set threshold value in a particular
direction, then the robot assigns it as closed and similarly if it is greater than
the set threshold value, then it assigns it as open. It also uses the left and right
wall distances to align itself parallel to the wall, while moving forward.
START
Reached the end
Left Wall
True
Front Wall
Yes
Right Wall
Yes
Yes
Right turn No
Straight No
U-Turn
FLOW CHART
Left turn No
Now update the array index
and store it in the array!
Call Reduce function to
shorten the path storing
array for any U-Turns, etc.
Now, set the index of the array,
storing the shortened path to the
initial value and continue
traversing the maze from
beginning.
Now, at each node, before
making a turn, verify it with the
final array and adhere to it get to
the end in a shorter way!
END.
False
3. As the bot traverses the maze, its path is stored in an array and is
simultaneously reduced by calling the function ‘reduce’,
if it can be shortened. For example, if we consider the
following part of the maze: at the node where it moved
forward over right, this turn along with its two previous
turns (U-turn and a left turn respectively) can be
shortened to ‘R’! That is, ‘LUF = R’ and now the array
is dynamically modified and the modified path is updated
into it. After reaching the end, the bot again starts from
the beginning and this time, at each node it checks with
the shortened array and follows it to go to the end in a
much shorter path!
In this way, while traversing the maze the bot remembers as well as
shortens and adheres it to reach the end in a much shorter path!
As it is said, the wall follower algorithm is amateur programmers’
favorite! This method works perfectly well for those mazes whose start and
end is wall connected or in other words for those mazes which are simply
connected, but gets stuck in loops. There are many algorithms which have
successfully overcome this problem of getting stuck in loops, but most of them
have one condition, that the maze be known priorly. Examples of some of the
algorithms are: Flood Fill Algorithm which is extensively used in
‘Micromouse Competitions’, the Pledge algorithm, the Tremaux’s algorithm,
etc.
4. CODE
#define fthold 12 // Threshold value in front direction
#define rthold 20 // Threshold value in right direction
#define lthold 20 // Threshold vlaue in left direction
const int t = 1050; // Time alotted for taking 90 degrees for 9V!
int tfr =2750; // Initial Time for which it moves forward when it chooses
forward over right.
int timefr; // Dynamically set time..for which it moves forward,when it
chooses forward over right.
int tlbef=440; // Time for which it moves forward before taking left turn.
int tlaf = 1150; // Time for which it moves forward after taking left turn.
int nf =0; // Number of times it chooses straight over right.
int nlr=0; // Number of times it takes left turn.
bool found=false; // If its true, it indicates that the bot has reached the
end!
int dir[100]; // Array for storing the path of the bot.
int i=-1; // For the indices of the dir array.
int j=0; // Implies the number of intersections bot passed through.
// Front US sensor.
const int trigPinf = 2;
const int echoPinf = 6;
// Right US sensor.
const int trigPinr = 8;
const int echoPinr = 5;
// Left US sensor.
const int trigPinl = 3;
const int echoPinl = 9;
//Booleans for recognising the walls. True if resp sensor distance is less than
the resp threshold vlaue.
bool fsensor; // For the front US sensor.
bool rsensor; // For the right US sensor.
bool lsensor; // For the left US sensor.
// Sorts and returns the median value of a five element array.
float middleval(float arr[])
{
for(int p=0;p<4;p++)
{
for(int q=0;q<4;q++)
{
The below code contains code both for Maze
Solving and Shortest path Simplification.
5. if(arr[q]>arr[q+1])
{
int temp = arr[q];
arr[q] = arr[q+1];
arr[q+1] = temp;
}
}
}
return arr[2]; // Median value.
}
// Moves the bot in the forward direction
void gofront()
{
// Moves forward adjusting its path
float ldist1 = leftdist();
float lconst = ldist1;
while(ldist1<=5) // Should turn a little to its right
{
digitalWrite(10,HIGH);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
digitalWrite(13,LOW);
delay(t/65);
ldist1 = leftdist();
if(abs(lconst - ldist1)>=0.8||(ldist1>=3.6)){break;}
}
float rdist1 = rightdist();
float rconst = rdist1;
while(rdist1<=5.4) // Should turn a little to its left
{
digitalWrite(10,LOW);
digitalWrite(11,LOW);
digitalWrite(12,HIGH);
digitalWrite(13,LOW);
delay(t/65);
rdist1 = rightdist();
if(abs(rconst - rdist1)>=0.9){break;}
}
if(leftdist()>=7.2) // Will move little to its left if its too far from
the left wall
{
digitalWrite(10,LOW);
digitalWrite(11,LOW);
digitalWrite(12,HIGH);
digitalWrite(13,LOW);
6. delay(t/30);
}
digitalWrite(10,HIGH);
digitalWrite(11,LOW);
digitalWrite(12,HIGH);
digitalWrite(13,LOW);
}
// Returns the dist of wall in front of it
float frontdist()
{
float gapf;float ticktockf;
digitalWrite(trigPinf,LOW);
delayMicroseconds(2);
digitalWrite(trigPinf,HIGH);
delayMicroseconds(10);
digitalWrite(trigPinf,LOW);
ticktockf = pulseIn(echoPinf,HIGH); // in one cm there are 29 microseconds.
gapf = ticktockf*0.0344/2;
return gapf;
}
// Returns the distance the wall to its right side
float rightdist()
{
float gapr;float ticktockr;
digitalWrite(trigPinr,LOW);
delayMicroseconds(2);
digitalWrite(trigPinr,HIGH);
delayMicroseconds(10);
digitalWrite(trigPinr,LOW);
ticktockr = pulseIn(echoPinr,HIGH);
gapr = ticktockr*0.0344/2;
return gapr;
}
// Returns the distance of the wall to its left side
float leftdist()
{
float gapl;float ticktockl;
digitalWrite(trigPinl,LOW);
delayMicroseconds(2);
digitalWrite(trigPinl,HIGH);
delayMicroseconds(10);
digitalWrite(trigPinl,LOW);
ticktockl = pulseIn(echoPinl,HIGH);
8. digitalWrite(10,LOW);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
digitalWrite(13,LOW);
}
// When it has to move forward according to the shortest path.(At some
intersection)
void frontturn()
{
for(int n=1;n<=8;n++)
{gofront();delay((timefr)/8);}
digitalWrite(10,LOW);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
digitalWrite(13,LOW);
delay(1000);
}
// When it has to take a right turn according to the shortest path.
void rightturn()
{ stopit();delay(1000);
float prevfdist = frontdist();
//while( abs(frontdist()-prevfdist)<=(prevfdist/2)-1)
for(int n=1;n<=5;n++)
{gofront();delay(260);}
digitalWrite(10,HIGH);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
digitalWrite(13,HIGH);
delay(t);
// gofront();delay(2400);
float prevfrdist = frontdist();
while( abs(frontdist()-prevfrdist)<=18)
/* for(int n=1;n<=10;n++)*/
{gofront();delay(300);}
digitalWrite(10,LOW);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
digitalWrite(13,LOW);
delay(1000);
}
void setup() // put your setup code here, to run once:
{
9. // US pins setup..
pinMode (trigPinf,OUTPUT);
pinMode (echoPinf,INPUT);
pinMode (trigPinr,OUTPUT);
pinMode (echoPinr,INPUT);
pinMode (trigPinl,OUTPUT);
pinMode (echoPinl,INPUT);
pinMode( 4,INPUT); // FOR THE IR SENSOR...
// Motor pins.
pinMode(10,OUTPUT);
pinMode(11,OUTPUT);
pinMode(12,OUTPUT);
pinMode(13,OUTPUT);
Serial.begin(9600); //staartingg serial communication...9600 bits per second.
// dir[0] = 0; // initial direction..?
}
void loop() // put your main code here, to run repeatedly
{
if(nlr==7)
{
found=true; // Reached the end.
for(int i=0;i<=2;i++){Serial.print(dir[i]);}
i=-1;j=0; nlr=0; // Back to start again..
// Stops the bot for 30 seconds after reaching the end.
digitalWrite(10,LOW);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
digitalWrite(13,LOW);
delay(30000);
}
float fdist; float rdist; float ldist; // front, right, and left
distances.
float fduration;float rduration;float lduration; // front, right, and left
travel time in echoPin.
float fdur[5]; float rdur[5]; float ldur[5]; // Arrays which store the
values of five durations... We will take only the median value(afer sorting)
with error bearing capacity of 40%.
float ldista[5];
// For the front US sensor..
for(int i=0;i<=4;i++)
{
digitalWrite(trigPinf,LOW); // Clearing the trigPin.
delayMicroseconds(5);
digitalWrite(trigPinf,HIGH); // Setting the trigPin HIGH for 10
microseconds..sends some 8cycle sonics.
10. delayMicroseconds(10);
digitalWrite(trigPinf,LOW);
fdur[i] = pulseIn(echoPinf,HIGH); // Returns the time for which the wave
travelled.
}
fduration = middleval(fdur);
fdist = fduration*0.0344/2; // Distance of the wall in the forward
direction
/*Serial.print("frontdistance: ");
Serial.println(fdist);*/
// for the right US sensor...
for(int i=0;i<=4;i++)
{
digitalWrite(trigPinr,LOW);
delayMicroseconds(5);
digitalWrite(trigPinr,HIGH);
delayMicroseconds(10);
digitalWrite(trigPinr,LOW);
rdur[i] = pulseIn(echoPinr,HIGH);
}
rduration = middleval(rdur);
rdist = rduration*0.0344/2; // Distance of the wall to its right.
/* Serial.print("rightdistance: ");
Serial.println(rdist);*/
// for the left US sensor....
for(int i=0;i<=4;i++)
{
digitalWrite(trigPinl,LOW);
delayMicroseconds(5);
digitalWrite(trigPinl,HIGH);
delayMicroseconds(10);
digitalWrite(trigPinl,LOW);
ldur[i] = pulseIn(echoPinl,HIGH);
}
lduration = middleval(ldur);
ldist = lduration*0.0344/2; // Distance of the wall to its left side
/* Serial.print("leftdistance: ");
Serial.println(ldist);*/
if((fdist>=125)||(rdist>=150)||(ldist>=400)) {return;} // Cancelling out any
error values...goes back to void loop().
11. fsensor = false;rsensor = false;lsensor = false; // Setting up the
booleans.
if(rdist<=rthold) rsensor = true;
if(ldist<=lthold) lsensor = true;
if(fdist<=fthold) fsensor = true;
// Left
Wall Following Algorithm!
// If left is closed-
if((lsensor==true))
{ // for a U-turn..
if((rsensor==true)&&(fsensor==true))
{ j=j+1;
i=i+1;
dir[i] = 3;
reduce(dir,i);
digitalWrite(10,HIGH);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
digitalWrite(13,HIGH);
delay(2*t);
}
// If Front is open..
else if(fsensor==false)
{
if((rsensor==false)&&(frontdist()>=40)) // If both front and right are
open..
{
i = i+1;
j=j+1;
if((found==true)&(dir[i]!=0)) // After reaching the end ... checks the s
{
rightturn();return;
}
else
{
if(found==false){
dir[i] = 0; // moving forward over right...
reduce(dir,i);
}
/*Serial.print("for the jth turn ..");Serial.print(" =");Serial.print(j);
Serial.print(" the i value is ");Serial.print(i);Serial.print("and the dir
is ..");Serial.println(dir[i]);*/
timefr = tfr + 65*nf;
nf=nf+1;
12. stopit();delay(1000);
for(int g=1;g<=10;g++){gofront();delay(timefr/10);}
stopit();delay(1000);
}
}
else {gofront();delay(300);} // Else moving forward .. only front is open.
}
//for a right turn..
else
{
i = i+1;
j=j+1;
dir[i] = 2;
reduce(dir,i);
float prevfdist = frontdist();
while( abs(frontdist()-prevfdist)<=(prevfdist/2)-2)
{gofront();delay(300);if(frontdist()<=4.5){break;}}
digitalWrite(10,HIGH);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
digitalWrite(13,HIGH);
delay(t);
float prevfrdist = frontdist();
while( abs(frontdist()-
prevfrdist)<=15.2){gofront();delay(300);if(frontdist()<=4.5){break;}}
}
}
else
{
//for a left turn..
i=i+1;
j=j+1;
if((found==true)&&(dir[i]!=1)){
if((dir[i]==2)&&(rightdist>=10)){rightturn();return;}
else if((dir[i]== 0)&&(fsensor==false))
{frontturn();return;}
}
else{
dir[i]=1; // Left turn..
13. nlr=nlr+1;
reduce(dir,i); //calling reduce function to shorten the path
dynamically..if path is not yet completed
{gofront(); delay(tlbef);}
digitalWrite(10,LOW); // takes a left turn..
digitalWrite(11,LOW);
digitalWrite(12,HIGH);
digitalWrite(13,LOW);
delay(2*t);
for(int n=1;n<=8;n++) { gofront();delay(tlaf/8);}
stopit();delay(1000);
}
}
delay(320);
}