0% found this document useful (0 votes)
37 views148 pages

Best Raspberry PI Zero Project For Home Automation Part 3

The document outlines various Raspberry Pi Zero projects for home automation, including setting up IoT devices, building a smart security camera, and creating a temperature monitoring system. It provides step-by-step instructions for each project, including hardware setup, software installation, and coding. The Raspberry Pi Zero is highlighted as an affordable and versatile option for DIY tech enthusiasts looking to explore home automation solutions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views148 pages

Best Raspberry PI Zero Project For Home Automation Part 3

The document outlines various Raspberry Pi Zero projects for home automation, including setting up IoT devices, building a smart security camera, and creating a temperature monitoring system. It provides step-by-step instructions for each project, including hardware setup, software installation, and coding. The Raspberry Pi Zero is highlighted as an affordable and versatile option for DIY tech enthusiasts looking to explore home automation solutions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 148

Best Raspberry PI Zero Project For Home

Automation Part 3

Index Of Contents
Best Raspberry PI Zero Project For Home Automation Part 3
Cayenne on Raspberry IoT Light Room
Introduction
Step 1: Cayenne drag and drop IoT project builder
Step 2: Install Adafruit PCA9685
Step 3: Wire the hardware
Step 4: First tests and final result
Schematics
Code
Play With Raspberry Pi ZERO and PHat
Step 1: Download the pHat Library
Install the library on Python 3
Install the library on Python 2
Step 2: Connect the pHat and try the code
Step 3: Try to make the pHat sparkle
Code
Read a Raspberry Pi Filesystem on Windows
Introduction
Install the Software
Copy the SD Card
Extract the Data
Save and Browse
Schematics
Raspberry Pi Zero + Repetier Server
Step 1: Open Putty (or any other SSH client) and connect to your Pi
Step 2: Open Server and your done!
Smart Security Camera
Building the Housing
Cutting and Gluing the Housing
Drilling Lots of Holes
Paint!
Final Assembly
Mount Electronics
Code
Receiving Emails
Mounting the Camera
Show it Off
Schematics
Code
MQTT Alarm Control Panel for Home Assistant
Custom parts and enclosures
Schematics
Code
Raspberry Pi Automated Plant Watering with Website
Materials:
Wiring:
RPi Wiring:
Hardware Setup:
Software
GPIO Script
Flask Webserver
Run Website Automatically
Code
J.A.R.V.I.S
Just A Rather Very Intelligent System.
JARVIS Brain
JARVIS Things
JARVIS Mobile
JARVIS Amazon Alexa
JARVIS Web
Code
Home Hidroneumatic Controlled by ESP8266 + MQTT + Web App
Overview
Block Diagram
Install all software needed and connect all devices
Software details
Hardware details
Schematics
Code
Laser Shootin' Robot
Introduction
Parts Needed
Assembly
Motor Control
The Controller
Using the Robot
Custom parts and enclosures
Schematics
Code
PiClock: RGB LED Smart Clock Using Raspberry Pi Zero W
Introduction:
Sensors:
Schematics
Code
SNES Pi: Zero
1. Clearing out the inside of the cartridge.
2. Installing the new parts inside of the cartridge
3. Finishing touches
4. Glamor shots!
Raspberry Pi as a Robotic Arm Controller with Flick HAT
Putting Everything Together
Preparing the Pi
The Testing Stage
Code
Headless Google Assistant with Startup Audio
First, clone the project
Code
DIY Raspberry Pi Temperature System with Ubidots
Requirements
Wiring Setup
With resistor integrated - with grove connector
Without resistor integrated - without grove connector
Sensor Setup
Sending data to Ubidots for visualization
Now it is time to code!
Optional Steps: Rename the Device and Variables
Event Setup
Result
The rapidly increasing popularity of single board computers has
ushered in a completely new era of personal computing. Whether
you are a budding programmer, or a hobbyist geek with a penchant
for tinkering and DIY tech projects, a single board computer is one of
the best things you can get for yourself. And even though there are
many of them to choose from, the Raspberry Pi series of single
board PCs remains the most popular among the single board
computers.
If you’re looking forward to buying your first Raspberry Pi, the
ridiculously cheap Raspberry Pi Zero is easily the best choice.
Retailing for a paltry $5, it comes with a 1 GHz single core CPU,
along with 512 MB RAM. In fact, chances are really high that you’ve
already got one for yourself. But you might be wondering, just what
amazing stuff can you do with this credit-card sized thing?
Here Best Raspberry PI Zero Project For Home
Automation Part 3
Cayenne on Raspberry IoT Light Room
Introduction
The antefact is a new room without light, a big white wardrobe, and
the desire to make a IoT device that I can use outside my house. For
the IoT "flavour" I use Cayenne.
In this room I want make a hidden light behind the wardrobe. I want
use the reflect of light upper the withe ceiling. I have mounted 5 mt of
5050 RGB LED strip up the wardrobe, and I've connected a
Raspberry Pi (do you want one?) with Cayenne to Adafruit PC9685,
that you can buy here, and after to my simple transistor board.
The Adafruit pca 9685 is a great shield that by the I2C connection
give 16 pwm port. YES! It's great!
With this board you can use the servo motors, and also you can use
a dimmable LED. But if you want use a long led strip, you must also
use a 12V DC power supply. But all the system like Raspberry,
PCA9685 and other, works by 5V DC power. For this reason I've
made a simple transistor board for use 12 Volt power supply with all
the components.
Step 1: Cayenne drag and drop IoT project builder
Cayenne is a great partner for IoT project. Cayenne supports many
platforms, and you can use the dashboard to make your project. You
can light a LED, or modify the temperature of your house by using
your smartphone.
Cayenne is simple and free! Try NOW Cayenne. Use it for your
project. The installation is automatic and very simple. See the docs of
Cayenne. Take your Raspberry Pi and install on it Raspbian distro.
Connect your Raspberry to your local network and to the internet.
Go to Cayenne and Sign Up. Use your smartphone iOS or Android,
install the Cayenne app and find your Raspberry Pi:
Install the Cayenne demon on your Raspberry Pi. If you not find your
Raspberry Pi, type in your Raspberry Pi terminal:

wget https://cayenne.mydevices.com/dl/rpi_pa6vva5ic6.sh
sudo bash rpi_pa6vva5ic6.sh -v
Step 2: Install Adafruit PCA9685
After the reboot of your Raspberry, you can see your device online.
Then you can add a new widget. The widget is a PCA9685.

Choose extensions, and choose PWM. You can see the photo below.
Now you can choose the name and the device. You must choose the
slave. Now try to add the PCA9685 device.
If the installation doesn't work, try to add manually the I2C
configuration. Follow this guide to add I2C on your Raspberry Pi.
After this you can see the slave address that you must add to
Cayenne installation page. Try to write on terminal of Raspberry Pi:
sudo apt-get update
sudo apt-get install i2c-tools
sudo i2cdetect -y 1
Now you can see the address. The address is ever 40. Now try to
add the PCA9685 device to your Cayenne Dashboard.
Step 3: Wire the hardware
Now is the hardware time!
Try to connect the components. I've used Fritzing, but Fritzing don't
have yet Adafruit PCA components. For this reason, I've used a note.
Connect the components. You have to make two controls for the 12
Volts connections.

If you connect the cable in wrong way you can burn all. The
Raspberry Pi and the Adafruit PCA9685. The 12 VCC+ (anode) only
connects to RGB led strip. You have to connect the - (cathode) of 12V
battery to your transistor shield or your bradboard. The led strip is
connected to 12V battery (+), and the three cathode Red, Green, and
Blue, are connected to three transistors.
Step 4: First tests and final result

Now you can test all the project. I have mounted the project in my
room. This is the final result! Great! An emotional room that I can
command by my smartphone or my iPad from anywhere in the world!
This is the power of Internet of Thing. This is the power of Cayenne!
Schematics
Connect the hardware

Download Project on Fritzing


Code
First step install
wget https://cayenne.mydevices.com/dl/rpi_pa6vva5ic6.sh
sudo bash rpi_pa6vva5ic6.sh -v
Second step install i2c tools
sudo apt-get update
sudo apt-get install i2c-tools
sudo i2cdetect -y 1
Play With Raspberry Pi ZERO and PHat

Raspberry Pi zero is a smallest version of Raspberry Pi. The


Raspberry Pi is not a comestible kind, but is a Microcomputer. A
single board computer that have CPU, RAM, HD, all connections like
USB, HDMI etc.
I have also discovered a small led matrix 11x5 by Pimoroni, the pHat.
This is a small and pretty led matrix that with this little pretty
Raspberry Pi, the Raspberry Pi zero, make a very kindly object. A led
scrolling led matrix, that you can program in Python.
I show you where to buy the Raspberry Pi, where buy the pHat, and
how to install the library and the code for work with the pHat matrix.
On Amazon you can buy the
1. pHat
or Unicorn Hat
2. Raspberry Pi
The Raspberry Pi Zero is available only on Pimoroni
Raspberry Pi Zero
Step 1: Download the pHat Library

After the soldering step, you can connect the pHat to your Raspberry
Pi zero.
In the Terminal window, type:
sudo raspi-config
put the password (default is raspberry)
1. Go to Advanced Options
2. Select I2C
3. Enable the I2C port
Install the library on Python 3
Open a Terminal and type:
sudo pip3 install scrollphat
or
sudo apt-get install python3-scrollphat
Install the library on Python 2
Open a Terminal and type:
sudo pip2 install scrollphat
or
sudo apt-get install python-scrollphat
Step 2: Connect the pHat and try the code
You can download the three files on your Raspberry Pi. The graph file
generate a random lines on your pHat, the tempCpu, show the CPU
temperature of your Raspberry Pi zero, and the meteoShare.py show
the local meteo condition.
If you want use meteo file you must install the pyown library by
terminal:
sudo pip install pyown
After you must provide a valid API key by using pyown library. The
weather service you're going to use in this resource is called
OpenWeatherMap. It's a completely free service, and has an easy-to-
use API. You're going to need your own account though, so click on
the link to go to the website: http://openweathermap.org
FSABFGZIW6PPDY5.py
F0AGRJDIW6PR3BX.py
FZPOQNTIW6PT2AZ.py
Step 3: Try to make the pHat sparkle

In the Terminal window, go to the same folder where are graph.py file,
and type:
python graph.py
or
python3 cpuTemp.py
or
python meteoShare.py
See the result!
See other examples on GitHub
Code
Pimoroni pHAT examples
https://github.com/pimoroni/scroll-phat/tree/master/examples
Download as zip
Read a Raspberry Pi Filesystem on Windows
Introduction
Ever find yourself into a situation where you want to copy a file from
your Raspberry Pi only to find that all you have is the SD card with
Raspbian OS on it. Well, I did, so here's a tutorial on how to read the
contents of a Raspbian SD image on Windows.
Normally you cannot mount a Raspbian filesystem on Windows using
the Windows 10 mount functions since it uses a different filesystem.
(ext2 to be specific). So we'll have to take another route.
Install the Software
You'll need to install:
win32diskimager for copying the SD image
7-zip for extracting the data
ext2read for browsing and reading the Linux FS
Copy the SD Card
Create an empty file and rename it raspbian.img.
Insert you SD card.
Use win32diskimager to read the SD card and direct it to the newly created file.
Press yes when prompted to overwrite the file.
After a couple of minutes you will have a copy of the Raspbian image
on your PC.
Extract the Data
Install 7-zip if you don't have it already.
Open the .img file with it by right clicking and selecting open.
You'll see that you'll have at least 2 files.
In my case the .img system image has 3 files since the filesystem has
been expanded.
Copy the file 1.img which represents the data on the Raspbian
directory.
Save and Browse
Now right click on ext2read and run it as a system administrator.
Open the file 1.img with ext2read. You can now browse the
Raspberry file system on Windows. if you want to save a directory or
a file just right click on the selected directory and save it on your
Windows laptop.
That's all folks.
Schematics
ext2explore
Use 3xt2explore to read Raspbian ext2 FS
Raspberry Pi Zero + Repetier Server

If you managed to get a Pi Zero then you have noticed it is crazy tiny
and it works well for low-er use CPU programs, one of which is
Repetier-Server. Repetier-Server is a bridge between you Repetier-
Firmware controlled 3D Printer and your web browser. The setup is
easy; you will need the fallowing to get started:
1) Raspberry Pi Zero
2) OTG USB HUB or OTG adapter and USB HUB
3) Power cable
4) USB Ethernet or WiFi adapter
5) SD card with Raspbian image
Step 1: Open Putty (or any other SSH client) and connect to your Pi
Enter the fallowing commands to install the server:
1) wget http://download.repetier.com/files/server/debian-a... -O repetier_armel.deb
2) sudo dpkg -i repetier_armel.deb
3) sudo service RepetierServer start
It's done!
Step 2: Open Server and your done!
The server address will be by default '[ipaddress]:3344'.
Smart Security Camera

Building the Housing


The housing consists of two components made from 1/4" (really 0.22"
- 0.23") medium density fiber (MDF) board. A bottom portion holds
the camera and Raspberry Pi Zero in place, and a top portion
attaches to the mounting bracket. We drew out some basic designs
with a few holes to mount the components. Schematics of the upper
and lower portions of the housing with exact dimensions can be found
in the "Schematics" section, but you may want to vary your
dimensions based on the parts and hardware available to you.
Cutting and Gluing the Housing
Once we had our basic design, we created some paper templates to
cut the MDF board. Paper templates aren't needed to cut the MDF
board, but we found that it was easier to position the parts on the
board. They can also be saved for later and used to build another
camera if you make a mistake while cutting. The exact
measurements for this template can be found in the attachments
section below. Trace and cut out each piece, being sure to label
them. Then trace the template pieces onto the MDF board.
Using an electric jigsaw, cut out each piece of the housing. Be sure to
wear safety glasses, cut in a well ventilated area, and wear a
breathing mask. The fine dust generated is not good for you. All the
dust also tends to cover up the traced line and makes it hard to see
where to cut. We found it really helps to have a second person
vacuum up the dust as you cut. You can also use an air compressor
to blow it away if you have one.
Assemble the pieces together using wood glue according to the
schematics. Use clamps to help you hold them in place while the
wood dries.
Drilling Lots of Holes
Get ready to drill.
For the camera mounting holes, because there are four small holes
for mounting, you want to be very careful when measuring and
marking the hole locations on the front face of the housing. We found
the camera mounting holes form a 21.7 mm x 13.3 mm rectangle.
Line the actual camera module up with your marked holes to be sure
they are in the proper location.
After marking the holes, use a small drill bit to drill 4 pilot holes in
those locations. Then drill through with a 7/64" drill bit. In the video,
you'll see us drilling counterbores at these locations too for the screw
heads to fit in. But this isn't necessary if you bought long enough
M2.5 screws, unlike us.
The hole for the camera lens takes three steps because the camera
lens does not sit flush with the rest of the circuit board and needs a
place to fit in when mounted up against the housing.
First, drill a pilot hole exactly in the center of the two top camera
mounting holes. Then, flip the housing over and use a 1/2" drill to drill
a counterbore on the inside face just deep enough for the lens to sit
in. Go very slowly when drilling the counterbore because it's easy to
go straight through the housing. Practicing on a scrap piece is highly
recommended. Lastly, you can use the same pilot hole to drill all the
way through with a 5/16" drill bit.
Now we need to drill a hole for the micro-USB power cable to go
through. Since the upper housing will cover the sides of the lower
housing, we need to drill the same hole in the same location on both
portions of the housing.
Line the Raspberry Pi up in the lower portion of the housing with the
ribbon cable port at the back of the housing so that the cable won't
crease or bend too much. Then, mark the location of the micro-USB
power port on the side of the housing. On the side of the housing,
mark the vertical position of this hole as close to the inside bottom
surface of the housing as possible without allowing the 1/2" drill you
will use to hit the bottom of the housing; this is because the power
port on the Pi will sit very close to the bottom, and you don't want
your hole too high.
Mark a corresponding hole location on the upper housing, and make
sure they line up when you put them together. Next, clamp the upper
and lower housing together and drill straight through both pieces with
a small drill to form pilot holes. After double checking the holes are in
the right place, use the pilot holes to drill through with a 1/2" drill.

And now, finally, more holes! Line your metal mounting bracket up
with the upper portion of your housing and find a hole to use as your
pivot point. Make sure there is plenty of travel space as you pivot the
bracket around. Also make sure it is high enough to not interfere with
the sides of the bottom housing. Mark this location and drill a hole of
an appropriate size to match the hole in your bracket. In our case, we
used an 11/64" drill to fit a #8 screw.
Okay, these are actually the last holes you need to drill. We need to
be able to attach the upper housing to the lower housing. We chose
to place two hex nuts inside one wall of the lower housing, allowing
the outer housing to be screwed into the lower housing using these
nuts. Drill two holes on the side of the lower housing with a 5/16" drill
bit, making sure they are placed high enough that inserted screws will
not run into the Raspberry Pi.
Paint!
Now that all the holes are drilled, paint the housing with black spray-
paint (or whatever you want) to make it look nice.
Final Assembly
Almost there. The metal mounting bracket for the camera is attached
so that the angle of the camera can be easily adjusted. Slip a #8 x
3/4" screw through the 11/64" pivot hole in the upper housing from the
inside, and secure it in place with a #8 hex nut on the outside.
Then slip the metal bracket onto the screw on top of the hex nut, and
hold it in place with a #8 wing nut. The wing nut allows you to easily
loosen the bracket, adjust the angle, and re-tighten.
To mount the hex nuts into the holes of the bottom portion of the
housing, use a cotton swab (or a piece of paper if you don't have one)
to coat the inside surface of the holes. Then carefully slide the M4
hex nuts into the holes and allow them to dry. Be careful not to get
much glue onto the threads of the hex nuts, and wipe off any excess
glue.
The housing is now complete!
Mount Electronics
Using four M2.5 screws, screw the camera onto the front piece of the
housing. We found that M2.5 screws actually threaded into the holes
of the camera module slightly, so no tiny nuts were required. Then,
plug the ribbon cable of the camera module into the video port on the
Raspberry Pi.
Place the Raspberry Pi into the housing with the ribbon cable port
facing the back end of the housing. This allows the ribbon cable to
arc smoothly from the camera module across the housing to the
Raspberry Pi.
Being careful not to pinch or crease the ribbon cable, slide the upper
housing onto the lower housing.
Finally, screw the two portions of the housing together using two M4 x 20 mm screws, and
plug in the micro-USB cable.

Congratulations! You've put together your own security camera. Now


you're ready to make it a smart security camera.
Code
This project uses a Raspberry Pi Camera to stream video. Before
running the code, make sure to configure the Raspberry Pi camera
on your device.
Open the terminal and run
sudo raspi-config
Select Interface Options , then Pi Camera and toggle on. Press Finish and
exit.
You can verify that the camera works by running
raspistill -o image.jpg
which will save a image from the camera in your current directory.
You can open up the file inspector and view the image.
This project uses openCV to detect objects in the video feed. You can
install openCV by using the following tutorial. I used the Python 2.7
version of the tutorial.
The installation took almost 8 hours (!!) on my Raspberry Pi Zero, but
it would be considerably faster on a more powerful board like the
Raspberry Pi 3.
The tutorial will prompt you to create a virtual environment. Make
sure you are using the virtual environment by typing the following
commands
source ~/.profile
workon cv
Next, navigate to the repository directory
cd Smart-Security-Camera
and install the dependencies for the project
pip install -r requirements.txt
To get emails when objects are detected, you'll need to make a
couple modifications to the mail.py file.
Open mail.py with vim vimmail.py , then press i to edit. Scroll down to the
following section
# Email you want to send the update from (only works with gmail)
fromEmail = '[email protected]'
fromEmailPassword = 'password1234'
# Email you want to send the update to
toEmail = '[email protected]'
and replace with your own email/credentials. The mail.py file logs into
a gmail SMTP server and sends an email with an image of the object
detected by the security camera.
Press esc then ZZ to save and exit.
You can also modify the main.py file to change some other properties.
email_update_interval = 600 # sends an email only once in this time interval
video_camera = VideoCamera(flip=True) # creates a camera object, flip vertically
object_classifier = cv2.CascadeClassifier("models/fullbody_recognition_model.xml") # an
opencv classifier
Notably, you can use a different object detector by changing the
path "models/fullbody_recognition_model.xml" in object_classifier=cv2.CascadeClassifier("
models/fullbody_recognition_model.xml")
to a new model in the models directory.
facial_recognition_model.xml
fullbody_recognition_model.xml
upperbody_recognition_model.xml
Run the program
python main.py
You can view a live stream by visiting the ip address of your pi in a
browser on the same network. You can find the ipaddress of your
Raspberry Pi by typing ifconfig in the terminal and looking for the inet
address.
Visit <raspberrypi_ip>:5000 in your browser to view the stream.
Note: To view the live stream on a different network than your
Raspberry Pi, you can use ngrok to expose a local tunnel. Once
downloaded, run ngrok with ./ngrok http 5000 and visit one of the
generated links in your browser.
Note: The video stream will not start automatically on startup. To start
the video stream automatically, you will need to run the program from
your /etc/rc.local file see this video for more information about how to
configure that.
Receiving Emails
When receiving an email for the first time, you might get the following
notification from Google:
By default, Google blocks apps from using SMTP without
permissions. We can solve this by clicking on the allow "less secure
apps" link and toggle the feature on. The next object detected will
send an email.

Mounting the Camera


The security camera Secan be easily positioned on a shelf or counter
inside if you want to monitor your home while out of town. We set it
on a shelf in our living room and it worked relatively well with normal
lighting. Flipping the metal bracket 180 degrees will help level the
camera.

You can also mount the camera outside your house or by your front
door with some 3M outdoor mounting tape.
We stuck a large piece on the top of the metal bracket and mounted it
on an overhang beside our front door. The tape should be strong
enough to support the weight of the camera. You can plug the camera
in outside with an extension cord or route the power wire through the
hinge in your front door. We used the wing nut to tilt the camera down
toward where a person would stand by the front door.
Note: the object detection works better if the camera is
positioned at a lower angle (e.g. at the same level as the person
it is detecting)
Show it Off
That's it! You've finished making an automated smart security
camera. Now go beef up your home security security system with this
cool little device!
Schematics
Housing Bottom Schematic
Dimensions for the MDF housing. The dimensions that you use may be different depending
on the hardware and MDF board thickness you have. Use this schematic as a general
guideline.

Housing Top Schematic


Dimensions for the MDF housing. The dimensions that you use may be different depending
on the hardware and MDF board thickness you have. Use this schematic as a general
guideline.
Code
Download as zip
MQTT Alarm Control Panel for Home Assistant
Overview
This project is a MQTT Alarm Control Panel for use with Home
Assistant's Manual Alarm Control Panel component. This project was
built using a Raspberry Pi 3 and a 7" touch screen display running
Android Things.

Home Assistant
Home Assistant is an open source home automation platform with the
ability to integrate with many hardware components as well as offer
custom features for automation. Recently, contributor Colin O'Dell
added a MQTT Manual Control Panel which allows for
communication between a custom alarm panel and the manual alarm
panel with MQTT.
I wanted an alarm panel which interfaces for Home Assistant's alarm
control panel component and gives me the functionality to control my
alarm as well as be mounted near the front door. Using Android
Things allowed me to make a rich user interface and also add
additional software features.
In my home I have several Z-Wave door sensors, a motion detector,
and siren hooked up to Home Assistant. With the manual alarm
control panel and some automation, I can activate the siren as well
send a notification using IFTTT whenever the alarm is triggered.
From my control panel I can arm/disarm the alarm through the
interface as I enter or leave.
Hardware
Raspberry Pi (needs to be compatible with Android Things)
Touchscreen display, I used a 7" screen but any touch screen compatible with
Android Things would work.
Micro SD card to burn the Android Things image
Micro USB cable for power.
Piezo Buzzer (optional if you want button sounds)
Home Assistant Setup
You should be using Home Assistant (v.0.50 or above) which adds
the new MQTT Manual Control Panel as well as having a MQTT
component setup and running. I won't go into detail about how to
setup Home Assistant or MQTT because the documentation provided
by Home Assistant is better than I could provide. Just be sure you
have the the alarm control panel and MQTT running first, also
connect any sensors you might need.
Raspberry Pi Setup
Make sure you properly setup the RPi3 with the 7" Touchscreen
Display. You won't need any special software setup if you use the The
Raspberry Pi Foundation 7" Touchscreen as it's compatible with
Android Things. Other compatible touch screens may require
additional configuration for Android Things.
There are two ways to setup Android Things and also load the alarm
control panel application.
1) The first option is to download the latest build (zip file) from the
project's Github release section which includes Android Things
Preview 0.4.1 with the application already pre-installed. This option
also allows you to receive future OTA updates.
Unzip the file to get the 'iot_rpi3.img'.
Burn image to an SD card using a tool like Etcher.
Insert the SD card into RPi3 and boot.
2) The second option is to setup your RPi3 to use Android Things
0.4.1-devpreview for Raspberry Pi 3 and then build and install the
application manually using Android Studio.
Unzip the file to get the 'iot_rpi3.img'.
Burn image to an SD card using a tool like [Etcher](https://etcher.io/).
Insert the SD card into RPi3 and boot.
Clone the repository and compile the APK using Android Studio, then sideload
the APK file onto your device using the ADB tool.
Be sure to set up network access either using WiFi or ethernet. If you
setup WiFi be sure to unplug the Ethernet cable, at this time Android
Things can't use both.
# Use the adb tool to connect over ethernet to the device
adb connect Android.local
# Then set your your WiFi SSID and password
adb shell am startservice \
-n com.google.wifisetup/.WifiSetupService \
-a WifiSetupService.Connect \
-e ssid <Network_SSID> \
-e passphrase <Network_Passcode>
You probably also want to set the time and timezone of the device:
# Reboot ADB into root mode
$ adb root
# Set the date to 2017/12/31 12:00:00
$ adb shell date 123112002017.00
# Set the time zone to US Mountain Time
$ adb shell setprop persist.sys.timezone "America/Denver"
Now connect the buzzer as shown in the sample diagram from the
Android Things drivers repository. This part is optional if you want to
have sound feedback.
Alarm Application Setup
The complete code for the application running on Android Things can
be found in my GitHub repository:
https://github.com/thanksmister/androidthings-mqtt-alarm-panel.
When you first start the application you will be asked to go to the
setting screen and enter your pin code. You also need to enter the
MQTT information that you configured in Home Assistant for your
MQTT service. Be sure you adjust the time intervals to match those
set in the Home Assistant MQTT alarm control panel.
The application should then connect to your MQTT broker and
display the current state of the alarm control panel in Home Assistant.
Whenever the state of the alarm control panel in Home Assistant
changes, the alarm control panel will reflect those changes via MQTT
communication. In order to deactivate the alarm, you will need to
enter you pin code within the time allowed.

To set the alarm just select the main icon on the alarm screen and
then select Arm Home or Arm Away options.
A small countdown will appear to indicate the time remaining before
the alarm is activated.

If you choose to get weather updates, enter your DarkSky API key
and current latitude and longitude in the weather setting screen. You
can get your lat/lon by using maps.google.com and copy them from
the url (they look like -34.6156624,-58.5035102)
To use a Instagram screensaver rather than the digital clock, turn this
feature on using screen saver settings. You can load other Instagram
images by changing the Instagram profile name in the settings.
Enclosure
I used an enclosure to give the alarm a small profile and also to make
it wall mountable: https://www.thingiverse.com/thing:1082431.
However, be aware that this case may require extra screws as well as
a longer flex cable. There are plenty of other case options for the RPi
3 / 7" Touchscreen option available. If you want to buy a case, the
SmartPi Touch case is a good option.

Notes
At this time there is an issue dimming the brightness of the backlight
for the display. So for now I have included a screen saver feature as
a short-term fix until the bug is addressed by the Android Things
development team. I will update the application once this feature is
working.
There have been multiple display issues using Android Things 0.5.0
and 0.5.1, therefore this application runs best under Android Things
0.4.1.
It's important that the alarm control panel settings reflect the settings
of those used in the alarm control panel component. Initially the
hardware control panel is set to the default settings of the alarm
control panel component.
One of the nice things about using Android Things is you can also
port your project to Android devices. I created an Android tablet
version of this same project that is available on Google Play and as a
downloadable APK file for side loading.
Google Play: https://play.google.com/store/apps/details?
id=com.thanksmister.iot.mqtt.alarmpanel
Repository: https://github.com/thanksmister/android-mqtt-alarm-panel
Custom parts and enclosures
7in Portable Raspberry Pi Multi-Touch Tablet
3D printable file to enclose the 7" Raspberry Pi Touchscreen and Raspberry Pi 3
CAD file on thingiverse.com
Schematics
PWM Speaker Diagram (Buzzer)
Connecting your buzzer to the Raspberry Pi 3
Code
MQTT Alarm Panel Github Repository
This is a repository with the latest code and release APK and Android Things project files.
Download as zip
Raspberry Pi Automated Plant Watering with
Website
After reading about how well things can grow indoors, I started
thinking that maybe automation was my path to healthy plants. So I
decided to build the bare minimum - get a plant, a pump, and a water
sensor. When the water sensor says "no water here", use the pump
to put water there.

I also decided to run it all through a Raspberry Pi to as an excuse to


interact with the RPi GPIO.
Here's how I did it!
Materials:
Raspberry Pi 3
Soil Moisture Sensor
Flexible Water Line
5V Relay
3-6V Mini Micro Submersible Pump
TOLI 120pcs Multicolored Dupont Wire
5v Power Supply (Any USB Cable+ USB Wall Charger
Wiring:
The first thing I did was make my 5V power supply from a usb cable.
Using an old iphone cable, I cut the iphone side off and fished out a
red and black wire. I soldered some sturdier wires to this, and
plugged it into a wall adapter. Checking with a voltmeter, this gave me
5V output.
Now time for the GPIO.
RPi Wiring:

Water Sensor - plug the positive lead from the water sensor to pin 2,
and the negative lead to pin 6. Plug the signal wire (yellow) to pin 8.
Relay - Plug the positive lead from pin 7 to IN1 on the Relay Board.
Also connect Pin 2 to VCC, and Pin 5 to GND on the Relay board.
Pump - Connect your pump to a power source, run the black ground
wire between slots B and C of relay module 1 (when the RPi sends a
LOW signal of 0v to pin 1, this will close the circuit turning on the
pump).
This diagram should capture the correct GPIO so long as you are
using Raspberry Pi 3. Not shown is another power source to the RPi.
Hardware Setup:
Once the wiring has been completed, attach the flexible hose to the
pump (I used electrical tape), and drop it into a jar of water. Attach the
other end of the hose to your plant.
Now plug in all power sources (and make sure your Raspberry Pi is
running some version of an operating system, like this one here).
Software
Note: If you get the wiring exactly as described above, my code in the
next section will work with no modifications. There are two parts to
this setup. One file controls all the GPIO and circuit logic, and the
other runs a local web server.
All files: water.py web_plants.py main.html
GPIO Script
Let's start with the code for controlling the GPIO. This requires the
RPi.GPIO python library which can be installed on your Raspberry Pi
as follows:
$> python3.4 -m pip install RPi.GPIO
With that installed, you should be able to use the water.py script
found here. You can test this is working correctly by running an
interactive python session as follows:
$> python3.4 >>> import water >>> water.get_status() >>> water.pump_on() This should
print a statement about whether your sensor is wet or dry
(get_status()), and also turn on the pump for 1s. If these work as
expected, you're in good shape.
At this point you can also calibrate your water sensor. If your plant
status is incorrect, try turning the small screw (potentiometer) on the
sensor while it is in moist soil until the 2nd light comes on.
Flask Webserver
The next aspect of this project is to setup the web server. This code
can be found here in a file called web_plants.py. This python script
runs a web server enabling various actions from the script described
above.
You will need to keep web_plants.py in the same directory as
water.py described above. You will also need a subdirectory called
"templates" containing the html file here called main.html.
You will need to install flask, and psutil as follows:
$> python3.4 -m pip install flask $> python3.4 -m pip install psutil
Make sure to place the web_plants.py file in the same directory as
the water.py script above. You will also need to create a sub-directory
called templates, and place main.html in the templates directory. Now
run the following command command to start your web server:
$> sudopython3.4 web_plants.py Now if you navigate to the ip address of
your RPi, you should see a web dashboard something like this:
Try clicking the buttons to make sure everything works as expected! If
so, you're off to the races. here's another great tutorial I followed on
flask + GPIO
Run Website Automatically
Finally, you probably want the website to auto start when the RPi gets
turned on. This can be done using a tool called cronjob, which
registers your website as a startup command.
To do so, type:
$> sudo crontab -e
This will bring up a text editor. Add a single line that reads (and make
sure to leave one empty line below):
@reboot cd <your path to web_plants>; sudo python3.4 web_plants.py
Now when you reboot your pi, it should auto start the server.
Code
water.py

# External module imp

import RPi.GPIO as GPIO

import datetime

import time

init = False

GPIO.setmode(GPIO.BOARD) # Broadcom pin-numbering scheme

def get_last_watered():

try:

f = open("last_watered.txt", "r")
return f.readline()

except:

return "NEVER!"

def get_status(pin = 8):

GPIO.setup(pin, GPIO.IN)

return GPIO.input(pin)

def init_output(pin):

GPIO.setup(pin, GPIO.OUT)

GPIO.output(pin, GPIO.LOW)

GPIO.output(pin, GPIO.HIGH)

def auto_water(delay = 5, pump_pin = 7, water_sensor_pin = 8):


consecutive_water_count = 0

init_output(pump_pin)

print("Here we go! Press CTRL+C to exit")

try:

while 1 and consecutive_water_count < 10:

time.sleep(delay)

wet = get_status(pin = water_sensor_pin) == 0

if not wet:

if consecutive_water_count < 5:

pump_on(pump_pin, 1)

consecutive_water_count += 1

else:

consecutive_water_count = 0

except KeyboardInterrupt: # If CTRL+C is pressed, exit cleanly:


GPIO.cleanup() # cleanup all GPI

def pump_on(pump_pin = 7, delay = 1):

init_output(pump_pin)

f = open("last_watered.txt", "w")

f.write("Last watered {}".format(datetime.datetime.now()))

f.close()

GPIO.output(pump_pin, GPIO.LOW)

time.sleep(1)

GPIO.output(pump_pin, GPIO.HIGH)

web_plants.py

from flask import Flask, render_template, redirect, url_for

import psutil
import datetime

import water

import os

app = Flask(__name__)

def template(title = "HELLO!", text = ""):

now = datetime.datetime.now()

timeString = now

templateDate = {

'title' : title,

'time' : timeString,

'text' : text

}
return templateDate

@app.route("/")

def hello():

templateData = template()

return render_template('main.html', **templateData)

@app.route("/last_watered")

def check_last_watered():

templateData = template(text = water.get_last_watered())

return render_template('main.html', **templateData)

@app.route("/sensor")
def action():

status = water.get_status()

message = ""

if (status == 1):

message = "Water me please!"

else:

message = "I'm a happy plant"

templateData = template(text = message)

return render_template('main.html', **templateData)

@app.route("/water")

def action2():

water.pump_on()
templateData = template(text = "Watered Once")

return render_template('main.html', **templateData)

@app.route("/auto/water/<toggle>")

def auto_water(toggle):

running = False

if toggle == "ON":

templateData = template(text = "Auto Watering On")

for process in psutil.process_iter():

try:

if process.cmdline()[1] == 'auto_water.py':

templateData = template(text = "Already running")

running = True

except:
pass

if not running:

os.system("python3.4 auto_water.py&")

else:

templateData = template(text = "Auto Watering Off")

os.system("pkill -f auto_water.py")

return render_template('main.html', **templateData)

if __name__ == "__main__":

app.run(host='0.0.0.0', port=80, debug=True)

main.html

<!DOCTYPE html>

<head>
<title>{{ title }}</title>

</head>

<body>

<h1>PLANT HELPLINE</h1>

<h2>The date and time on the server is: {{ time }}</h2>

<h2> {{ text }} </h2>

<a href="/auto/water/ON"><button>Turn ON Auto Watering</button></a>

<a href="/auto/water/OFF"><button>Turn OFF Auto Watering</button></a>

<a href="/sensor"><button>Check Soil Status</button></a> <br>

<a href="/water"><button>Water Once</button></a>

<a href="/last_watered"><button>Check Time Last Watered</button></a>

</body>
</html>
J.A.R.V.I.S
Just A Rather Very Intelligent System.

This project demonstrates the use of different technologies and their


integration to build an intelligent system which will interact with a
human and support in their day to day tasks. It is inspired from the AI
bot, "JARVIS" from the movie, "Iron Man". It was developed for
Google I/O'17 challenge.
Currently, the project version 0.30, has 5 main modules:
JARVIS Brain
JARVIS Things
JARVIS Mobile
JARVIS Amazon Alexa
JARVIS Web
JARVIS Brain
NLTK and Scikit based, NLP engine written in python to classify the
input speech (in the text) and process it depending on the
classification. Currently, the trainer trains the engine to classify for 3
patterns:
welcome greetings
basic mathematics expression solving (+,-,*,/, square, squareroot, cube,
cuberoot)
commands to do tasks
If the input speech result is not found out from the classified module,
then it is processed for online web search using "duckduckgo" web
search API.
This engine is accessible via python based service build using Flask
framework.
JARVIS Things
Android Things (running on RPi3) based interface and controlling
unit. It takes speech input from the humans and sends the text
version of it to JARVIS BRAIN for processing. Then based on the
response from the Brain, it performs tasks. It can also have its inputs
via sensors and can triggers tasks directly. USB mic connected takes
the input speech and via Google Voice Search (added manually via
old apk), converts it into text. This text is sent to Brain using Volley
library. The JSON response is parsed and further processed to
check what the Things have to do. If it's a command type, the
respective task is performed like moving a rover, turning on the lamp,
etc. Or else if it's a simple reply, using inbuilt Text-To-Speech, it is
spoken out via USB speaker connected to RPi3. MQTT protocol is
used for assisting JARVIS MOBILE to control the Rover and the
Lamp.
Rover control: Rover is based on Arduino UNO board and communicates with
JARVIS THINGS via RF module. Based on the command received from Brain,
different commands are transmitted to Rover to make it move forward,
backwards, left, right and stop. For the demo, all commands execute for 5
seconds only.
Lamp control: Lamp is based on ESP8266 wifi module. It is programmed to
control the relay which then controls the AC appliance i.e. the LAMP. MQTT
protocol is used for communication between the Thing and the Lamp.
Location Temperature: Added in ver 0.25, speaking out "Temperature of New
Delhi" will fetch the temperature for New Delhi using Open Weather Map API. If
not found it will go for regular web search as fail safe option.
JARVIS Mobile
The Android application provides another user interface to
communicate with the Brain and perform the tasks. Similar to
JARVIS Things, it uses, in-built Speech-To-Text and Text-To-Speech
libraries. The replies are read out aloud on the mobile device itself.
However, to perform other tasks like moving the rover and other, it is
dependent on JARVIS Things to perform those tasks on its behalf.
MQTT is used to send instructions to the Things and the tasks are
performed.
JARVIS Amazon Alexa
Basic Amazon Alexa skill which is integrated with JARVIS Brain via
HTTP protocol. Jarvis skill can be invoked on Amazon Alexa
supported device like Amazon Echo using the invoke term as
"Jarvis". The device will use inbuilt Voice to Text and Text to Speech
features and communicate with the JARVIS Brain. During testing, it
was found that Voice to Text on Alexa was not that accurate as
compared to Google's and this resulted in error responses from
Brain. The skill is not published for world yet.
JARVIS Web
Similar to JARVIS Mobile, JARVIS Web gives another interface to
interact via web platform. It uses 'webkitSpeechRecognition' for
Voice to Text conversion and 'SpeechSynthesisUtterance' for Text to
Speech. AJAX in simplest form is used to communicate with JARVIS
Brain. JARVIS Web can do the greetings, maths calculations,
fetching temperature for the location or do a web search. In future, it
would be integrated with JARVIS THINGS and control other
hardware using MQTT.
Code
J.A.R.V.I.S.
Download as zip
Home Hidroneumatic Controlled by ESP8266 + MQTT
+ Web App

Overview
Motivated by the constant lack of water supply in my city I have been
forced to install a water tank that feeds a hidroneumatic so as to have
water with enough pressure for the whole house.
Additional needs to install a pump for the rapid filling of the main tank at
times when the water arrives with very little pressure. This project allows
controlling:
Turn On/Off hidroneumatic pump.
Fill water tank using a litle pump until full sensor detects full tank and stop pump.
Fill water tank with water came from street when the pressure helps, until full sensor
detects full tank and close the solenoid valve.
When the water tank is close to emptying empty sensor activate the buzzer alarm and
stop all pumps.
Deactivate all devices.
Turn on/off pumps room light.
Detect movement and turn on pumps room light and turn off after 15 seconds.
See all display info at an LCD display or remotely in LCD Status at web page.
All activities executed with help of :
3 solenoid valves (blue color)
Full sensor
Empty sensor
Water pump (red color)
Hidroneumatic Pump (red color)
Block Diagram
In the block diagram picture we can see the four main components of this
project:
1.-Clients
Pc or mobile connected with web browser
Pc connected with browser to Cayenne cloud
Cayenne mobile app for IOS or Android
Future project will include Apple HomeKit and Amazon Alexa Echo Dot (red color)
2.- Web Server and MQTT Broker running at Raspberry PI v3
Apache Web Server
Mosquitto MQTT Broker Server
Future project will include Homekit Accessory Server (red color)

3.- Control server running at ESP8266


Bridge Cayenne to MQTT Broker Server
State machine
Sensors control
LCD display all commands and messages
Future project will includes Bridge Alexa MQTT Broker Project (red color)
4.- Devices like pumps, valves, light, sensors and LCD display controlled
by ESP8266 + MCP23008

Install all software needed and connect all devices


Step 1 - Configure Raspberry PI v3
Make sure Raspberry PI was updated
sudo apt-get update
sudo apt-get upgrade
Check your ip address with the command
ifconfig
See ip address at eth0 or wlan0
Install Apache Web Server
Check this tutorial at Raspberry PI org.
Edit file hidro.html and change line 20 with ip address of your Raspberry PI
Copy file hidro.html and script directory, included at source code, to /var/www/html
Raspberry PI folder.
Test web server from a browser client http://rpi_ip_address/hidro.html, you must see
this page :
Install Mosquitto MQTT Broker Server
cd /
wget http://repo.mosquitto.org/debian/mosquitto-repo.gpg.key
sudo apt-key add mosquitto-repo.gpg.key
cd /etc/apt/sources.list.d/
sudo wget
or
sudo wget
sudo apt-get update
sudo apt-get install mosquitto
sudo nano /etc/mosquitto/mosquitto.conf
# add this two lines for websocket support
listener 1884
protocol websockets
systemctl start mosquitto
systemctl enable mosquitto
Step 2 - Configure your device and project at Cayenne Cloud.
Go to https://mydevices.com/ site.
Sign in or Sign up if you don't have an account it is free.
Click at Add new.. and select Device/Widget

Click at Generic ESP8266 to create a device


You will see a page with all information to connect a client or device to Cayenne Cloud.
You must store MQTT USERNAME, MQTT PASSWORD, CLIENT ID, MQTT SERVER
and MQTT PORT parameters because you need update its in
HidroNeumaticControl.ino file lines 42, 43, 38, 44 and 45
Don't close your browser because it waits by ESP8266 connect
Step 3 - Programming ESP8266 and connect it to Cayenne Cloud
Edit file HidroneumaticControl.ino and change your wifi ssid and password at lines 35
and 36.
Change your Raspberry PI ip address obtains at step 1 at line 123

Remove // at line 32 to let use of Cayenne Cloud bridge.

Save all changes and upload the file HidroneumaticControl.ino to ESP8266 using
Arduino IDE.
When upload is completed press reset button at your ESP8266 device and you see at
waiting browser Cayenne device page that it go to Create App page. Your ESP8266
device was successfully connected to Cayenne Cloud.
Step 4 - Create your device widgets at Cayenne Cloud Dashboard
Click at Generic ESP826 at Cayenne Create App page
At this point you will create all widgets, to do that press Add new.. and Device/Widget
again and scroll down to Custom Widgets option and click it, you see all type of
custom widgets.
Press Button controller widget for create a new button to activate
Hidroneumatic.
Name : Hidroneumatic
Device select "Generic ESP8266"
Data select "Data Actuator"
Unit "Digital 0/1"
Channel : 1 (special attention with this number)
Choose Icon : Tank
Press Add Widget and the widget appears at your DashBoard
Press Button controller widget for create a new button to activate Pump
that fill tank with water.
Name : Fill with pump
Device select "Generic ESP8266"
Data select "Data Actuator"
Unit "Digital 0/1"
Channel : 2 (special attention with this number)
Choose Icon : Motor
Press Add Widget and the widget appears at your DashBoard
Press 2 State display widget for create a new status indicator that tank is
fill from water street.
Name : Fill with pump
Device select "Generic ESP8266"
Data select "Digital Sensor"
Unit "Digital 0/1"
Channel : 3 (special attention with this number)
Choose Icon : Digital 2 State
Press Add Widget and the widget appears at your DashBoard
Press Button controller widget for create a new button Deactivate all
devices.
Name : Deactivate all
Device select "Generic ESP8266"
Data select "Data Actuator"
Unit "Digital 0/1"
Channel : 4 (special attention with this number)
Choose Icon : Toggle/Switch
Press Add Widget and the widget appears at your DashBoard
Press Button controller widget for create a new button Room light to turn
on/off pumps room light.
Name : Room light
Device select "Generic ESP8266"
Data select "Data Actuator"
Unit "Digital 0/1"
Channel : 5 (special attention with this number)
Choose Icon : Light
Press Add Widget and the widget appears at your DashBoard
Press 2 State display widget for create a new motion detector for pumps
room.
Name : Fill with pump
Device select "Generic ESP8266"
Data select "Motion"
Unit "Digital 0/1"
Channel : 6 (special attention with this number)
Choose Icon : Motion
Press Add Widget and the widget appears at your DashBoard
Step 5 - Create a Project "Hidroneumatic Control"
At this point you will create a Cayenne project "Hidroneumatic Control" to
do that press Add new.. and Project option.
Now drag and drop all created widgets from left bar to project dashboard to
create a new Cayenne browser project to control our devices.
Step 6 - Install Cayenne mobile client from Apple Store or Android
Store
After installs your Cayenne mobile app login and select Projects and
Hidroneumatic Control and go to dashboard and you will see all widgets.
Step 7 - Software is ok now proceed with Hardware, the control board
Install the LCD I2C, the buzzer alarm and PIR Sensor on the plastic topbox with all
female connectors.
Based on schematic diagram make a control board with breadboard and some
wrapping wired

Put the relay board and control board inside plastic box and organize all connections
based in schematic.
Connect the buzzer alarm, LCD and PIR sensor from topbox to control board and
power on with 12 volts power supply and you will see all information at LCD display
panel.
Software details
This project uses two files:
hidro.html : is a responsive web application client written in Bootstrap,
Jquery and using a Paho Javascript Client, an MQTT browser-based client
library written in Javascript that uses WebSockets to connect to an MQTT
Broker.
Must be installed at Apache Web Server together with script folder.
It is a Publisher and Subscriber to MQTT Broker topic "HidroControl" it
send and receive messages.
Each button On/Off send a message to MQTT Broker and change on to off
and off to no status.
If tank is empty or begin fill and is full you will receive the status at Tank
Status label.
If a movement is detected by a PIR sensor the pumps room light turn on
and you will receive the status at Room Light label.
If you want remote connect to LCD you only have to click at LCD Status
and you receive the last two lines messages that is displayed at LCD.
HidroNeumaticControl.ino : is a firmware to ESP8266.
Have a state machine to control all devices, depends from message
receive from MQTT Broker calls each function.
It is a Publisher and Subscriber to MQTT Broker topic "HidroControl" to
send and receive messages.
It implements a Cayenne to MQTT Broker bridge.

It implements a I2C communication with LCD display and MCP23008.


Turn on/off pumps room light when PIR sensor detects movement.
Hardware details
The brain of control board is a ESP8266, I use an Adafruit HUZZAH
implementation but you can use any other for example Sparkfun ESP8266
Thing.
With I2C interface from pines 4 (SDA) and 5 (SCL) it communicates with I2C devices
like LCD I2C display and MCP23008.
GPIO 15 controls the buzzer alarm.
GPIO 12 controls PIR sensor.
MCP23008 is an 8-Bit I/O Expander and controls the other devices and
sensors.
GPIO0 : Hidroneumatic pump
GPIO1: Water pump
GPIO2: Hidro solenoid valve
GPIO3: Tank solenoid valve
GPIO4: Water from street solenoid valve
GPIO5: Water tank full sensor
GPIO6: Pumps room Light
GPIO7: Water tank empty sensor
Use 12v / 30A power supply because all solenoids valves, and a Dc to DC
converter to generate 5v needs for all circuit.
Pumps room light, water pump and hidroneumatic pump using 110v
controlled by each relay, be attention when connects them.
You can test the control board and MQTT Broker connection loading in the
web browser the web app client, press some buttons and see all
messages send and all status at LCD display, check that all relays are
activated like the next table.
Schematics
Schematic Control Board
Code
Hidroneumatic Control
Download as zip
Laser Shootin' Robot
Introduction
Ever wanted to make a robot? How about one that is controlled via
Bluetooth? Still not exciting enough for you? This robot can fire
lasers, basically becoming a mobile artillery piece that shoots light!
Now you can tease your cat, make a presentation more exciting, and
even shine it at people (just not at their eyes please)! This project will
document how I went about creating a robot capable of such epic fun!
Parts Needed
DFRobot was generous to me by sending out a couple of their 2WD
MiniQ Robot Chassis. These things are great! Each chassis came
with 2 50:1 geared motors, nice sturdy tires, and plenty of mounting
holes. Next, I got a Raspberry Pi Zero; it is small, yet very capable,
due to its on board Wifi and Bluetooth. I also got an Arduino Nano
and an HC-05 for the controller. See the BoM for the full list of parts
needed.

Assembly
I began by assembling the 2WD MiniQ chassis kit from DFRobot. I
slid the wheels onto the motor shafts, then inserted them into
brackets and attached them to the chassis. Finally, I added the metal
supports.

Now it was time to build the main board. The L293d motor driver got
soldered in place, along with wires running to the Raspberry Pi's
GPIO pins. Next, I soldered a connector for the battery, as that will
provide the main power. After the power source was added, I installed
a 5V regulator along with an NPN transistor. The regulator provides
the right voltage for the Pi and laser diode, and the transistor allows
the Pi to safely control the laser without blowing up.
Motor Control
This was actually one of the trickiest parts of this project. I had to
decide the way I should transmit information to drive the wheels. I
was faced with two options: tank or arcade drive. Tank drive operates
like a.. tank, of course. Left makes right wheel go forwards and the
left wheel goes backwards, turning the robot left. Here is a table for
movements:
Joystick Position | Left Wheel | Right Wheel
Left | -1 | 1
Right | 1 | -1
Up | 1 | 1
Down | -1 | -1
Neutral | 0 | 0
But that gives you very limited control. Plus, when working with a
large voltage, the robot tends to get zippy. A better option was to use
PWM to variably control the motor speeds. Now, it isn't as simple as
using an Arduino, there isn't any analogWrite(motor_pin, value) abstraction,
which can get frustrating. Luckily, the RPiGPIO library has a PWM
class. To use it, I first made each motor pin into an output
with GPIO.setup(motor_pin, GPIO.OUT) . Then I made an object for each
motor pin with motor_pwm = GPIO.PWM(motor_pin, 100) . Make sure to change
the motor_pwm variable name for each object. Now that I had variable
motor speeds, could focus on getting the values from the joystick
translated into motor values.
The Controller

The basic analog joystick is made of two potentiometers that give a


10-bit value ranging from 0-1023 on the Arduino. I set up the Arduino
Nano to take a reading from each analog pin that the joystick was
connected to (in my case it was A1 and A2). After that, the code
maps x and y to values between -50 and 50, like this:
int x = map(A1_reading, 0, 1023, -50, 50);
int y = map(A2_reading, 0, 1023, -50, 50);
Then, I added a constrain call to capture any outliers:
x = constrain(x, -100, 100);
y = constrain(y, -100, 100);
Sparkfun recently did a tutorial on how to use an RC controller for DC
motors, which was perfect for what I needed. You can find that blog
post here. The way to determine the way each motor goes is pretty
simple. Just do:
int motor1 = x+y;
int motor2 = y-x;
And then:
motor1 = constrain(motor1, -100, 100);
motor2 = constrain(motor2, -100, 100);
...to ensure it stays within -100 and 100. And you may be asking,
"How do I make it come to a stop?" Well, that is where a deadzone
comes into play. By adding this as a global variable:
int deadzone = 10;
...and then:
if(abs(motor1) <= deadzone){
motor1 = 0;
}
if(abs(motor2) <= deadzone){
motor2 = 0;
}
...you can effectively stop a motor whenever you aren't pushing the
joystick. Lastly, I was wondering how to efficiently send this motor
data to the Raspberry Pi. It needed to be easily readable and have
good decoding support. Then this idea popped into my head: JSON! I
could send a string that conveys the motor data to the Pi, then the Pi
can decode it into variables, and lastly control the motors accordingly.
So I added these lines:
String json_string = "{\"motor\":[" + String(motor1)+","+String(motor2)+"]}";
Serial.println(json_string);
which send the motor data as a JSON string, where "motor" is the
key, and an array containing the motors values is its value. There is
also an interrupt that will send "Fire" to the Pi if the button is pushed.
Now for the last part: having fun with it!
Using the Robot
Before I began to use this new machine of light beams, I had to
somehow get the HC-05 module talking to the Pi. Here is the
condensed version. First, power on the HC-05 and Raspberry Pi,
then click on the Bluetooth icon on the Raspberry Pi and pair your
HC-05. The password should be: "1234" without the quotes.
On the Raspberry Pi Zero W, enter these commands:
hcitool scan
This will scan for bluetooth devices and return their MAC addresses.
Now enter:
sudo rfcomm /dev/rfcomm0 mac_address
...where mac_address is the address of the HC-05. Download the
code for this project and transfer the Python files to a directory.
REMEMBER THAT DIRECTORY; you will need it soon. Now open up
the rc_car_main.py python script with a text editor, and where you
see BT_addr , enter in "/dev/rfcomm0". Also, enter in the motor pins for
the motors in:
car.config_motors(pin1, pin2, pin3, pin4, invert=False)
Run the script and see if the robot moves the proper way. If it goes
left when it should be going right, set invert=False to invert=True .
Have fun with your new laser firing bot!
Custom parts and enclosures
Battery Platform
Schematics
Robot Schematic

Controller Schematic
Code
Controller Code
Python Robot Class
Python Robot Main Script (run this one)
PiClock: RGB LED Smart Clock Using Raspberry
Pi Zero W

Introduction:
This project aims to create a RGM matrix based SMART clock using
the RGB Matrix interface boards being sold by Electrodragon and
the Raspberry Pi Zero W.
The software being used is Raspbian and the board will be
interfaced directly in C. The ultimate goal would be to create an SDK
for this board to create multiple apps for the P10/P6 display and
finally create a nice 3d printed case for the same
The SDK will be capable of:
Creating your own App-Modules for running different apps on the RPi
Creating watch-faces for the board
Enabling connectivity with REST APIs like IFTTT and so on to interact with the
on board sensors
Sensors:
The project will host a number of sensors to which the "App
Modules" can interface to:
LDR
Temperature sensor
DS1307 RTC clock
3d gesture sensor from Flick
Stay Tuned for more from this project. The repo for this project will
be at: https://github.com/narioinc/PiClock
A quick sneak peek at the progress made so far:
Schematics
Hzeller Adapter board ref design
Download as zip
Code
PiClock
Download as zip
Hzeller RGM Matrix Driver
Download as zip
SNES Pi: Zero

This project was inspired by the "Pi Cart: a Raspberry Pi Retro


Gaming Rig in an NES Cartridge". I thought it might be interesting to
see if everything could be crammed into the even smaller SNES
cartridge, there was just barely enough room!
This project uses the Raspberry Pi Zero W, this was chosen because
of how easy it is to emulate roms thanks to RetroPie. The Zero is also
pretty small and powerful enough to run pretty much any game pre-
N64.
1. Clearing out the inside of the cartridge.
First thing is to open up the SNES cartridge using a 3.8mm
Screwdriver Security Bit such as this one on Amazon.
Next was finding a way to remove the plastic inside that would get in
the way of the new components. Originally I used a heat gun and x-
acto knife, which initially worked, but I found that the cartridge was
very easily warped with the heat.
After the problem with the heat gun I used a dremel cut off wheel and
sanding bit, which did work fine although the end result was a bit
messy, and it took quite a bit longer than I would like (this plastic also
smells quite a bit).
Finally at the recommendation of an employee at the UNR Innevation
Center I tried the drill press. This actually ended up working very well.
It is relatively fast and consistent after setting the maximum depth.
2. Installing the new parts inside of the cartridge
Now to get all of these parts into the cartridge. I used double stick
tape to hold the Raspberry Pi Zero, and a bit of hot glue for the other
components. To make the USB hub fit you have to remove the
housing, which comes off very easily when pulling the two halves
apart.
After a bit of trial and error I found that this was the best layout to get
everything to fit and to leave enough room for all of the cables to plug
in properly. By far the most difficult cable is the HDMI extension, it is
very thick and did not like to be bent much.
3. Finishing touches
To customize the SNES cartridge a bit I made a custom label.
Removing the original label is fairly easy using a hair dryer or heat
gun (on lowest setting). Once the label was removed I used a bit of
rubbing alcohol to clean up any remaining residue.
The print came out okay, but if I were to get more printed I would use
white backed stickers instead of clear as it is a bit darker than I
wanted.
4. Glamor shots!
There is a GroupGet currently running for some hand made SNES Pi:
Zero's which you can find here:
https://groupgets.com/campaigns/350-snes-pi-zero
Raspberry Pi as a Robotic Arm Controller with
Flick HAT

This guide will allow you to control a robotic arm to pick up and move
objects without having to touch anything! We will be using a Flick
Board combined with a Pi to command a motorised robotic arm. This
should work with any Pi and Flick combo if the Flick board is
compatible with the Pi, likewise the OS is not limited so any Linux OS
that has a Terminal or console will work.
So now to get into setting up the project.
Putting Everything Together
This project requires no soldering or electronic construction. The only
mildly difficult part is the build of the Robotic Arm for which there is a
guide that comes with the Arm and there are various tips and guides
online.
Parts for this project:
Raspberry Pi A+,B+ and above
An SD card with your choice of OS (I am using Raspbian Jessie Full)
Maplin Robotic Arm Buy Here
Flick Board compatible with you Pi
For initial preparation:
Monitor
Keyboard
Mouse
Ethernet cable
Use these diagrams to wire up the kit taking extra care with fitting the
Flick Board onto the GPIO pins.

Preparing the Pi
I suggest setting up SSH on your Pi for convenience by running:
sudo raspi-config
Choose interfacing options then SSH and enable SSH Server. Next
we need to go through the basic updates and installations:
sudo apt-get update
sudo apt-get install git
sudo apt-get install subversion
Run each of those and let them install.
Run each of those and let them install. Then we need to get the Flick
Board software via:
curl -sSL https://pisupp.ly/flickcode | sudo bash
And to finish the installing section we need the various programs and
software which allow the Pi to talk to the Arm. I would suggest, if
possible, copy and pasting each of block of code rather than typing
out each line separately.
Install PyUsb:
cd
mkdir pyusb
cd pyusb
git clone https://github.com/walac/pyusb.git .
sudo python setup.py install
cd
sudo rm -r pyusb/
cd
This will install a library that allows the Pi to communicate to the Arm
via USB.
Install RobotArmControl program:
cd
mkdir robotarmcontrol
cd robotarmcontrol
svn co http://projects.mattdyson.org/projects/robotarm/armcontrol .
The Testing Stage
Running this line should make the Arm test every motor and then
return to its starting position.
sudo python testRobotArm.py
Now to test the Flick Board from the demo program:
flick-demo
When this program is running you should be able to see inputs and
values as you move your hand over the Flick.

Now the program which allows control of the Arm:


flick-armcontrol
Code
Download as zip
Headless Google Assistant with Startup Audio
Now, by starting Google Assistant as a service on boot, the unit can
be used headless. Also, I have added a cool startup audio and an
audio alert for wakeword detection.

Clone the project from github and follow the instructions in the
readme file. It should not take you more than 10-15 mins.
First, clone the project
git clone https://github.com/shivasiddharth/GassistPi
This is implemented in Python2 so your existing Google
Assistant may not work. So please start by making a fresh copy
of latest RaspbianINSTALL AUDIO CONFIG FILES1. Update OS
and Kernel
sudo apt-get update
sudo apt-get install raspberrypi-kernel
2. Restart Pi3. Choose the audio configuration according to your
setup. (Run the commands till you get .bak notification in the
terminal)3.1. USB DAC users,
sudo chmod +x /home/pi/GassistPi/audio-drivers/USB-DAC/scripts/install-usb-dac.sh

sudo /home/pi/GassistPi/audio-drivers/USB-DAC/scripts/install-usb-dac.sh
3.2. AIY-HAT users,
sudo chmod +x /home/pi/GassistPi/audio-drivers/AIY-HAT/scripts/configure-driver.sh

sudo /home/pi/GassistPi/audio-drivers/AIY-HAT/scripts/configure-driver.sh
sudo chmod +x /home/pi/GassistPi/audio-drivers/AIY-HAT/scripts/install-alsa-config.sh
sudo /home/pi/GassistPi/audio-drivers/AIY-HAT/scripts/install-alsa-config.sh
3.3. USB MIC AND HDMI users,
sudo chmod +x /home/pi/GassistPi/audio-drivers/USB-MIC-HDMI/scripts/install-usb-mic-
hdmi.sh
sudo /home/pi/GassistPi/audio-drivers/USB-MIC-HDMI/scripts/install-usb-mic-hdmi.sh
3.4. USB MIC AND AUDIO JACK users,
sudo chmod +x /home/pi/GassistPi/audio-drivers/USB-MIC-JACK/scripts/usb-mic-onboard-
jack.sh

sudo /home/pi/GassistPi/audio-drivers/USB-MIC-JACK/scripts/usb-mic-onboard-jack.sh
3.5. CUSTOM VOICE HAT users,
sudo chmod +x /home/pi/GassistPi/audio-drivers/CUSTOM-VOICE-HAT/scripts/custom-
voice-hat.sh

sudo /home/pi/GassistPi/audio-drivers/CUSTOM-VOICE-HAT/scripts/custom-voice-hat.sh
sudo chmod +x /home/pi/GassistPi/audio-drivers/CUSTOM-VOICE-HAT/scripts/install-i2s.sh
sudo /home/pi/GassistPi/audio-drivers/CUSTOM-VOICE-HAT/scripts/install-i2s.sh
Those Using HDMI/Onboard Jack, make sure to force the audio
sudo raspi-config
Select advanced options, then audio and choose to force audioThose
using any other DACs or HATs install the cards as per the
manufacturer's guide and then you can try using the USB-DAC
config file after changing the hardware ids4. Restart Pi5. Check
the speaker using the following command
speaker-test -t wav
CONTINUE AFTER SETTING UP AUDIO
1. Download credentials--->.json file
2. Place the .json file in/home/pi directory
3. Rename it to assistant--->assistant.json
4. Use the one-line installer for installing Google Assistant
4.1 Make the installers Executable
sudo chmod +x /home/pi/GassistPi/scripts/gassist-installer-pi3.sh
4.2 Execute the installers
sudo /home/pi/GassistPi/scripts/gassist-installer-pi3.sh
5. Copy the google assistant authentication link from terminal and
authorize using your google account6. Copy the authorization code
from browser onto the terminal and press enter7. Move into the
environment and test the google assistant
source env/bin/activate
google-assistant-demo
After verifying the working of assistant, close and exit the terminal
HEADLESS AUTOSTART ON BOOT SERVICE SETUP
Make the service installer executable
sudo chmod +x /home/pi/GassistPi/scripts/service-installer.sh
Run the service installer
sudo /home/pi/GassistPi/scripts/service-installer.sh
Enable the services
sudo systemctl enable gassistpi-ok-google.service
Start the service
sudo systemctl start gassistpi-ok-google.service
RESTART and ENJOY
VOICE CONTROL OF GPIOs and Pi Shutdown
The default GPIO and shutdown trigger word is "trigger" if you wish to
change the trigger word, you can replace the 'trigger'in the
main.py(src folder) code with your desired trigger word.Similarly, you
can define your own device names under the variable name var.The
number of GPIO pins declared should match the number of devices.
FOR NEOPIXEL INDICAOR
1. Replace the main.py in src folder with the main.py from Neopixel
Indicator Folder.
2. Reboot
3. Change the Pin numbers in the given sketch according to your
board and upload it.
4. Follow the circuit diagram given.
Now you have your Google Home Like Indicator
Code
Download as zip
DIY Raspberry Pi Temperature System with
Ubidots

A temperature monitoring system provides valuable insights in both


commercial and industrial environments to reduce inefficiencies or
maintain quality of products and their quality. What if I told you that
you can monitor the temp of your self-built wine-cellar or your family's
aquarium at home using the same device. Further, what if I told you
that the same device could be used to monitor air and liquid
temperatures of fluids at your factory too? The makers of our world
have made this possible and this guide is here to help kickstart your
own initiatives at home or on the shop floor.
This guide will be your tutorial for a simple DIY temperature
monitoring system that is also waterproof to boot. Using a Raspberry
Pi and Ubidots we'll show you how to connect your Pi and display in
real-time your temperature system's metrics. Using Ubidots, you can
also create emails or SMS events to ensure your "variable" (in this
case, the temperature) remains within a set of defined limits assigned
by you to ensure quality and efficiency of your system's conditions.
For this project we are going to use a 1-wire pre-wired and
waterproof version of the DS18B20 sensor. What is 1-wire? It's a
communication protocol that makes connecting your IoT sensors
simpler by aggregating all cabling into is a single wire (...well actually
it's three, two are ground and power connections for energy, the third
being the 1-wire for data transmission).
IMPORTANT NOTE: The 1-Wire temperature sensor has different
versions for sale; one with a resistor integrated into the sensor and
the other without. When purchasing or setting up your hardware, best
to make sure your devices and sensors are compatible prior to
moving further in this tutorial.
Requirements
Raspberry Pi 3 Model (Already configured)
OneWire Temperature Sensor - DS18B20
Ubidots account - Educational License - Business License
IMPORTANT NOTE: This guide assumes your Raspberry Pi has
been configured and is connected to the Internet. If not, you can
quickly do so using this quick start guide from the Raspberry Pi
Foundation.
Wiring Setup
As previously mentioned, the OneWire temperature sensor is sold
with different versions containing resistors. For this tutorial, we will
illustrate both versions–with and without a resister. No matter which
you choose for your system, make sure to double check any
connections are properly based on the below diagrams and photos.
With resistor integrated - with grove connector
Please follow the table below to make the right connections for your
OneWire temperature sensor with resistor.
TIP: The Arduberry is new campaign in Kickstarter, which brings a
simple and inexpensive way to bring Arduino shields to the Raspberry
Pi. This incredible option is the easy way to start connecting your
grove sensors using an Arduino Grove shield. For more information
about this, please reference to the campaing.
Without resistor integrated - without grove connector
The resistor in this setup is used as a pull-up for the data-line, and
should be connected between the data wire and the power wire. This
ensures that the data line is at a defined logic level, and limits
interference from electrical noise if our pin was left floating.
Use a 4.7kΩ (or 10kΩ) resistor and follow the diagram below to make
the correct connections. Note that the pins connected in the
Raspberry Pi are the same used in the table above:
Sensor Setup
With your Raspberry Pi connected to the internet, verify the IP address assigned
to the board access using ssh in your computer's terminal:
ssh pi@{IP_Address_assigned}
If you haven't already configured the credentials of your Raspberry Pi,
note that you will have to use the default credentials provided:
UserName: pi
Password: raspberry
When your Pi is configured and connected correctly, the user of your
terminal becomes listed as: pi@raspberrypi
Now let's upgrade some packages and install pip, Python's packet manager.
Copy and paste the below commands into your terminal and press "enter" after
each to run the commands.
sudo apt-get update
sudo apt-get upgrade
sudo apt-get install python-pip python-dev build-essential
Then, install Request library, which is a popular Python library that simplifies
making HTTP requests. Copy and paste the below commands into your terminal
and press "enter" run the command.
$ pip install requests
The Raspberry Pi comes equipped with a range of drivers for
interfacing. In this case, to be able to load the 1-Wire sensor's driver
on the GPIO pins, we have to use these below two drivers. These
drivers are therefore stored as loadable modules and the command
modprobe is employed to boot them into the Linux kernel when
required.
Run the commands below:
$ sudo modprobe w1-gpio
$ sudo modprobe w1-therm
Now, we need to change the directory to our 1-Wire device folder and list the
devices in order to ensure that our sensor has loaded correctly. Copy and paste
the below commands into your terminal and press "enter" after each to run the
commands.
$ cd /sys/bus/w1/devices/
$ ls
At this moment you sensor has already been assembled and
connected and should be listed as a series of numbers and letters. In
our case, the device is registered as 28-00000830fa90 , but your case will
be a different series of letters and numbers, so replace our serial
number with your own and run the command.
$ cd 28-00000830fa90
The sensor periodically writes to the w1_slave file, to read your temp
sensor, please run the command below:
$ cat w1_slave
This command will show you two lines of text with the output t=
showing the temperature in degrees Celsius. Please note that a
decimal point should be placed after the thefirst two digits (this is
provided in the final code- do not worry); for example, the
temperature reading we've received is 29.500 degrees Celsius.

Now that you are able to take temperatures readings, it is time to post
them to Ubidots!
Sending data to Ubidots for visualization
Now it is time to code!
Create and run a Python script in your computer's terminal:
$ nano onewire_temp_ubidots.py
Then paste and save the below code to your terminal:
import os
import time
import requests
os.system('modprobe w1-gpio')
os.system('modprobe w1-therm')
temp_sensor = '/sys/bus/w1/devices/28-00000830fa90/w1_slave'
def temp_raw():
f = open(temp_sensor, 'r')
lines = f.readlines()
f.close()
return lines
def read_temp():
lines = temp_raw()
while lines[0].strip()[-3:] != 'YES':
time.sleep(0.2)
lines = temp_raw()
temp_output = lines[1].find('t=')
if temp_output != -1:
temp_string = lines[1].strip()[temp_output+2:]
temp_c = float(temp_string) / 1000.0
temp_f = temp_c * 9.0 / 5.0 + 32.0
payload = {'temp_celsius': temp_c, 'temp_fahrenheit': temp_f}
return payload
while True:
r = requests.post('http://things.ubidots.com/api/v1.6/devices/raspberry/?token=
{Assign_your_Ubidots_Token}', data=read_temp())
print('Posting temperatures in Ubidots')
print(read_temp())
time.sleep(10)
Make sure to replace the serial number 28-00000830fa90 with yours, and
assign your Ubidots account token in the request URL. If you don't
know how to get your Ubidots Token, please reference the article
below for help:
Find your TOKEN from your Ubidots account
Completed code terminal window:
Now let's test the script. Paste and run the below script in your
computer's terminal.
python onewire_temp_ubidots.py
If it is working properly, you will see a new device in your
Ubidotsaccount with two variables: temp_celsius and temp_fahrenheit
Optional Steps: Rename the Device and Variables
The names of the variables created are the same as the API labels,
which are the IDs used by the API. This doesn't mean their names
can't be changed, so it is recommended to change the names of your
devices and variables to make them friendlier to your nomenclature.
To learn how to rename your variables names, see below:
How to adjust the Device name and Variable name
You can also add and adjust the units of each variable from your list
of options:
As you can see below, we've assigned different units to the each
variable, and also assigned more friendly names to fit our projects
nomenclature. This is highly recommended to users seeking
deployments of 100s or devices.
Event Setup
An event (or alert) is any action triggered when data fulfills or
exceeds a design rule. For example, an email or SMS message can
be sent anytime a sensor stops sending data or a temperature
exceeds a maximum or minimum threshold.
Result
In just a few minutes you've built an easy DIY temperature
monitoring system. Now place your sensors where needed and
start tracking temperatures from your device today!
To Be Continued To Part 4 ...

You might also like