This document contains random symbols and characters with no discernible meaning or context. It also includes two YouTube video links but provides no other information to summarize.
This device uses an Atmel ATSAMG55J19A-MU Cortex-M4 ARM 120MHz CPU and includes an Atmel ATWINC1500B WiFi 2.4GHz module and a Cypress CYBL10563-68FNXI Cortex-M0 48MHz Bluetooth 4.1/BLE module. It also features a Micron N25Q032 4Mbyte flash memory and includes LED indicators, buttons, and a buzzer.
The document discusses how biometric sensor data from wearable devices can be used to create new experiences and senses through applications, such as using brain waves to navigate a maze, controlling lights with motion, touching lights that respond to heartbeats, and experiencing synesthesia through a color sheet that makes sounds. It advocates for using technology and the maker movement to hack and change the world by sharing accessible hardware and programming skills to empower everyone to become a maker.
Data analytics with hadoop hive on multiple data centersHirotaka Niisato
This document discusses GMO Internet's data analytics system for analyzing social game data from over 500 game titles across multiple data centers in Japan and the US. It summarizes the system's architecture, which uses Hadoop/Hive to process logging data from game servers into hourly, daily, weekly, and monthly reports on key performance indicators. The system partitions and stores large volumes of data across multiple NameNodes and processes over 6 million blocks and 44,000 jobs per day to generate conversion counts and other analytics for A/B testing.
Android and OpenNI - NUI Application Treasure Hunter RobotHirotaka Niisato
This document describes a treasure hunting robot game that is controlled using brain waves. The robot uses an OpenNI depth sensor and OpenNI/NITE libraries to recognize its position and orientation. Brain waves are read using a MindWave sensor and mapped to different robot actions - higher brain waves turn the robot left, middle waves move it forward, and lower waves turn it right. The source code for the project is released online so others can replicate and expand on the project.
This document describes an "Auto Chasing Turtle" robot project that uses a Kinect sensor to detect human faces and autonomously follow those faces. The robot consists of a KONDO Animal 01 robot base controlled by a BeagleBoard-xM running openFrameworks and ofxDroidKinect software. The Kinect provides RGB and depth camera data to detect faces and calculate the robot's course and distance to the target face. The robot rotates to find faces, calculates its direction to the detected face, and adjusts its distance using Kinect depth data. An iPad can view the robot working in real time.
12. HVC-Cでの他の開発(おまけ)
・Let s Play with Intel Edison!!
SDK内のソースにHVC-C側とのコマンド
が書いてある通りに作ればOKっぽそう
NodeJS、Nobleで接続して通信する所まで
は確認済(間違ってソースをrm…orz)
MacOSでもbrew installして開発で使える。