2015年6月30日 星期二

鴻海贊助機器人 獲美競賽季軍

Ref: http://www.chinatimes.com/newspapers/20150609000186-260204#.VXuF7_e2QMg.facebook

鴻海贊助機器人 獲美競賽季軍
Chimp
美國國防高等研究計劃署(Defense Advanced Research Projects Agency,簡稱 DARPA)機器人大賽出爐,由鴻海(2317)、亞馬遜(Amazon.com)與谷歌(Google)三巨頭出資贊助的機器人「Chimp」,拿下季軍,贏得50萬美元的獎金。
機器人是科技業的明日之星,無庸置疑!為了引導流行,美國國防高等研究計劃署主導的機器人大賽「Robotics Challenge」,率先在6月5日、6日登場,成功吸引各界目光,就連對機器人前景看好的企業家大老們,也紛紛慷慨解囊出資贊助。
為了評判出機器人的優劣,DARPA在賽前特別要求參賽者,必須製造出可以克服災難、完成任務的機器人,期待未來可以改採機器人,來替代人類,執行不適合人類執行的任務。
根據競賽設計,機器人必須在大賽中完成包括在牆壁打洞、攀登樓梯、通過不平穩的地面等等,共計8項考驗,此外,為模擬緊急災變環境,團隊與機器人的通訊還被中斷30秒以上。
昨日比賽結果出爐,獲得鴻海、亞馬遜與谷歌贊助、由卡內基美隆大學(Carnegie Mellon University)所組成的「Tartan Rescue」團隊,其機器人「Chimp」以55分15秒的時間完成所有任務、總得分8分,拿下第3名。
Tartan Rescue團隊的成員Tony Stentz、Eric Meyhofer與David Stager據傳都是Uber自駕車開發案的研究成員,而贊助者谷歌本身也是無人駕駛車的開發商,另鴻海也對無人駕駛車的技術頗感興趣,是否是因為這個關係,才促成這項贊助案,則不得而知。(工商時報)

2015年5月10日 星期日

IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》

Ref: https://tw.tech.yahoo.com/news/ip67-ip68-%E4%BB%A3%E8%A1%A8%E4%BB%80%E9%BA%BC-%E6%AC%A1%E7%9C%8B%E6%87%82-ip-024400392.html

IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
Ingress Protection Rating 國際防護等級認證
國際防護等級認證(International Protection Marking , IEC 60529)也稱作異物防護等級(Ingress Protection Rating)或IP代碼(IP Code。有時候也被叫做「防水等級」「防塵等級」等,這個等級表定義了機械和電子設備能提供針對固態異物進入(包括身體部位如手指,灰塵,砂礫等),液態滲入,意外接觸有何等程度的防護能力。
IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
例如,一個標稱IP22的插座可以有效防止手指的插入而不會損害,並且也不會在垂直或者近似於垂直的情況下由於滴水而變得危險。對於這類室內插座最低的設計標準就是IP22或者IP2X。部分內容轉自Wiki
IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
標稱IP67的手機可以完全防麈,並可放入最深1m的水
IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
首先是 固態微粒防護,也代表防塵等級
IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
接著是第二個代碼 液體滲透防護,也代表防水等級
IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
所以號稱IP67的SP廣穎 Armor A65M,防塵能力已達最高,完全防塵

而防水能力可浸泡於水中30分鐘
IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
而到達最高標準IP68 的Sony Xperia Z3,還可以承受水壓衝擊,甚至可長時間浸泡於水中而無影響

但要注意的是,防水防塵的完美狀態,是指未經過落摔之前的測試標準,若產品經過落摔後,因為機體結構改變而造成進水現象,則不列入討論範圍
IP67/IP68 代表什麼? 一次看懂《IP 防水防塵 等級表》
此外防水防塵的製作技術也有所不同。

有些公司的防水是在內層塗上防水膜,而這層防水膜通常是油性材質,若經過肥皂、清潔劑等等洗滌過後

就有機會造成防水特性逐漸消失,因此把防水手機拿去沖水OK,拿清潔劑刷洗就不行

而海水本身也會造成腐蝕現象,因此所有的防水測試都是自來水測試,落入海中只能自求多福
總結
防水機可防水、不防肥皂水、也不防海水,摔過的防水機不要再嘗試下水
下課

2015年5月9日 星期六

台大新發現,鍍金洋蔥表皮適合製造人造肌肉

Ref: http://www.bnext.com.tw/ext_rss/view/id/688253

圖說明
根據The Verge報導,國立台灣大學的研究者發現,鍍了金的洋蔥表皮是人造肌肉的好材料。
洋蔥特殊的細胞結構,使這種新的人造肌肉在收縮的時候能夠彎曲並保持柔軟。研究者曾經嘗試過用聚合物,但他們發現洋蔥皮是能在自然界中找到的便宜的選擇。而之前的模型往往無法在收縮的同時彎曲。
研究者把洋蔥表皮細胞中的水分凍乾,浸在稀釋了的酸中,讓表皮更具有彈性之後將洋蔥表皮浸在金中,並附上電極。它像肌肉一樣,能夠延伸,而且能對電流有反應。
圖說明
這種「肌肉」呈鑷子狀的結構。表皮上部會在低電壓下延展,高電壓下收縮,但這種材質和結構目前並沒有很好的承重能力——用兩塊「肌肉」只能夾起棉花球,研究人員正在這方面進行突破。
如果這種新的肌肉能夠投產,那麼可能會為仿真的、尤其是與人親密接觸的機器人帶來新的突破。現在的機器人的肌肉能單獨進行簡單的收縮、延展、旋轉,而無法像真正的肌肉一樣同時做到這幾件事。
比如Shadow Dexterous Hand機械手中使用的Air Muscle,注入壓縮空氣可以使它收縮37%的長度,並保持柔軟,但是它只能在一個方向上拉伸或收縮。當然它也可以旋轉,但需要依靠外力。不過與洋蔥表皮相比,它能承受更大的壓強和重量。
本文出自愛范兒/韓雨

2015年4月19日 星期日

Raspberry Pi : DIY 手機

http://goo.gl/jvmcGg

紀錄一下, 以後有空來試.

Now you can build your own DIY smartphone using Raspberry Pi

TyFone
Image Credit: Tyler Spadgenske

Walking into a store or going online to buy a smartphone is, like, so totally 2014. Why not just build your own?
Developer Tyler Spadgenske has been tinkering away at a basic DIY smartphone that runs on Raspberry Pi for more than a year now, and he’s posted an updated set of directions over at Instructables. His version builds on a previous iteration created by developer Dave Hunt.
According to the Raspberry Pi Foundation, the so-called “TyPhone” can “take photos (and send them to Dropbox or another device), send texts, and manage its own battery level, as well as placing and taking calls. Tyler wrote his own OS in Python, 3D-printed a rather smart enclosure, and now has a phone he’s built from the bottom up – hardware and software both.”
So, head on over to Instructables for the step-by-step directions and list of supplies, and get your Maker on. Or, if that seems too daunting, at least check out this video:


Raspberry Pi : raspivid


raspivid is one of the programs in the package

raspivid 是在這包 git 中的一個程式

 https://github.com/raspberrypi/userland


userland/host_applications/linux/apps/raspicam


Usage : can refer to this document
可參考這份文件

http://www.raspberrypi.org/wp-content/uploads/2013/07/RaspiCam-Documentation.pdf





ARM cortex-a 速度



Many people assume newer processors will be faster, or that 64-bit processor will provide a performance boost compared to 32-bit processors, but the reality can be quite different, and I’ve decided to have a look at ARM Cortex-A cores using ARMv7 (32-bit) and ARMv8 (64-bit) architecture, and see what kind of integer performance you can expect from each at a given frequency. To do so, I’ve simply use DMIPS/Mhz (Dhrystone MIPS/Megahertz) values listed on Wikipedia.
Vertical Scale: DMIPS / MHz
Vertical Scale: DMIPS / MHz

Drystone benchmark has no floating-point operating, so it’s a pure integer benchmark. I’m only looking at ARM core here, and once integrated in an SoC, other parameters like memory bandwidth, amount of cache,  GPU, etc.. will greatly affect the overall system performance. The figure above are per MHz, and it does not mean for example that a Cortex A5 processor will be slower than a Cortex A7 processor, as can be seen by the comparison between Amlogic S805 (4x Cortex A5) and Broadcom BCM2835 (4x Cortex A7), which shows the Amlogic processor is about 40% faster due to higher clock speed.
With that in mind, it can be seen than you may not expect all recent Cortex A53 processors to outperform existing Cortex A15 and A17 processors, and in some case even Cortex A9 processors, and the real performance benefit with 64-bit cores only start to show with Cortex A57, and especially Cortex A72 cores which is some cases could be twice as fast as Cortex A15 cores. The red zone on top of some bars represents the possible performance variation due to different implementations of the cores.
ARMv8 also brings some other improvement such as additional cryptographic extensions, an increase in the number of SIMD/floating point, and general purpose registers, and more, as shortly explained in that article. All of these should also deliver benefits provided the firmware and applications support them.


Read more: http://www.cnx-software.com/2015/04/09/relative-performance-of-arm-cortex-a-32-bit-and-64-bit-cores/#ixzz3XliGcof7

arduino beacon - iBeacon

http://evothings.com/diy-arduino-beacons/


記錄下來, 有空來試試.

DIY Arduino Beacons as an alternative to iBeacons

Mikael KindborgTutorials
Tweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+
Apple created a lot of interest in the IoT scene with the introduction of iBeacons. However, Apple placed restrictions such that generic BLE devices could not be used as iBeacons. In this tutorial we will show you how to have a fun time making a mobile app and create a custom implementation similar to iBeacons – based on the Arduino micro controller using standard BLE hardware.

In this tutorial, we will create a mobile app for Android and iOS, that uses an Arduino compatible board with a BLE shield to create a beacon. This can be thought of as a Do-It-Yourself version of Apple’s iBeacon technology – which is proprietary and restricts the way you can scan for beacons.
The reason we are using the Arduino for the beacons is that it can be easily programmed and that it is a cool tinker-friendly piece of technology that you can evolve far beyond the limits of iBeacons. The foundation of the iBeacon technology is the use of a small BLE (Bluetooth Low Energy) device that periodically advertises a UUID (Universally Unique Identifier) – however we will use the BLE name which is accessible across all mobile devices.
Arduino compatible boards with built in BLE will also work fine, such as the RedBearLab Blend Microboard, the RFduino or the LightBlue Bean.

The example app – Beacons for relaxation

The mobile application we have built for this tutorial is meant for use in a location where we want to give people time to relax and experience calm, for example a museum, an airport, a hospital or a public place such as a park. When approaching a beacon, the app will display a page that suggests a method for relaxation.
The beacon itself (the Arduino board) could be used as is, or be placed inside an object or display case that signifies the existence of the beacon, or be visually hidden. BLE devices have a range of up to 30 meters depending on the surroundings. You can be as many beacons as you like for your application, however we will use three of them to display the various relaxation states in the app.
Screenshots from the app:

relaxing-places-screen1relaxing-places-screen2relaxing-places-screen3relaxing-places-screen4
Below is a photo of an Arduino Uno we used as one of our beacons with a ReadBearLab BLE ShieldShield mounted on a SparkFun prototyping board:
relaxing-places-screen4
The LightBlue Bean is a small Arduino device with integrated BLE that we also used as DIY beacons:
relaxing-places-screen4

How to make the Arduino work as a Beacon

When a beacon is sending out signals, it uses the BLE advertising mode. The device repeatedly sends out an advertised name that you can set in your Arduino code. Additionally, you get the signal strength (RSSI = Received Signal Strength Indicator), which can be used to determine which beacon is the closest one.
As you walk around with your mobile device, another beacon will eventually become the closest one and the app will detect this and switch to the information that associated with that beacon. As you will see below it is remarkably simple to program this kind of mobile application using Evothings Studio.
The Arduino sketch will in essence do just one thing, set the name of the BLE shield. A limitation of the BLE name is that it is restricted to 10 characters, however with careful definition it can be enough to make the beacon IDs unique within your project setup.
It is important to note that you will have to set a unique BLE name onto every Arduino board you want to use as a beacon – if you would use the same name, there would be no way to tell the difference between them.
The Arduino sketch for the ReadBearLab BLE Shield and RedBearLab Blend Micro is shown below (file ArduinoBeacon.ino):
// Arduino code for example Arduino BLE Beacon.
// Evothings AB, 2014

// Include BLE files.
#include <SPI.h>
#include <boards.h>
#include <RBL_nRF8001.h>
#include <services.h>

// This function is called only once, at reset.
void setup()
{
    // Enable serial debug.
    Serial.begin(9600);
    Serial.println("Arduino Beacon example started");
    Serial.println("Serial rate set to 9600");

    // Set a custom BLE name for the beacon.
    // Note that each Arduino should be given a unique name!
    ble_set_name("BEACON1");

    // Initialize BLE library.
    ble_begin();

    Serial.println("Beacon activated");
}

// This function is called continuously, after setup() completes.
void loop()
{
    // Process BLE events.
    ble_do_events();
}

How the mobile application works

The mobile app that monitors beacons is developed in HTML5 and JavaScript using Evothings Studio. During the development process we used the Evothings Client app to utilize the HyperReload technology to effortlessly debug and develop our app.
When development is finished, there are several ways you can share your app. You can host the app on a web server so that visitors of the location where we have placed the beacons can easily access it. You could also build a native app and publish it on the app stores using exactly the same code.
The app continuously scans for advertising BLE devices and determines which of them belong to our application, then determines which beacon is closest and displays the HTML content associated with that beacon. The individual information pages are found in the file index.html. Each page is defined within its own div tag, which can be dynamically shown or hidden. If no beacons are in range, a default information page is shown.
Monitoring beacons and selecting which page to show is done in JavaScript code, found in the fileapp.js. The beacon to page mappings are defined as follows:
// Mapping of beacon names to page ids.
app.beaconPages =
{
    'BEACON1':'page-feet',
    'BEACON2':'page-shoulders',
    'BEACON3':'page-face'
}
Note that the names of your beacons must match the names used as keys in the above dictionary. Here is the code that gets called each time a BLE device advertisement is received by the app (this happens continuously):
app.deviceFound = function(deviceInfo)
{
    // Have we found one of our beacons?
    if (app.beaconPages[deviceInfo.name] && deviceInfo.rssi < 0)
    {
        // Update signal strength for beacon.
        app.beaconRSSI[deviceInfo.name] =
        {
            rssi: deviceInfo.rssi,
            timestamp: Date.now()
        }
    }
}
Logic for selection the closest beacon is found in the timer function app.runSelectPageTimer, which gets called at regular intervals.
In total, the app has the following code files:
  • index.html – main page, contains div tags for info pages
  • app.js – the JavaScript code for the app, included in index.html
  • page.css – style sheet definitions
  • ArduinoBeacon – folder with the ArduinoBeacon.ino file
There are also images used in the application. You will find all the project files on GitHub.

Running the app using Evothings Studio

  • To run the app, first download Evothings Studio.
  • Then install and start the Evothings Client app on your mobile device(s).
  • Download the code for the app from GitHub into a folder on your computer.
  • Drag the file index.html into Evothings Workbench.
  • Connect from the Evothings Client app to the Workbench.
  • Press RUN in the Workbench window, right next to the example.
  • Remember to configure your Arduinos with the proper names for the app to work!

Running the app from a web server

To share the app with others, you can host it on a web server. Do as follows:
  • Download the code for the app from GitHub and put it on a web server.
  • Ask your users to install and start the Evothings Client app on their mobile device(s).
  • Connect from the Evothings Client by entering the address of the web server (such as http://myserver.com/mybeaconapp) and tap CONNECT to start the application.
  • Remember to configure your Arduinos with the proper names for the app to work!

Use any BLE device as a beacon

You should be able to use almost any BLE device for this project. Being able to set the device name is a requirement for more serious projects – just enter the name in the dictionary app.beaconPages as shown above. You can use the app BLE Scan that comes as an example included with Evothings Studio to determine the names of your BLE devices.
The advantage of using an Arduino as a beacon is that you can change its advertising name. You can also change the name of the LightBlue Bean, by connecting to a bean using the Bean Loader App and clicking on the name and edit it (no coding required). Other devices may have other procedures for setting the name, and some devices you cannot change the name for.

Where to go from here

This tutorial introduced the concept of “Do-It-Yourself” beacons to allow you to use any BLE device as a beacon without using Apple’s proprietary iBeacon technology.
However, Evothings Client also supports Apple’s iBeacons by including the Cordova iBeacon plugin. If you wish to explore iBeacon technology, Evothings Studio makes it easy to get started. Check out theiBeacon Scan example app to get going.
You are always welcome to drop in on the Evothings Forum, to discuss technology, applications, ask questions, and share experiences.
Interesting links:

2015年4月10日 星期五

Raspberry Pi : LCD

http://www.wvshare.com/product/3.5inch-RPi-LCD-A.htm


http://www.waveshare.net/wiki/3.5inch_RPi_LCD_(A)

sudo raspi-config
确定已选择:Enable Boot to Desktop/Scratch -> Desktop Log in as user ‘pi’ at the graphical desktop
#3.5inch RPi LCD (A)运行
 
sudo ./LCD35-show
 

Raspberry Pi : Kernel image

Ref: https://www.raspberrypi.org/documentation/linux/kernel/building.md

在 PC 上用 cross compile 的方式來做 :

1. 安裝工具

$ git clone https://github.com/raspberrypi/tools

把目錄 : 

tools/arm-bcm2708/gcc-linaro-arm-linux-gnueabihf-raspbian/bin 

設定到 PATH 

2. 安裝 source

$ git clone --depth=1 https://github.com/raspberrypi/linux

3. 把 SD card 插入, 用 lsblk 可以看到

sdb
   sdb1
   sdb2

4. 編譯 kernel 

$ make ARCH=arm CROSS_COMPILE=arm-linux-gnueabihf- menuconfig

Open the following menu :
Device Drivers
Network device support
Wireless LAN

$ make ARCH=arm CROSS_COMPILE=arm-linux-gnueabihf- 

如果 compile 時發現 libstdc++.so.6 有問題的話

sudo apt-get install libc6-i386 lib32z1 lib32stdc++6


5. 建立目錄 mnt/fat32 和 mnt/ext4, 並 mount
    
$ sudo mount /dev/sdb1 mnt/fat32
$ sudo mount /dev/sdb2 mnt/ext4

6. 安裝 kernel


$ sudo cp mnt/fat32/kernel.img mnt/fat32/kernel-backup.img
$ sudo cp arch/arm/boot/Image mnt/fat32/kernel.img

7. 安裝 module

$ make ARCH=arm CROSS_COMPILE=arm-linux-gnueabihf- INSTALL_MOD_PATH=mnt/ext4 modules
$ sudo make ARCH=arm CROSS_COMPILE=arm-linux-gnueabihf- INSTALL_MOD_PATH=mnt/ext4 modules_install

8. umount sd card

$ sudo umount mnt/fat32
$ sudo umount mnt/ext4


2015年4月8日 星期三

2015年4月5日 星期日

raspberry pi : broadcom video core

Broadcom Videocore

http://www.broadcom.com/docs/support/videocore/VideoCoreIV-AG100-R.pdf


Raspberry Pi Camera module introduction

http://elinux.org/Rpi_Camera_Module



  • Sensor type: OmniVision OV5647 Color CMOS QSXGA (5-megapixel)

"flat flex" cable (FFC, 1mm pitch, 15 conductor, type B contacts, Molex 21039-0843,

tiny connector (Hirose DF30 series, part DF30FC-24DP-0.4V)

  • Video: 1080p at 30 fps with codec H.264 (AVC)
  • 改裝

OV5647

2010 年 2 月 annouce
- 五百萬像素
低照度感光效果( 680-mV/lux-sec )
- 720p 60 fps or 1080p 30 fps

product feature : http://www.ovt.com/uploads/parts/OV5647.pdf