Magical Experience with Beacons

One month ago on 27th of March, my friend passed me a box of Estimote Proximity Beacons. That day marks the beginning of my journey towards a greater understanding of beacons and IoT.

Since the day I joined travel industry, I have always been thinking of providing a fun travel experience with beacon technology. When I joined Changi Airport team in 2015, I proposed to my manager the possibility of applying beacons in the airport. The idea was rejected. Now, I finally get the chance to build something with the small little Estimote Proximity Beacons.

estimote-beacons.png
We forcefully opened up the beacons and replaced the batteries.

Claiming Beacons

Every Estimote beacons are shipped with an unique ID which we can modify. By default, the beacon ID is in iBeacon format and consists of 3 values:

The three values are hierarchical. The purpose of UUID is to distinguish our beacons from all other beacons in the network. Major and Minor values allow us to label the beacons with higher accuracy.

ibeacon-format
An example of how a chain of retail shops will deploy and label their beacons. (Source: Estimote Developer Docs)

The iBeacon ID can be changed. One way is to use the Estimote app to do it. Since I wasn’t the owner of the beacons, my first step is to claim the beacon using the app. After I successfully claim the beacons, I can then proceed to retrieve detailed info of the beacons and modify their info.

claiming-beacons-and-changing-broadcasting-power.png
Claiming beacon and modifying its info, such as its range (by default it’s ~3.5m).

Google Beacon Platform

After configuring our beacons, we can then proceed to claim the ownership of our beacons on the Google Beacon Registry. There is a mobile app called Beacon Tools available from Google to help us registering our beacons on Google Beacon Registry. There is a very interesting video interviewing Peter Lewis in the Coffee with a Googler season talking about the steps of beacon registration.

google-beacon-registry.png
Peter shares about Google Beacon Registry and Google Beacon Platform. (Source: YouTube)

After that, we can associate a lot of information with our beacons. To do so, we first are recommended to use Google Beacon Dashboard. There is a very simple tutorial guiding us to use the Google Beacon Dashboard to associate the attachments with the beacons.

attachments
My beacon project, Icy Marshmallow, and the attachments of a beacon in the project.

Read Attachments

I’m using the Nearby Messages API to retrieve the attachments from the beacons. I did a small little Android app (which is properly configured following the recommended steps) with the codes as shown below to achieve this.

package gclprojects.icymarshmallow;

...
import com.google.android.gms.common.ConnectionResult;
import com.google.android.gms.common.api.GoogleApiClient;
import com.google.android.gms.Nearby;
import com.google.android.gms.nearby.messages.Message;
import com.google.android.gms.nearby.messages.MessageListener;
import com.google.android.gms.nearby.messages.Strategy;
import com.google.android.gms.nearby.messages.SubscribeOptions;

public class MainActivity extends AppCompatActivity 
        implements GoogleApiClient.ConnectionCallbacks,
        GoogleApiClient.OnConnectionFailedListener {
    
    private GoogleApiClient mGoogleApiClient;
    private MessageListener mMessageListener;
    ...

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        ...

        mGoogleApiClient = new GoogleApiClient.Builder(this)
                .addApi(Nearby.MESSAGES_API)
                .addConnectionCallbacks(this)
                .enableAutoManage(this, this)
                .build();

        mMessageListener = new MessageListener() {
            @Override
            public void onFound(final Message message) {
                // Called when a new message is found.
                // Use message.getType().toString() to read the attachment Type
                // Use new String(message.getContent()) to read the attachment Value
            }
        }
    }

    @Override
    protected void onStart() {
        super.onStart();
        mGoogleApiClient.connect();
    }

    @Override
    protected void onStop() {
        super.onStop();
        mGoogleApiClient.disconnect();
    }

    @Override
    public void onConnected(@Nullable Bundle bundle) {
        if (mGoogleApiClient != null && mGoogleApiClient.isConnected()) {
            subscribe();
        }
    }

    ...

    private void subscribe() {
        SubscribeOptions options = new SubscribeOptions.Builder()
                .setStrategy(Strategy.BLE_ONLY)
                .build();

        Nearby.Messages.subscribe(mGoogleApiClient, mMessageListener);
    }
}

With the codes above, when beacon gets detected by the mobile app, the onFound method gets called for each of the attachment associated with the beacons. If we print the variable message into Log, we shall see something as follows.

Message{namespace='icy-marshmallow', type='string', content=[29 bytes]}

As shown above, the Value of the attachment is base64 encoded. So to read it, we just need to use new String(message.getContent()).

In the subscribe method, since we are only interested in messages attached to BLE (Bluetooth Low Energy) beacons, we use Strategy.BLE_ONLY.

Problem #1: Unsubscribe Method

When the app is running and another app comes into the foreground, we also need to stop subscribing to messages from the beacons. Otherwise, when we navigate back to the app, the messages can no longer be received even though we re-trigger the subscribe method.

So, I added the following codes.

@Override
protected void onPause() {
    if (mGoogleApiClient != null && mGoogleApiCLient.isConnected) {
        unsubsribe();
    }

    super.onPause();
}

@Override
protected void onResume() {
    if (mGoogleApiClient != null && mGoogleApiClient.isConnected) {
        subscribe();
    }

    super.onResume();
}

private void unsubscribe() {
    Nearby.Messages.unsubscribe(mGoogleApiClient, mMessageListener);
}

Problem #2: Stop Receiving Messages After Few Minutes

Another problem I notice is that the messages will stop be “found” after one to two minutes. However, if I re-trigger the mobile app, then I can start seeing the messages being detected for another one or two minutes.

To solve this issue, I use a simple timer which helps to check whether it has been quite some time the app doesn’t detect the beacons. If it’s more than 1 minute, then the timer will do a unsubscribe-then-subscribe-again action. This will help the mobile app to keep receiving the messages from the beacons. It also solve the problem of the mobile app re-visiting the beacons.

Problem #3: Geo-Location

This is not a real problem if we don’t need the geo-location information of the beacons. However, if we need to know the geo-location of the beacon, one simple way is to just use the LocationManager which provides periodic updates of the mobile geographical location.

package gclprojects.icymarshmallow;

...
import android.location.Location;
import android.location.LocationListener;
import android.location.LocationManager;

public class MainActivity extends AppCompatActivity 
        implements GoogleApiClient.ConnectionCallbacks,
        GoogleApiClient.OnConnectionFailedListener {
    LocationManager locationManager;
    ...

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        ...

        LocationListener locationListener = new LocationListener() {
            public void onLocationChanged(Location location) {
                // Record down the latitude and longitude of the mobile
            }

            ...
        }

        locationManager = (LocationManager) this.getSystemService(Context.LOCATION_SERVICE);
    
        int permissionCheck = ContextCompat.checkSelfPermission(this, Manifest.permission.ACCESS_FINE_LOCATION);
        if (permissionCheck == PackageManager.PERMISSION_GRANTED) {
            locationManager.requestLocationUpdates(LocationManager.NETWORK.PROVIDER, 0, 0, locationListener);

            ...
        }
    }
}

Writing Data to Firebase

This step is optional unless the data collected needs to be stored for future use.

I use the following codes to write the beacon data to Firebase database.

package gclprojects.icymarshmallow;

...
import com.google.firebase.database.DatabaseReference;
import com.google.firebase.database.FirebaseDatabase;

public class MainActivity extends AppCompatActivity 
        implements GoogleApiClient.ConnectionCallbacks,
        GoogleApiClient.OnConnectionFailedListener {
    private DatabaseReference mDatabase;
    ...

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        ...
        
        mDatabase = FirebaseDatabase.getInstance().getReference();

        mMessageListener = new MessageListener() {
            @Override
            public void onFound(final Message message) {
                ...

                Beacon beaconInfo = new Beacon(...);

                Format formatter = new SimpleDateFormat("yyyy-MM-dd-HH:mm:ss");
                mDatabase.child("Person A")
                        .child(formatter.format(new Date())
                        .setValue(beaconInfo);
            }
        }
    }
}

...

public class Beacon { ... }
firebase.png
Successfully recorded the data from beacons in my Firebase database!

To integrate our Android app with Firebase, our friendly Android Studio comes with a tool called the Firebase Assistance which will help us connect to the Firebase. The assistance also comes with short getting-started tutorial to show us how to cinfigure and add realtime database to our mobile app.

beacon-in-changi-airport.png
Spot the beacon. =)

Installing Beacons in Changi Airport

Installing beacons in our Changi Airport is always one of my dreams to enhance the experience of millions of travelers flying in and out of the airport. In fact, currently the Armsterdam city is already making use of beacon technology to build a powerful beacon networks to give the people a better experience when they are walking around in the city. So why can’t we do the same in our friendly Changi Airport? =)

Advertisements

Burger and Cheese

xamarin-cognitive-services-android-voicetext

As a web developer, I don’t have many chances to play with mobile app projects. So rather than limit myself to just one field, I love to explore other technologies, especially mobile app development.

Burger Project: My First Xamarin App

Last month, I attended a Xamarin talk at Microsoft Singapore office with my colleague. The talk was about authentication and authorization with social networks such as Facebook and Twitter via Azure App Service: Mobile App.

Ben Ishiyama-Levy is talking about how Xamarin and Microsoft Azure works together.
Ben Ishiyama-Levy is talking about how Xamarin and Microsoft Azure works together.

The speaker is Ben Ishiyama-Levy, a Xamarin evangelist. His talk inspired me to further explore how I could retrieve user info from social network after authenticating the users.

Because I am geek-first and I really want to find out more, so I continue to read more about this topic. With the help from my colleague, I developed a simple Xamarin.Android app to demonstrate the Authentication and logged-in user’s info retrieval.

The demo app is called Burger and it can be found on my Github repository: https://github.com/goh-chunlin/Burger.

Challenges in Burger Project

Retrieving user's info from social network.
Retrieving user’s info from social network.

In Burger project, the first big challenge is to understand how Azure App Service: Mobile App works in Xamarin. Luckily, with the material and tutorial given in the Xamarin talk from Ben, I was able to get a quick start on this.

My colleague also shared another tutorial which is about getting authenticated user’s personal details on Universal Windows Platform (UWP). It helps me a lot to understand about how mobile app and Azure App Service can work together.

My second challenge in this project is to understand Facebook Graph API. I still remember that I spent quite some time finding out why I could not retrieve the friend list of a logged-in Facebook user. With the introduction of the Facebook Graph API 2.0, access to a user’s friends list via /me/friends is limited to just friends using the same app. Hence after reading a few other online tutorials, I finally somehow able to get another subset of a user’s friends via /me/taggable_friends.

In this project, it’s also the first time I apply Reflection in my personal project. It helps me easily get the according social network login class with a neat and organized code.

microsoftdeveloperday
Microsoft Developer Day at NUS, Singapore in May 2016

Cheese Project: When Google Speech Meets MS LUIS on Android

Few months ago, I’m fortunate to represent my company to attend Microsoft Developer Day 2016 in National University of Singapore (NUS).

The day is the first time Microsoft CEO Satya Nadella comes to Singapore. It’s also my first time learn about the powerful Cognitive Services and LUIS (Language Understanding Intelligence Service) in Microsoft Azure in Riza’s talk.

presentation
Riza’s presentation about Microsoft Cognitive APIs during Microsoft Developer Day.

Challenges in Cheese Project

Everyday, it takes about one hour for me to reach home from office. Hence, I will only have two to three hours every night to work on personal projects and learning. During weekends, when people are having fun out there, I will spend time on researching about some exciting new technologies.

There are many advance topics in LUIS. I still remember that when I was learning how LUIS works, my friend was actually playing the Rise of the Tomb Raider beside me. So while he was there phew-phew-phew, I was doing data training on LUIS web interface.

luis
Microsoft LUIS (Language Understanding Intelligence Service) and Intents

Currently, I only worked on some simple intents, such as returning me current date and time as well as understanding which language I want to translate to.

My first idea in Cheese project is to build an Android app such that if I say “Please translate blah-blah to xxx language”, the app will understand and do the translation accordingly. This can be quite easily done with the help of both LUIS and Google Translate.

After showing this app to my colleagues, we realized one problem in the app. It’s too troublesome for users to keep saying “Please translate blah-blah to xxx language” every time they need to translate something. Hence, recently I have changed it to use GUI to provide language selection. This, however, reduces the role played by LUIS in this project.

voicetext
VoiceText provides a range of speakers and voices with emotions!

To make the project even more fun, I implemented the VoiceText Web API from Japanese in the Android app. The cool thing about this TTS (Text-To-Speech) API is that it allows developers to specify the mood and characteristic of the voice. The challenge, of course, is to read the API written in Japanese. =P

Oh ya, this is the link to my Cheese repository on Github: https://github.com/goh-chunlin/Cheese. I will continue to work on this project while exploring more about LUIS. Stay tuned.

languagelist    googlespeech    SuccessfullyTranslated.png

After-Work Personal Projects

There are still more things in mobile app development for me to learn. Even though most of the time I feel exhausted after long work day, working on new and exciting technologies helps me getting energized again in the evening.

I’m not as hardworking as my friends who are willing to sacrifice their sleep for their hobby projects and learning, hence the progress of my personal project development is kind of slow. Oh well, at least now I have my little app to help me talking to people when I travel to Hong Kong and Japan next year!

The Little Problem when Consuming .NET Web Service via kSOAP

In one of our projects, we have a .NET web service. On the client side, we have phone running Android which will consume the web service. Hence, the first option that came to my teammate’s mind is to use kSOAP2, a lightweight SOAP client library for Android.

Problem

When we first tried out our Android app, we realized that it worked fine if the app was calling a web method with no parameter. However, when the app called another web method with one or more parameters, the web service complaint that the value it received was always null.

My colleague discovered this issue around 5:40pm, 20 minutes before I could leave the office. I always hope that I can enjoy a sunset on my way back from office. Hence, I decided to help him settle that little problem within the 20 minutes. Otherwise, I might miss the sunset for the day.

Just a monkey watching the sunset
Just a monkey watching the sunset

First Check: Parameter Name

According to an article on CodeProject regarding kSOAP and .NET web services, it is a must to have the name and type of PropertyInfo to be the same as the web method parameter name and type.

My colleague had the parameter name same and the type is always string. So, this is not the cause.

Second Check: dotNet Flag

I am not sure why some people saying that the dotNet flag doesn’t need to be set to true (or commented out). However, it turns out that later even when we didn’t do anything about this dotNet flag, it still worked. So we just kept the following line in our code.

envelope.dotNet = true;

Third Check: Namespace

I finally came across this discussion on Stack Overflow which suggested that the problem we’re facing was probably caused by the namespace. DanneJaha, the author of the post, commented that if trailing “/” was used in our web service, we needed to have the same in our application as well.

After we added the trailing “/”, our application worked when calling web methods with parameters! Yahoo!

If this doesn’t help in your case, according to other comments on Stack Overflow, the namespace, SOAP action, etc are case sensitive. So, please check that too.

30 Minutes Crash Course of kSOAP

I left the office around 6:10pm so I still had chance to enjoy the sunset.

Anyway, this is actually a good opportunity for me to learn a bit about kSOAP and how Android app communicates with .NET web service. So, it’s fun to get involve in some other projects which you are not working on usually. =)

There are more to learn though.

First Draft of Android Wear App

As a software engineer, it’s always fun when trying out to build some new technology and new tool. It all started with a question from my colleagues, “Is it possible to build an app for Samsung Gear?” The answer is yes, but how? I had never built an app for Android Wear and I didn’t even own any smart watch. Hence I decided to give it a try.

First View

I built the app using Android Studio.

I like the launching animation on the Android Wear emulator. Below are some amazing screenshots taken.

Android Wear Emulator Launched!
Android Wear Emulator launched!

In fact, I could not get the emulator in the first place. When I tried to launch the Android Wear Square emulator, it threw the following error messages on Android Studio.

emulator: ERROR: x86 emulation currently requires hardware acceleration! 
Please ensure Intel HAXM is properly installed and usable. 
CPU acceleration status: HAX kernel module is not installed!

Hence, what needed to be done was just installing the Intel x86 Emulator Accelerator (HAXM) installer under the Extras folder in Android SDK Manager.

After the installation is done, there is also a need to run the installer located in the Hardware_Accelerated_Execution_Manager folder. Once it is installed on the computer, the Android Wear emulator should be able to be launched!

There is a very good discussion on Stack Overflow about this issue, please check it out if you are interested to know more.

The Intel Hardware Accelerated Execution Manager Setup
The Intel Hardware Accelerated Execution Manager Setup

First Draft

Currently, I haven’t finished my first app for Android Wear, Gym Challenge. However, there are already some screenshots available for it which demonstrates the look-and-feel of the application on the Android Wear Square emulator.

Gym Challenge homepage
Gym Challenge homepage

 

One of the features in Gym Challenge is to keep track of the user's time spent on gym.
One of the features in Gym Challenge is to keep track of the user’s time spent on gym.

Meanwhile, I still have no idea on how to try out the functionality to get the heart rate of the user without connecting the emulator to the real smart watch. So, things like SensorManager mSensorManager = ((SensorManager)getSystemService(SENSOR_SERVICE)); is never tested on my app. =P

Anyway, this is just a beginning of my Android Wear app development journey. Feel free to comment or correct me if you spot anything wrong in this post. Thanks! =)

Fixed Issues in Entertainment Connect (Android)

After my Android app, Entertainment Connect, was completed in the beginning of this month, I tested it on tablet. It turned out the VideoView size didn’t scale accordingly to the screen size. That was when I started to learn about Layout Weight.

Fix the Layout: Make Video Displays Nicely

Previously, I set a fixed value for the height of Video View. So, that caused the video appeared to be very small on a bigger screen of tablet.

<VideoView
    android:id="@+id/videoView2"
    android:layout_width="fill_parent"
    android:layout_height="200dp"
    android:paddingTop="50dp"
    android:keepScreenOn="true" />

It turns out that both VideoView and ListView are put in linear layout with vertical orientation. So, I just need to assign some nonzero values to the layout weight of them and the video now looks very nice on Android tablet.

Entertainment Connect on Galaxy Tab 4
Entertainment Connect on Galaxy Tab 4

Log Out and Go Back

Entertainment Connect retrieves media files from the user’s OneDrive. Hence, there is a need to handle user login and logout activities correctly. Android devices all have this Back button. Hence, in the first version of Entertainment Connect, user can log out from the app and then click the Back button to view the media list again. To prevent that, I added following two lines in the logout method.

intent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP); // To clean up all activities
finish();

FLAG_ACTIVITY_CLEAR_TOP helps to clean all the activities and thus prevent user to go back to the Player Activity after logging out. There is an interesting discussion on StackOverflow regarding this.

Calling finish() because the current activity (Player Activity) is done and should be closed after logout. Interestingly, without calling it, the user can still go back to the Player Activity after logging out even though FLAG_ACTIVITY_CLEAR_TOP is used.

Don’t agree with my methods? Feel free to correct me on Github: https://github.com/goh-chunlin/EntertainmentConnectAndroid.

Available on Play Store

Entertainment Connect is now already available on Google Play Store: https://play.google.com/store/apps/details?id=gclproject.onesong&hl=en. It’s free, so please download it now!

Install Entertainment Connect on your Android now!
Install Entertainment Connect on your Android devices now!

Entertainment Connected to Android

GCL Project + Android + OneDrive

It has been two months since I completed Entertainment Connect for Windows 8 platform. Entertainment Connect is an application that is able to play those MP3 and MP4 media files stored in your Microsoft OneDrive storage.

Soon after I completed the application for Windows, I found out that more and more of my family and friends were buying Android phones. Thus, I decided to build another version of Entertainment Connect for Android.

Entertainment Connect is now available on Android devices!
Entertainment Connect is now available on Android devices!

Today, I would like to share what I had learnt in developing my first personal Android app which makes use of Microsoft Live SDK for Android.

New IDE: Android Studio

I have been using Eclipse for Android app development in my work. Coding with Eclipse is not easy. Luckily, Google just released Android Studio, an official IDE built specifically for Android with much powerful GUI designer. In addition, due to the fact that Google encourages developers to migrate to Android Studio, I decided to try it out.

Android Studio with the login page of Entertainment Connect.
Android Studio with the login page of Entertainment Connect.

Working with Microsoft Live SDK

Yesterday, I just received notifications from Live SDK Github saying that the team was going to support and migrate to Android Studio. Finally. When I started this project, the Live SDK only supports Eclipse ADT.

It is very easy to include Live SDK to the project in Android Studio. Firstly, I need to download the Live SDK. Just download the whole project via ZIP is enough. The project consists of some useful samples which teach us how to properly use the SDK.

Secondly, I need to add new module under Project Structure.

Add new module in Project Structure.
Add new module in Project Structure.

Thirdly, I just choose the “Import Existing Project” option which will import the Eclipse project (Live SDK) as a module.

Import existing Eclipse project as module.
Import existing Eclipse project as module.

Finally, to make my application being able to use the Live SDK, I need to create introduce a module dependency to my app module, as shown in the screenshot below.

Introduce module dependency between app and src (Live SDK).
Introduce module dependency between app and src (Live SDK).

That’s all. If you would like to know more details about adding SDK in Android Studio, please checkout a post in StackOverflow about the import of Facebook SDK.

Can It Be More Complicated?

When I did Entertainment Connect for Windows 8 using WinJS, to create a media player, I basically just used the following codes.

var playerContainer = document.getElementById('playerContainer');
videoPlayer = document.createElement('video');
videoPlayer.id = videoStaticUrl;
videoPlayer.controls = "controls";
var videoSource = document.createElement('source');
videoSource.src = videoUrl;
videoSource.type = 'video/mp4; codecs="avc1.42E01E, mp4a.40.2"';
videoPlayer.appendChild(videoSource);
playerContainer.appendChild(videoPlayer);

With these few lines of code, I can already get a working media player with all the controls, such as play, pause, progress bar, etc.

However, this is not the case in Android app development. I am using VideoView. Hence, I also need to build my own play/pause functions and progress bar.

Also, I realized there was a bug if I switched from playing video file to audio file on VideoView. The image part of the previous video will stay even though the audio starts playing already. Hence, I added the following few lines of code to reset the background image of the VideoView so that the image of previous video will be “erased”.

videoPlayer.setBackgroundColor(Color.TRANSPARENT);
if (availableMedia.get(position).getmMediaFileName().toLowerCase().endsWith(".mp3")) {
    videoPlayer.setBackgroundColor(Color.BLACK);
}

Loading the thumbnail of media from OneDrive is also a headache in Android.

In Windows 8 app, after adding the items returned from Live SDK to a collection, I can easily bind the items to the template easily. After that, the thumbnails will be automatically shown on the screen smoothly.

<!-- Template of the list items to show available music/videos -->
<div id="mediumListIconTextTemplate" data-win-control="WinJS.Binding.Template" style="display: none">
    <div class="mediumListIconTextItem">
        <img onerror="this.src='/images/default-video-preview.png';" class="mediumListIconTextItem-Image" data-win-bind="src : picture" />
        <div class="mediumListIconTextItem-Detail">
            <h4 data-win-bind="innerText: name"></h4>
            <h6 data-win-bind="innerText: duration"></h6>
        </div>
    </div>
</div>

In Android, I have to create a background worker to retrieve the thumbnail with the following code. Then sometime when I scroll the list, the thumbnail won’t be updated immediately. Also, I need to use some tricks to make sure the correct images are displayed on the list view.

URL thumbnailUrl = new URL(imageView.getTag().toString());
HttpsURLConnection imageConnection = (HttpsURLConnection) thumbnailUrl.openConnection();
imageConnection.setDoInput(true);
imageConnection.connect();
InputStream inputStreamOfImage = imageConnection.getInputStream();
return BitmapFactory.decodeStream(inputStreamOfImage);

Building Android App Is Fun

Yes, it is fun. However, it’s slower than Windows 8 app development. It’s just too bad that not a lot of my friends really go Windows Store to download desktop apps. So I have no choice but to build Android version of my app also.

I will try to publish Entertainment Connect to Google Play soon after I have fixed my debit card issue. Currently, I still encounter problems on paying developer registration fee with Google Wallet. Oh well.

Meanwhile, feel free to read rest of Entertainment Connect (Android) code on Github: https://github.com/goh-chunlin/EntertainmentConnectAndroid.

Entertainment Connect (Android) GitHub Banner