To help others jump on the mobile app revolution bandwagon, I have released yet another ebook and you can get it free for the next 5 days. I actually launched this ebook a while back, but have now published it to Amazon and made it available for Kindle which is something I didn’t do before, but got a tremendous amount of requests for. So, now that it’s available for the Kindle and you can get it for free, what’s stopping you? Head over to Amazon and grab your free copy of Android App Dev 101: Create Your Own Android Apps for Fun and Profit before it’s too late!
A while back a friend of mine asked me to help him build an app that used computer vision to detect things like traffic lights & road signs. The idea was to use his Raspberry Pi and Pi Cam as a dashcam in his car. He wanted it to identify traffic lights that were red and play some sort of sound letting him know the light was red and that he needed to stop. Once the light turned green, it would play another sound letting him know that he could proceed. He also wanted to have the app do several other things, all of which were based off of color. Normally I would recommend using something such as a Haar classifier or dedicated algorithm for detecting such objects. But, I also wanted to see for myself whether it would be possible to detect objects based entirely on their color. The results were actually very good and the speed of the app outperformed some of the best algorithms available.
In order for me to know the RGB values of the objects he wanted to detect, I wrote a simple tool that uses Python and OpenCV to filter out all colors except the colors I am interested in, leaving me with the RGB values I needed to plug into the app. Now, I know that there are plenty of tools out there that can be used to identify RGB values of objects in images. But, being the geek that I am, I opted to write my own tool. Plus, this was just another opportunity for me to do some work with computer vision. Since I enjoyed playing around with this tool and found it helpful for my friend’s dashcam project, I thought I would share the app in case anyone else might also find it useful / fun.
A while back I posted an article about an Android app I wrote that allows you to perform real-time vehicle diagnostics using Google Glass and an OBD-II adapter. At the end of that article, I promised I would share the source code once I completed it. Unfortunately, I have lost the final source code for that project. Earlier today, I received an email from a reader asking for an explanation of how to read data from the OBD-II adapter. Since I’ve received several other emails asking this same question, I thought I would turn my reply to that email into a post to share with anyone else that might be interested in doing this.
BTW, I did manage to find an earlier version of that source code in one of my backups which I will share a link to at the end of this article. Just be warned, though, that the code is very messy (there’s a lot of commented out debugging stuff in there), it isn’t documented, and it doesn’t include all of the stuff mentioned in the video at the link above. It also only supports reading RPM & speed information, but I had also started adding support for MPG which wasn’t finished at the time this backup was made. But, this code and the following explanation should be enough to get you started with creating your own Android app that allows you to perform real-time vehicle diagnostics using your Google Glass.
In yesterday’s article, I showed you how to create a Python script that broadcasts a systems’ vitals (diskspace, memory, & CPU) out onto the network. As promised, today I will show you how to create a Python script that listens for these broadcasts and displays the results. I use this script as part of my home automation system and display the results on a touchscreen TFT that’s exposed out the front of an enclosure that houses the main-brain of my home automation system (see pictures below). But, you can do all kinds of other cool things with the results such as selecting Raspberry Pis with unused resources to act as slaves in a distributed computing application (which I will explain in another article). But for now, let’s get started.
In case you haven’t read any of my other home automation articles, I use a lot of Raspberry Pis in my home automation system. Keeping track of them used to be a burden, but not any more. As mentioned in yesterday’s article, I have Raspberry Pis installed in some pretty crazy places. Even though all of the Pis in my house have dedicated functions, some Pis aren’t used as extensively as others. For example, I have Raspberry Pis installed inside the walls of several rooms in my house which act as built-in media centers. But, when those Pis aren’t being used for providing media, I don’t want them sitting there doing nothing. So, I make them contribute their unused resources to the rest of the Pis in my house to get a little extra horsepower for some of the Pis that have more heavy lifting to do (such as the Pis that do face & speech recognition).
In order for me to keep up with which Pis are online and how much free diskspace, memory, and CPU utilization they have available to share with the rest of the house, I wrote a small Python app that runs every time the Raspberry Pi starts up and continues to broadcast its existence and system vitals every 10 seconds (configurable) where I detect those broadcasts and display the results on a touchscreen TFT monitor (which I will explain in my next article). For now, I want to share with you the Python script that broadcasts its system vitals out to any listening system on the network.