Python and Sensor Networks at Aclima

Thursday, April 02 2015

After a few weeks at Aclima, I feel like I’ve got my bearings enough to start talking about our challenges and what we’re doing with Python. It’s always interesting to hear case studies, so I’ll toss ours on to the pile.

A brief overview of Aclima

The best way to gain insight into how a system operates or responds to stimuli is to measure and observe it over time. By looking at historical data, we can find interesting interactions, correlations, and inspiration for future research. Sometimes, such efforts lead to sweeping changes in theories, policies, and even public opinion. A few convenient examples of such tidal shifts in opinion are drastic reductions in cigarette, lead, and asbestos usage.

The environment around us has all sorts of systems that we don’t understand particularly well. For example, the consequences of pollution on a global (and local) scale are still not particularly well understood. At an even more localized level, any number of things can influence the conditions inside of a building (and the health of its inhabitants).

As interesting as it is to measure data for the sake of measuring, the data itself isn’t particularly valuable without analysis and insight. Fortunately, Aclima handles the whole process, from designing and deploying the sensors, to collecting the data and performing analysis. The end product is actionable data or recommendations, or an on-going conversation with deployment partners to change the way we imagine and manage our buildings, communities, and cities.

Where Python comes into play

It would be possible to build the backend for our sensor networks on any number of programming languages. However, Python has snagged a huge chunk of mindshare in the engineering and science community. It is also an incredibly capable general-purpose language. Consequently, our infrastructure and research teams both make extensive use of Python.

Some of the critical things we use Python for include (but aren’t limited to):

  • The ingestion of data points in tight intervals from sensors all around the world.
  • Automated analysis and tagging of incoming data.
  • Querying, charting, and analyzing the data that we have gathered.
  • Deploy/config management (Ansible).
  • Various internal tools.
  • SciPy and Pandas are incredibly useful!

Challenges for the backend team

Since my position is within the backend team, I am most well-equipped to share some of the things we’ll be working on in the near term:

  • Reliably handling inbound sensor data from thousands (or millions) of sensors around the world.
  • Reducing the chance of data loss due to outages or minor disruptions in the backend.
  • Building tools for our team and our customers to review and analyze the data.
  • Working with interesting technologies such as Cassandra, Docker, Flask, and more.
  • Incorporating machine learning.
  • Automating. Everything.

If any of this sounds interesting, we’re hiring!

If building large, highly resilient, highly scalable sensor networks sounds interesting to you, check out our Backend Software Engineer position. We are planning to grow our team substantially this year. We also have a number of other openings on the frontend, device, and research teams.

Progress update on python-gotalk

Friday, February 06 2015

As covered in a previous post, I’ve been tinkering with a Python implementation of the fledgling Gotalk. Since this has been fun to play with, I figured it’d be worth sharing where python-gotalk is, and what has happened with it in the last two weeks.

Upstream gotalk progress ...

read more

python-fedex 1.1.0 released

Thursday, February 05 2015

I am pleased (and somewhat embarrassed) to release python-fedex 1.1.0! Pleased in that these changes have been patiently waiting for PyPi for a few years now, embarrassed in that I’ve let the project sit since I stopped using it more than five years ago. Let’s re-visit ...

read more

Why you should donate to the Django fellowship program

Friday, January 23 2015

Cheerleading/peer-pressuring the masses to pitch in for the Django Fellowship program.

read more

Let’s play: python-gotalk

Friday, January 23 2015

A recent HackerNews post announced Gotalk, a simple bidirectional protocol. I can imagine your collective eyeballs rolling. "Oh great, yet another half-baked way for… things to talk to one other". But keep following along, maybe you’ll see something you like. Here are some highlights:

  • By Rasmus Andersson - You may ...
read more

python-colormath 2.1.0 released

Sunday, January 11 2015

python-colormath 2.1.0 has landed, bringing with it some excellent new features and bug fixes. See the release notes for a more detailed look at the changes.

The headlining feature is the replacement of our hardcoded conversion tables with NetworkX-based resolution of color conversions (courtesy, Michael Mauderer). Color ...

read more

Networked, multi-container image crawling with Docker and fig

Saturday, January 10 2015

An example of a networked, multi-container image crawler using Docker and fig.

read more

python-colormath 2.0 released!

Saturday, May 03 2014

python-colormath was started back in 2008, when I was an undergraduate at Clemson University (Go Tigers!). While there are a good number of people out there making use of the module effectively, there were a lot of things I wanted to do differently in an eventual 2.0 release. There ...

read more

Fabric task for notifying New Relic of a code deploy

Monday, February 11 2013

A brief example Fabric task for notify New Relic of code deploys.

read more

Switched to Pelican

Sunday, February 10 2013

For the last four years, my blog has been powered by Django. As I have found myself becoming more and more busy, I have stopped wanting to hassle with keeping things up to date on the server and the application.

After a weekend of tinkering and conversions, I’m now ...

read more