Python

Deploying clortho with ansible and venv

I have started using Ansible to manage the few services that I still have running around here. I want to deploy clortho to a user on the system I use for serving up movies to my Roku players . I need to copy the source, setup a venv with the dependencies (aiohttp ), and setup a systemd unit to make sure it is started at boot time.

As of Python 3.3 the core library includes support for venv , and Python 3.4 added default installation of pip so that now the only support you need on a target system is the core Python 3.4 build. Everything else can be done inside a venv.

clortho - A simple key/value server

clortho is a very simple key/value server written using python 3.4 and aiohttp

key namespace is per-client IP address, and there is no authentication. It supports direct connections from clients, and the X-Forwarded-For header for use with proxies. I shouldn’t have to mention that running this on an open network isn’t a good idea.

Get a key value by requesting http://server/keystore/<key>:

curl http://server/keystore/louis

If the key has been set it will return a 200 status with the value. If the key is not set it will return a 404 and a message:

Automatic Backup of Files to S3 and Glacier

Automatic backups are important, especially when it comes to irreplaceable data like family photos. I have used s3cmd to maintain my website’s static files for a while now, and it was simple to use it to push my 100GB+ archive of photos over to S3. But I needed an automated way to update it with any new photos that my wife or I may take. The sync protocol really isn’t what you want – there should be no need to re-examine all the files that have already been archived. You really only want to copy over new ones added since the last update.

Reliable Wireless Temperatures

Freezer Temperatures

The temperature sensors that I use to drive the freezer graph are in my garage, which doesn’t have an easy way to run wires to the server room. I have a WRT54GL running DD-WRT configured as a bridge to the rest of my network. The problem with this is that the connection isn’t always reliable. I used to have a simple script that read the temps and fed them to my main mysql server, but the connection would frequently drop and it would lose the temperature data.

New MovieLandmarks Update

Movie Landmarks is back online. I think this is the 5th iteration of the project that I originally started back in 2006. It started out as a PHP app, morphed into a python wsgi application. It was always backed by a mysql db with lots of interactive features. For this redesign I've dropped all of that extra stuff and simplified things.

I threw out the database and replaced it with a couple of python dictionaries holding the landmark information and another with the movie data. I use Python to create the JSON and html files use for the site. This only needs to be run when I add new landmarks or movies.

Local time for mutt email display

I use mutt as my email client. Something that has recently been bugging me is that when reading a message it displays the original Date: header with the sender's timezone. Since I work with people in several different zones I am constantly having to do timezone math when looking at these. So I decided to fix that with a bit of python.

One of mutt's features is that you can feed every email you view through a filter by using the display_filter setting. So I created a small python app using the email module to parse the message, grab the original date and add a new header named X-Date: that has my local time in it. It looks like this:

tidy_html plugin for rawdog

Requires python-tiny package on Fedora. Cleans up the HTML, preventing broken elements from spilling over into adjacent postings. Code was lifted from feedparser.py and dropped into a plugin for rawdog since I couldn't find an easy way to get mx.Tiny installed.

# rawdog plugin to tidy up html output using python-tidy module
# Brian C. Lane <bcl@brianlane.com>
#
from tidy import parseString
import rawdoglib.plugins, re

def tidy_html(config, box, baseurl, inline):
    data = box.value
    utf8 = type(data) == type(u'')
    if utf8:
        data = data.encode('utf-8')

    data = str(parseString(data, output_xhtml=1, numeric_entities=1, wrap=0))

    if utf8:
        data = unicode(data, 'utf-8')
    if data.count('<body'):
        data = data.split('<body', 1)[1]
        if data.count('>'):
            data = data.split('>', 1)[1]
    if data.count('</body'):
        data = data.split('</body', 1)[0]

    box.value = data.strip()

rawdoglib.plugins.attach_hook("clean_html", tidy_html)

Newseum Page Grabber Script

Newseum archives the front pages of of over 500 newspapers from all around the world. If you know the ID of the papers you want to see you can use this simple Python program to download the jpg of the papers' front page to your local system. Edit the CITIES list to set the IDs of the papers to be grabbed.

#!/usr/bin/env python
"""
    Quick Newseum Frontpage Grabber script
    Copyright 2009 by Brian C. Lane
    Imp Software
    All Rights Reserved

    Modify CITIES list below to add the city designators (as seen in the
    URLS at http://www.newseum.org/todaysfrontpages/default.asp)
"""
import urllib2
import re
import os
import urlparse

# Add more cities here
CITIES = [ "AL_AS", "AL_MA",   ]

NEWSEUM_URL="http://www.newseum.org/todaysfrontpages/hr.asp?fpVname=%s"
NEWSEUM_IMG="http://www.newseum.org"

def fetchNewseumImage(city):
    """
    Fetch the image for a city
    """
    print "Parsing the page for %s" % (city)
    page = urllib2.urlopen(NEWSEUM_URL % city).read()

    # Quick and dirty grep for the image name
    match = re.search('<img class="tfp_lrg_img" src="(.*)" alt=', page)
    if match:
        img_url = NEWSEUM_IMG + os.path.abspath(match.group(1))
        print "Saving the image for %s" % (city)
        image = urllib2.urlopen(img_url).read()
        open(os.path.basename(match.group(1)), "wb").write(image)

def main():
    """
    Main code goes here
    """
    for city in CITIES:
        fetchNewseumImage(city)

if __name__ == '__main__':
    main()

The source is also hosted here at github

Hygrosens Python Library

Hygrosens manufactures a number of sensors for measuring temperature, humidity, light level, pressure. Their devices use a common serial data format for a wide variety of sensors, include 1-wire sensors from Dallas. This library reads the output from Hygrosens devices and passes it to a calling function as a hash. I have included an example that outputs the readings in human readable format, and another that stores the readings into a MySQL database.