Category Archives: Geek

Django Unit Tests Against Unmanaged Databases

A Django project I’m working on defines two databases in its config: The standard/default internal db as well as a remote legacy read-only database belonging to my organization. Models for the read-only db were generated by inspectdb, and naturally have managed = False in their Meta class, which prevents Django from attempting any form of migration on them.

Unfortunately, that also prevents the Django test runner from trying to create a schema mirror of it during test runs. But what if you want to stub out some sample data from the read-only database into a fixture that can be loaded and accessed during unit tests? You’ll need to do the following:

  • Tell Django to create the second test database locally rather than on the remote host
  • Disable any routers you have that route queries for certain models through the remote db
  • Tell Django to override the Managed = False attribute in the Meta class during the test run

Putting that all together turned out to be a bit tricky, but it’s not bad once you understand how and why you need to take these steps. Because you’ll need to override a few settings during test runs only, it makes sense to create a separate test_settings.py to keep everything together:

from project.local_settings import *
from django.test.runner import DiscoverRunner


class UnManagedModelTestRunner(DiscoverRunner):
    '''
    Test runner that automatically makes all unmanaged models in your Django
    project managed for the duration of the test run.
    Many thanks to the Caktus Group: http://bit.ly/1N8TcHW
    '''

    def setup_test_environment(self, *args, **kwargs):
        from django.db.models.loading import get_models
        self.unmanaged_models = [m for m in get_models() if not m._meta.managed]
        for m in self.unmanaged_models:
            m._meta.managed = True
        super(UnManagedModelTestRunner, self).setup_test_environment(*args, **kwargs)

    def teardown_test_environment(self, *args, **kwargs):
        super(UnManagedModelTestRunner, self).teardown_test_environment(*args, **kwargs)
        # reset unmanaged models
        for m in self.unmanaged_models:
            m._meta.managed = False

# Since we can't create a test db on the read-only host, and we
# want our test dbs created with postgres rather than the default, override
# some of the global db settings, only to be in effect when "test" is present
# in the command line arguments:

if 'test' in sys.argv or 'test_coverage' in sys.argv:  # Covers regular testing and django-coverage

    DATABASES['default']['ENGINE'] = 'django.db.backends.postgresql_psycopg2'
    DATABASES['default']['HOST'] = '127.0.0.1'
    DATABASES['default']['USER'] = 'username'
    DATABASES['default']['PASSWORD'] = 'secret'

    DATABASES['tmi']['ENGINE'] = 'django.db.backends.postgresql_psycopg2'
    DATABASES['tmi']['HOST'] = '127.0.0.1'
    DATABASES['tmi']['USER'] = 'username'
    DATABASES['tmi']['PASSWORD'] = 'secret'


# The custom routers we're using to route certain ORM queries
# to the remote host conflict with our overridden db settings.
# Set DATABASE_ROUTERS to an empty list to return to the defaults
# during the test run.

DATABASE_ROUTERS = []

# Set Django's test runner to the custom class defined above
TEST_RUNNER = 'project.test_settings.UnManagedModelTestRunner'

With that in place, you can now run your tests with:

./manage.py test --settings=project.test_settings

… leaving settings untouched during normal site operations. You can now serialize some data from your read-only host and load it as a fixture in your tests:

class DirappTests(TestCase):

    # Load test data into both dbs:
    fixtures = ['auth_group.json', 'sample_people.json']

    ...

    def test_stub_data(self):
        # Guarantees that our sample data is being loaded in the test suite
        person = Foo.objects.get(id=7000533)
        self.assertEqual(person.first_name, "Quillen")

Please Don’t Text Me

In the olden days, a typical worker’s desk had an “inbox” and an “outbox.” The “inbox” represented things that needed to be dealt with, and the “outbox” represented things that were done. When email came along, its designers wisely emulated this metaphor.

onedoesnotsimply

Your email inbox represents everything you haven’t dealt with yet, but that needs to be. While managing your email, you’re engaged in an ongoing process of deleting things you don’t need to ever see again, or archiving things that have been dealt with but need to be kept for reference. If it doesn’t need to be dealt with, it has no excuse to exist in your inbox. At the end of every day, what’s left in your inbox is the (hopefully very small) list of things you haven’t gotten around to. But you know they’ll still be there tomorrow – they won’t be lost. Your inbox is, in essence, the most important on-going to-do list you’ve got.

Text messaging apps have no such concept. When a text is new, you get an alert. But the moment you glance at it, there is no mechanism for separating it out from all of the thousands of other texts piled up in your app – it becomes part of the noise. There is no way to know what in your text app needs responding to and what does not.

Therefore, when you send me a text, I have two choices:

  1. Drop everything and respond right now so your message doesn’t get forgotten
  2. Add your message to the “mental stack” of things that need to be dealt with later.

Most of the time, when a new text rolls in, I’m not able to deal with it right now. Ipso facto, most of the time, when a new text rolls in, it’s bound to get forgotten – I’ll never see it again. Unless I add it to my mental stack, i.e. unless I incur a cognitive burden.

Case in point: A few days ago a text rolled in while I was on a long bike ride, asking for information I didn’t have access to at the time. When I arrived home six hours later, that text was the absolute last thing on my mind. It was gone, virtually forgotten. There was nothing to remind me that it ever existed. If it had been an email, the fact of it existing in my inbox would have ensured that it got the response it deserved. The sender had simply chosen the wrong tool for the job.

Because of this reality, when you send me a text, you are putting a burden on me. You are saying, “Drop what you’re doing and respond to me right now, regardless whether it’s convenient for you, lest this communication be forgotten.”

When you email me, you’re saying “Please respond to this when the timing is convenient for you.” With email, I have the luxury of being able to delay my response a day or two if needed. There is no cognitive burden – I don’t have to remember to respond. I’ll know to respond later, because your message is there in my inbox.

So in what occasions is a text more appropriate than email?

  • We’re arranging details about something that’s happening now or in a few hours
  • You just want say hello or share something simple that doesn’t demand an immediate response
  • Computer is on fire.

If you’re planning something that is not happening today, please don’t text. If you’re communicating important information, that needs real typing to work out, please don’t text. If you’re communicating information I might want to be able to refer to later, please don’t text.

I know there’s a lot of talk about how “email is dead” and “email belongs to the old,” and about how some young people actually prefer text over email. I say it’s not about youth – it’s about respecting people’s time, regardless of age (and everyone is busy!). Also: Claims about the death of email are grossly exaggerated – for me and millions of others, email is still the centerpiece of online communication.

I’m not asking you never to text me. I’m asking to ask yourself whether what you have to say rises to the level of deserving a time-stealing text.

Taming a Mammoth Music Collection

Whether you’re talking about LPs or MP3s, people have really different ideas about what constitutes “the ultimate music collection.” For some, it means a process of endless refinement, boiling down a set of music to the purest essentials: All signal, no noise. For others, it’s an archival process (“Why have one Bix Biederbeck CD when you could have 23?”)

record-collection

It’s possible to have the best of both worlds: Maintain a large collection so you have access to everything, but create a playback system so you only end up hearing what you truly love.

I’ve been an eMusic subscriber for nearly a decade. I’ve spent a good deal of my spare time over the past four years digitizing my entire record collection, followed by my entire CD collection, followed by the large CD collections of six record-collecting friends (one of which alone was basically the Musical Library of Alexandria). All told, I’ve managed to amass a collection of ~120,000 tracks spanning ~9,100 albums, mostly in lossless format, and all with high-quality album art.

collection2

The accreted set now weighs around  2.25 terabytes  – large enough to have “special needs.” Over the past four years of building the collection, I’ve  picked up a few tips. Thought I’d share some of the most useful bits here,  in case anyone finds them helpful.

Love it or hate it, iTunes has enough traction to be considered the “default” music player for almost everyone, so I’m going on the assumption that it’s your player too. If you use something else, power to you! Everything below assumes you use iTunes 11 or 12.

This guide is split up into four major sections:

  • Remote Control (Playback techniques)
  • Miscellaneous iTunes Tips (Rare B-sides)
  • Digitization Notes
  • Building a Server

Continue reading

Displaying Django User Messages with Angular.js

Django’s Messages framework is an elegant workhorse, and I’ve never built a Django site that didn’t use it for displaying success/failure/info messages to users after certain actions are taken (like logging in successfully or adding an item to a cart).

But wouldn’t it be cool if you could use that functionality client-side, delivering user messages to be processed as JSON data rather than statically outputting messages to generated HTML? On a recent project, I needed to do this because Varnish caching doesn’t let you mark page fragments as non-cacheable, so statically generated messages were not an option. But there are all sorts of reasons you might want to handle Django Messages client-side.

messagedisplay

Here’s how to accomplish the job in a really lightweight way, without the need for a full-blown REST API app like Django Rest Framework or Tastypie, and with Angular.js (which is, IMO, the best of the current crop of JavaScript application frameworks).
Continue reading

DIY.org

diyorg diy.org is one of the best sites for keeping kids stimulated and engaged in the real world I’ve ever encountered. Beautifully designed and engineered, it breaks real-world maker skills into more than a hundred categories. When kids accomplish three tasks in a category, they get a virtual badge (you can purchase a real version of the badge for $5). This is the site I wish I had thought to build, dangit.

No idea what their monetization strategy is, but huge applause to the engineers and designers behind the project.

Miles (@Milezinator) is spending his Christmas break on a mad DIY badge quest (a blissful escape from Minecraft for us!).

django-allauth: Retrieve First/Last Names from FB, Twitter, Google

Of the several libraries/packages available for setting up social network logins for Django projects, I currently find django-allauth the most complete, with the best docs and the most active development. Doesn’t hurt that the lead dev on the project is super friendly and responsive on StackOverflow!

But not everything about it is intuitive. After wiring up Twitter, Facebook and Google as login providers, I found that first and last names were not being retrieved from the remote services when an account was successfully created. I also, frustratingly, could find only the most oblique references online to how to accomplish this.

There are a couple of ways to go about it – you can either receive and handle the allauth.account.signals.user_signed_up signal that allauth emits on success, or set up allauth.socialaccount.adapter.DefaultSocialAccountAdapter, which is also unfortunately barely documented.

I decided to go the signals route. The key to making this work is in intercepting the sociallogin parameter your signal handler will receive when an account is successfully created. I then installed a breakpoint with import pdb; pdb.set_trace() to inspect the contents of sociallogin. Once I had access to those goodies, I was able to post-populate the corresponding User objects in the database.

This sample code grabs First/Last names from Twitter, Facebook or Google; season to taste:


# When account is created via social, fire django-allauth signal to populate Django User record.
from allauth.account.signals import user_signed_up
from django.dispatch import receiver

@receiver(user_signed_up)
def user_signed_up_(request, user, sociallogin=None, **kwargs):
    '''
    When a social account is created successfully and this signal is received,
    django-allauth passes in the sociallogin param, giving access to metadata on the remote account, e.g.:

    sociallogin.account.provider  # e.g. 'twitter' 
    sociallogin.account.get_avatar_url()
    sociallogin.account.get_profile_url()
    sociallogin.account.extra_data['screen_name']

    See the socialaccount_socialaccount table for more in the 'extra_data' field.
    '''

    if sociallogin:
        # Extract first / last names from social nets and store on User record
        if sociallogin.account.provider == 'twitter':
            name = sociallogin.account.extra_data['name']
            user.first_name = name.split()[0]
            user.last_name = name.split()[1]

        if sociallogin.account.provider == 'facebook':
            user.first_name = sociallogin.account.extra_data['first_name']
            user.last_name = sociallogin.account.extra_data['last_name']

        if sociallogin.account.provider == 'google':
            user.first_name = sociallogin.account.extra_data['given_name']
            user.last_name = sociallogin.account.extra_data['family_name']

        user.save()

Force WordPress Auto-Update

I was excited to try out the new auto-update feature in WordPress 3.7.1. But the first attempt failed, since I had an old .svn directory sitting around. Deleted that, then waited… days. Communicated with one of the core devs on Twitter, who said that this first rollout was intentionally slow, to get things up to speed.

Finally got tired of waiting and decided to take matters into my own hands. Didn’t want to click the Update button – that would be cheating. Discovered there’s a new function call: wp_maybe_auto_update() that triggers the process that’s supposed to run via wp-cron.

So to trigger it from the command line, all you have to do is to create a small script in your WP root directory that bootstraps WP core and calls the function:

<?php
  // request-update.php
  require( dirname(__FILE__) . '/wp-load.php' );
  wp_maybe_auto_update();
?>

With that in place, run:

php ./request-update.php

Wait a few seconds, and your site should be updated. Check the Updates page in your Dashboard and confirm that you received the update email, and Bob’s your uncle.

Miles’ Minecraft Channel

Over the past few months, my 10-yr-old son (now 11!) has been producing his own YouTube video podcast series. Nearly every morning before school, he’s in the office with a microphone and QuickTime’s Screen Capture feature, narrating a Minecraft how-to or walk-through sequence of some kind. He’s becoming a real pro.

Now that he’s developed a solid set of videos, he asked me for a bit of help promoting his channel. He’d love to have more subscribers, if you or your kids are into Minecraft. Here’s the channel link.

I much prefer his tips on creative build techniques, like the “Epic sandfall” embedded below. I’m not nearly as into the PvP mode game tours, but as long as it’s clean and non-violent, I’m OK with it.

Vimeo vs. YouTube for GoPro Footage

Owning a GoPro camera is a total blast, but having to deal with the ultra-high-def footage and non-standard frame rates it generates forces you to think of details you might not have had to think about before. And beyond that, of course you want to show off all that pixel clarity. Watching one of your clips on the desktop is a jaw-dropping experience; watching it again after it’s been uploaded to the web is comparatively disappointing. But hosting the original files on your own server isn’t a very nice option either.

After spending the day with a GoPro on my head at the Santa Cruz Boardwalk yesterday, tried uploading one of the clips to both YouTube and Vimeo, and you can check them both out below for sake of comparison (try both of them full-screen).

Here’s the Vimeo version:

Double Shot, Santa Cruz Boardwalk w/GoPro Helmet Cam from Scot Hacker on Vimeo.

Since Vimeo is known for having the highest quality, it’s no surprise that the Vimeo version has less pixelation and more retained detail. But I’ve got seven clips to upload, and have to “wait for my week to reset” before I can upload more high-def footage, unless I spring for the “Plus” version at $10/month. Otherwise I have to wait for Wednesday to roll around if I want it free.

And here’s the YouTube version:

I don’t mind paying for services that provide quality, but $10/month is kind of steep for me, given how seldom I’ll need this ability. Hrmm, what to do.

Intro to Python Programming

For the past week, I’ve been mulling an offer to write another book (“Intro to Python Programming” for Penguin Press). It’s been more than a decade since my last book, and I’d really enjoy the opportunity, but am trying to get better about saying “no” to things, stop pushing myself so hard all the time, and to start having “margins of life” to enjoy like normal folk.

After a lot of thought, I just politely declined; can’t face the prospect of six months of deadlines and lost weekends. At one point, an offer like this would have seemed like part of living a juicy life; now it feels like just another thing that distracts from it. Hoping it was the right decision.