Of the several libraries/packages available for setting up social network logins for Django projects, I currently find django-allauth the most complete, with the best docs and the most active development. Doesn’t hurt that the lead dev on the project is super friendly and responsive on StackOverflow!
But not everything about it is intuitive. After wiring up Twitter, Facebook and Google as login providers, I found that first and last names were not being retrieved from the remote services when an account was successfully created. I also, frustratingly, could find only the most oblique references online to how to accomplish this.
There are a couple of ways to go about it – you can either receive and handle the
allauth.account.signals.user_signed_up signal that allauth emits on success, or set up
allauth.socialaccount.adapter.DefaultSocialAccountAdapter, which is also unfortunately barely documented.
I decided to go the signals route. The key to making this work is in intercepting the
sociallogin parameter your signal handler will receive when an account is successfully created. I then installed a breakpoint with
import pdb; pdb.set_trace() to inspect the contents of
sociallogin. Once I had access to those goodies, I was able to post-populate the corresponding User objects in the database.
This sample code grabs First/Last names from Twitter, Facebook or Google; season to taste:
# When account is created via social, fire django-allauth signal to populate Django User record.
from allauth.account.signals import user_signed_up
from django.dispatch import receiver
def user_signed_up_(request, user, sociallogin, **kwargs):
When a social account is created successfully and this signal is received,
django-allauth passes in the sociallogin param, giving access to metadata on the remote account, e.g.:
sociallogin.account.provider # e.g. 'twitter'
See the socialaccount_socialaccount table for more in the 'extra_data' field.
# Extract first / last names, avatar from social nets and store on User record
if sociallogin.account.provider == 'twitter':
name = sociallogin.account.extra_data['name']
user.first_name = name.split()
user.last_name = name.split()
if sociallogin.account.provider == 'facebook':
user.first_name = sociallogin.account.extra_data['first_name']
user.last_name = sociallogin.account.extra_data['last_name']
if sociallogin.account.provider == 'google':
user.first_name = sociallogin.account.extra_data['given_name']
user.last_name = sociallogin.account.extra_data['family_name']
I was excited to try out the new auto-update feature in WordPress 3.7.1. But the first attempt failed, since I had an old
.svn directory sitting around. Deleted that, then waited… days. Communicated with one of the core devs on Twitter, who said that this first rollout was intentionally slow, to get things up to speed.
Finally got tired of waiting and decided to take matters into my own hands. Didn’t want to click the Update button – that would be cheating. Discovered there’s a new function call:
wp_maybe_auto_update() that triggers the process that’s supposed to run via wp-cron.
So to trigger it from the command line, all you have to do is to create a small script in your WP root directory that bootstraps WP core and calls the function:
require( dirname(__FILE__) . '/wp-load.php' );
With that in place, run:
Wait a few seconds, and your site should be updated. Check the Updates page in your Dashboard and confirm that you received the update email, and Bob’s your uncle.
Over the past few months, my 10-yr-old son (now 11!) has been producing his own YouTube video podcast series. Nearly every morning before school, he’s in the office with a microphone and QuickTime’s Screen Capture feature, narrating a Minecraft how-to or walk-through sequence of some kind. He’s becoming a real pro.
Now that he’s developed a solid set of videos, he asked me for a bit of help promoting his channel. He’d love to have more subscribers, if you or your kids are into Minecraft. Here’s the channel link.
I much prefer his tips on creative build techniques, like the “Epic sandfall” embedded below. I’m not nearly as into the PvP mode game tours, but as long as it’s clean and non-violent, I’m OK with it.
Owning a GoPro camera is a total blast, but having to deal with the ultra-high-def footage and non-standard frame rates it generates forces you to think of details you might not have had to think about before. And beyond that, of course you want to show off all that pixel clarity. Watching one of your clips on the desktop is a jaw-dropping experience; watching it again after it’s been uploaded to the web is comparatively disappointing. But hosting the original files on your own server isn’t a very nice option either.
After spending the day with a GoPro on my head at the Santa Cruz Boardwalk yesterday, tried uploading one of the clips to both YouTube and Vimeo, and you can check them both out below for sake of comparison (try both of them full-screen).
Here’s the Vimeo version:
Double Shot, Santa Cruz Boardwalk w/GoPro Helmet Cam from Scot Hacker on Vimeo.
Since Vimeo is known for having the highest quality, it’s no surprise that the Vimeo version has less pixelation and more retained detail. But I’ve got seven clips to upload, and have to “wait for my week to reset” before I can upload more high-def footage, unless I spring for the “Plus” version at $10/month. Otherwise I have to wait for Wednesday to roll around if I want it free.
And here’s the YouTube version:
I don’t mind paying for services that provide quality, but $10/month is kind of steep for me, given how seldom I’ll need this ability. Hrmm, what to do.
For the past week, I’ve been mulling an offer to write another book (“Intro to Python Programming” for Penguin Press). It’s been more than a decade since my last book, and I’d really enjoy the opportunity, but am trying to get better about saying “no” to things, stop pushing myself so hard all the time, and to start having “margins of life” to enjoy like normal folk.
After a lot of thought, I just politely declined; can’t face the prospect of six months of deadlines and lost weekends. At one point, an offer like this would have seemed like part of living a juicy life; now it feels like just another thing that distracts from it. Hoping it was the right decision.
Even the passwords we once considered “strong” have become almost trivially easy for sophisticated crackers to break. With massive arrays of very-fast processors, lightning fast graphics cards, and extremely sophisticated cracking techniques, hackers are making mincemeat out of stolen password databases, and openly trading them on the black market.
Most “average user” passwords now fall so easily that some in the security community feel the username/password mechnanism itself must be traded in for something entirely different, like biometrics. But until that time comes, you need to be doing everything you can to make your passwords as secure as possible.
The purpose of this piece is not to scare you, but to give you the tools you need to stay safe.
There’s a ton of advice floating around out there on what makes for a good password, how to create memorable (and easy-to-type) passwords, and how to keep track of lots of different passwords. Unfortunately, a lot of that advice is written by geeks for geeks, while the people who generally need the advice the most are “regular” (non-geek) users. If you’re a geek, chances are you’re probably already doing most of this stuff. This article is an attempt to summarize the best password hygiene advice out there for your parents, bosses, aunts and uncles, and non-geek friends.
There’s a twist at the end, plus an explanation of the graphic above, so please read all the way through.
I’m awed almost daily by the simplicity and elegance of Angular.js. By eliminating all of the DOM access syntax we’ve come to take for granted in jQuery and friends, and by giving any element on the page a live, two-way data binding relationship with your business logic, Angular lets you create anything from simple widgets to full-on Single Page Applications with the fewest lines of code possible.
I recently created a live GPA calculator as part of a large SPA I’m working on in my day job, but have boiled it down to its bare essence for this widget demo. Try changing the dropdown options here and watch the GPA calculation change in real-time:
View html | View script
In this example, we assume that a student’s current course load comes in over the wire with course names and units. We iterate over the course set and, for all courses being taken for a letter grade, multiply the numeric weight of a predicted grade by the number of units. Those scores get added up, then divided by the total number of units. When a new grade estimate is selected from a dropdown, we need to recalculate the whole aggregate. Let’s step through it.
So everyone’s going apeshit over the impending death of Google Reader. Can we keep a bit of perspective on this please?
- We loved and used RSS before Google Reader, and we’ll continue to love and use RSS long after it’s gone.
- Google Reader is just another RSS client. OK, its community integration features were unique, but as a pure client, there always have been, and will always continue to be, lots of far superior alternatives.
- This has nothing to do with “the death of open standards.” Nothing is happening to the RSS standard, for godssake.
- What do you expect from free software? A lifetime commitment?
I’ll grant that the big problem here is that Reader has become the default backing store for other clients. In fact, my favorite RSS client by far, Reeder, uses Google Reader as a storage and sync mechanism. Hopefully, Reeder will act quickly to enable other aggregators to fill that role, or to let us add feeds independently of a central aggregator. If it doesn’t, I’ll find one that does. Because, after all, that’s what all RSS aggregators did before Reader existed.
It’s not that big of a loss. RSS lives.
Thank God they spared Orkut.
Update: Reeder has already stated that they’ll live on after the death of Reader.
Here are 50+ Reader replacements either working now or on the horizon.
One of two mystifying downgrades that come with OS X Lion / Mountain Lion* is the fact that all traces of hex values have been removed from the Digital Colorimeter. For hundreds of thousands of web developers, obtaining hex values is the only purpose of Colorimeter, and I suspect that web developers are the bundled app’s main users. This one is a total head-scratcher. Hopefully the change is a bug, not a feature, and it’ll be back someday.
Meanwhile, if you’re looking for a workable free replacement, check out this simple Colors app. Not quite as elegant, but gets the job done just fine.
* The other mystifying downgrade in Mountain Lion is the “snooze” feature in iCal alerts. You used to be able to set a snooze to be re-reminded a few hours later, or the day before, or whatever you like. Now your only option is to acknowledge the alert and dismiss it, so the daily GTD workflow for bazillions of users is completely broken. What went through the heads of the designers removing this critical feature is anyone’s guess.
This is primarily a guide for administrators of cPanel hosting systems, though tech-savvy cPanel users with shell access will be able to use this technique as well.
Users of webmail systems like GMail, Yahoo, etc. are accustomed to having a “Mark as Spam” button in the interface. Clicking the button tells the server that the selected message is spam, to prevent similar messages from showing up in the inbox again. So how can administrators of standard cPanel-based hosting systems provide similar functionality?