Archive for the ‘Web Design’ Category

I started using Sass (actually, scssphp, since I'm a PHP troglodyte) and I'm loving it. It makes CSS much cleaner; just having variables makes everything easier. The only catch is whether it would work with the non-standard CSS I use with my CSS parser for behaviors. The good news is that it does, with one caveat.

Continue reading ‘Using parsecss with Sass’ »

I wanted to use the CSS pseudoelement :before for manipulating input elements. I had <input class="before-test"/>, and I wanted to use CSS to add a label before the element on small screens only (on a larger screen I would have a table with the label as the <th> element). So

  .before-test:before { content: "label: "; }

ought to work, right?


:before,despite its name, doesn't insert its content before the element specified. It inserts the content as the first child of the element specified. In other words, it works like jQuery's prepend, not like before. See Smashing Magazine's article for more details.

So the correct way to do this is:

<div class="before-test"><input/></div>

With the selector targeting the enclosing element.

Continue reading ‘Understanding :before and :after’ »

I've been asked to help update my daughter's school website (it's not terrible now, just dated and hard to update), and they have some official "branding" with specific colors and fonts. The colors are easy (though I had to explain to the PR person that just because we have Pantone colors doesn't mean I can get the website colors to match on everyone's monitors) but custom fonts are more complicated. Luckily, Paul Irish has worked out all the cross-browser bugs and FontSquirrel does all the work for you. I don't have anything to add.

Finally got the courage to bite the bullet and switch the domain registrar from 1&1 to Nearly Free Speech. I have been forwarding the domain to for 5 months, but I was afraid that there was something I was going to miss in the transition, or that I would lose an important file or something like that. Transferring the domain means closing the account with 1 and 1, deleting everything, so it was irrevocable.

Transferring a domain involves two steps, first releasing it from 1&1, then telling NFS to take over. Releasing it took a while, and since I had private registration I had to call 1&1 customer support to get it finally done. The correct steps are, as far as I can tell:

  1. Make sure the NFS website (like correctly mirrors the original website, since you can't control exactly when the switch will happen and you hate to go down.
  2. On the 1&1 control panel, select Domains then select the check box for the correct domain.
  3. Select the Contact menu above the list of domains and select Private/Public Registration and make the domain public. If you want to keep it private even for the week or so of the transfer, skip this step but call customer service to have one of the administrators approve the transfer. NFS calls this inability to transfer a privately registered domain "Extortion by Proxy"
  4. Under Contacts, select Show Domain Contact Details and make sure those are right. If not, you'll never get the email to approve the transfer.
  5. Click the Info button and select Unlock Domain and copy the Auth code; that's the domain password to allow it to be transferred.

Now on the NFS side, log into your account and go to{yourusername}/domains/transfer?init=1. They will give you a to-do list before transferring, but you've already done all of that except the name server part, which is easier to do after transferring the domain. Enter the domain name and follow the instructions, they're straightforward. You'll need the Auth code from above.

Wait a week. At some point, you'll get an email asking you to approve the tranfer. Do so.

Change the nameservers (use "Set up DNS and name servers automatically").

It all seems to work well, and I'm now completely running on my new host!

I use Bing for the search box on, and it's worked well; simple API, no need to create a custom search engine as with Google. Unfortunately, Microsoft is losing almost half a million dollars an hour on Bing, and they want me to make up the difference. Well, not me alone, but they are going to start charging for using their web services. Fortunately, they are (as of now) providing a free tier of up to 5,000 queries a month, which is far more than I need.

So I have to sign up for Azure Marketplace (Azure is Microsoft's cloud service) and Subscribe to the Bing Web Search API and create an application key. Then I need to convert my old requests into the new format. Luckily, Microsoft provides a migration guide (as a Word document!), and that includes sample code in PHP. The biggest difference is the need for HTTP authentication. The code from Microsoft works, as long I leave out the proxy line in the context parameters (I guess they only tested their code on local servers) and file_get_contents works on URLs, which is enabled on my service with Nearly Free Speech. I imagine setting the header similarly with cUrl would also work.

The other big difference is that they no longer return the total number of results if not all of them were returned. Now they return a parameter __next (note two underlines) that contains the URL for getting more results if they are available. Since I'm only showing a limited list, I just need to test for the existence of that parameter to indicate that more results are available.

So the updated code is:

Continue reading ‘New Bing API’ »

OK, I'm convinced. While SexyBookmarks looked cool, and allowed me to put all those "Share or Tweet or Like This" buttons on each post, they have started to get to me. They're too cute, not terribly useful (I'm not that egotistical about spreading my words) and clearly are slowing things down. And the privacy issues are starting to weird me out. So I'm taking them off, both the social media buttons and the Sitemeter javascript. I'll keep the static Sitemeter image, because I am that egotistical that I want to have an idea about the numbers. I can use Apache's logs to get the referring sites if I'm curious.

As I wrote, I'm using Amazon S3 to store files that are too expensive to keep on my web server, with the plan of having frequently-updated files on the server and relatively constant stuff on S3. The address for my S3 server is, which is stored in the global variable $_SERVER['CDN'].

So to include a file, I would do:

$filename = '/toinclude.php';
if (file_exists($_SERVER['DOCUMENT_ROOT'].$filename)){
  $filename = $_SERVER['DOCUMENT_ROOT'].$filename;
  $filename = $_SERVER['CDN'].$filename;
include ($filename);

Which I use often enough to want to generalize it into a class.

Continue reading ‘Using S3 files in PHP’ »

Since I started using 3 months ago, I've been very pleased. Getting them to set up access with a private key was straightforward and the email support person was prompt, helpful and friendly. The only downsides are the safe mode restrictions, which I have been easily able to work around, and the expensive storage ($1/MB/month), which would add up quickly with all the icons and fonts I'm serving with the webservices.

So I put them onto Amazon's S3 at with the intent of using that like a Content Delivery Network (though it isn't really unless I pay for CloudFront as well)—static, large files should come tranparently from S3 while the dynamic site runs on

I do this with a bit of .htaccess hackery. It's harder to create or modify files on S3, so files that are in active developement are on the webserver. I want to serve those files if they exist. Only if the desired files do not exist do I want to get them from S3. Unfortunately, does not support mod_proxy, so the redirecting is not transparent (and we can't do things that require same-origin security). But for images and the like, this works:

SetEnvIf Request_URI . CDN=
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} ^/(images|inc|fonts)/
RewriteRule . %{ENV:CDN}%{REQUEST_URI}

Line 1 creates a variable named CDN. The directive SetEnv would be more natural to use as SetEnv CDN, but URL rewriting is done before SetEnv runs. The newer SetEnvIf runs early enough for the variable to be used for rewriting, but it's conditional, so we use a dummy condition: REQUEST_URI ., which means "If the requested URI matches any character"

Line 4 tests whether the requested file exists on the server. Only if it does not exist (!-f) is the next line tested, which is whether the file is in any of the CDN directories.

If it is to be redirected, create the new URL by concatenating the CDN variable with the requested URI, which does not contain the protocol or hostname. Thus has a REQUEST_URI of /images/silk/add.png and the rewritten URL is

The advantage of setting a variable in the .htaccess (aside from having the "magic constant" at the top of the file") is that this is passed to the PHP code as . So the name of the S3 server is written in just one place, with no need to change multiple files if it changes.

Download the code.


Download the WP Audio Player Standalone.

So the rabbi asked me to add the ability to play the audio on the taamim page (basically, a long list of short MP3's) directly on the page, rather than click the link to open a new page. No problem, right? We're living in an HTML5 world, so I should be able to do:

  $('<audio>').attr({src: this.href, controls: 'controls'}).insertBefore(this);

And everything ought to work: browsers that can't handle <audio> elements get nothing, modern browsers get a modern audio player. Nice progressive enhancement.

But of course, it's not that easy. Webkit (Chrome, Safari) supports MP3 playing, but Firefox does not (and won't), and Internet Explorer only does for IE9 and up, and I have to support Windows XP and IE8 (source; consistent with my experimentation). I don't like the <embed>ed players, so I'll go with Flash. I like the player that Language Log uses, and viewing the source tells me that's WPAudioplayer, which has a standalone version that requires just two files, the 11-kb swf file and a 12-kb javascript file.

To use it, include the javascript with a <script> element and initialize the player with AudioPlayer.setup('/path/to/player.swf', {width: 100}); where 100 is the desired width of the player in pixels (it's constant for every player on the page and it's a mandatory option). Then, each player is implemented by replacing an existing element, identified by id: AudioPlayer.embed(id, {soundFile: '/path/to/soundfile.mp3'});.

Of course, iOS won't run Flash, so I still need to use the <audio> element there. So I need to detect if the audio element works, and if not, insert the Flash substitute. Browsers that can't handle either get a blank spot.

Putting it together into a plugin:

(function($) {

var uid = 0;
var init = function (swf, width){
	AudioPlayer.setup(swf, {width: width});
	init = $.noop;
$.fn.inline_mp3 = function(swf){
  return this.each(function(){
		var id = 'audioplayer_'+(uid++);
		var player = $('<audio>').attr({
			src: this.href,
			controls: 'controls',
			id: id
		// audio.canPlayType test from
		if (!(player[0].canPlayType && player[0].canPlayType('audio/mpeg;').replace(/no/, ''))){
			init (swf, player.width());
			AudioPlayer.embed(id, {soundFile: this.href});

It uses a unique number to assign an id to each element, and lazy-initializes the Flash player. The player should be styled with a given width (since IE8 doesn't have a default <audio> size):

audio {
	width: 80px; 
	display: inline-block;

And use it:


And there are lots of other packages of html5/Flash fallback audio players but this is small and easy enough for me to understand.

I've used 1&1 since I started mucking about on the web; they had a cheap plan for $3/month with one domain name and a simple LAMP stack. But the price has been going up (now $5/month; still comparatively cheap!), and I'm starting to chafe at the limitations (no SSH shell access, proprietary 404 pages, no languages beside PHP) and I didn't want to shell out for my own virtual server (Rackspace goes as low as $11/month but I'm really cheap). So I was overjoyed when I found Nearly Free Speech.

Continue reading ‘Nearly Free Speech’ »