Tuesday 29 September 2009

Website finally gets Google Analytics

I really hate it when web sites make you wait to read their content because the web page is waiting for an ad server or google analytics server to respond to their request to display an ad or track a visitor. The site owner may view these tools as important to their site, but as a user I do not. I came to their site on the 'promise' of certain content, not to exercise their site's implementation... and if the response is too long (10 seconds or so) I find another web site to get the content from.

This is why I haven't (until now) implemented Google Analytics on my site. A user comes to my site for some content they are interested in, not to enable me to track my user population. So what's change?

I recently finished writing my open source javascript boot scripts (more on that in a later post) which load other resources after the web page has been fully rendered. One such script that is called loads google analytics in the background without affecting the user's experience of the page content. If google's response is slow or fails, the user is never aware of it. I consider this a great improvement to the user experience that respects the user's intentions whilst still meeting the site owner's management needs without alienating their users.

Friday 18 September 2009

Optimised Webpage Delivery and New Cool Tools

I'm really pleased. Having spent more time on finding solutions for front-end issues has certainly been worth it. I have been trying to minimise network load through the use of deflating resource sizes, minifying css and js, and easily managing version numbers of resources so that lengthy expiry dates can be used. There are certainly some great solutions out there... so this is what I did:


Deflating resource sizes: For the php files I used ob_start("ob_gzhandler");. For the css (and future js) used Minify which worked for me when other solutions would not. All full automated.


Minify resources: This is quite labour intensive. To do this, I manually replaced all @import statements in other css files with the css from those files. Then I ran YUI Compressor to minify the css files.


Versioning resources: Let's face it, doing this by hand is tedious and more importantly error prone. Using the resource's 'natural' name and managing them in a version control system is the way to go. But to use lengthy expiry dates and allow users to keep up with your latest resource changes requires a way to uniquely identify those resources through URLs. I found this article Automatically Version Your CSS and JavaScript Files at Particle Tree and implemented it.


So what is the outcome of this effort? For my homepage browsers were doing the following:























Browser TypeHTTP ConnectionsTotal Size
non-ie843KB
ie81046KB
ie6/71146KB

(ie8 and ie6/7 have conditional comments to load additional css rules) and afterwards:



























Browser TypeHTTP ConnectionsTotal SizeReduction
non-ie48KB81%
ie858.5KB81%
ie6/768.5KB81%

That's faster page delivery and a snappy response (Host Provider permitting).


Steve Souders announced a couple of new tools today. I think you're going to like this one...SpriteMe.

Thursday 17 September 2009

Optimisations Thwarted by Host Provider

Last night I spent several hours implementing different ways to gzip my css and js resources for my website. Solutions include modifying .htaccess, using a php script to do it, combining php script/ .htaccess/ mod_rewrite, gzipping on the client/ addtype to .htaccess. The solutions went from just deflating on the server, all the way to php scripts that combined, minimised, cached and deflated -just move your files over to the web server and the rest is taken care of.

There are some really great solutions out there, but none worked for me when it came to deflating stuff. It seems gzipped items, whilst delivered to the browser, are delivered empty. It's definitely due to my host provider's configuration although they say you can gzip php files, so may be there's away.

Over the next few weeks I'll try some different ideas and see if I can get these resources deflated. Until then I'll stick with minimisation using YUIcompressor on the development client.

Thursday 10 September 2009

Loading JS Dynamically and Unobtrusively

For a while I've wanted to implement unobtrusive javascript to support Steve Souders ideas for high performance web sites. For me using unobtrusive js to load _all_ the page's js after the page has loaded is a logical extension to the philosophy of progressive enhancement; first create the HTML page, second add CSS to make it visually pleasing, and finally add the javascript to knock their socks off!! There are many more reasons to create a web page this way, but that's for another post. Others allude to this dynamic loading of js as 'progressive rendering' (obviously only if the js modifies the user interaction).

The architecture I've been creating for a general solution that can be used across different customer sites without changes, is to have a small amount of inline js in each page (preferably as the last tag before the close body tag) that loads up a small external js file when DOMContentLoaded fires that then loads all the other js the developer wants and any additional CSS files that are only used for js the enhancements to a page. In the web page the developer is able to define which js and css resources are needed by using a couple of simple api provided by the inline script; ct.resources.loadJS() and ct.resources.loadCSS().

So far I've been able to keep the inline js portion down to about 1200 bytes compacted (gzipping the HTML page should reduce the network load to about 700 bytes for the inline js), so it's pretty light. It also means that all the js and additional css is loaded _after_ the page has rendered keeping pages light and responsive from the users perspective.

Having researched into loading scripts dynamically and referring to Steve Souders lastest book, Even Faster Web Sites, I decided to base the external file's code around Steve's solutions with some modifications. However, when modifying someone else's code to address issues others have resolved when using similar techniques, as well as optimising the code and making it ready for production use from years of painful experience, always seems to take longer than first thought. Hopefully I'll have it ready in the next day or so, and then I'll be able to add Google Analytics and client libraries such as jQuery and Dojo to my pages without affecting initial page rendering responsiveness.

Wednesday 9 September 2009

Accessibility shown up with NVDA

I downloaded the latest version of NVDA, a free and open source screen reader for the Microsoft Windows operating system, and ran my homepage through it. This is something I've done in the past for other web sites I've had a hand in building, and it brought back to me how hard accessibility is.

Sure I've got skip links on my pages and I'd turned off all styling to 'read' my homepage. But if you close your eyes and run a web page through a screen reader, you quickly find that it's very different from 'reading' an un-styled page. Every time a page is read, all the skip links are read out one after another. The user can use keys to follow links and jump around the page, but if the user has a motor impairment and clicking links or pressing keys is difficult, the accessible usability of the skip links is significantly reduced.

Listening to the links being read out means 'remembering' the links readout. There is automatically an increased cognitive load on the user's memory, and if a user's cognitive ability is slightly impaired then your 'accessible' website is not what you thought it once was. A web site is there for a reason; to allow a user to buy something, find some information, etc. It's not 'to allow a user to buy something, find some information, etc. so long as that user is not disabled'.

So taking the position that a website should be usuable by everyone, what can be done to recover the accessibility of our 'accessible' website?

  1. Don't crowd all the skip links at the top. Place the skip link immediately before the item to skip. Doing this has the added benefit that the user 'knows without listening' they are not skipping over other items of the site they might want to go to.

  2. Place a 'Jump to' link (maybe links) at the end of a item to other important items such as navigation, page top, or page index.

  3. Move the placement of items in the html source. Should the navigation bar always be read before the content of the page the user just told you they wanted to read? What about the header/ banner of the site? These placements are important when visually accessing the site and can still be acheived with a little css.


Why bother? Disabled users have the collective spending power of tens of billions of dollars in the US alone. If a user is visiting your site it is because they want access to what you are offering. Why put up a sign that says 'if you can't reach the counter we don't want your custom'?

I've got some work to do on my own site. I used a modified version of the 960  Grid System that works really well and then added some accessibility features. The site (and yours) would be far more accessible if the placement of un-styled items in the HTML source is considered first and then doing some work in CSS to move those items around into visually pleasing positions.