Showing posts with label Google. Show all posts
Showing posts with label Google. Show all posts

Thursday, 22 April 2010

Google Buzz Button IE8 Error

Recently Google introduced their Buzz buttons to "Help people post your content on Google Buzz". Share buttons are not a new idea, but Google Buzz is new and kinda shiny. Here's a quick intro if you're not up to speed yet:



And here's a Buzz button:   

A few days ago I implemented the Buzz share buttons on a website at the clients request and it all seemed to be working great, until this morning I found that some pages were failing to load in Internet Explorer 8.

I pointed IE8 at the development site and confirmed there was a serious problem. The error I recieved was thus:

Webpage error details
User Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1) ; .NET CLR 2.0.50727; .NET CLR 3.0.04506.30)
Timestamp: Thu, 22 Apr 2010 13:33:25 UTC
Message: HTML Parsing Error: Unable to modify the parent container element before the child element is closed (KB927917) Line: 0 Char: 0 Code: 0


I disabled the Buzz button to confirm it was part of the problem (which it was), and then checked my implementation against the documntation and other sites. Everything seemed to be OK. After a long period of debugging, and following up false leads it turned out to be that Internet Explorer was executing the Google Buzz button's javascript code too early - even when it was in the page footer.

The Solution:
Once again jQuery came to the rescue - I replaced this line:
<script type="text/javascript" src="http://www.google.com/buzz/api/button.js"></script>

with this line:
<script type="text/javascript">jQuery(document).ready(function(){jQuery.getScript('http://www.google.com/buzz/api/button.js');});</script>

Tuesday, 22 December 2009

Updating Google Map Marker's z-index

Lately I've been working on a web application that uses Google's Maps API. It's been an interesting and engaging project.

One of the limitations of the current Maps API is that the z-index of a marker cannot be changed after it has been created. The client requested that the selected marker "popped to the front" as some markers obscured others in certain map areas depending on zoom and closeness of coordinates. This was a reasonable request and would enhance the UI, but was not so easy to implement.


Mike Williams gives a good introduction to this issue and details of how to set the z-index of the marker when it is created with addOverlay() in his Google Maps API Tutorial. Having read this, I attempted to re-create each marker when it was clicked and keep track of the top most z-index. I had some success but had unpredictable z-index results and it was definitely an inefficient way to produce the desired effect.

I decided to browse the DOM and see if I could find a better way to do this. I found a guide to Undocumented Google API features which seems to be mostly out of date, but contained the very important details of how to calculate a Marker's default z-index:

Use marker.setZIndex(Math.round(marker.getLatitude()*-100000)) to get a moved marker to overlap correctly.

Even though setZIndex() and getLatitude() are not valid methods in the current API, it's easy to understand the calculation.

In my application I was already using a unique icon for each marker so that they displayed sequential letters (A,B,C...) and had added an index property to the marker object. I was able to leverage this with a bit of jQuery magic to find each icon in the DOM and alter the CSS z-index value. Since the default z-index is something like -108619296, I created a function to toggle the z-index between normal and front positions by multiplying it by -1.

icon = $("#mapbox div div div img[src='/images/markers/"+marker.index+".png']");
zidx = icon.css('z-index');
icon.css('z-index',zidx*-1);


Just to make sure that no other marker was still in the top position, I looped through my array of markers and reset the z-index with this function.

function reset_zorder(marker) {
$("#mapbox div div div img[src='/images/markers/"+marker.index+".png']").css('z-index',Math.round(marker.getPoint().lat()*-100000));
}


Obviously none of this is a copy+paste solution. but it should give anybody needing to manipulate Google Maps MAP Marker z-index a good example to work from.

Thursday, 9 July 2009

Tracking Email Clicks in Analytics

Many of us are using Google Analytics to measure the usual web metrics, but most of use are only using a small portion of this tool's functionality.

It's common practice to send registered users an email to confirm account activity, to keep them up to date via a newsletter, or to try and encourage return activity. In many cases we are not really measuring how effective these mailings are or how they impact on our website traffic.

At first glance it looks like a tricky problem, mail client applications will generally not pass a referrer and browser mail will be recorded as one of the hundreds of mail domains in use. Link Tagging is the simple solution, although there are a few options depending on how deep you want to go.

Source and Medium

By appending utm_source and utm_medium parameters to your links you can easily track who many visits are directly attributable to your mailings and see them in the All Traffic Sources report.

Here's an example of how your links should look:
http://www.yoursite.com/somepage.html?utm_source=Newsletter&utm_medium=email

Setting the utm_source value will replace any referrer value as the Traffic Source so random browser domains will be consolidated under one value, along with any email clicks with no referrer which would usually be classed as "(direct)". This is the only required parameter of this type, any other utm_xxxx fields used in conjunction with utm_source are optional.

Using utm_medium=email is recommended, especially if you are using more than one utm_source value in different email types (e.g. Newsletter, AdminEmail, ReferAFriend) so that you can easily filter the results on the All Traffic Sources report.

Campaigns

Specifying a utm_campaign value can help group your links in a more meaningful way. This could be a sub group of your source categories (e.g. utm_campaign=200907 to identify this is the monthly Newsletter for July 2009) or you could use a campaign like utm_campaign=Winter-Sale across many sources (email, banner, CPC, etc). It all depends on what you want or need to measure. Whatever you choose, any utm_campaign values tracked will be displayed on the Traffic Sources->Campaigns report.

Content and Terms

These options are less common but still useful. Setting the utm_content parameter could help identify if text or html emails are getting more clicks. Alternatively you could track the comparative success of different creatives from the same campaign. utm_content values tracked will be displayed on the Traffic Sources->Ad Versions report.

I've included the utm_term here just for completeness. It's usually used to identify search terms or keywords purchased. utm_term values tracked will be displayed on the Traffic Sources->Keywords report.

Handy Hint

Even if you're already tracking your email clicks with another solution, it's probably worth adding these parameters (or at least some of them). As long as they are passed to the landing page it doesn't matter if you add them to the pre or post tracking URL. You may do some special tweaks so that your tracking solution passes utm_xxxx parameters on to the destination URL.

Sunday, 17 May 2009

RSS: Learn To Burn...

If you are regularly publishing any kind of content online, then you are probably also providing an RSS feed. (If you've been living in a cave for since 2005 and never heard of it, then you should read this introduction to RSS.)

RSS is a tricky thing to measure, requests are not tracked like normal webstats, and are commonly anonymous or via a proxy. The frequency of requests is dependant on the user's feed reader and could be daily, weekly, hourly or even every minute (or anything in between). This is why it's important to have a good grasp of how much bandwidth and processing resources your RSS feeds are using.

RSS Caching

The easiest way to offset the processing cost is to cache your feed. Depending on your site's publishing schedule and implementation, the caching period and method used will be different. The basic idea is to dump your feed into a file and serve that.

On each request check if the cache file exists and if it is younger than 15 minutes. If not then build the feed and dump it into the cache file, ready for the next request. Depending on the frequency of requests, this can reduce your feed building resource cost considerably.

Introducing FeedBurner

Feedburner has been providing RSS feed management tool since 2004. By October 2007, they reportedly hosted over a million feeds for 584,832 publishers. In June 2007, FeedBurner was acquired by Google Inc., and shortly after two of their popular PRO services (MyBrand and TotalStats) were made free to all users. By August, 2008, Google had completed migrating FeedBurner into its group of services.

FeedBurner works very well with the major blog publishing sites, but it's also worth investigating if your site is standalone.

The initial payoffs of using FeedBurner is that they can help you get a handle on the size of your subscription base, and will cache and serve your feed, thereby absorbing much of the processing and bandwidth costs.

There has been quite a bit of discussion about the accuracy of subscriber stats provided by the FeedBurner service. As stated earlier in this article, RSS stats are problematic due to the plethora of clients and the complications of anonymity and proxy services. Having said that, they service offered is a lot better than no stats and in my opinion the benefits outweigh the cost many times over.

Don't Lose Your Audience

One of the important tips about integrating FeedBurner is to make sure that your subscribers still subscribe to your site's feed URL and are redirected to your FeedBurner URL. This way, if you ever decide to drop the FeedBurner service, then you won't leave your subscribers stranded with a defunct FeedBurner URL. Google has been quite open about this issue, if you know where to look.

If you are redirecting traffic you need to make one small change to your FeedBurner options to make this work properly - but it's not that easy to find... Click on the Optimize tab for your feed, and then BrowserFriendly in the Services menu. At the bottom of the form, in the Content Options section, there is a link with the text "Use your redirected feed URL on your BrowserFriendly landing page". Click on that and then enter your site's feed URL.

This change should result in most subscribers using your site's URL, however, this still doesn't seem to work correctly with Firefox's Live Bookmarks. I haven't found a decent work around for this yet, or even much evidence that it is an issue, but for me it never works, so be aware. Even the "ClearFeed" landing page is somewhat confusing when Live Bookmarks are used, which is a concern considering Firefox's popularity.

FeedBurner Pros vs Cons

Pros
  • Free stats/caching service
  • Reliable infrastructure
  • Simple to use

Cons
  • Stats are tied to a single Google login
  • Subscription stats allegedly fluctuate
  • Some Firefox Live Bookmarks issues

Tuesday, 5 May 2009

Common Sense SEO Tips

Search Engine Optimisation (SEO) is often thought of as a mystical dark art. Many folks are making a good living off giving advice on how to increase your ranking in the major search engines, some of them even know what they're talking about. In many cases the "tricks" involved are common sense and can be implemented without too much trouble. The clever part is recognising what is the best approach for each particular site.

Here are my best simple tips to get you found via Search Engines

Search Term Targeting
Think about what terms people would type into a search engine if they wanted to find your site. It's better to get 10 visitors who will be interested in your content than 100 visitors who will immediately leave and never return. Compile a list of terms you would expect your site to be well ranked for and target those terms. Make sure these terms appear in your page's body text. Keep the list in a spreadsheet and record your Search Engine ranking so you can measure improvement.

Good Content
Make sure you have a decent chunk of crawlable text. Webcrawlers don't index text if it's just in an image. Make sure the content includes the terms you want to be found for. If your product or service is regional then include those details on every page (the page footer does nicely), then you'll be more likely to be found when people search for "kitten jugglers London SW15".

Page Title
The content of the page's <title> tag is taken as a description of the page content. This is a key index for your pages. Make sure the title is clear but succinct. Limit it to 5-10 words, including your company name. Don't use the same title on multiple pages or they may be grouped as one page in search engine results.

Well Formed Links
Never ever use click here links. Link text is one of the best indexing opportunities you have, don't squander it. The text in links to your pages are treated in a similar way to page titles but links to your pages are aggregrated.

Meta Tags
Do use the keyword and description metatags. Keep it clear and simple. The keywords tag is said to be less used of late due to heavy abuse, but Yahoo claims to still support it. The description tag should provide a concise explanation of your page's content.

<meta name="description" content="An introduction to the nocturnal habits of hedgehogs of United Kingdom">
<meta name="keywords" content="hedgehog United Kingdom UK Erinaceinae nocturnal insectivore furze-pig">


If your website includes mulitple languages, translated content or is not in English you should consider also using the language tag.

Seeding
If your site is brand new and hasn't been indexed yet, you can get the ball rolling by adding it to the Open Directory Project. Search engines are an incestuous bunch feeding off each other and this is a good entry point as it's used by AltaVista, A9, AOL, Ask, Clusty, Gigablast, Google, Lycos, MSN, and Yahoo.

Webmaster Tools
Google's Webmaster Tools can tell you if anything is going wrong on your site that could be affecting your search ranking. You can read about Webmaster Tools in an earlier post.

Wednesday, 15 April 2009

Analytics Debunks Charlatan

Yesterday I overheard a friend's phonecall with a salesman for an internet listing service. She runs a modest business with her website as the only advertising and bringing in enough customers to keep her booked weeks in advance.

The listing service was claiming that they could increase traffic to her site because they specialised in listing companies in her specialised field. I suggested that she should think carefully before throwing her money their way, since her website is fairly well optimised for search engines.

As it turned out, the listing service had offered her a one month free trial (which had just expired) and had been allegedly sending traffic her way already. I decided to spend a few minutes helping her evaluate the trial.

The first step was to look at their website. Sadly their homepage failed to load as most of the content was blocked by ABP - not a good start. Next we found her listing on their site, mostly content pasted from her homepage, although her business name was spelt incorrectly (twice). By this stage I was feeling underwhelmed.

So we decided to check out the traffic they've been sending to her site. Google Analytics had been in place for some time so we could easily measure the impact. The first thing we did was check the Traffic Sources report. Indeed there were 27 visits in the last month, although they never peaked higher than 2 per day and the bounce rate seemed pretty high to me.

I suggested we check on where these visitors were coming from and see if we could find out a little more about them so we set up a Custom Segment where Source contains the listing site's domain. We could see that almost all of the traffic came from London, except for 3 visits from Australia, coincidentally where the business was based. Digging further into the New vs. Returning Visitors report showed that all but one of the London visitors was the same person returning every day or so to generate traffic.

In my opinion this kind of listing service is a waste of money if you have followed the most basic SEO principals. Needless to say, my friend will not be engaging their services.

Wednesday, 18 March 2009

Ad Blocker Detection

Late last week I mentioned Adblock Plus in my list of Essential Add-on Tools for Firefox. Here's the description again, just in case you missed it:

Adblock Plus allows you to browse without ads. It works very well and is incredibly popular because of that. Users love it, advertisers and websites dependant on ad revenue hate it. As long as flash banners keep soaking up CPU cycles, I'll keep using it.
https://addons.mozilla.org/en-US/firefox/addon/1865


Just about all websites rely on advertising revenue to some degree. The problem is, how do you measure the number of potential impressions you're losing to ABP?

A couple of weeks ago I cooked up a very simple solution that seems to perform quite well. In this example I'll track it with Google Analytics, but you could easily rework it to your own system.


ABP blocks items based on a list of exception rules. This detection script relies on a javascript file specifically named to be blocked by these rules. According to this script you have ABP enabled on this page (assuming you're viewing this on something that supports javascript).

Here's the code:

<script type="text/javascript">var hasABP='ABP';</script>
<script type="text/javascript" src="/js/advertising.adserver.bannerad.js"></script>
<script type="text/javascript">pageTracker._setVar(hasABP);</script>


The contents of the file /js/advertising.adserver.bannerad.js are:

var hasABP = '';

So after running this for a few hours, you should spot ABP users as 'ABP' in the Google Analytics User Defined section. In theory you could get some false positives, but the rate reported for me was so low that I think it's performing well.

Once you have some data collected, you can set up Custom Segment in Analytics' Advanced Segments and try and estimate the lost impressions based on page views.

Monday, 16 March 2009

User Segmentation in Google Analytics

You're probably using Google's free webstats product Analytics on at least one site. And to be honest, unless you're an anti-Google-domination zealot, there's little reason not to. It's easy to install and the tools are pretty good.

It's a shame there's no simple hourly traffic graph, and the Advanced Segments produce some fairly questionable results, but they are still in Beta and they do clearly state the "report is based on sampled data."

Aside from that, it's very easy to set it up, give the marketing guys "user" access and forget about it - which is what many of us do. This is where we fail. There is a lot of useful segmentation data Google Analytics collects that can give us technical insights.

Web-stats are so much more than unique visitors and page views. Analytics can give you a good grasp of your customers' browser version, operating system, screen resolution, java support, Flash version and even connection speed.

Geolocation stats can help you decide if your global reach warrants employing a Content Delivery Network.

You can even create your own segmentation using User Defined values. I like to use it to identify users as Anonymous, LoggedIn and Admins, but you can pretty much set any value you like using this statement after the usual tracking script:

<script type="text/javascript">pageTracker._setVar('SegmentX');</script>

As long as you don't go crazy inserting a unique id per user (remember this is a segmentation tool), you should be able to get some useful data back.

Wednesday, 11 March 2009

What you don't know CAN hurt you

I'd be very surprised if you didn't know that Google offers a range of web products beyond their search engine. You've probably got a Gmail account, will have seen Google Adwords and may even already use Google's Calendar, Docs, RSS Reader, Analytics or Ad Manager. Even Blogger.com is part of the Google family now.

There is one application that nobody seems to talk about much. It's not as sexy as some of the other apps, but it can have great value for devs like us. I'm talking about Webmaster Tools.

Webmaster Tools can tell you about the things going wrong on your site that you never dreamed were happening. The functionality is organised into Diagnostics, Statistics, Links, Sitemaps and Tools.

  • Diagnostics will detail any errors and problems encountered by Google's web crawlers and mobile cell phone crawlers while accessing pages on your site. It also gives a Content Analysis that identifies potential problems with site metadata, such as title and meta description information.

  • Statistics include the Top search queries returning pages from your site and which of them were clicked, 'What Googlebot sees', Crawl Stats including the current PageRank for pages on your site, Index Stats and Subscriber Stats based on products such as iGoogle, Google Reader, or Orkut.

  • Links details the pages on your site that have external links, internal links, and which links on your site have been identified as candidates for appearing directly in Google search results.

  • Sitemaps allow you to submit and manage sitemaps.

  • Tools allow you to Generate and Analyze a robots.txt file, Remove URLs from Google's indexes, Enhance 404 pages, or install the Webmaster Tools gadget on iGoogle.

Together these tools make a cohesive suite that can highlight a whole slew of problems to tackle that you never knew you had. This might sound like a nightmare, but realistically you need to know which pages on your site have broken links, or HTTP errors, or missing title tags, or even 404s.

I've only discovered one gotcha to date, which is that Google Analytics data seems to be intertwined with the results. If you have been a bit clever with renamed pageviews, you could get some extra errors reported.

Webmaster Tools takes minutes to set up, and after a day or so will probably give you a raft of "measurable wins" to chase. Most of the problems highlighted are easy fixes, but depending on the traffic on your site can have a significant impact. At worst, even if you get no problems reported, it's worth the minimal set up time to get a clean bill of health from Google's crawlers.