As developers, we often spend time optimising, tweaking or redesigning to increase performance. It's fairly easy to measure performance increases when optimizing code or systems, but it's a lot harder to gauge the effectiveness of User Interface changes.
In many cases the UI choices we make are subjective at best. In some cases, our design decisions can actually make things worse, not better. There are some interesting anecdotes about this in this recent post.
In order to make sure we are making the right decisions, rather than just a bunch of assumptions, we need to measure the effectiveness of the current implementation in order to compare with the improved version.
Depending on what it is that you're changing, this could impact on conversions, page hits, or data collection rates. You'll need to decide the best way to measure the effectiveness of the changes, but without data to measure it, you'll never really know if you're doing it wrong.
It seems like a simple idea, but it applies for all optimisation and is often overlooked. A simple rule to follow is to remember to ask "How will we know if this works?" before you implement a change.
Showing posts with label measurable. Show all posts
Showing posts with label measurable. Show all posts
Monday, 27 April 2009
Always get a Baseline first
Wednesday, 15 April 2009
Analytics Debunks Charlatan
Yesterday I overheard a friend's phonecall with a salesman for an internet listing service. She runs a modest business with her website as the only advertising and bringing in enough customers to keep her booked weeks in advance.
The listing service was claiming that they could increase traffic to her site because they specialised in listing companies in her specialised field. I suggested that she should think carefully before throwing her money their way, since her website is fairly well optimised for search engines.
As it turned out, the listing service had offered her a one month free trial (which had just expired) and had been allegedly sending traffic her way already. I decided to spend a few minutes helping her evaluate the trial.
The first step was to look at their website. Sadly their homepage failed to load as most of the content was blocked by ABP - not a good start. Next we found her listing on their site, mostly content pasted from her homepage, although her business name was spelt incorrectly (twice). By this stage I was feeling underwhelmed.
So we decided to check out the traffic they've been sending to her site. Google Analytics had been in place for some time so we could easily measure the impact. The first thing we did was check the Traffic Sources report. Indeed there were 27 visits in the last month, although they never peaked higher than 2 per day and the bounce rate seemed pretty high to me.
I suggested we check on where these visitors were coming from and see if we could find out a little more about them so we set up a Custom Segment where Source contains the listing site's domain. We could see that almost all of the traffic came from London, except for 3 visits from Australia, coincidentally where the business was based. Digging further into the New vs. Returning Visitors report showed that all but one of the London visitors was the same person returning every day or so to generate traffic.
In my opinion this kind of listing service is a waste of money if you have followed the most basic SEO principals. Needless to say, my friend will not be engaging their services.
The listing service was claiming that they could increase traffic to her site because they specialised in listing companies in her specialised field. I suggested that she should think carefully before throwing her money their way, since her website is fairly well optimised for search engines.
As it turned out, the listing service had offered her a one month free trial (which had just expired) and had been allegedly sending traffic her way already. I decided to spend a few minutes helping her evaluate the trial.
The first step was to look at their website. Sadly their homepage failed to load as most of the content was blocked by ABP - not a good start. Next we found her listing on their site, mostly content pasted from her homepage, although her business name was spelt incorrectly (twice). By this stage I was feeling underwhelmed.
So we decided to check out the traffic they've been sending to her site. Google Analytics had been in place for some time so we could easily measure the impact. The first thing we did was check the Traffic Sources report. Indeed there were 27 visits in the last month, although they never peaked higher than 2 per day and the bounce rate seemed pretty high to me.
I suggested we check on where these visitors were coming from and see if we could find out a little more about them so we set up a Custom Segment where Source contains the listing site's domain. We could see that almost all of the traffic came from London, except for 3 visits from Australia, coincidentally where the business was based. Digging further into the New vs. Returning Visitors report showed that all but one of the London visitors was the same person returning every day or so to generate traffic.
In my opinion this kind of listing service is a waste of money if you have followed the most basic SEO principals. Needless to say, my friend will not be engaging their services.
Tuesday, 10 March 2009
So, what is it that you do again?
Being a developer is a mixed bag. Generally we like to create, invent or fix things, but we often end up spending a lot of time maintaining, or optimising, or fire-fighting, or planning, or evaluating, or discussing. Although these things are all engaging tasks too, they are commonly not what drive us, and worse, they are pretty much un-measurable.
So what's the big problem with that? In a nutshell, most managerial types don't get what we do, and when it comes to performance review time it really makes it hard to shine when most of your work can't have a valid metric applied. This is easy to understand when you try to compare measuring a dev or sysadmin role to measuring a role with monthly sales target. "99% server up-time" doesn't cut it anymore.
Obviously meeting deadlines for project milestones are measurable and carry significant weight, but in reality many startups have a more fluid development path, and many of us are not working on projects where we can control or are responsible for milestones.
So what can we do about it? In short, not a lot. Try and set as many realistic measurable goals as you can, keep a log of anything adhoc you do that will look like a measurable win, so you'll be ready when review time comes along (or if you want to push for a pay rise).
Since the daily trials and task lists of most of us are out of our control, I thought I'd start this blog to suggest ideas over the coming weeks that can help you define your role in a tangible way, or maybe just have a "look what I did" moment. Hopefully they will also be "measurable wins", or at least interesting to read.
So what's the big problem with that? In a nutshell, most managerial types don't get what we do, and when it comes to performance review time it really makes it hard to shine when most of your work can't have a valid metric applied. This is easy to understand when you try to compare measuring a dev or sysadmin role to measuring a role with monthly sales target. "99% server up-time" doesn't cut it anymore.
Obviously meeting deadlines for project milestones are measurable and carry significant weight, but in reality many startups have a more fluid development path, and many of us are not working on projects where we can control or are responsible for milestones.
So what can we do about it? In short, not a lot. Try and set as many realistic measurable goals as you can, keep a log of anything adhoc you do that will look like a measurable win, so you'll be ready when review time comes along (or if you want to push for a pay rise).
Since the daily trials and task lists of most of us are out of our control, I thought I'd start this blog to suggest ideas over the coming weeks that can help you define your role in a tangible way, or maybe just have a "look what I did" moment. Hopefully they will also be "measurable wins", or at least interesting to read.
Subscribe to:
Posts (Atom)