It should be pretty obvious that we think Google Analytics is neat. Neat as in essential to your business! It qualifies as one of those technologies that is clever, cool and seriously useful. The inherent risk using clever, cool and useful tech is that one can get awfully lost in the clever and cool without gaining any value.
For example, I have been guilty of spending many an interesting hour building and rebuilding custom reports - a great new feature in the GA beta interface. I had to catch myself though - were these reports relevant? Were they useful? Were they aligned with our clients' Key Performance Indicators?
I undertook an exercise to rationalise the custom segments and custom reports we had available to our GA account.
The general profiles and filters that we set up in the last blog post are exactly that - general. Sufficiently generic so as to be useful to 90% of GA users...probably. Anyway - custom reports and custom segments should be developed so as to be aligned with business Key Performance Indicators.
Think about it: Why would a board member be so interested in the bounce rate on the website broken down by OS, browser type and browser version? Would a developer really be 'bovvered' by relative marketing campaign keyword performance? Probably not. So, as we created profiles to capture filtered data, we can also create profiles to present data.
Let's say we created a developer specific profile - this doesn't mean to say other users can't view the data but it is a profile focused on the preserntation of data that a developer would want to see. Any report - even a custom report can be added to the dashboard for the profile that you are currently viewing:
And here is the report added at the bottom of the dashboard:
Remove the 'clutter' from the report for the given profile by hitting the X in the top right of each report overview in the dash board.
Now you can have a dashboard for each business function tuned to specific needs, custom segments and reports that are relevant and useful!
Saturday, 6 December 2008
Monday, 17 November 2008
Google Analytics baseline setup
This is going to be a longer post where I'll explain what Moneyspyder does to clients to help them attain what we consider to be the bare minimum Google Analytics setup.
GA (Google Analytics) is really easy to get up and running but it's not so obvious what should be done to start getting the best out of it (see a previous post on why quality data is essential).
Default profile
Having initially setup GA you are the proud owner of one single, unfiltered, raw profile. This profile will capture all the data sent to it by your website traffic.
Notice the two links highlighted above - these two little chaps are going to help a lot: Profiles and Filters.
First of all, let's get our raw profile sorted out:
Standard Profile Edits
First of all, click 'edit' next to your standard profile :
Then click 'edit' again in the top right:
Now, if you site is a transactional site you'll want to track the e-commerce data generated by your site so click the 'Yes, an e-commerce Site' radio button.
Likewise, if your site has internal search you'll want to track it so click on the 'Do Track Site Search' radio button:
Having clicked the relevant controls, be sure to tell Google what currency you're using and what search parameters to look out for. Say your site uses a parameter 'q' on the url for searches:
http://www.yoursite.com?q=widgets
Then type 'q' in the Query Parameter box.
Now 'Save Changes'.
Standard Filters
Now head back to the Analytics Settings page for your site. Click on 'Filter Manager' (bottom right), and then click 'Add Filter' (top right):
Here is a filter to turn all data into lower case:
Notice that the filter name is totally up to you. Choose custom filter, lowercase, Request URI and add your single profile to the 'Selected Website Profiles' list. Now 'Save Changes' and all your data will be lower case. This is results in nice consistent case in your data - shouldn't be necessary but say users search for 'Widgets' and 'widgets' - you'll see the data for the two search terms as one which is more useful.
Now, one more standard filter:
This filter will exclude your internal traffic from your data. Obviously you'll need to change the regular expression to match your IP address range. You can exclude multiple ranges or single addresses - if you can build the reg exp (perhaps using Rubular which is a neat tool) you can filter the addresses out.
Caveat
We normally apply these two standard filters to all profiles but when you start building more advanced custom filters you will need to think carefully as to whether you want to apply these filters to your standard profile or another specific profile. Filtered data can never be retrieved. Once it's filtered, it's gone. If you apply a filter by mistake, you can create holes in your data. This is to be avoided so be careful!
Now, time for some
Advanced filters and extra profiles
We suggest starting with 4 extra profiles covering the basic customer segments:
As you will see, the filters that can be applied can get quite 'fruity'. Depending on your Key Performance Indicators you may choose to add or remove filters that are better aligned with your website strategy so as to measure your tactical goals more accurately.
Let's setup the profiles first of all:
Click 'Add Website Profile' in the bottom left of the Analytics Settings page for your site in GA.
Choosing to add a new profile for an existing domain will create another profile that will receive data from your website. No new GA tracking codes or scripts are required. Slightly off topic though: Make sure your test, staging and production sites all have separate Ga accounts - keep the data clean!
Give it an appropriate name of course...
Now repeat for the other profiles and you'll end up here:
Don't be concerned by the this will change in a few hours. New profiles, even though they may be using an existing URL, are new profiles. they do not have data, they are not copies of existing profiles.
Remember the standard filters and profile tweaks. Do them to these new profiles. Make them all track search if your main unfiltered profile tracks search and likewise for e-commerce.
Now we need to filter the data going in to these profiles. We shall set up the New Visitors filter first and apply it to the New Visitors Profile.
Go into 'Filter Manager' and create a new filter:
You'll notice some similarity with the 'Lowercase' filter. It's a custom filter again. This time we 'include' traffic based on 'Visitor Type' being 'new'. We apply this filter to the New Visitors profile ONLY!!!!
Now let's do the same for returning visitors:
Cool! Now we have an unfiltered profile albeit with slightly filtered (for cleaning purposes) data, and two corresponding profiles containing data pertaining only to New Visitors and Returning Visitors. Already we have better visibility of our two main customer types and what they are doing on our site.
Now the home stretch (for now) where we can create another two customer filters to capture two more really useful segments - organic traffic and paid traffic:
take a moment to browse through the filter fields available - awesome huh? You're probably thinking already how you could use some of these fields for future profiles and filters. I am!
Now you should end up with a list of filters like this:
take a moment to double check your profiles to make sure you have the right filters on the right profiles - always worth performing a gross error check at this point cos now you have to wait for the data to come flying in!
Couple these basic segments (filtered profiles) with the new GA custom segments and reports and you start to see how powerful clean data can be.
GA (Google Analytics) is really easy to get up and running but it's not so obvious what should be done to start getting the best out of it (see a previous post on why quality data is essential).
Default profile
Having initially setup GA you are the proud owner of one single, unfiltered, raw profile. This profile will capture all the data sent to it by your website traffic.
Notice the two links highlighted above - these two little chaps are going to help a lot: Profiles and Filters.
First of all, let's get our raw profile sorted out:
- e-commerce tracking
- search tracking
- filter internal traffic
- lower case data
Standard Profile Edits
First of all, click 'edit' next to your standard profile :
Then click 'edit' again in the top right:
Now, if you site is a transactional site you'll want to track the e-commerce data generated by your site so click the 'Yes, an e-commerce Site' radio button.
Likewise, if your site has internal search you'll want to track it so click on the 'Do Track Site Search' radio button:
Having clicked the relevant controls, be sure to tell Google what currency you're using and what search parameters to look out for. Say your site uses a parameter 'q' on the url for searches:
http://www.yoursite.com?q=widgets
Then type 'q' in the Query Parameter box.
Now 'Save Changes'.
Standard Filters
Now head back to the Analytics Settings page for your site. Click on 'Filter Manager' (bottom right), and then click 'Add Filter' (top right):
Here is a filter to turn all data into lower case:
Notice that the filter name is totally up to you. Choose custom filter, lowercase, Request URI and add your single profile to the 'Selected Website Profiles' list. Now 'Save Changes' and all your data will be lower case. This is results in nice consistent case in your data - shouldn't be necessary but say users search for 'Widgets' and 'widgets' - you'll see the data for the two search terms as one which is more useful.
Now, one more standard filter:
This filter will exclude your internal traffic from your data. Obviously you'll need to change the regular expression to match your IP address range. You can exclude multiple ranges or single addresses - if you can build the reg exp (perhaps using Rubular which is a neat tool) you can filter the addresses out.
Caveat
We normally apply these two standard filters to all profiles but when you start building more advanced custom filters you will need to think carefully as to whether you want to apply these filters to your standard profile or another specific profile. Filtered data can never be retrieved. Once it's filtered, it's gone. If you apply a filter by mistake, you can create holes in your data. This is to be avoided so be careful!
Now, time for some
Advanced filters and extra profiles
We suggest starting with 4 extra profiles covering the basic customer segments:
- New
- Returning
- Organic
- Paid
As you will see, the filters that can be applied can get quite 'fruity'. Depending on your Key Performance Indicators you may choose to add or remove filters that are better aligned with your website strategy so as to measure your tactical goals more accurately.
Let's setup the profiles first of all:
Click 'Add Website Profile' in the bottom left of the Analytics Settings page for your site in GA.
Choosing to add a new profile for an existing domain will create another profile that will receive data from your website. No new GA tracking codes or scripts are required. Slightly off topic though: Make sure your test, staging and production sites all have separate Ga accounts - keep the data clean!
Give it an appropriate name of course...
Now repeat for the other profiles and you'll end up here:
Don't be concerned by the this will change in a few hours. New profiles, even though they may be using an existing URL, are new profiles. they do not have data, they are not copies of existing profiles.
Remember the standard filters and profile tweaks. Do them to these new profiles. Make them all track search if your main unfiltered profile tracks search and likewise for e-commerce.
Now we need to filter the data going in to these profiles. We shall set up the New Visitors filter first and apply it to the New Visitors Profile.
Go into 'Filter Manager' and create a new filter:
You'll notice some similarity with the 'Lowercase' filter. It's a custom filter again. This time we 'include' traffic based on 'Visitor Type' being 'new'. We apply this filter to the New Visitors profile ONLY!!!!
Now let's do the same for returning visitors:
Cool! Now we have an unfiltered profile albeit with slightly filtered (for cleaning purposes) data, and two corresponding profiles containing data pertaining only to New Visitors and Returning Visitors. Already we have better visibility of our two main customer types and what they are doing on our site.
Now the home stretch (for now) where we can create another two customer filters to capture two more really useful segments - organic traffic and paid traffic:
take a moment to browse through the filter fields available - awesome huh? You're probably thinking already how you could use some of these fields for future profiles and filters. I am!
Now you should end up with a list of filters like this:
take a moment to double check your profiles to make sure you have the right filters on the right profiles - always worth performing a gross error check at this point cos now you have to wait for the data to come flying in!
Couple these basic segments (filtered profiles) with the new GA custom segments and reports and you start to see how powerful clean data can be.
Friday, 31 October 2008
Downtime is bad, right?
Not always no. This week we did a pretty large update to www.beautifulpure.com involving some lengthy database work. Whilst this is happening, of course transactions cannot happen on the site.
So, we do this work in the dead of night and plan the downtime. With appropriate planning, testing the deployment as well as testing the new functionality and faultless procedural execution of the deployment we still met our 99.9% uptime SLA.
Overall for this year we are running at 99.91% which roughly equates to 6.5 hours of downtime accross all clients for the whole year so far.
The next part of this analyis will take into account planned vs unplanned downtime and of the unplanned downtime - what time of day did it occur?
Watch this space.
So, we do this work in the dead of night and plan the downtime. With appropriate planning, testing the deployment as well as testing the new functionality and faultless procedural execution of the deployment we still met our 99.9% uptime SLA.
Overall for this year we are running at 99.91% which roughly equates to 6.5 hours of downtime accross all clients for the whole year so far.
The next part of this analyis will take into account planned vs unplanned downtime and of the unplanned downtime - what time of day did it occur?
Watch this space.
Monday, 27 October 2008
Stop Press!
Catalogue|e-business magazine reported today that Snow Valley has partnered with Site Confidence to provide monitoring for their client's sites.
Woo hoo...
Moneyspyder have done this for over two years. For Free. Coupled with Pingdom for a back up. Engineyard run their own in house server monitoring in addition. For Free.
Using these services mean that Moneyspyder can honestly claim to have 99.9% uptime or better for all sites for as long as we have been running.
Woo hoo...
Moneyspyder have done this for over two years. For Free. Coupled with Pingdom for a back up. Engineyard run their own in house server monitoring in addition. For Free.
Using these services mean that Moneyspyder can honestly claim to have 99.9% uptime or better for all sites for as long as we have been running.
Saturday, 4 October 2008
Why Formula 1 is boring and what we can gain from it.
You may have seen the recent Grand Prix on television....You did? And you fell asleep? Thought so. Dull huh? But what does it have to do with Moneyspyder and why?
There are so many reasons but there is one main reason we can learn from and make use of: the precision with which the teams race now pretty much ensures they know what the result will be bar any accidents or freak weather. With 20 or so cars flying around a track at over 200mph how can they be so certain of the outcome? These are pretty complex pieces of machinery after all! Be great if we could be so certain about the outcome of our businesses, huh?
thing is, the science of Formula 1 has been applied to the measurement of key performance indicators on racing cars enabling the teams to know to the nearest 10th of a second how well the car is performing against expectations set by a mathematical model of what should be possible.
Sounding familiar yet? Think about an internet retail site; What sort of bounce rate, conversion rate, daily unique visitors, email open rate, basket abandonment rate, etc. etc. is expected? How does current site performance measure against the expectation? How are these expectations set? In a similar way to Formula one cars funnily enough.
By now, you should be saying 'yeah yeah, I have Google Analytics. So what?'. If you aren't saying this please call Moneyspyder now! So you dohave Google Analytics (GA)? Are you able to measure your site performance to an adequate level of precision? Are you confident in the accuracy of your data?
Recently we've seen a fair amount of GA data with a broad spectrum of precision and quality of data. To get any value from GA it is fundamentally important to get clean and precise data on which to base our expectations and indeed any subsequent measurement.
For example, customers arriving on your (all important) landing pages from marketing emails will probably arrive with a url payload:
http://yoursite.com/email=abc123
Now, each email is likely to have a different value associated with it. How is this going to manifest itself in your GA reports? Your landing page usage measurement will be totally useless if you measure each unique request based on unique url's.
http://yoursite.com/email=abc123
http://yoursite.com/email=abc124
http://yoursite.com/email=abc125
http://yoursite.com/email=abc126
Nightmare!
'Course, You don't have to be on the Moneyspyder platform to get quality GA data...but it helps! The quality of your analytics data will define the quality of you analysis and how much trust you can place in it. Treat you online business to a bit of Formula one style accuracy - give us a call.
There are so many reasons but there is one main reason we can learn from and make use of: the precision with which the teams race now pretty much ensures they know what the result will be bar any accidents or freak weather. With 20 or so cars flying around a track at over 200mph how can they be so certain of the outcome? These are pretty complex pieces of machinery after all! Be great if we could be so certain about the outcome of our businesses, huh?
thing is, the science of Formula 1 has been applied to the measurement of key performance indicators on racing cars enabling the teams to know to the nearest 10th of a second how well the car is performing against expectations set by a mathematical model of what should be possible.
Sounding familiar yet? Think about an internet retail site; What sort of bounce rate, conversion rate, daily unique visitors, email open rate, basket abandonment rate, etc. etc. is expected? How does current site performance measure against the expectation? How are these expectations set? In a similar way to Formula one cars funnily enough.
By now, you should be saying 'yeah yeah, I have Google Analytics. So what?'. If you aren't saying this please call Moneyspyder now! So you dohave Google Analytics (GA)? Are you able to measure your site performance to an adequate level of precision? Are you confident in the accuracy of your data?
Recently we've seen a fair amount of GA data with a broad spectrum of precision and quality of data. To get any value from GA it is fundamentally important to get clean and precise data on which to base our expectations and indeed any subsequent measurement.
For example, customers arriving on your (all important) landing pages from marketing emails will probably arrive with a url payload:
http://yoursite.com/email=abc123
Now, each email is likely to have a different value associated with it. How is this going to manifest itself in your GA reports? Your landing page usage measurement will be totally useless if you measure each unique request based on unique url's.
http://yoursite.com/email=abc123
http://yoursite.com/email=abc124
http://yoursite.com/email=abc125
http://yoursite.com/email=abc126
Nightmare!
'Course, You don't have to be on the Moneyspyder platform to get quality GA data...but it helps! The quality of your analytics data will define the quality of you analysis and how much trust you can place in it. Treat you online business to a bit of Formula one style accuracy - give us a call.
Friday, 3 October 2008
Congratulations Terry
Congratulations to Terry Jones and everyone at PSL on the launch of their new website - http://www.idirectdebit.co.uk/.
Another quality Mephisto driven website launched by Moneyspyder!
Another quality Mephisto driven website launched by Moneyspyder!
Wednesday, 27 August 2008
Test test test!
Why do we (royal we) spend so much time arguing over the minute details on websites, pages, page sections and even at the level of individual images? We (again, all of us) must have spent hours in the past, to-ing and fro-ing over the meeting room table offering (forcing!) opinions based on feel and subjective judgement...sometimes even based on experience!
It takes a lot of time though. Not everyone comes up with the right answer every time. What is the right answer though? Well, logically, it's what your customers/users say is the right answer. Your customers will tell you the right answer by voting with their hard cash. Thing is, investing in change can be risky and expensive.
Bringing an end to the long meeting room wrangles, mitigating risk and justifying ROI before hand is actually easier than you might think...mostly ;-)
Moneyspyder has invested serious amounts of time tailoring our unique Rails based e-commerce platform to cater for A/B split and multivariate testing using the Google Website Optimiser coupled with our in house analytics capabilities and Google Analytics.
We have run multiple experiments using the A/B, multivariate and multivariate A/B models (yup, all different models). We have tuned the platform to enable rapid test rollout and enhanced reporting for all our clients. This is a key feature of the platform - enabling all clients on the platform to benefit from enhancements made to the core 'flexshop' product.
So, now, rather than chew over contentious details ad nauseum, we take the option that saves time and money and gets the right result every time...it's not always the result you expect but it is the answer that is given directly by your customers SO LISTEN TO THEM!!!
All together now, 'TEST TEST TEST'!
Happy testing ;-)
It takes a lot of time though. Not everyone comes up with the right answer every time. What is the right answer though? Well, logically, it's what your customers/users say is the right answer. Your customers will tell you the right answer by voting with their hard cash. Thing is, investing in change can be risky and expensive.
Bringing an end to the long meeting room wrangles, mitigating risk and justifying ROI before hand is actually easier than you might think...mostly ;-)
Moneyspyder has invested serious amounts of time tailoring our unique Rails based e-commerce platform to cater for A/B split and multivariate testing using the Google Website Optimiser coupled with our in house analytics capabilities and Google Analytics.
We have run multiple experiments using the A/B, multivariate and multivariate A/B models (yup, all different models). We have tuned the platform to enable rapid test rollout and enhanced reporting for all our clients. This is a key feature of the platform - enabling all clients on the platform to benefit from enhancements made to the core 'flexshop' product.
So, now, rather than chew over contentious details ad nauseum, we take the option that saves time and money and gets the right result every time...it's not always the result you expect but it is the answer that is given directly by your customers SO LISTEN TO THEM!!!
All together now, 'TEST TEST TEST'!
Happy testing ;-)
Tuesday, 19 August 2008
Moneyspyder launches a new site for Babyetc
We are delighted to announce the launch of a new site - Baby Etc - feel free to check out the details on our
Press Release.
Press Release.
Friday, 25 July 2008
Moneyspyder uptime in '08 so far
The data is in.
We've just finished collating our uptime data up to the end of July and it looks pretty good.
So far we have avoided unscheduled downtime. On average we are running all our clients at 99.9% uptime with only scheduled downtime accounting for the 0.1%.
How do we do this?
The application of rigorous software engineering practices and procedures is a must of course. Even with the best architecture and technology infrastructure, by not applying solid procedural control over development, testing and deployment one's business and therefore clients will be screwed....badly.
We happen to have the first rate technology (Ruby on Rails) and infrastructure (Engine Yard, Brightbox and Slicehost) in place also. Cost effective, smart, reliable, fast and secure. What more could you ask for?
Couple these elements with a cool, clever and passionate team and the prospects for maintaining a solid service are looking very good!
We've just finished collating our uptime data up to the end of July and it looks pretty good.
So far we have avoided unscheduled downtime. On average we are running all our clients at 99.9% uptime with only scheduled downtime accounting for the 0.1%.
How do we do this?
The application of rigorous software engineering practices and procedures is a must of course. Even with the best architecture and technology infrastructure, by not applying solid procedural control over development, testing and deployment one's business and therefore clients will be screwed....badly.
We happen to have the first rate technology (Ruby on Rails) and infrastructure (Engine Yard, Brightbox and Slicehost) in place also. Cost effective, smart, reliable, fast and secure. What more could you ask for?
Couple these elements with a cool, clever and passionate team and the prospects for maintaining a solid service are looking very good!
Thursday, 24 July 2008
Friday, 13 June 2008
Why EngineYard?
We get frequent questions about our hosting strategy:
A good interview and a great indication as to why we work so closely with EngineYard.
- Why not host yourself?
- Why host in the US?
- What's so good about EngineYard?
A good interview and a great indication as to why we work so closely with EngineYard.
Thursday, 12 June 2008
Improving e-commerce articles by Dr. Mike Baxter
Dr. Mike Baxter is the Moneyspyder Director of Customer Experience. Catalogue and e-Business magazine recently published a series of Mike's articles on improving e-commerce. We are delighted to announce that these articles are now available on the Moneyspyder website.
Thursday, 29 May 2008
NewRelic
We recently signed up through EngineYard for the NewRelic beta programme. What a revelation!
We are no longer digging through log files for all the important (but hard to find) data on bottlenecks or benchmarking data. We now have a clear picture of normal and good application behaviour.
Some highlights:
As we like to act on data, this app rules - now we have the data and can operate on our apps with greater precision than we thought possible.
I seriously recommend NewRelic if you want to know your Rails app inside and out.
Credit has to go to both partners in this venture - EngineYard has always delivered kick ass hosting and support for Moneyspyder and now EngineYard and NewRelic together help us deliver world class performing and scaling Rails applications to our clients.
We are no longer digging through log files for all the important (but hard to find) data on bottlenecks or benchmarking data. We now have a clear picture of normal and good application behaviour.
Some highlights:
- easy install - minutes
- real time data
- responsive support
As we like to act on data, this app rules - now we have the data and can operate on our apps with greater precision than we thought possible.
I seriously recommend NewRelic if you want to know your Rails app inside and out.
Credit has to go to both partners in this venture - EngineYard has always delivered kick ass hosting and support for Moneyspyder and now EngineYard and NewRelic together help us deliver world class performing and scaling Rails applications to our clients.
Monday, 5 May 2008
Moneyspyder - who, what? Why are you so good?
Yes, what is the big thing about Moneyspyder?
Well...
First of all, we are using Ruby on Rails as the core of our technology offering. It's still considered new and therefore cool! Yes, cool for technical folk but the end result is certainly considered cool by our clients. I'll explain:
A core strength of Rails is its support for rapid development and an Agile approach - ho hum, that old spiel again. Yes but with a twist. We enable our clients to play to our strengths you see. We have a core set of Internet retail site components on which we can build custom specialist functionality very quickly and to a very high quality.
What's at the core then?
Yup, I think that just about covers what would be considered 'standard' in a web shop. (Do let me know if you think of any more.) Thing is, not all of these components are always going to be required. We can add and remove these components through quick and simple configuration. Not coding - just a config tweak and kazam! Gift wrapping is available. Less time focusing on technology integration - more time focused on client sites performing better month on month with new functionality.
Each component listed above is built as a Rails Engine - a plugin module if you like. Now Rails allows us to drop in a new component really quickly, easily and safely...oh, and we can choose the version we want to use (thanks to Piston)....it could be one of our components or a 3rd party plugin. We pick the one we are happy using and it's pinned at that version until we say otherwise. In summary, our modularity gives us speed, strength, flexibility and reliability. We like these and we know our clients like them too. They tell us. This is certainly a case of pragmatic technology choices doing the required job with little or no techy bling for the sake of techy bling.
Analytics, testing and gathering data is a core feature of the Moneyspyder philosophy. We make sure our framework captures as much data as possible so that we can say with certainty what customers are doing and when. For example, we can see when promotions work. We know the right types of promotions to setup on our clients' sites. We also know that promotional messaging is key to promotion performance. Our promotions module will let customers know how much more they have to spend to secure a 10% discount or free delivery. We call this Promotion Proximity Warning - PPW. It works - we see a clear uplift in both conversion and AOV when PPW is turned on.
Customers think searching is cool - data says so, customers say so. We make sure our search solution fits hand in glove with all our searchable components to maximise conversion on clients' sites. Products, brands, categories, static pages - all indexed and searchable. Our search is not indiscriminate though - it honours archived or pre-live entities. If your new product range is not set to go live just yet - they remain safe from appearing in search results until you say so. Search uses internal tagging to relate products, brands and categories - really handy for automatic intelligent population of related products when trawling through 1000 products and relating them really should be a job for a smart machine. We consider internal SEO to be just as important as external SEO - especially considering the 3x more likely purchase propensity of searching customer over casual browsers.
Well, that's a start. Drop us a line if you want to know more about us...
Well...
First of all, we are using Ruby on Rails as the core of our technology offering. It's still considered new and therefore cool! Yes, cool for technical folk but the end result is certainly considered cool by our clients. I'll explain:
A core strength of Rails is its support for rapid development and an Agile approach - ho hum, that old spiel again. Yes but with a twist. We enable our clients to play to our strengths you see. We have a core set of Internet retail site components on which we can build custom specialist functionality very quickly and to a very high quality.
What's at the core then?
- Users
- Products
- Variants
- Categories
- Brands
- Baskets
- Orders
- Payments
- Fulfillment
- Promotions
- Search
- Content Management
- Split Testing
- Analytics
- Gifting
Yup, I think that just about covers what would be considered 'standard' in a web shop. (Do let me know if you think of any more.) Thing is, not all of these components are always going to be required. We can add and remove these components through quick and simple configuration. Not coding - just a config tweak and kazam! Gift wrapping is available. Less time focusing on technology integration - more time focused on client sites performing better month on month with new functionality.
Each component listed above is built as a Rails Engine - a plugin module if you like. Now Rails allows us to drop in a new component really quickly, easily and safely...oh, and we can choose the version we want to use (thanks to Piston)....it could be one of our components or a 3rd party plugin. We pick the one we are happy using and it's pinned at that version until we say otherwise. In summary, our modularity gives us speed, strength, flexibility and reliability. We like these and we know our clients like them too. They tell us. This is certainly a case of pragmatic technology choices doing the required job with little or no techy bling for the sake of techy bling.
Analytics, testing and gathering data is a core feature of the Moneyspyder philosophy. We make sure our framework captures as much data as possible so that we can say with certainty what customers are doing and when. For example, we can see when promotions work. We know the right types of promotions to setup on our clients' sites. We also know that promotional messaging is key to promotion performance. Our promotions module will let customers know how much more they have to spend to secure a 10% discount or free delivery. We call this Promotion Proximity Warning - PPW. It works - we see a clear uplift in both conversion and AOV when PPW is turned on.
Customers think searching is cool - data says so, customers say so. We make sure our search solution fits hand in glove with all our searchable components to maximise conversion on clients' sites. Products, brands, categories, static pages - all indexed and searchable. Our search is not indiscriminate though - it honours archived or pre-live entities. If your new product range is not set to go live just yet - they remain safe from appearing in search results until you say so. Search uses internal tagging to relate products, brands and categories - really handy for automatic intelligent population of related products when trawling through 1000 products and relating them really should be a job for a smart machine. We consider internal SEO to be just as important as external SEO - especially considering the 3x more likely purchase propensity of searching customer over casual browsers.
Well, that's a start. Drop us a line if you want to know more about us...
Thursday, 17 April 2008
Beautifulpure gets a makeover - extreme edition
Beautifulpure got a major facelift yesterday:
A cool site just gets cooler and better. We're happy, Marc (Beautifulpure) is happy. Take a look!
- New design - nice fresh new look
- Moved from Ferret to Sphinx for search
- Upgraded back end admin
- Lightbox to Thickbox
- No more Prototype/Scriptaculous - only jRails
- Under the hood top secret stuff ;-)
- New email/newsletter signup functionality
- Improved support for Internet Explorer 6.0
A cool site just gets cooler and better. We're happy, Marc (Beautifulpure) is happy. Take a look!
Friday, 4 April 2008
jRails
Standard Web 2.0 coolness done easily and nicely
As Rails users we are well familiar with script.aculo.us and the Web 2.0 coolness that is pretty much built in to Rails.
Standard Web 2.0 coolness done better
To be frank, our js framework of choice is actually jQuery. We prefer the lighter weight, power and general betterness.
Show stopper?
Thing is, we've found a few negative points of using both jQuery and script.acol.us together:
All is not lost
So, we've elected to use jRails. We now use only one js framework resulting in lighter, cleaner and nicer Web 2.0 apps. No changes are required to most .rjs scripts and you still get pretty much all the functionality required from script.aculo.us.
Minor gotcha
We did find a slight omission however. Where script.aculo.us Ajax calls use
Of course, if you're using Piston, you'll be fine.
p.s.
More news soon on Moneyspyder's newest clients and work...!
As Rails users we are well familiar with script.aculo.us and the Web 2.0 coolness that is pretty much built in to Rails.
Standard Web 2.0 coolness done better
To be frank, our js framework of choice is actually jQuery. We prefer the lighter weight, power and general betterness.
Show stopper?
Thing is, we've found a few negative points of using both jQuery and script.acol.us together:
- Heavier pages due to more libraries
- Slightly mitigated through packing and caching
- Not working nicely together
- Having to use noconflict
var J = jQuery.noConflict();
- A minor inconvenience really...
- Mixing frameworks is just plain messy
All is not lost
So, we've elected to use jRails. We now use only one js framework resulting in lighter, cleaner and nicer Web 2.0 apps. No changes are required to most .rjs scripts and you still get pretty much all the functionality required from script.aculo.us.
Minor gotcha
We did find a slight omission however. Where script.aculo.us Ajax calls use
loadingto prescribe Ajax loading behaviour, jQuery uses
beforeSendThis is not supported in the current
options_for_ajaxWe have been in touch with Aaron and hopefully a fix will be posted...Till then, here's a patch:
options_for_ajax
.
.
.
js_options['beforeSend'] = "function(xhr) {xhr.setRequestHeader('Accept', 'text/javascript'); #{options.delete(:loading)};}"
Of course, if you're using Piston, you'll be fine.
p.s.
More news soon on Moneyspyder's newest clients and work...!
Tuesday, 29 January 2008
We grow!
We're really happy to announce a new weapon in our armoury: Rowan Cox has joined Moneyspyder. Rowan is essentially a Java guru but has taken to Rails like a duck to water.
Working on one of our skunk works projects (4 to roll out during Q1 '08) at the minute, most of the fruits of Rowan's labours will remain under wraps for now but Rowan will be making major contributions to our core e-commerce engine as well.
Welcome Rowan!
Working on one of our skunk works projects (4 to roll out during Q1 '08) at the minute, most of the fruits of Rowan's labours will remain under wraps for now but Rowan will be making major contributions to our core e-commerce engine as well.
Welcome Rowan!
Sunday, 20 January 2008
How was your '07 uptime?
Moneyspyder went live with our first clients in June last year. We have maintained our approach of gathering data with regards to client uptime using Site Confidence.
Having returned from my Winter Sun break I can now publish some findings from our data.
Our average uptime across all clients was 99.75% - this equates to 3.6 minutes per day on average of downtime. Our worst weekly uptime was 97.82% (best was obviously 100%!!)
Now, consider the reasons for outages here; typically a large Rails deployment will essentially take down a site for a short period of time. If we take out the periods where deployments were scheduled we see slightly different figures:
Average uptime soars to 99.91% - fractionally more than one minute per day of downtime (mostly scheduled infrastructure work) with a worst uptime of 99.42%.
As you will have seen on our website we gather data, schedule updates and roll out changes based on data on a monthly rolling cycle. Thus, identifying deployments is easy and not just because we maintain deployment audit trails. Business forecasting and scheduling gets easier with the monthly cycle. All those large marketing campaigns leading up to busy periods are 100% safe - any downtime is worked around the main retail peaks including a change freeze if necessary and certainly scheduled out of hours otherwise.
What of the other outages? We've seen a mixture of:
So, I think we have a good basis to build on in terms of reliability. '08 is looking pretty neat already - nearly at the end of Jan and so far we are exceeding last years averages (99.97%).
Having returned from my Winter Sun break I can now publish some findings from our data.
Our average uptime across all clients was 99.75% - this equates to 3.6 minutes per day on average of downtime. Our worst weekly uptime was 97.82% (best was obviously 100%!!)
Now, consider the reasons for outages here; typically a large Rails deployment will essentially take down a site for a short period of time. If we take out the periods where deployments were scheduled we see slightly different figures:
Average uptime soars to 99.91% - fractionally more than one minute per day of downtime (mostly scheduled infrastructure work) with a worst uptime of 99.42%.
As you will have seen on our website we gather data, schedule updates and roll out changes based on data on a monthly rolling cycle. Thus, identifying deployments is easy and not just because we maintain deployment audit trails. Business forecasting and scheduling gets easier with the monthly cycle. All those large marketing campaigns leading up to busy periods are 100% safe - any downtime is worked around the main retail peaks including a change freeze if necessary and certainly scheduled out of hours otherwise.
What of the other outages? We've seen a mixture of:
- scheduled infrastructure maintenance (scheduled outside of business hours)
- post deployment issues requiring patches (something we strive to eliminate)
- upstream connectivity issues (new pipes now in use)
- load balancer outages (resolved by Engine Yard)
- mongrels (software component) restarting due to memory leakage (also resolved through new deployment process)
So, I think we have a good basis to build on in terms of reliability. '08 is looking pretty neat already - nearly at the end of Jan and so far we are exceeding last years averages (99.97%).
Subscribe to:
Posts (Atom)