What makes the Index credible
Publication of the latest FT Bowen Craggs Index inevitably prompts requests for an explanation of its provenance and worth, David Bowen says.
|FT Bowen Craggs Index, Financial Times||FT Bowen Craggs Index|
|FT Global 500|
The fourth Financial Times Bowen Craggs Index of corporate website effectiveness is published today (21 April). You can see it on FT.com or in more detail in a special area of our site. Both versions have two articles by us looking at the overall findings and pulling out themes.
Whether your company is in the list or not, we intend the Index to be useful. If your site is included, it shows areas where you are shining (cue praise from the boss) and lagging (you are clearly under-resourced…).
If it is not included, the list is a useful guide to best practice: download the Excel sheet from our site, and sort it according to the metric that interests you. (We are also happy to carry out a review of your site using the same methodology, which additionally will give you access to the interactive Index database we are launching.)
We are often asked how we put the Index together and why it is so widely accepted as the most credible ranking of its kind. The following set of FAQs gives some answers.
How are the companies selected?
In June each year the Financial Times publishes its Global 500, ranking the world’s largest quoted companies by market capitalisation. We use this as the basis of our constituency for the following year, taking the 25 biggest companies from each of the US, Europe and Rest of the world (including Russia). There is a turnover of about a dozen companies each year. Our aim is not to say ‘these are the best websites’, but to provide a detailed picture of the world of corporate websites with a mass of best practice – the new database will keep reviews from previous years, so the resource will get ever more substantial.
What is the timetable for reviews?
We do the great bulk of the work from January to March. We ask companies in advance if they are planning updates, and structure the review schedule to get the most up-to-date analysis possible. In the future, the database will be refreshed on a rolling basis, but there still will be once-a-year publication to show the state of the corporate web.
Who does the reviews?
We have six people working on the Index, either full or part time. They all have a background in the web and business, and most have being working on the Index since it started, in 2007.
What is the methodology?
Our starting point is that the Index has to be credible – for us that means a checkbox approach cannot be used, because companies vary so much in what they can and should be doing online. We have a set of metrics, which are divided into sub-metrics, and for each of these we ask a set of questions. These are derived from our full-scale benchmarking technique, but have been refined to provide a strong basis for comparison without needing such exhaustive research.
Some of the questions are broad (Is the look and feel appropriate? Does the site tackle relevant controversial issues?) – here analysts have to make a judgement based on their knowledge of the company, industry and the web. Others are quite specific (How well are CSR management and measuring systems explained? What formats are provided for quarterly results?). Here we use a ‘proxy’ system as a way of streamlining the analysis. If CSR (corporate social responsibility) systems are well explained, we infer that the company provides a sophisticated service for CSR/SRI (socially responsible investment) professionals. If multiple formats are provided for results, the company is putting effort into making life easy for investors and analysts. It is important when making these judgements to know what other companies are doing – the scale of the Index means we can do this. We do not, incidentally, use a proxy system in our full benchmarks.
Are the scores strictly comparable from year to year?
Almost. We adjust the questions to take account of changes in the world of the web. The big change this year is that we have woven several proxy social media-related questions into the metrics. The aim is to see how well companies are integrating the web and social media (for example, Is there social media integration on the home page?) and also whether they are making active use of it (Does the company use social media for case studies, reputation management, conversation?).
As a one-off change this year, we have stopped providing customer scores where the site is clearly not intended to serve customers, and where there is a clear alternative. While for the Index we look at the corporate sites of China Mobile, Coca-Cola, McDonald’s, PepsiCo, Volkswagen and Wal-Mart, they all have customer-facing sites with much more obvious URLs. We include a non-distorting score (i.e. reflecting the other metrics), so that the totals are realistic. This is a technique we use in detailed benchmarks; for example, when covering non-quoted companies.
How long do the reviews take?
It depends on the complexity of the site, but on average a review from scratch takes about 12 hours, including checking. Where a site has been analysed previously, we go through it carefully for changes. This year the process has taken rather longer, because we have put everything into the online database (which has about 400,000 words of documentation in it).
Do you not have conflicts of interest?
We have carried out consultancy and benchmarking work for about 20 companies in the Index. Most do well in the rankings. This is not because we favour them, but because they are also the companies that take the web most seriously, so are more likely to employ us. We separate consultancy from benchmarking, but more important we could not possibly mark clients up (or down). Even if we wanted to, we would soon be found out – especially once all our underlying research is available on the database. We also know that web managers do not like to be over-scored – they know if a score feels right, and want a credible judgement they can support internally.
The companies we have worked for include: Abbott Laboratories, BP, British American Tobacco, Chevron, Cisco Systems, Coca-Cola, Eni, GlaxoSmithKline, HSBC, Nestlé, Novartis, Philip Morris International, Procter & Gamble, Roche, Samsung, Shell, Siemens, Total, Unilever, Vodafone.
If you would like to know more about the new FT Index database, please contact Dan Drury
First published on 21 April, 2010