Wednesday, December 30, 2009
This occurs when resources (web pages to you and I) are moved or renamed, but links or bookmarks still look for the old URL.
Usually a dreaded error comes up that says "Error 404, resource not found" or something similar - unless you let Google take control of this and automatically suggest useful alternatives from its search index.
So what can you do about this? You obviously can't have exactly the same website structure and page naming convention from one generation of your website to the the next (and some evidently can't even have this from one week to the next).
My recommendation to handle this is to set up site redirects correctly, so that you give the user what they want. Now some websites build 'custom 404' pages that are more palatable than the generic & bland page I've already described. These will politely tell you that the page doesn't exist and usually give you a link back to the homepage of the site.
For example, here's Amazon's:
One additional step you can also carry out is to ensure that permanently moved content has a 301 HTTP server redirect set up (and don't use 'Meta refresh code in your web page)
Note: This hint is courtesy of the great O'Reilly book 'The Art of SEO' I am currently reading for review on the subject
Tuesday, December 29, 2009
And so this is Christmas
And what have you done
Another year over
And a new one just begun
And what a year! (Phew!)... Our company (http://www.idealinterface.co.uk/) has grown in numbers and the projects we're working on, we've been busy pitching for some new work and also started work on some in-house projects that should start to reach fruition soon.
But what about 2010 and beyond, what does that hold in store?
Seth Godin recently asked a bunch of eminent thinkers and movers "what matters now?" and the results have just been published into this very interesting (and free) ebook:
Monday, December 28, 2009
There are two main ways to ensure that your site doesn't appear in search engines:
- Use a lot of underhand Search Engine Opimisation techniques
This 'blackhat' activity will get your website delisted quickly and with a chance your URL may never reappear in the listings!
- Tell the search engine spiders not to visit
What? Yes, you can tell all the major search engines that you don't want all or part of your site spidered & indexed. Its called a robots.txt file and it sits in the root directory of your website.
Note: This isn't an exactly new thing done by Google, Robots.txt was devised in 1994... two years before Google was even incorporated.
So why, if Mr Murdoch has been threatening to delist his properties from every search engine (apart from those he wants to do a deal with) why hasn't he just got his Tech people to make this simple change and instantly get removed?
Note: This is even something Google took some time earlier in the year to point out how to do.
Sunday, December 27, 2009
Or to put it another way, what's the opportunity cost of not having a conversation with your customers? (And by a conversation... I actually mean a continuous dialogue not a one-way monologue that pushes information at them).
We're now living in a world of Attention Economics, where the attention of a customer is easily lost and so very hard to regain (especially if your business is commoditised). So basically a lost conversation is a lost customer, and with the cost of customer retention usually far cheaper than new customer acquisition... even the financial benefits are obvious.
Saturday, December 26, 2009
However although its easy to attack a person's personal brand, its even easier for a badvocate to attach your company, regardless of whether they have a legitimate grievance or not.
So what can an organisation do in advance to protect itself from this activity?
1. Firstly register all likely domains that your organisation needs and as many of the ones you don't want to be used against you. These include all popular TLD's (top level domains), common misspellings (especially if your domain is even slightly difficult to spell or obscure) and obvious insulting ones such as mycompanysucks.com, etc.
2. Find out what the general public and customers are saying about you. . Search for yourself on search engines weekly and set up Google alerts and Twitter monitoring services (e.g. http://www.tweetscan.com/) on your company and brand terms.
3. Claim the first 10 results of Search Engine Results Pages - especially in Google which still has the lion's share of the search market - for each major term.
There are a number of ways to do this such as:
- registering your company on Linkedin.com (here is the profile of Ideal Interface as an example) and Naymz.com
- using sub-domains for your different online services / offerings (e.g. Yahoo)
4. If you haven't already... start and maintain a blog.
This will not only act as a useful resource of information for prospective & existing customers, but will help your Search Engine Optimisation efforts around your key brand terms.
Friday, December 25, 2009
For an example of the power and speed of Social Media, take a look at this TED presentation from Alexis Ohanian (incorrectly named in the video) on a Greenpeace campaign against Japanese whalers.
Thursday, December 24, 2009
Note: I actually found out subsequently via Wikipedia that the movie was made by the Museum of Media History and was originally produced in 2004 and subsequently updated in 2005.
We have witnessed a crumbling newspaper business model, once based upon near-monopolistic retail and classified adverts. And those 'golden days' just aren't likely to make a comeback (ever). Well in 2009 this animation took another step towards becoming realised.
Advertising revenues along with readership numbers have dropped further and so have the profits & staff of those papers along with them. The only difference worth pointing out is that the battle this year wasn't taken up by the New York Times - as suggested in EPIC, but by News Corp (owners of the Wall Street Journal).
And now even some of those papers that state their readership is rising aren't selling more newspapers. Apparently since April 1 2009 there have been new auditing rules in the USA that have made it easier for newspapers to count a reader as a paying customer... by counting both their print and digital subscriptions to the same person as two readers! (The Huffington Post explains further here)
But the decentralising of information from newspapers and their mainstream media owners, to millions of the general populace with the ability to publish for free, is just the tip of the iceberg.
What were are really hinting at here is the decentralising of power... and for some that is an opportunity (those prepared to understand and embrace digital democratisation) whilst to others it is a threat (old media owners and those who used the media to control and restrict thought & opinion).
Wednesday, December 23, 2009
However this article from Frederic Filoux attempts to counter some of the Anti-google feeling amongst newspapers publishers.
Its a fairly heavy-hitting article that explains how newspapers at first rushed to embrace Google but now want their share of the search engine when their own business models failed;
What I don’t like is the duplicity shown by legions of publishers: pouring huge sums into SEO/SEM to develop questionable audiences on the one hand, while at the same time whining about copyright violation and fair use abuses simply because this Styrofoam audience is not monetized enough.
Monday, December 21, 2009
Wednesday, December 16, 2009
I've previously covered the topics of Brandjacking and Google's Sidewiki tool on brand sites. However I have recently considered whether the use of Sidewiki on a site could actually be considered Brandjacking...
Firstly, for those to who don't have the Google Toolbar installed on their favourite browser, I realise that you won't know what I mean... so here's a video to explain Sidewiki's functions:
So how could Sidewiki be Brandjacking?
Well, it now allows anyone to comment on any page on your site and therefore to say what they like (yes, I'm aware there is a complaints policy, but I'm not aware of any company that has done this successfully yet). It therefore is scary for those brands that don't facilitate user-generated comments and discussions, but is also a problem for those that do!
As Jeff Jarvis pointed out back in September about this:
So now in the Sidewiki, there’s a parallel discussion going on, separate from [commenting functionality]. There’s no opportunity to respond in threads. I have no control over the content associated with my site essentially on my site. What has been added? Each of those people could have and normally would have commented right here.
It therefore creates a separate channel for alternative conversations to take place on a website. However this is an argument that's not restricted to Sidewiki. Other services such as FriendFeed have removed conversations to entirely different sites for some while now (with little complaint), something Matt Cutts from Google is happy to point out.
Hopefully some clever person is already working out how to include Sidewiki comments into common blogging platforms (perhaps even Google - via their Sidewiki API) or commenting services such as Disqus. This would make finding, tracking & responding easier for everyone, including those that allowed comments on their site... and may even encourage a few more to do so.
So... is it Brandjacking?
In my opinion its not... but what it may be.. is a catalyst in making more brand site owners aware that they don't entirely own the online channel and that their advocates and badvocates now have a chance to say what they like and perhaps its time to join in!
Tuesday, December 15, 2009
Monday, December 14, 2009
Struan Bartlett from NewsNow.co.uk stated:
“We have worked extremely hard to seek clarification from the NLA and its solicitors on the legal basis for either NewsNow or our customers requiring a licence. I am sorry to say that the NLA has not substantiated the legal basis for its licence. Indeed in our view its arguments do not hold water. We believe that other organisations who privately agree with our position have reluctantly signed the NLA agreement under pressure. However, we are not in a position on our own to fund an extremely costly legal case on behalf of an entire industry.
So... rather than fight a costly legal battle against the NLA, they will imminently remove the links from their paid-for part of their site.
The full news release is here.
1. Who or what is associated with the name?
It may sound good to name someone after a famous person, place or events - but times change
(As the people who named their baby boys "Tiger" have subsequently found out)
2. What does the name sound like out of context?
Ask yourself the question "if you saw it written down, would you know it was a name?"
(Famous examples such as Scout Willis and Peaches Geldof spring to mind)
3. Does the name only suit a certain point in life?
Something that sounds great for a baby, may not necessarily be good for an adult.
(e.g. How many board directors do you know that are called "Poppy"?)
You may even consider what the name means in other languages!
(Note: a UK translation agency now provides a service that does exactly this)
And the same needs to be said for companies who name themselves and their products.
The most obvious recommendation I can give is to do an online search first. As well as seeing if another company has a similar name, you can also see what other search results come up as well. Something could come up that can damage your future brand, especially if the results are particularly unsavoury. However if your term produces a Googlewhack (no search results for those exact words) then you have either found something completely obscure or search heaven. This article from Marty Wintraub at Search Engine Land covers a lot of other useful pointers (including watching out for Search Engine Optimisation and social media naming tricks).
Thursday, December 10, 2009
These days its not effective enough to measure just the cost of each campaign and work out the average revenue per email... You have to have a good idea of the total worth of these accounts and try to understand the customer buying profile(s).
To take the stress out of calculating this yourself, the Direct Marketing Association (DMA) have done this work for you . They found on average that an email address is valued at £9.11
However, its should be remembered that:
- This research was done in late 2006 / early 2007, so you need to amend this according the the rate of inflation over time
- This is only an approximate average from a number of companies in different sectors, therefore it is probably a good thing to do your own modelling and mathematics
Tuesday, December 8, 2009
In fact, a recent report from Aberdeen Group highlights that many companies are still not assessing their website performance from the user's perspective and that less than a third of those surveyed have the support for their C -level executives for any website optimisation initiative.
The report also highlights the areas of concern that site owners have about their web offering, with most stating their applications as their principle worry:
Monday, December 7, 2009
Why? Well recently Matt Cutts, Google's Principle engineer, has mentioned that Google could well be factoring page response time in their organic search rankings in the future.
The impact of this change (ranking based upon how quickly a site comes back with a page to the user or the search engine spider) is that popular sites with slow pages could....
- Find themselves pushed down the page in Google, etc. or even disappear from the first page entirely
- Have to spend more on their SEO efforts or step-up their pay-per-click advertising to ensure they get prominence on the top of search results*
- Have to spend more of their development and testing efforts on Volume & Performance efforts - in turn adding to project timescales
- Bring hosting Service Level Agreements further into the spotlight
(*You try explaining to your boss why you're having to spend loads more on your Marketing budget because your site slipped down the results for key search terms)
So... with page performance becoming a more important factor, are you already measuring your page response times and comparing them against those of your competitors?
Thursday, December 3, 2009
In this modern age of digital communications, it is refreshing to see some companies bucking the trend and refusing to put up a website. Russell and Bromley is one such company that has thought better of taking advantage of any: online messaging, customer contact information or store locator to help customers actually find their High Street shops...let along putting up images of their product or (heavens forbid) selling stuff online.
No, instead they have decided to block user access to http://www.russellandbromley.
This has also obviously been the case for some while as they have even managed to get their main URL de-listed from Google.
So having seen this error message displayed for over a week, I rang up the company and asked them if they knew this was the case.... only to be told by the person answering that "Yes, we were aware and we will be putting up a website some time in the new year"
How disappointing... they are obviously doing so well in ignoring Christmas and the holiday sale period online, I'm surprised they have time to work on this new site at all.
Wednesday, December 2, 2009
There has been a continual rise in awareness of the term SEO by most businesses and it now not just the domain of the marketing/technology hybrid (so-often labelled a "geek") that it was several years ago. Now it is a defined market sector in the Internet industry and one that requires its own skill-set and experience, has its own terminology and mainly tries to keep to best-practice rules & guidelines. Overall the main purpose of the SEO industry is to ensure its clients get maximum and ethical benefit from their sites within the organic (main) results on web search engines (Google, Yahoo, Bing, etc.).
A year or so ago, stories of the SEO market place dying out emerged, partly fueled by the opinion that as all sites became standardised there would be no difference between them.
This viewpoint, in my opinion, is greatly exaggerated or incorrect.
- As new junior talent comes into the web development industry (and others leave) and as the search engines change the algorithms their very engines are based on - which affects the placement of the results - thus there is always lots to learn and improve upon.
- Not all web agencies are created equally - some still don't develop to standards or test, test and test again
- Different sites require different approaches. Some need instant recognition at the top of one or two specific keyword searches, others need a more general or volume-based approach.
- Search habits change, either as vocabulary and names come to prominence, or as users get more proficient and specialised in their search terms.
So what is your company doing about its constantly-evolving SEO requirements and how are you going to find the right SEO agency to help you with this in 2010?
The 'First Click Free' functionality in Google allowed visitors access to view the first page of premium content on such sites, but subsequent pages were blocked. This blockage could be overcome by constantly going back into Google News and clicking on the next premium story, meaning that all premium content could be read for free. But this has now been changed.
Now Google has announced that it is allowing publisher to change this rule, meaning that after viewing 5 pages of a paid content site per day, Google will stop its First Click Free rule being used. This means that visitors on the 6th click will be faced with the newspapers site's registration/payment screen.
More information from Josh Cohen at Google is available here.
So... is this a knee jerk to the recently-announced News Corp / Microsoft discussions? Is it a way to placate Rupert Murdoch following his relentless insulting of Google?
Errr... no! Apparently Google has been planning this change for some time, although it has also been more-than aware of the battle that's been coming for some time as well!
Tuesday, December 1, 2009
I got sent the chapters by the publishers who were after a quote for the notes on the cover. So hopefully my comments are there when the book arrives on bookshelves (if there are any physical bookstores left in 2010)
Anyhow, this book explains the "who, what and how" of this art-form.
Note: I've called it this, rather than a science, as the: subjects, measurement and quality are completely subjective.
It has therefore got me around to thinking about blogging best practice and I think it's time to mention, not the good things I find about blogging... But the mistakes that are made so often.
- Writing badly
Yes, i know I'm guilty of it sometimes (I use an iPhone blogging app sometimes, so I don't get the chance to correct my spelling before I post), but this has to be a major reason why I stop reading blogs. Not everyone is a great writer... so if you're not, take time to read your entire posting again before you submit it. If you can wait a bit beforehand, even better.
- Blogging in anger
Submitting a post just to vent your spleen is the equivalent of drunk-dialling. However unlike a midnight call to an ex-lover... your post can be up on the web for a very long time and you can quickly regret doing it. Always remember that everyone on the web can read your post, not just you. In particular, if you mention a person or company in a particularly nasty way.... consider NOT doing this and then consider it again! (I'll not even venture into the legal nightmare of defamation and libel for more vitriolic posts)
- Not replying to comments
So someone has taken the time to read your blog and even write something in response. Now assuming they are not a spammer or a mad person.. what reason or right have you got to ignore them? Blogging is a platform for a dialogue, not a monologue. Even a "thanks for the posting" response goes a long way to building a relationship with your audience, so whether you are a company or personal blogger... you have a brand to consider
- Intermittent blogging
Yes I know that writing a daily blog can be a huge investment in time, so if you can't maintain a daily blog... don't do it! (consider writing a weekly blog instead). Its understandable that we take holidays or time off, but if you know this is likely, why not write a post or two in advance and schedule them to go live when you're absent - most blogging packages allow this functionality these day.
Note: I've yet to see a headline of a dead blogger continuing to post thanks to this feature, but I bet it has happened!
- Writing posts that are too short or too long
If you wanted something fluffy that goes "meow" then you wouldn't buy a dog and try to train it to do it, would you? (you'll be disappointed or on TV/YouTube very quickly). So don't blog if you are only going to post a few words (use Twitter) or a huge amount (write a book/ebook). Statistics from back in 2006 from ProBlogger showed that the average time visitors spend reading a blog post is 96 seconds. Now, potentially these times have decreased as people have got used to Twitter's 'microblogging' format, but the general consensus still to keep your postings between 200 - 500 words.
Note: If you have a much longer piece, consider breaking this up into two or more articles
So... does anyone have any more?
Thursday, November 26, 2009
To quote Mark Sigal from O'Reilly
Analog (old) media is all about managing scarcity by controlling distributionwhereas
Digital (new) media. ... content, in tandem with un-tethered distribution and pretty good search/retrieval functions, operates in complete disregard for the old media-based pricing models that preceded it
And as we know, when they meet.... the results are very disruptive!
Its therefore an analog vs. digital fight we are in the midst of. Up until now, digital has won over the protectionist activities of 20th Century analog-based business models.
The hard-fought battlegrounds have been:
A clear win to digital, where iTunes and the MP3 format were the tank and gunpowder
- small ads
Another clear win to digital, with a soldier called Craig and the mighty eBay the victors
A draw, where radio has been allowed to live (but limp in its damaged vehicle with a license that could expire at any time in the future)
But.... the war is getting closer and closer to the centre-piece of the analog media's territory.... television. So the analogs have drawn their line of battle , along the far-reaching fields of newspaper control.
It is here they have decided to stand and attack back, to preserve what they have and fight one last and (possibly) long battle... or risk losing everything (AKA: mainly the value of their shareholdings in their companies).
Tuesday, November 24, 2009
I've recently mentioned the Forrester report that states that 80% of Internet users would not bother to pay for newspaper content online. Yet the belief at News Corp is that there is still a model to be had from sticking up a pay wall and hiding your content from Google (and charging Microsoft for the privilege of displaying your content).
Perhaps this is the "rewriting the economics of newspaper" that James Harding, Editor of The Times talked about last week? Here he talked about charging a fee for a 24 hour view of the newspaper online (figures of around £1 have been mentioned, but obviously an annual subscription would be less than this).
But is there a possible financial model to had from de-listing from Google? Well... Bill Tancer at Hitwise has done this work, albeit taking just one News Corp newspaper as an example... The Wall Street Journal.
His findings are that although WSJ.com gets over 15% of its traffic from normal search (which quite possibly isn't that valuable to Mr Murdoch, as this is predominately brand searches, not searches for content), it gets 11% of its traffic from Google News. Now that must surely contribute to 11% of all landing page advertising inventory and any successive page that the visitor then moves onto?
So.... unless Bing can provide this sort of replacement revenue to News Corp.... how are they hoping to make more money (not less) from de-listing from Google and going to a Bing-only search engine model?
Obviously the answer is they are hoping to make further money from their paywall subscriptions. Although the WSJ may need to seriously consider whether its readers will be happy about paying for their paper and well as having to deal with intrusive/distracting in-page advertising.
And now proof of the UK paywall model for local content is about to be put to the test. Today, news has surfaced that local newspaper publisher Johnston Press will start an experiment to charge for access to its weekly websites from next week, although they have yet to announce they are de-listing their content from Google.
This is exactly what a German investigation unearthed when looking at what the effect on Google.de would be if most of the country’s publishers delisted their content from it.
The results found:
five percent of the top 10 (Google search) results came from the news organisations - and this is with publishers co-operating with Google.It is therefore likely that the effect on Google would be minimal and the content gap would simply be filled by other news sources. This would especially true if it was only News Corp content unavailable to the whole web and put it behind a paywall or available via Bing.com
An article from Travolution caught my eye recently.
Apparently there is a view that some of the online travel marketing spend is moving away from PPC (Per per click) advertising in the search giants to metasearch travel sites (e.g. Kayak).
According to Wouter Blok from European Hotel site http://www.easytobook.com/ he has seen conversions quadruple from such sites.
However, is this conversion rate standard for the rest of the industry or has Mr Blok just not used PPC correctly?
Monday, November 23, 2009
However, this battle to make content on newspaper sites such as The Times and The Sun chargeable, looks like its dragging in other media players....well, it has to, or else the existence of free news elsewhere will mean that most people simply won't visit news sites if it costs them.
First it was The Telegraph group that Murdoch indicated that he was in discussions with, when he told a Telegraph journalist (who surprisingly didn't report this slip) what he was up to.
Note: As Alan Greenslade points out, I'm sure its more than a little anti-competitive to have discussions with your opposition, as well as being somewhat foolhardy to admit to it for all regulatory bodies to hear.
Now it looks like Rupert is talking with Microsoft in the hope that Steve Bulmer will pay him money for his content if he removes it from Google listings.
As so many web commentators have pointed out... It is extremely easy to de-list all News Corp content from Google, by sticking a small file containing a single line of code on each website. But surprisingly, despite calling Google names (he's obviously run out of sticks & stones this year, perhaps after losing so much on MySpace)... this instant change hasn't been done. Perhaps News Corp, even temporarily needs Google!)
Yes, soon Bing could contain News Corp's lovely content. This could increase its share of the search engine market and now giving it potentially more tabloid news to display in local searches. I can't wait!
Friday, November 20, 2009
For too long PR and other media relations people have needlessly spammed journalists and (more recently) bloggers with their irrelevant releases.
Emails titled "For Immediate Release" constantly fall into our inboxes in the hope that we will write about the latest product or service. But most are sent with little actual targeting of their subject or audience and are quickly dismissed and deleted.
But with the release of the Dow Jones Media Relationship Manager (http://www.dowjones.com/
This tool apparently understands what journalists and bloggers are covering and enables them to be contacted with relevant and personalised messages.
One does wonder if this is an automated or human process, how often their index is updated and therefore the ways that bloggers & journalists can be entered onto / removed from this list, or else it may up as yet another spamming tool that will automatically end up in the deleted folder, saving a lot of us the effort of putting there ourselves!
Thursday, November 19, 2009
And who do they blame for their troubles? Well.... Google of course (my particular favourite is Robert Thomson, editor of The Wall Street Journal, calling search sites such as Google “tapeworms.” - I jest-ye-not!).
Its all their fault for, errr.... ;
However, can the newspapers learn from Google? Can they actually create and maintain a local or hyper-local news search service that would rival and better the search giants?
Tuesday, November 17, 2009
Now... in my experience this actually takes some explaining to define what you actually mean and how you going to measure it before you are likely to get agreement from the client.
Firstly, its important to understand that all things on the Internet are definitely not equal. Connection speeds (bandwidth), latency, the browser you are using and the speed of your device all contribute to the differences between one user's experience and another.
Q: So, what is an acceptable page download time?
A: This depends on who you ask
For a long time I have used the words of Jakob Neilson, the web's foremost usability expert, who has tackled this subject several years ago. He gives the timescale for user attention between 1 and 10 seconds, after that user flow is broken and users tend to leave the site:
More recently, two seconds has been given by Akamai as the new average of an online shopper’s expectation for a web page to load:
(I guess you would expect this from a company who provide fast Internet delivery services!)
However, there is no doubt in my mind that a user's expectation of page download times is gradually increasing. So just because bandwidth speeds are increasing, there is no reason for site owners to increase page size accordingly (or to provide complex or badly-written client-side code that takes ages to render in the browser).
Monday, November 16, 2009
The report makes fascinating reading, not only with some great insight (e.g. how Waterstones provides contextual search within a specific product category - which I find particularly useful) but also that it still misses out some of the major online high street players (e.g. River Island).
Sunday, November 8, 2009
According to a report from Kelkoo recently, £8.9bn will be spent online this Christmas, that's 20 pence for every £ spent!
In a report by the Centre for Retail Research on behalf of Kelkoo, online shopping is anticipated to grow by 24% on last year!
Thursday, November 5, 2009
This is an enterprise product search service that is hosted by them.
It is similar to Mercado, in that is has:
- Facetted navigation
- Business rules for prioritisation
- Spellchecker, synonyms & recommendations
- An XML API for exporting results back into your own website
It even has a built-in shopping cart if required and pricing is based on number of products/SKU's and the number of searches.
Wednesday, November 4, 2009
- How many golf balls can fit in a school bus?
- You have to get from point A to point B. You don’t know if you can get there. What would you do?
- How much should you charge to wash all the windows in Seattle?
There are various versions of these questions around the web, such as:
Tuesday, November 3, 2009
That's the fee that the Newspaper Licensing Agency want to start charging NewsNow... (for now).
"I don't think it is being unduly greedy to suggest that some of this comes back to us on behalf of the organisations creating the content"
states Andrew Hughes NLA's Commercial Director (note the use of "unduly" there)
But what reciprocal value does the NLA place on the links to its member sites? If only this value could be quantified...in reference to:
"NLA's commercial director"
- http://econsultancy.com/blog/4902-the-nla-explains-why-it-is-going-after-the-news-aggregators (view on Google Sidewiki)
Monday, November 2, 2009
Newpaper Club is a service due to be launched early in 2010 with the aim of building a service to help people make their own newspapers.
In a digital age, where some newspapers are closing and most are losing money, its possibly surprising to see a print-based proposition being launched. However what makes this service different is the product... a limited run (five to five thousand in quantity) tabloid-style paper in a generic dozen page format... but printed at existing commercial presses.
Could this be the final manifestation of the hyperlocal newspaper?
Well possibly and already the Cabinet Office has tasked them with producing a publication just for one East London postcode.
- Newspaper Club Helping people to make their own newspapers (view on Google Sidewiki)
Friday, October 30, 2009
Struan Bartlett has given us further insight into the threats made by the major (National and Regional) UK newspapers.
In his new open Q&A session (http://www.newsnow.co.uk/
Perhaps it may be the right time to go to court to resolve this... I just wonder if Google will get dragged into the fight, as there is every indication that the newspaper groups don't want them involved (maybe because they will have a bigger and better legal team?)
"They claim copyright law as a legal justification."
- NewsNow: NewsNow.co.uk Free Linking Q&A (view on Google Sidewiki)
Wednesday, October 28, 2009
Journalists should already be familiar with George Orwell’s 6 rules of writing (if not, then I suggest you take a look now) however Mr Orwell had [unluckily] never heard of the Internet nor had ever tried to read stuff off a back-lit electronic screen rather than the printed page he was used to.
I've been lucky enough to work with some great online copywriters, digital editors and web information architects. They have made my life so much easier and made the client (and me in the process) look very professional. And so, whilst I don't regard myself as having a fraction of their skills (or patience), I have learnt a thing or two from them. I'm therefore going to extend George Orwell's rules by 2 more, to make them '8 rules for writing for the web'.
- Structure your content the way people want to read it, not the way you think you need to show it - kind of like a user-centred copy approach, if you will. So.... out goes huge lengthy paragraphs and in come bulleted lists and other such devices to make you content more scan-able and consumable.
To put it another way, think of your end-user as an eater of words rather than a reader of them.
Arrange your content into small manageable bite-sized chunks that are easy to digest!
(Can you tell I once worked for an international food company?)
- Make sure you have copy standard and stick to them! I have lost count at the amount of times I have seen a company refer to themselves differently across their own corporate website for no reason.
- Mega Corp
- Mega Corp inc.
- the company
and the list goes on!
It really doesn't take much to keep a central dictionary of common terms & titles, and its now even less effort to stick this document up on your company Intranet as a permanent reference.
I hope George Orwell approves of my additions.....
Tuesday, October 27, 2009
I’ve recently used and blogged about Google’s Sidewiki and the risks/benefits it creates for brands and their websites. Its both an opportunity and concern to allow visitors to write whatever they like (within reason – or the comment gets deleted) on your homepage and beyond.
However, if you are the site owner looking to post about your own site, but worried that your own introduction and information will not get the prominence you want, Google has thoughtfully allowed you write a special entry that will stay as the top entry for this page in SideWiki.
In fact, this article is such an entry for this website. So… welcome one & all and please feel free to comment on both this blog and in the sidewiki.
Well, according to an article last week in Forbes magazine, if you're one of the legion of bloggers who pulls up companies & brands for their deeds then you're not an advocate... you're a badvocate.
Now, bloggers having a go at companies for bad, mad or stupid actions isn't a particularly new thing, but the time of ignoring badvocates has surely passed? Its not longer a case of sweeping the issue under the carpet & hoping the message goes away (or doesn't get any worse). Social Media is now pervasive and won't go just away overnight, so neither will your badvocates.
Monday, October 26, 2009
However, even if he didn't say it, its worth crediting it to him anyway.... particularly in the online world where the zeitgeist, hopes & fears and social drivers of society can be quickly gauged by seeing what their people look for in seach engines.
However, when putting some work together for a potential client, we have been discussing if search habits are different across the Internet and for me the answer has to be "Yes", for example...
- Different countries / regions have different languages. Even translated, these lose a little something along the way http://www.imdb.com/title/tt0335266/
- Trends, thoughts and influences differ from place to place over time and although there may be global memes, these are no-doubt subtley different at a local level.
- Different search engines are used across the world (e.g. Naver.com is popular in Korea and Baidu.com is bigger than Google.cn), these have different indexing algorythms and therefore differing search results.
But accepting that there are these differences, the next question should be "are the ways that people use search engines different from place to place?". By this, I mean:
- Do users in different counties go to search engines for different reasons at a different points in their consideration/buying cycle?
- Do search engine visitors from different markets/locale's trust the results in varying ways?
- Do a different numbers of pages of results get looked at depending on the language of the user?
All food for thought and possibly a much bigger topic than just one blog post.......
Friday, October 23, 2009
Using hashtags on Twitter (using # before a keyword to show your subject, allowing others to follow) is extremely useful.
For example following the stream of collective opinion on last night's BBC Question Time was both amusing and insightful. (#bbcqt)
However today when searching subsequent views on the programme, there are now a whole host of unrelated sales messages using the same hash tag. Obviously realising that those following this important subject on Twitter may be influential decision makers, spam accounts have decided to invade this subject for gain.
- Organisers- create a topical hashtag for your important event, but expect to use another for your next one
- Followers - follow a hashtag, but don't expect the same quality of tweets in the morning
- Twitter - please find some way of stopping this or see the decline in the usefulness of hashtags
"Sorry! Bing Twitter Search is not available in this locale." is about as useful as a chocolate teapot to anyone outside of the United States who wishes to take adantage of Microsoft's Twitter search facility that they launched recently.
Has Bing just indexed US-based Twitter accounts or does it only want US-based people to see the results?
(Maybe its the latter, if its only sold advertising inventory to US clients)
Bing Twitter Search is not available in this locale."
- Bing Twitter (view on Google Sidewiki)
Thursday, October 22, 2009
Today sees Google (who have closely followed Bing) announce that they are going to index Twitter feeds and add them to their search results.
Now ignoring the huge compexities of storage and spidering this creates for the search giants, lets just consider what it means ....
Yes, your tweets are now likely able to be found when someone searches for your name. So... depending upon the speed of the indexing, this means that searches for content also bring up conversations and the life-streaming activity of the entire Twitterati.
Have Google and Bing just made Twitter more important to a person's (and company's) online presence? Quite possibly.....
At first, ot seems like a daft decision... To sell your original & new clothing on eBay or Amazon, when you already have a decent and fully featured eCommerce website.
Well that's what UK department store Debehams
(www.debenhams.co.uk) have done and on both sites at the same time!
However, taking a closer look at the detail, this might not be as silly a decision as it first looks.
Firstly Debenhams is not selling its entire product catalogue on either site. On Amazon it has only 1000 selected products available now (increasing to about 2000 by next week), including its more up-market 'Designers at Debenhams' range. On auction site eBay however it is replicating the products from its outlet web store.
The competiton will be keeping a close eye on whether this initiative works for the now-profitable department store....
Links are the very essence of the Internet (and it would not be the World Wide Web without them). So its with some amazement that I read Struan Bartlett's open letter to a bunch of newspaper groups.
Why? Well it would seem the Managing Director of NewsNow has been threatened with legal action if his company (amongst others) doesn't stop linking to newspaper sites or accept 'controls'.
So, let me get this right.... the newspaper industry is in a complete tail-spin, aggregators drive traffic to these newspaper sites and the newspapers now want to stop this acivity?
Wouldn't they be better-off finding out how to retain whatever custom they could rather than going after those who are actually providing them with readers and search engine optimisation assistance?
Being a huge aggregator of newspaper content, did Google get one of these threats? I bet not!
Wednesday, October 21, 2009
Consequently I have been asked to write the 'opinion' piece for Social Media within the document and here is what I wrote.....
Social Media are online functionality that support the human need for social interaction. The Internet has transformed from a series of one-to-one monologues into numerous dialogues amongst crowds of individuals, that consequently enable greater and more concentrated communication and opinion.
The lesson for all brands is to appreciate is that they are already being talked about and that opinions are already circulating about their offers and service. The challenge is therefore to understand those conversations, be part of them and to influence them over time.
Tuesday, October 20, 2009
An initiative by the National Policing Improvement Agency has placed maps online that show UK crime statistics.
As well as being able to compare overall crime rates, visitors to the site can view figures for: burglary, robbery, violence, vehicle crime and anti-social behaviour.
However,the voice of the Police Federation (the police 'union') that said this could help criminals, by letting them find specific crime hotspots, looks to be unlikely as the service is repeatedly unavailable today.
The reason "Due to very high popularity users may experience temporary intermittent issues accessing this site".
Either the Home Office never anticipated this level of traffic in its first week of launch (strange, given the curiousity around such a subject) or gangs of 'ner-do-wells' are using the service to plan their next set of heists, muggings and anti-social behaviour online.....
Friday, October 16, 2009
In yet another aggressive move on its competition in the "sell everything we possibly can online" market (BTW: who actually is in this market right now?) Amazon has announced it now has no minimum spend to qualify for its Super Saver Delivery option.
This basic delivery choice on www.amazon.co.uk (usually advertised on the site as '3 - 5 working days') is incredibly popular and in my experience nearly-always gets to me in only a couple of days.
Obviously someone at Amazon has done their maths and worked out that they will gain additional revenue without adding too much to their costs. This benefit should come either from incremental basket value or up-selling shoppers to premium rate delivery options.
This could be a blatant move to steal market share at a loss, but this activity could harm Amazon in the longer term if it subsequently raised the minimum spend back to £5 or higher.
Here's hoping the people with the spreadsheets have their assumptions correct and visitors won't make more frequent but lower value purchases...
I'll not go into too much detail, but this paper starts in a tone that many old hacks will be concerned about:
Journalists are truth-tellers. But I think most of us have been lying to ourselves.He covers the inconvenient fact that journalism has no business model if it cannot provide something that people value or enriched their lives.. and perhaps the discussion should stop there?
Thursday, October 15, 2009
I’ve been using Google’s Sidewiki in the last week or so and have even used it to post to my blog, suffice to say I’ve found it useful for tracking my own comments about a site, as well as found it very interesting to read the comments of others.
Most notable of these is the outcry about Seth Godin’s ‘Brands in Public’.
A service created for brands to see what is being said about them online in one place by aggregating content from the usual social media suspects across the web, such as: Twitter, blogs, feeds, Twitter, etc. However others have openly criticised this service as Brandjacking and where have these comments and criticism been posted? Well… on the Google Side Wiki attached to the site of course.
Now let’s get one thing clear… I view transparency and openness as key brand attributes, and there can be nothing more open than allowing criticism against your brand on your own website. That is of course until your CEO understands that now anyone can say what they like and others can read it (with the Google SideWiki Toolbar feature enabled)
Yes, there are ways to complain about certain content and Google has (surprisingly) been very clear about the process… although I have yet to hear personally of an appeal being successful.
You can try and explain to your CEO that you’ve never really had control of what people say about your brand. You can highlight that thanks to social media customers and potential customers are able to discuss, share and complain about your products & services (or just your company as a whole). You can point them to examples where companies have made the situation worse by either ignoring the comments or by trying to get them removed.
However you may find it’s a case of NIMBY (not in my back yard) or my accurately “Not on my homepage”
Monday, October 12, 2009
We're really proud at Ideal interface to have developed and launched the new website for All About Brands (http://www.aabplc.com/). This site uses the Kentico Content Management System (CMS) and is built with some great features including: blog, Twitter feeds and is optimised for Search Engine Optimisation (SEO).
We think it shows AAB to be a professional branding company that leverages the digital medium.
As well as creating their guidelines for social media use, Intel have also published these on their main website.
This transparent act serves not just employees of the company, but anyone participating on any form of social media on the Intel site (e.g. their blogs, forums, etc.)
Thursday, October 8, 2009
But yesterday Google announced their proposal for a new standard for making AJAX crawlable.
These particularly technical recommendations hope to free up the content within AJAX-based sites, but need agrement from web server developers and search engines... a tall order indeed!
Wednesday, October 7, 2009
Today a letter from Amazon founder Jeff Bezos has been posted up to visitors of its UK site informing them that its now available and you can download 250,000 books as well as Uk & International newspapers.
You have to buy the 'USA & International version' from amazon.com. Luckily the $279 price of the basic (Kindle2) version has been slashed by $60 since July. Therefore at the current exchange rate, this makes it an attractive £175.60!
Note: They do seem to be out of stock right now....
Oh, there's no word yet on when a similar international version of the flashier Kindle DX will be available.
Tuesday, October 6, 2009
Well, unsuprisingly one newspaper bosses think it is, but unfortunately readers and other journalists think otherwise.
Now, as many people know, hiding news content to readers (unless they are charged for it), is known as using a paywall and Mr Rupert Murdoch (who is some chap who owns a company called NewCorp - a company that took a $3.4 billion net loss in 2008, down from net income of $5.4 billion in 2007) thinks that this will make him some money. Well... he's correct.
So... how will they actually charge for this service? Well one way would be to have a subscription service such as the economist website and the other one is to use Micro-payments (e.g via a service such as http://bitcents.com/)
Putting up a pay wall for the New York Times or The (London) Times WILL make some money. But will it kill off his online readership (and therefore his small but constant advertising revenue) in the process? Quite possibly!
There are a lot of industry observers such as myself who think this will fail, most notable including Steve Outing, who shows statistical evidence from the USA that newspaper execs "remain delusional about how charging for online content"
But its not just observers who think this is a bad idea [bad joke: its also Observers]
Emily Bell from the Guardian, back in August said
No – we are not contemplating a pay wall, nor as far as I’m concerned would we ever….they are a stupid idea in that they restrict audiences for largely replicable content. Murdoch no doubt will find this out – even rudimentary maths suggests he will struggle with a completely free model to meet advertising revenue levels across the NI offerings.
And even Google CEO Eric Schmidt says that is is unlikely that a paywall model will work because news content is now so ubiquitous across the web.
Time will tell.....
Thursday, October 1, 2009
Also know as domain squatting, when a URL is obtained with the aim of pretending to be a brand:
Although Google is doing nothing illegal by letting people purchase existing brand trade marks (see my previous post on the ongoing LV vs. Google battle) some people are passing off fakes as reputable products in their pay-per-click adverts.
This is the rather complex process where someone registers a domain name the instant its current ownership expires. As it is now possible to use an 'Add Grace Period' for domain registrations (where within a period of 5 days the domain can be dropped and a full refund recieved), registrars don't even need to purchase domains to use them now.
Now, this wouldn't be too much of a problem, but:
1. Google takes a few days to recognise that the page has changed and re-index it correctly
2. The fraudulent registrar instantly put up a new site with lots of adverts (e.g. Google Ad Sense)
3. This now happens on a grand scale
Brandjacking now happens in a lot of different market sectors including the ones my company has been working in:
Only last week Nucelus released a study which has found that in the last year 80% of the surveyed travel businesses suffered brand hijacking. These incidents have increased from 67% in 2008.
MarkMonitor in its Spring 2009 report found that:
Brand abuse is increasing, but more important than the sheer volume is the increased sophistication and the opportunistic nature of brandjackers
And its not just domains and marketing efforts that are subject to brandjacking. Earlier this year on Twitter, an account called @exxonmobilcorp was set up and someone named Janet who claimed to work for Exxon answered questions about ExxonMobil.
This turned out not to be any employee of the company.
So, what are you doing to avoid being brandjacked?
Wednesday, September 30, 2009
According to a study by the IAB , PWC & World Advertising Research Centre, online advertising now has a 23.5% market share, now making it the biggest ad media.
Unsuprisingly mainstream media is sticking its head in the sand on this one. Lindsey Clay from the TV marketing body Thinkbox says:
it’s interesting but meaningless to sweep all the money spent on every aspect of
online marketing into one big figure and celebrate it.
And apparently the World is still flat as well..... See the larger article in New Media Age
Tuesday, September 29, 2009
But this isn't just the latest and greatest technology, its good old integration between systems that semes to be the biggest stumbling block. In an eConsultancy report from earlier this year, there were considerable technical integation issues raised by lots of respondents
A focus on other business priorities is preventing organisations from successfully tying up their online and offline data, according to half to companies surveyed. It is alarming that so many companies are not prioritising an integrated approach to CRM.
Or to use another statistic from the same report...
Only a fifth of companies (20%) say that they are definitely able to link data from the online channel with back office systems
Friday, September 25, 2009
“Google has not committed a trademark infringement by allowing advertisers to select, in AdWords, keywords corresponding to trademarks.”Those words from the Advocate General at the European Court of Justice is just another round in the ongoing legal battle between LVMH and Google. So although the final decision is not due until later in 2010, this does position the search giant favourably. It therefore looks to reverse the French courts' decision in 2005 where the Louis Vuitton owner successfully claimied Google had acted illegally and allowed competitor companies (or those that make fake copies) to buy keywords such as "Louis Vuitton" to be displayed for Pay Per Click advertising.
But during 2009 click fraud rates (the percentage of clicks on your adverts that are not genuine customers) have dropped! In fact, they seem to have declined by about 25% over the last reported 6 months to a figure of 12.7%.
What does click fraud mean for the average company that uses Pay Per Click advertising?
It means you're more likely to spend more money for less real traffic!
But why does this happen?
Well there has apparently been an increase in the use of click farms. These are groups of people paid to click on your adverts by:
1. Site owners who want to make more money from their advertising inventory
2. Your competititors who want you to spend more money (or the same money for less real numbers of visitors/customers)
But why would click fraud rates drop in a recession? Are the less scrupulous companies also feeling the pinch and therefore are less likely to be able to afford click farm rates?
(Note: it does still use them when you buy their sealed yellow box to spider your internal network and it "sometimes" uses the 'Description' keyword for the text you see beneath the URL in its search results)
Here's the full article and video explaining:
Thursday, September 24, 2009
Now what doesn't get me annoyed is the fact that advertising people expect us to believe birds can carry PC's through the central business district, but that the chap who gets all this equipment delivered can moan that he doesn't have his keyboard.
Did he not think of either going into the next door office to grab a USB keyboard?or better still.... Use the mouse?
It is after all just a presentation he's giving....
- Some people work longer hours
- Some people put more effort into their work (or a seen to be doing so)
- But some people innovate (new products, new markets, new technologies)
It is still amazing that with low customer interaction costs via channels such as online that companies are still trying to work longer and with more effort and failing to innovate (if you can actually call using the Internet as innovative these days).
However, this report also mentions that a third of respondents fail to actual monitor their cost to serve, thus not even measuring the cost of each customer interaction.
Wednesday, September 23, 2009
- It assists their overall Search Engine Optimisation (SEO)Efforts
- It facilitates viewing/downloading only part of the report (useful if the user is on a low bandwidth connection)
- It contributes to a more environmentally friendly company profile
The report for 2009 which was released this month has found that almost half (48.7%) of those surveyed had a full HTML version of their site and according to them:
HTML is the only way to make full use of the Internet’s potential.HTML has gradually increased in its usage, compared to just PDF's or images (usually in JPG format). And surprisingly it it UK companies that led the way, with 64% of the FTSE100 found to use HTML.
It is also interesting to note that in a piece of research focusing on Corporate Responsibility done by Radley Yeldar earlier this year, they found only 9% of those companies they surveyed (mainly FTSE 100 and FTSE 250's) used just HTML for their reports, with 47% using HTML and PDF combined.
Even US regulations now encourage company reporting in formats such as HTML, rather than just providing PDF's of the report pages. However using HTML is just the start of what you can achieve with your online company report. Animations and even audio & video (whilst remaining accessible to as many users as possible, so all sound media must have a transcript) add interactivity and help in the communication of company information to shareholders, employees, journalists and others.
So where's the future going for company reports? Well, depending upon who you ask, you get different answers....
- Those who have a CEO & board who are professionally media trained would say you should put as much (relevant) video into your online report
- Those who are standards based would say that you should make a lot of use of XBRL (eXtensible Business Reporting Language) or at least have sensible tagging of relevant content
- And those who are on a budget will wait and see what works or becomes legislation
Monday, September 21, 2009
One of the greatest advantages of online is the level of accountability it gives marketing people with their budgets. It can also be provide in near enough real-time with the analytic tools available in the market. So with this accountability comes the obvious conclusion that online also provides the advantage of being to conduct multivariate testing – whether it be on a web page, an e-mail, Pay Per Click copy or an online advert.
Yet, there are still many companies out that invest massively online and yet do not conduct even the simplest of A/B tests.
But A/B testing is nothing new for marketers. Direct marketing has embraced this for decades and if you want to read up more about this then I recommend you get Commonsense Direct and Digital Marketing from one of the Direct Marketing gurus – Drayton Bird. The “Direct” version was first published in 1982 and was and probably still is on the recommended reading lists for the CIM qualifications when I first came across this book back in the early 1990s.
It is still relevant, in fact probably even more so today, 27 years on, but despite the wealth of evidence that supports testing, then why do some marketers still not do it?
It is not as if all marketers have the insight to know what is the exact page on a web site to maximise response or sales as a survey by Maxymiser proved in May when only 21 out of 452 marketers correctly identified the best performing web content from an A/B/C/D test.
So what are the reasons for not doing this?
I think it comes down to one of the following reasons:
1) Resources – many think that simple A/B testing will involve increasing the time to develop the web page, or e-mail, or online advert. But the reality is that if you start off simple then A/B testing can be easily accommodated within the current processes. The change could be as simple as colour, or ordering, or a headline on an e-mail, or the time of day or actual day an e-mail is sent. How many times have you sat there and presented two or more options for the final sign-off? Why not test both?
2) Building the site for me and not the audience – despite the fact that you build a website or design an e-mail for your target audience, I can guarantee that in meeting rooms and offices across the world that websites and e-mails are being sign-off or changed because of the “gut feel” of the decision maker. For no better reason than that is what the decision maker likes most. The fact that they may not fit or understand their target audience is an irrelevance as long as it makes them happy. If you are in this position – bad luck, and good luck with the job search!
3) Can’t build the business case – perhaps you are stuck with the conundrum of how can you prove that it is worth testing unless you know what the results of the uplift are and go around in circles as you don’t want to make a prediction and the test fails? Well consider the lost opportunity if you don’t test. Look at elements that you know are not performing, perhaps a high drop-off page on a website, or copy for your PPC campaigns, or the fact that your competitors are sending e-mails out at different times to you. If you look hard enough there is certain to be some element of your online marketing that can be improved – and use a test to prove it – and that will help prove that testing works.
4) Marketers not knowing what real marketing is – coming from an engineering background I have to admit I have an axe to grind but unfortunately I have come across too many marketing people who talk about “creative” and not enough about the numbers. There would be a lot more marketers in the boardrooms of the UK if they got to grips with the numbers side of the business and unfortunately this does present an image problem for Marketing, which still has not been addressed.
In the current environment, it is easy to say that because you are so stretched that you need have so many priorities for “business as usual”, but my challenge to that is that this is exactly the sort of thing you must be doing in the current environment to prove your worth and ensure that you help your business through these tough times.
So if people in your organisation see multivariate testing as the equivalent to “How do you an eat an elephant?”
The answer is quite simple – “One bite at a time”.
You can still read Marc's previous guest post Did the BBC and not Video kill the Radio Star?