Call Us Free: +44 (0)203 199 4421

Yearly archive for 2013

Should You Fear Negative SEO?

Should You Fear Negative SEO?

Manual spam actions and algorithmic penalties are two things that many webmasters lose a lot of sleep over, and with good reason. If you get hit by a penalty, you can easily drop off the front page of the SERPs, and end up losing almost all of your organic search traffic. Penalties are typically handed out for unnatural-looking link profiles and spammy marketing tactics. It’s natural for a webmaster to wonder how hard it would be for a competitor to perform such “SEO” efforts against them.

Sadly, the answer to that question is yes, negative SEO is possible. It is incredibly difficult to pull off against an established site, but it can be done. Until recently, negative SEO was a popular black hat tactic. The good news is that there are several defences against negative SEO, and Google is getting much better at helping webmasters recover from ranking penalties. Firstly, Google allows you to disavow links that point to your own website (you can’t disavow someone else’s links). This means that if you keep an eye on your Webmaster Tools profile you can disavow low quality links when they crop up, helping to protect you against most penalties. If you have any concerns, get in touch with Google themselves: they’re more willing to help than you might first expect, and will be happy to reconsider your website if you think that it has been hit with an unfair penalty.

Secondly, if your site has plenty of high quality links, and a solid presence in social media, it will be much harder for a competitor to water down your SEO efforts. Just think of the number of low quality links that Wikipedia has coming in. Google doesn’t care about those links because the overwhelming majority of links coming in to the site are high quality, and the anchor text is varied. The same goes for almost any other popular website. Google understands that you cannot control the behaviour of other webmasters.

Negative SEO can happen, in theory, but it’s highly unlikely to happen to you, and even if it does happen the chances of your competitor’s efforts negatively affecting your website are slim. Instead of worrying about negative SEO, focus on building a better site and promoting it effectively. Not only is building a better website the best defence against negative SEO, it’s also the best way to boost your rankings.

Dominos Using Content Hub for SEO

Dominos Using Content Hub for SEO

Dominos pizza has just launched a new content hub which it hopes will help to propel it to the top of the organic search results, ahead of other rival pizza companies. The pizza chain’s SEO efforts make an interesting case study, because the company has put a lot of effort into coming up with quirky and interesting content that will engage their target audience. The new website includes a range of light-hearted and humorous blog posts, as well as widgets for users to share, competitions, infographics and even video content. The chain will be combining this SEO effort with social media campaigns on Instagram and Twitter, as well as more typical voucher based promotions.

Dominos is hosting the content site on a subdomain of the main Dominos domain, so that it will not confuse users who are simply interested in visiting the ecommerce site. The company has decided to focus on best-practice SEO, and also plans to expand the number of sales channels it uses, and thereby get to know customers better through their content campaign.

A Bigger Social Media Presence
Domino’s decision to invest in organic SEO shows that no company is too big to ignore the search engines. Nick Dutch, the head of digital at the company said that their brand had been built on a strong social media presence, and they had enjoyed a lot of success with using online content marketing in the past. Since the company now accepts a lot of online orders, they believe that adding content to their own website was a natural step forward. They hope that the content will attract a wide range of new visitors, and that those visitors will head on over to the ecommerce section of the website.

The next step in Dominos’ online marketing plan is to make it easier for customers to leave feedback for the brand on their blog. Building engagement is just as important as having that initial online presence, and the company will be working hard to build a community around their brand. So far, their efforts seem to be working. In March 2013, they launched an online sitcom which generated one million views. Purchase frequency among viewers apparently increased by 15 percent over the following few months, proving that if your brand is fresh in the minds of your customers, they are more likely to spend money.

Google Updates AdWords Ad Rank Formula

Google Updates AdWords Ad Rank Formula

Google has just announced another change to its Ad Rank calculation. The Ad Rank system is used to determine where your ad is shown and how much you are charged per click. Now, in addition to factoring in your maximum CPC bid and the Quality Score of the ad, the system takes into account the expected impact of ad formats and extensions.

What the Ad Rank Changes Mean for Advertisers

Google has launched many different ad formats and extensions recently, and Google wants to encourage advertisers to use as many of the extensions as possible, as long as those extensions make sense for their niche. Judicious use of extensions improves both CTR and overall campaign performance. Extensions are eye-catching and impress your potential customers. Better CTR means more money for Google, and hopefully more money for you too.

Google plans to serve extensions automatically for advertisers depending on the context of ads. This will reduce the workload for advertisers, but may have the side effect of removing some control from the advertiser if they wanted to opt-out of a certain type of extension. If you chose to avoid using extensions to keep the cost of your campaign as low as possible, then this “improvement” won’t be a welcome one.

However, the way that Google describes the system does sound useful. If you earn the lowest paid-ad spot, Google will show the sitelinks and seller rating extensions alongside your ads, because those extensions are the best performing ones for that spot. Ads in the top spot may show image extensions, and so on.

Save Time and Get More Clicks

Google says that it will automatically select the best performing ad formats and extensions for advertisers, increasing CTR and ensuring that the most relevant and prominent ads are shown for each search. The increase in the importance of Ad Rank means that advertisers with a low Quality Score might see a deterioration in their AdWords performance. You can increase your Quality Score by disabling campaigns that are performing poorly, tightening up on your keyword selections and improving the quality of your landing pages.

Currently, the AdWords update affects only paid search results, and not other forms of advertising within the network. This means that you have some time to tweak your campaigns. If you haven’t experimented with extensions on your own yet, now is a good time to do so.

Bing Teams Up With TripAdvisor For Travel Results

Bing Teams Up With TripAdvisor For Travel Results

TripAdvisor and Microsoft have been working closely together over the last few months. TripAdvisor recently announced the launch of a new travel app for Windows 8.1, and they also signed a deal with Bing which sees TripAdvisor’s content and travel search features being added to the Bing search page.
When users search for travel related content, images and reviews from TripAdvisor will be displayed, along with information from TripAdvisor’s price comparison tool. The results benefiting from TripAdvisor content include hotel, attraction and restaurant searches. Currently, only destinations in the US are being indexed in this way. It is unclear whether the partnership will extend to other countries in the future.

Why Not Google?
TripAdvisor’s decision to work with Bing is a huge benefit for the search engine. TripAdvisor and Google don’t have a particularly good relationship. In fact, TripAdvisor is a member of FairSearch, an organization that has spent a lot of time lobbying against Google. It is easy to understand, then, why the travel group would choose to help another search engine extend its “knowledge graph” like features.
Is TripAdvisor Fair?

Not all hotels and attractions will welcome the addition of TripAdvisor results to Bing’s search feature, however. While the price comparison and review site is popular with occasional travellers, many frequent fliers resent the site’s reliance on consumer reviews. TripAdvisor does try to moderate the content that appears on its site, but there are lots of review postings from accounts with only one or two reviews to their name, and there are also reviews from travellers who appear to have little experience with the kind of venue they are reviewing.

Questions about shills, fake reviews from rival hotel or restaurant holders, and bad reviews from overly picky one-off guests have plagued the site. Any business owner who has had problems with negative postings on TripAdvisor may now find their online reputation even more tarnished because of the increased visibility of the review site thanks to its partnership with Bing.

Online Reputation Management
If you are a hotel, restaurant or attraction owner, now is a good time to start pro-actively managing your online reputation. Even if you are not based in an area covered by this new agreement, you should start encouraging your most loyal customers to spread the word about your business. An engaged community is good for SEO, and will stand you in good stead should Bing extend its TripAdvisor partnership.

Cutts Warns Webmasters Not To Duplicate Meta Descriptions

Cutts Warns Webmasters Not To Duplicate Meta Descriptions

Matt Cutts, head of search spam at Google, has published a video advising webmasters on the issue of meta descriptions. For many years, it has been “common knowledge” that having correctly configured meta tags is essential to good SEO. Conventional wisdom dictates that every single page should have a meta title, description and keywords.

While it is easy enough to add meta tags to pages on a relatively static website, it can be difficult to create effective titles and keywords for bigger websites that have fresh content added daily. Many webmasters get around this by configuring default meta descriptions, but according to Matt Cutts, this could be counter-productive. Using duplicate descriptions could cause your site to get penalized in the search results. In his latest webmaster tips video, Cutts explained that having duplicate meta descriptions can actually be harmful. It is better to have unique descriptions, and if you can’t have unique descriptions, don’t bother adding descriptions at all. Cutts went on to say that he doesn’t bother to create meta descriptions for his own blog. Rather, he allows Google to auto-create descriptions based on the content of his pages, and the searches that they appear for.

The role of the meta description is to show users that the content that they have searched for appears on the web page in question, and Google is quite adept at creating accurate meta descriptions these days. Some webmasters have expressed dissatisfaction with the advice that Cutts gives, complaining that the duplicate content penalty is being taken too far when it is applied to meta descriptions. From Google’s point of view, meta descriptions are not supposed to be ads, and a duplicate meta description adds no value to the search listing, because it is unlikely to accurately reflect the content of the page.
Allowing Google to create its own meta descriptions should save webmasters a lot of time; however the algorithm it uses to create the descriptions is far from perfect. Many webmasters have found that Google’s algorithm erroneously displays cookie warnings, information from the header or footer, or irrelevant text snippets, rather than a useful piece of information. This is particularly problematic for ecommerce sites, which would be better served by an informative, generic snippet instead of an auto generated piece that may or may not include information that is beneficial to would-be visitors.

Google Allows Searchers to Prevent Encryption

Google’s decision to encrypt almost all search keyword data so that it no longer appears in Google Analytics attracted a lot of negative attention from webmasters last month. The search giant made the decision to encrypt keyword data to protect the privacy of searchers, but it has recently revealed a workaround that will give searchers the choice as to whether or not their keyword data is encrypted.

Using URL Parameters
The simplest version of the workaround is to append ?nord=1 to your search URL. This will prevent Google from using the SSL version of its site, ensuring that your search keywords appear in the logs of the webmasters whose sites you visit.

If you run a network and want to ensure that Google’s SSL searches are disabled for all devices on that network, you can achieve similar results by removing SSL at the network level. Google provides detailed advice on how to do this, and use content filtering at a network level, on their support website.

What This Means For Webmasters
This is a small positive step for webmasters, but it is unlikely to have a big impact on the amount of referrer data that you are able to capture, unless your target demographic is webmasters and the webmaster community at a whole decides to help each other out by allowing their keywords to be collected.

For webmasters whose target audience is people who are not tech-savvy, this news means relatively little. The average surfer doesn’t bother to log in to their Google account and check their account options, let alone change URL parameters in their browser, so the amount of people who will change this option is likely to be relatively small, especially given that there is no real incentive for anyone to make the change.

Currently, webmasters who want to see referrer data are limited to the most popular keywords for their website, which are listed in their Webmaster Tools panel. The listing in Webmaster Tools includes only the most popular organic search keywords. Webmasters who have large sites that attract a diverse range of searchers cannot view all of the keywords that send traffic to their website. It is unclear whether Google will ever change their minds about this new system, or whether they have a goal of pushing webmasters to use AdWords as their primary method of collecting search and traffic data.

Google Agrees to EU Oversight

The New York Times is reporting that Google has come to an agreement with the EU Competition Commission that would see the company be overseen by a “watchdog”. The role of the watchdog has been laid out in a 96-page job description, which describe the position as “Monitoring Trustee”. The trustee will be paid by Google and will serve a five year term. They are forbidden from having other links to Google during that term, and for a three year period afterwards.

As part of the position, the Monitoring Trustee will be responsible for confirming that Google is complying with the terms of the competition commission’s settlement. In addition, they will receive complaints from competitors, and assess the validity of any competitive issues and incidences of non-compliance. The full terms of the settlement have not yet been agreed, and there is no guarantee that a settlement will even be reached. Google will be trying hard to reach an agreement with the competitions commission, because if they do not do so it could cost them up to $5 billion in combined fines and penalties.

More Input Requested
The competition commissioner, Joaquin Almunia, has asked for a second round of input from Google’s rivals after Google agreed to some additional concessions as part of a settlement proposal. Among the settlements, Google agreed to place three outgoing links to rival search providers on the results page. Google has already implemented these links on some European servers, but the system is only in testing. Even in its current state, the rival links have not been well received, with competitors calling the solution “problematic”, and stating that it is flawed and ineffective.

The competitions commission is still waiting for feedback on Google’s concessions. Many competitors are calling for Google to work harder to allow search neutrality. They have made accusations of search bias and traffic diversion (placing their own verticals near the top of the SERPs), and are not satisfied with the concessions that Google has made so far. The alternative, to eliminate Universal Search completely, is not something that Google is willing to consider.

A similar case filed with US regulators was abandoned with no action taken against Google because the regulators believe that there was not enough evidence against Google to put together a full case. However, the European regulators have more power to make significant demands to corporations.

Google AdWords Bid Simulator Adds Conversion Metrics

Google AdWords Bid Simulator Adds Conversion Metrics

Google has added a conversion estimates feature to the AdWords Bid Simulator. The tool will now provide users with estimates of the number of conversions they can expect from a given ad, as well as impression and click estimates. This will help advertisers work out how bid changes might impact their conversion volume and their expected sales.

The tool will provide advertisers with the number of conversions (including one-per click and many-per click) and conversion values, assuming the advertiser has assigned some. The conversion estimates are based on the number of conversions that Google believes the advertisement may attain in one day, and are based on a recent one-week period. It is unclear whether the seven day period that Google uses is a rolling period of the last seven days, or whether the calculation window is picked up differently.

Google says that the conversion data is most accurate for advertisers that have a longer history and higher conversion volume. The estimates may be poor for low conversion campaigns and for campaigns that are relatively new. You need at least two weeks’ worth of conversion tracking data for the simulator to offer useful information.

New Opportunities
Another recent addition to Google AdWords is the Opportunities Tab. The Opportunities Tab was relaunched last week, and the new tab includes several new kinds of ad opportunity. Advertisers will be presented with opportunities based on how their site has performed over the last seven days. The suggestions that the tab offers are automated, and will include things such as breaking out ad groups, or even lowering bids if it is possible to do so without reducing the number of clicks that your ad campaigns earn.

The opportunities tab is a useful offering for novice advertisers who are not well versed in the way that the AdWords system works. However, the system is not perfect, and it may sometimes make erroneous recommendations. If you are an experienced AdWords user, then you should generate your own reports and perform your own analysis on the data within your account before making any changes to your campaigns.

Many webmasters have criticised these new measures from Google, expressing concern that the measures are focused on bringing advertisers further into the Google ecosystem, and pushing them to upgrade their accounts. While the features will help advertisers to increase their conversions, most will find that this comes at the price of spending more money on advertising.

Bing Continues to Gain Market Share

Bing Continues to Gain Market Share

The October search market share data collected by comScore has finally been published, and it appears that Bing is continuing to gain market share at a steady rate, although that share is coming from Yahoo, rather than Google. Google’s share of the search market remains flat at 66.9 percent, a level that it has held since August. Bing now enjoys 18.1 percent market share, while Yahoo has slipped down to just 11.1 percent, losing some ground to Bing, and some to the “other” minor search engines.

The comScore data tracks only desktop search activity. Mobile search traffic is not included. However, data from other sources indicates that Google is, as you would expect, equally dominant in the mobile space. Google is the default search engine on Android devices, which dominate the mobile market in many parts of the world. Windows Phone 8 has proven to be popular in the western world, and ships with Bing as the default search engine, but it is likely that many users choose to open a browser and go to Google, rather than using the Bing tile on their phone.

What Does The Future Hold for Bing?
While Bing is steadily growing, it may not have a future at Microsoft. Potential Microsoft CEO Stephen Elop recently announced that if he were to become CEO of the company, he would consider selling Bing, along with some other Microsoft properties which he believes are a distraction from the core focus of the company. While this is simply speculation at this point, Microsoft has already tried to sell Bing at least once in the past. Last time around, Microsoft was in talks with Facebook, but they were unsuccessful in selling the property. Since those negotiations, Facebook has added Graph Search to their own property, so they may be interested in Bing from a technology point of view.

Moving away from search would be a big step for Microsoft, however. The Bing platform is closely integrated with Windows Phone and Windows 8. If they give up having an in-house search platform then they will need to find a friendly buyer who will license the platform back to them. Doing so is a large gamble, especially considering how important search, and the web, both are today. Why give up on a search property to focus on software sales when the cloud is becoming an increasingly important part of the future?

Bing Renews Deal With Twitter

Bing Renews Deal With Twitter

Bing has again renewed its deal with Twitter to include Tweets in its search results. Bing and Twitter first formed a partnership in 2009, and renewed the deal in 2011. The partnership has proven quite successful, and the two companies announced the renewal of the deal on November 1st.

Bing users have the option of searching for Tweets using the Social Search Page, but Tweets also appear in Bing’s main search results. Twitter is indexed quite quickly, so users searching for major events should see social media mentions of the events in their sidebar. Bing does not appear to give precedence to any particular social media accounts, so any recent mentions of a trending topic that are considered noteworthy may appear in that sidebar.

Neither Bing nor Twitter has made much of a noise about the deal. There was a mention of the renewal on their Twitter pages, but Bing posted only a small blog post about the deal, and Twitter has not elaborated on it at all, not even to say how long the extension will be. Most analysts, however, expect that the deal will run for another two years.

What This Means For Webmasters
Bing’s decision to renew the deal with Twitter shows something that most webmasters were probably already fairly confident of; that social media is here to stay, and that Twitter is an important place for news and discussions. Both Google and Bing place a lot of importance on social media mentions, and webmasters should remember that when planning their online marketing strategies.

Bing is not the only search engine that is offering Twitter search features. Mozilla signed a deal with Twitter to include Twitter search functionality in Firefox, and Yandex also pulls data from the social network. Google tracks social media mentions, and favours sites that are highly rated across Facebook, Twitter, Wikia and other popular social or user generated content services. If you don’t already have a presence on Twitter, or if your presence is limited and you are not putting a lot of effort into building engagement on the service, then you should reconsider your marketing efforts.

Many web users think of Twitter as an interactive RSS feed, and use it as their primary source of information for major news events. Building a following on Twitter and posting updates about the latest news in your industry will increase your company’s visibility.

Page 2 of 512345