Call Us Free: +44 (0)203 199 4421

Yearly archive for 2013

Rival Links Deemed Unacceptable in Antitrust Case

The European Union Competition Commissioner has just announced that Google’s latest round of proposals in the EU antitrust case were “unacceptable”. Commissioner Joaquin Almunia has said that the rival links proposal, which was a part of the second set of proposals made by Google in an attempt to settle the case, was not good enough. Google must find a better way to give equal treatment to competitors versus their own content in vertical searches.

Almunia was involved in the negotiation of the original proposal, which was condemned by European competitors almost immediately. The proposal was put to the test and rivals were given the opportunity to comment on it. Google then came back with the proposal to display three “rival links” on the results pages for certain verticals. The lobbying group FairSearch conducted a study on that system and argued that the results showed that those links did very little to drive traffic to the alternative services. The modified proposals do not address that problem.

Google’s Properties In The Spotlight
A second study conducted by the Institute of Communication and Media Research in Germany found that Google’s rival links are positioned poorly on the page. The study used eye tracking technology to determine which areas of the page got the most attention, and found that Google’s sponsored results were the most prominent part of the page, and got the most attention from visitors. Organic links got a negligible amount of attention from visitors, and the alternative search sites, or “rival links” did not attract enough attention from visitors to generate a significant number of click-throughs.

The combined research results from ICOMPand FairSearch indicate that Google’s latest proposal is not sufficient to answer the concerns of the competition commission. The competition commission reviewed the results carefully, and will of course have taken into account that both studies were conducted by organizations that have indicated they want to see Google restrained in a number of ways. The European Commission will be discussing another proposal with Google.

At this stage, it is unclear whether Google will be willing to go through another back-and-forth, or whether they will decide that the restrictions would be too great and they should simply risk being fined by the commission. The company has accepted fines from other territories, if the alternative would be to place too many restrictions on their business.

Why Secure Search Matters

The year 2013 saw several major algorithm changes which affected search marketers, as well as the Hummingbird update, which Google bills as being a whole new algorithm. However, those algorithm changes are not the thing that most search marketers are talking about. The big news of the year is the advent of Secure Search.

Over the course of the last year, the amount of searches that resulted in the keywords “(not provided)” appearing in the Google Analytics (or other) logs of websites has increased dramatically, from less than 25% to more than 75%. Google wants to encrypt every single search that is performed on their service in the near future.

What Does Secure Search Really Mean?
Some search marketers have used the introduction of secure search as a chance to proclaim, loudly, that SEO is dead. This is not strictly true. While secure search is problematic for online marketers, sound SEO strategies still work. In the pre secure search world, online marketers had the benefit of being able to view all the keywords that were being used to access your site and even use that data along with heatmaps, goals and other analytics tools to get detailed information about visitor preferences, behaviour and conversions. Now that search data is not being provided, you no longer have that luxury.

However, search keyword data is just one part of the SEO toolbox. The best webmasters have already adapted to life without that single tool. There are other things that you can do to learn about your visitors, such as look at which pages are the most popular, and track activity on your on-site search tool, rather than relying purely on Google’s data. SEO means more than just targeting a few keywords. That should have been clear from the Penguin and Panda updates, but it is something that many online marketers refused to accept, until now.

The new school of SEO focuses more on on-page optimization, click tracking, social media and high quality content. If you want to succeed in 2014, you will need to embrace modern SEO techniques. Google’s secure search roll-out is an unwelcome wake-up call for many webmasters, but in a way Google is doing those webmasters a favour. By depriving webmasters of data that they were using as an SEO crutch, it is forcing them to learn modern SEO tactics. Are you ready for a new year of SEO evolution?

Optimizing Your Site for The Sale Season

while Black Friday and Cyber Monday are traditionally seen as American phenomenon, they are slowly gaining traction in other parts of the world. Europeans may not celebrate Thanksgiving, but retailers love any opportunity to host a seasonal sale, and customers love discounts whenever they are available. With some creative marketing and SEO you can take advantage of the open wallets and eager spending habits that come with this time of year.

Create New Landing Pages
If you decide to take part in any of the major sales, the first thing you need to do is create new landing pages for your biggest discounts. Don’t just send the customer to a cluttered, confusing home page and expect them to find the best deals. Pick some special deals and make pages for them. Those landing pages could be seasonally themed or broken down by product: the choice is yours. The important thing is to make it easy to find the best deals.

Speed Up Your Site
If you run some huge discounts, your site will get a lot of traffic. If you want those visitors to think positively about your brand, then you need to make sure that you give them the best user experience possible. Consider using a CDN or an improved caching system to ensure that your website does not go down when it gets a lot of users.

Head to Social Media
PPC advertising is a good way to attract customers who know what they want to your website. However, during big sales periods – such as the run up to Christmas, the January Sales, or the period surrounding Black Friday – you aren’t aiming for people who want something specific. If you want to attract customers shopping for “big discounts” you need to announce those discounts to the world. Social media is a great way to broadcast your special offers. Don’t limit yourself to just Facebook and Twitter. Post pictures of your most discounted items on Pinterest, head to the relevant deals section of Reddit, and post on forums that relate to your niche.

Traditional search marketing will help you to make your sale a success, but it’s the other tools that will really expand your reach. If your deals are easy to find, and your site works well on both desktop PCs and mobile devices, your sale should be a big success.

The SEO Year In Review

The world of SEO is always changing, and search marketers should always strive to stay abreast of the latest developments. It’s fair to say, however, that 2013 was a particularly turbulent time for webmasters and SEO experts. This year saw the Penguin 2.0 update, the Hummingbird algorithm change, and some huge changes to analytics and adwords, including the removal of keyword referral data, and some changes to the adwords extension system.

If 2012 was the year that Google clamped down on spam, then 2013 was the year that it cleaned up poor content. Today, website owners cannot afford to rely on spun articles, bulk press releases and thin content to market their websites. Google made it clear with the Hummingbird release that it’s not enough to write content for search engines; your visitors must be your top priority.

The Year of The Mobile
Many analysts are calling 2013 the year of the mobile. Google was quite vocal about how important mobile users are, and the search giant has warned webmasters that they should focus on responsive design, make their sites load more quickly, and take mobile visitors into account in everything they do. While their algorithms are certainly starting to favour sites that are well-designed for mobile users, there is still a lot of progress to be made in terms of mobile optimization. If anything, we can expect to see even more emphasis on mobile factors in 2014.

Google and the Competition
This year saw Google embroiled in antitrust battles all over the world. The company is currently engaged in a back and forth discussion with the competition commission over several proposals to improve competition in the search space. One such proposal, the inclusion of “rival links” for things like travel and product search, is particularly controversial. The companies that brought the case against Google feel that the search giant is over-emphasizing their own properties in the search results, and wants to see alternatives given more screen space. Google’s rival links proposal is its second attempt at coming to a mutually agreeable solution, but has been called too little, too late by many involved with the case.

In 2013, Bing managed to gain a significant amount of market share, but most of that market share came from other search engines, rather than Google’s search traffic. Growth for rival search engines seems incredibly limited at this time.

Canada Competition Bureau Investigating Google Antitrust Case

The Canadian Competition Bureau has just announced plans to proceed with a formal antitrust investigation into Google. The CCP has already been engaged in a preliminary inquiry and has just requested several documents that it claims will show how Google has abused its market power and engaged in anti-competitive behaviour when conducting business in Canada.

The CCB’s case is similar in many ways to the case that is ongoing with the European Commission, and the allegations are much the same. The Financial Post says that the CCB’s filing indicates that Google controls “substantially or completely” the search industry in Canada. The filing mentions a 90 percent share, and third party sources suggest this is an accurate assessment. For example, StatCounter puts Google’s share of the Canadian search market at 88 percent.

The CCB claims that Google has exclusive syndication and distribution agreements with many hardware manufacturers, so it is set as the default search engine on many OEM machines. It also makes use of universal vertical search result pages which showcase its own properties and content sites instead of those of rivals. This is the cornerstone of the European Commission’s case against Google, and is something that Google is trying to remedy with “rival links” on its search results pages. The “rival links” proposal has been poorly received by Google’s competitors, who feel that it is too little, too late, and that the links have too low a CTR and are not promoted well enough on the results pages. Google’s own properties, such as YouTube, are given a much more prominent position in the SERPs, and the search engine appears to be making a concerted effort to keep visitors within its own ecosystem for as long as possible.

Another issue raised by the CCB is the use of anti-competitive terms and strict restrictions on how AdWords API data can be used by third parties. Most of these issues were resolved after Google reached a settlement with the FTC.

The court filing made by the CCB is an extensive document, more than 120 pages in length. If the CCB is successful in its case against Google, then the company will face fines and may be forced to change some of its operations. The fines should not be an issue for Google, since the company has already been through several similar cases in other parts of the world.

Creating Content For the Hummingbird Era

Creating Content For the Hummingbird Era

A few years ago, writing content for the web was a highly specialised skill, one that differed from writing for books or for print. Skilled copywriters managed to create articles that ranked well in the search engines and read well to human visitors, but there were many “SEO writers” that produced stilted, repetitive articles which made little sense to human beings.

Since those dark days, Google’s algorithms have become far more sophisticated, and through technologies such as latent semantic analysis, those keyword stuffing techniques have become obsolete. Today, if you’re a webmaster writing about Google’s Hummingbird update, or an open source software lover writing about Ubuntu’s Meerkat version, you don’t have to worry about competing with nature websites for rankings for those animal-related keywords. Google understands context, so there’s no need to over-optimize your copy to cover every possible phrase that a searcher might use.

Questions, Answers and Quality
Hummingbird rewards webmasters that answer questions and help searchers complete the objective that they had in mind when they made their search. Google wants webmasters to get away from thin content sites, and from sites that have a lot of duplicate content (such as cross-posted news, or product descriptions copied from a manufacturer’s website). Instead, Google wants webmasters to write useful content: How-To guides, tips for using products, detailed reviews, or even simple community engagement posts. The Internet is full of junk, and there’s no need for webmasters to keep adding to it.

Currently, a search for Hummingbird brings up the bird, a bakery of the same name, a movie with that title, a brand of fishing tool, and references to the Google algorithm. Add the word “content”, however, and Google instantly understands the context of the query. Hummingbird is just the first iteration in what will inevitably be a long journey, and the algorithm is far from perfect. There may be other examples of multi-meaning keywords that are not so easily understood. For this reason, it’s important to write your copy with the context in mind. Make it clear what your content is about and focus on writing content that serves the user well.

The content you write is primarily aimed at teaching users, but if you write it well it will teach the search algorithms too. As more webmasters write high quality copy, the algorithms will get better, and poorly written web copy will become much less prevalent through necessity.

Gmail’s Image Caching Hurts Email Marketers

Google has recently started caching images included in email messages. This change to the way that Gmail works was done with little fanfare, and has probably escaped the notice of most webmasters. However, those who rely heavily on email for their marketing are dissatisfied with the change.

Now that Google is caching images, email marketers have lost out on an essential piece of information. Under normal circumstances, when a user opens an HTML email message and downloads the images inside the message, a lot of information is sent to the server hosting the image. The most obvious information is a) how many times the email message has been opened, and b) the geographical location of the person opening the email. However, there are some other bits of information that can be gleaned from the download.

One particularly interesting piece of information is revealed by the referrer data. The referrer data can reveal whether the viewer was using a mobile app, email client or a web browser, and what folder your email ended up in. This is important because it helps email marketers determine whether or not their emails are being filtered into junk/spam or other folders, or actually making it into the inbox of their readers.

Why Google Made the Change
Google changed the way that it handles email because it wants to protect the privacy of its users. Gmail already protects users by masking the IP address of people when they send emails, and this change protects people when they read them as well. This goes hand in hand with the recent “Not Provided” search keyword data change which hides the keywords that a user searched for from webmasters. These changes may benefit people who like to keep their web usage private, but they make life difficult for webmasters that rely heavily on analytics.

Another possible reason for the change is to improve users’ perception of the performance of Gmail. Many users may not realise that Gmail is not responsible for the serving of images that are contained inside emails, and may blame Google if an email is slow to load. By caching images, Google regains some control over the quality of the user experience.

Thanks to these changes, email marketers can expect to see some confusing and inaccurate statistics, at least until their analytics programs are updated to work around, or adjust for, data from Gmail users.

Gmail’s Image Caching Hurts Email Marketers

Google has recently started caching images included in email messages. This change to the way that Gmail works was done with little fanfare, and has probably escaped the notice of most webmasters. However, those who rely heavily on email for their marketing are dissatisfied with the change.

Now that Google is caching images, email marketers have lost out on an essential piece of information. Under normal circumstances, when a user opens an HTML email message and downloads the images inside the message, a lot of information is sent to the server hosting the image. The most obvious information is a) how many times the email message has been opened, and b) the geographical location of the person opening the email. However, there are some other bits of information that can be gleaned from the download.

One particularly interesting piece of information is revealed by the referrer data. The referrer data can reveal whether the viewer was using a mobile app, email client or a web browser, and what folder your email ended up in. This is important because it helps email marketers determine whether or not their emails are being filtered into junk/spam or other folders, or actually making it into the inbox of their readers.

Why Google Made the Change
Google changed the way that it handles email because it wants to protect the privacy of its users. Gmail already protects users by masking the IP address of people when they send emails, and this change protects people when they read them as well. This goes hand in hand with the recent “Not Provided” search keyword data change which hides the keywords that a user searched for from webmasters. These changes may benefit people who like to keep their web usage private, but they make life difficult for webmasters that rely heavily on analytics.

Another possible reason for the change is to improve users’ perception of the performance of Gmail. Many users may not realise that Gmail is not responsible for the serving of images that are contained inside emails, and may blame Google if an email is slow to load. By caching images, Google regains some control over the quality of the user experience.

Thanks to these changes, email marketers can expect to see some confusing and inaccurate statistics, at least until their analytics programs are updated to work around, or adjust for, data from Gmail users.

Google Busts Anglo Rank Link Network

Google has recently started caching images included in email messages. This change to the way that Gmail works was done with little fanfare, and has probably escaped the notice of most webmasters. However, those who rely heavily on email for their marketing are dissatisfied with the change.

Now that Google is caching images, email marketers have lost out on an essential piece of information. Under normal circumstances, when a user opens an HTML email message and downloads the images inside the message, a lot of information is sent to the server hosting the image. The most obvious information is a) how many times the email message has been opened, and b) the geographical location of the person opening the email. However, there are some other bits of information that can be gleaned from the download.

One particularly interesting piece of information is revealed by the referrer data. The referrer data can reveal whether the viewer was using a mobile app, email client or a web browser, and what folder your email ended up in. This is important because it helps email marketers determine whether or not their emails are being filtered into junk/spam or other folders, or actually making it into the inbox of their readers.

Why Google Made the Change
Google changed the way that it handles email because it wants to protect the privacy of its users. Gmail already protects users by masking the IP address of people when they send emails, and this change protects people when they read them as well. This goes hand in hand with the recent “Not Provided” search keyword data change which hides the keywords that a user searched for from webmasters. These changes may benefit people who like to keep their web usage private, but they make life difficult for webmasters that rely heavily on analytics.

Another possible reason for the change is to improve users’ perception of the performance of Gmail. Many users may not realise that Gmail is not responsible for the serving of images that are contained inside emails, and may blame Google if an email is slow to load. By caching images, Google regains some control over the quality of the user experience.

Thanks to these changes, email marketers can expect to see some confusing and inaccurate statistics, at least until their analytics programs are updated to work around, or adjust for, data from Gmail users.

The Importance of Site Search in Marketing

The Importance of Site Search in Marketing

If you have a large website, you probably have a site search feature, and that feature will allow you to view keyword reports, see which items are the most popular on your website, and get an insight into the behaviour of your visitors. That information is incredibly valuable, and can be used to augment your marketing campaigns. However, according to a recent study conducted by SLI Systems, more than 56% of website owners do not make use of the data gathered from site searches to improve their marketing, either because they don’t know how, or because they simply don’t think it will be any use.
SLI Systems has worked with several major ecommerce brands, and their report mentions some impressive case studies, highlighting how site search can be used improve email marketing, integrate with social media, and even fuel retargeting advertisements or custom landing pages.

Admittedly, accessing site search data and using it to generate reports may be difficult if you are using a bespoke search solution, or do not have a lot of experience in working with analytics and reporting tools. Around thirty percent of the companies that were surveyed said that they did not know how to use site search data to improve their marketing and the end user experience on their website. However, the majority of website owners do appear to be committed to offering an efficient and intuitive user experience. Basic features such as autocomplete were offered on more than 60% of the websites included in the survey, and many webmasters are planning to improve their internal search offerings in 2014.

Around 20 percent of webmasters are planning refinements for their search feature, and will be adding personalized search histories in 2014. One quarter of webmasters plan to add quick view windows, and other features such as auto complete with graphics, and pop-ups that appear on mouse-over are also on the cards for many of the websites that took part in the survey.

Testing is another thing that webmasters are starting to take seriously. Just 21 percent of webmasters have A/B testing implemented for their site search feature at the moment, but another 28% say that they plan to add it next year. Webmasters that are not taking their testing seriously are missing out on some powerful marketing tools and information that could improve their SEO, paid search and other marketing techniques.

Page 1 of 512345

Recent Facebook Posts

SEO Services UK | London SEO Service | SEO Consultant London

Google Limits Right to Be Forgotten to Europe http://www.seoperspective.com/google-limits-right-forgotten-europe/

1 likes2 years ago

SEO Services UK | London SEO Service | SEO Consultant London

Quick Guide to Using Paid Social Amplification More details visit us at http://www.seoperspective.com/lead-generation-explosion-quick-guide-using-paid-social-amplification/

0 likes2 years ago

Facebook Can Track Searches Outside Its Own App Facebook has just updated its privacy policy to allow it to track online searches outside of..

0 likes2 years ago

SEO Perspective

Get Subscribed home feeds of SEOPerspective at http://www.seoperspective.com/feed/

0 likes2 years ago

Address

Suite 501
International House
223 Regent Street
London
W1B 2QD
Website: http://seoperspective.com