I’ve examined the roles of Authority & relevance when determining Search Engine ranking in previous weeks on this blog; this week I will examine the role of Social Media when determining SERP rankings, this is frequently generally referred to as social Search.
Social activity such as liking, sharing, commenting or +1 votes does correlate with high ranking websites however it doesn’t actually cause it is backlinks which causes sites to rank more highly Widely shared content will lead to more backlinks to that content resulting in higher search rankings, this confirms that social media is what the military refer to as a force multiplier, in a nutshell successful use of social media will amplify the impact of your content
A force multiplier refers to a factor that dramatically increases (hence “multiplies”) the effectiveness of an item or group (Source: Wikipedia)
First I’d like to clarify what exactly is meant by the social graph, this is NOT the traditional diagram with X & Y axis you may be familiar with from Maths at School, instead the social graph maps the connections and shared interests of your social network, in Social Search a persons search results are influenced by their Social Network in contrast to traditional search which uses backlinks & algorithms to determine Search results. Trond Lyngbø suggests that
‘In the algorithmic ranking model that search engines used in the past, relevance of a site is determined after analyzing the text and content on the page and link structure of the document. In contrast, search results with social search highlight content that was created or touched by other users who are in the Social Graph of the person conducting a search.’
With Social Search Connections are valued more highly that backlinks this should cut down on the amount of link spam, a further benefit is that Social Networks act as a form of intermediary or arbiter. Content, which is deemed excellent i.e., gets shared a lot then it should have an impact on search rankings, in this way social networks serve to validate content hopefully minimizing the abuses of link building seen in the past. Such as buying links & link farms, there will always be some companies trying to take shortcuts to good search engine results such as trading in backlinks or social shares, hopefully the search engines will rapidly adapt to counter these abuses
Ideally, you want to gain references from social accounts with good reputations. Having your own social presence that is well regarded is important. So participate on relevant social platforms in a real, authentic way, just as you would with your web site, or with customers in an offline setting.(source:Searchengineland.com)
Social Search represents a challenge for smaller businesses, especially where they are strapped for time, as it takes an investment of time to build up a decent network. It certainly underlines the importance of good quality website content otherwise your social network will have nothing to talk about or share. Carl Potts Designs offer both Social Media management and engaging content development services specifically designed for Smaller Businesses to discover more contact me today.
Meta Descriptions are the way in which web developers can describe the contents of their web pages in a format readily understood by Search Engines, if done correctly Meta Descriptions are displayed on SERPs they are not visible when visiting a site, unless the underlying HTML code is examined something the typical website visitor is pretty unlikely to do. Meta Descriptions are limited to 170 characters (including spaces) long should be concise and contain your best keywords, the character limit means that the number of keywords that can be included is limited, this means that keyword stuffing is a bad idea.
The following list contains the best practises for creating effective Descriptions
• Contains call to Action
• Important Pages have unique descriptions
• Contains a clear description of your company/page contents
• Avoids non alphanumeric characters
• Makes selective use of your best keywords
Each of these best practises will now be examined in further detail.
Matt Cutts, the head of the search spam team at Google advises that is best to ensure that every page has a unique description; it is better to page descriptions empty rather than than having duplicated page descriptions. The practical approach to this Cutts suggests is to ensure that the most important pages all have unique page descriptions.
Calls to action encourage the reader to take a particular course of action, very basic examples include things like ‘Call us today’ or ‘Check our latest offers!’, this is important when getting the potential customer to take a particular course of action.
Google tends to remove non-alphanumeric characters such as quote marks from descriptions so it best not to use them to differentiate text within the description.
I find that businesses often have a tendency to want to target every possible keyword known to mankind in their particular niche, I feel this is ill advised, its better to target the important keywords well rather than overdo it, if you are targeting keywords you need good quality content to back it up. Standard Good practice holds true, text should be written with the reader in mind rather than for the benefit of Search Engines.
Algorithms are the calculations performed by Search Engines such as Google and Bing which determine the rankings of the Search Engine results pages (SERPs) for given search queries, the exact nature of Algorithms is a closely guarded secret of the search engines, as the search engines are constantly striving to provide ‘better’ results Algorithms are constantly being updated
Penguin is designed to ensure that websites have good quality backlink profiles and to reduce the influence of spammy or poor quality backlinks, the first Penguin update was introduced in April 2012.Previous Penguin updates have lead to real concerns throughout the Search Engine industry with Google introducing a disavow tool so webmasters can refute backlinks from poor quality websites
Backlinks are links from other websites back to your website, Google and the other search engines treat these as votes for your website, the logic being websites that are useful and provide good quality content will attract lots of backlinks naturally, predictably some webmasters tried to game the system by buying links from poor quality websites known as link farms, it is this sort of backlink which Penguin targets sites with poor quality link profiles being demoted in the Search Engine Results pages(www.seroundtable.com)
There has been some suggestions that Google favours Larger businesses and that it is beyond the resources of Smaller businesses to employ specialist SEO teams developing website content designed to raise a companies SERP rankings, I agree completely with directtrafficmedia’s conclusion
‘As you can see, Google does not make it easy for small businesses to do well in search results. It is now more important than ever for online businesses to understand the environment that their site exists in. An SEO strategy is needed in order to avoid penalties, with budgets being put aside for external agencies or in house SEO. It is no longer acceptable in the world of search engine rankings to just build a site in the cheapest and simplest way possible and expect good rankings. Every small online business owner is now also an SEO, without it, online businesses cannot succeed.’(directtrafficmedias.co.uk)
Penguin underlines the importance of building a healthy backlink profile and not resorting to artificial means of boosting number of backlinks. It would be naive to assume the trading of backlinks has disappeared completely, however it does appear to be less high profile/obvious than it was a few years ago. The best way to ensure your website has nothing to fear from this or future penguin updates is to avoid link buying schemes and to develop good quality website content which answers the needs of clients and attracts backlinks naturally. Linkarati suggest that ‘However, ‘Penguin was very effective at finding and punishing spammy and automated links. This prompted a fundamental change in the SEO industry as automating your link building to create thousands of irrelevant links was no longer effective’ (Linkarati.com). The Monthly Carl Potts Designs newsletter covers topics such as Penguin regularly and provides suggestions how smaller businesses can compete effectively online, Sign up using the Contact Form on this website.
Definition of mobile search: using a web-enabled mobile device – feature phone, smartphone or media tablet – to query a search engine, using a relevant word or phrase – e.g. “emergency plumber in Manhattan” – known as a search term.(Source:Mobiforge.com)
Mobile Search is the practice of querying a search engine from an internet connected handheld device such as a smart phone(Source:techtarget.com)
My definition is as follows:
Mobile Search is any search conducted using a mobile device (smartphone or mobile), which may take advantage of the devices GPS tracking.
Mobile Search continues to account for an increasing share of searches with mobile search predicted to overtake desktop search in the UK in 2014(intelligentpositioning.com), research by fresh egg shows that just over half (52%) of UK smartphone owners search daily, this figure is likely to continue growing as 4G technology becomes more widely available in the UK.
Mobile Search continues to account for a larger proportion of Uk searches with it predicted to overtake desktop search in 2014.
Icebreaker consulting found that 40% of mobile users will navigate to elsewhere if presented with a search result which does not cater to a mobile audience (source: vocus.com) in addition vocus.com found that mobile users expect a website to be quick loading (loads in less than 3 seconds)
Mobile Searchers value convenience highly and are likely to visit another site if dissatisfied with a search result, which fails to cater to their needs
Google recommend 3 approaches for accommodating mobile phone users these being:
I. Responsive Web Design
II. Adoptive web design
III. Dedicated Mobile site
There is some debate as to which is the best approach with Google preferring the responsive route
Susan Walders suggests that the mobile web of today reminds her of the standard web of 1999 she also questions why so little effort is spent checking the quality of mobile websites compared to their desktop counterparts, if even major brands are failing to provide good quality mobile websites (source:searchengineland.com), the problem is likely to be much more severe at SME level.
Mobile Search continues to grow in importance and this trend is likely to continue into the future s tablet ownership becomes more widespread and 4G capable smartphones become widely available. This means that businesses cannot afford to neglect mobile users, and their websites need to cater to the specialist requirements of mobile users
Aleydra Solis argues that
‘Mobile SEO differs from desktop SEO since it’s specifically targeted to the mobile search environment, taking into consideration the specific mobile user’s search behaviour and intent, and the characteristics, requirements and restrictions of the mobile web platform from a content, interface and technical perspective.’ (econsultancy.com)
This suggests that simply having a responsive website is not enough to completely cater for the requirements of mobile users, content also needs to be tailored accordingly with less emphasis on long tail keywords.
This weeks article will examine Pigeon a recent Algorithm change introduced by Google, note Pigeon is a moniker bestowed by the searchengineland.com website for the change and not the official Google name for the Algorithm change. I’ll examine what has changed as a result of penguin and how this will affect smaller businesses and any implications this may have.
The most obvious change has been to the appearance of the local pack in the Google Search Engine Results Page (SERP, On Search Engine Land, Greg Gifford highlights this with a number of searches
• “used cars” = seven-pack
• “used cars louisville” = no map pack
• “used cars louisville ky” = three-pack
He argues that before pigeon all of these searches would have resulted in a seven-pack display, this is no longer the case, but results can be somewhat random. Detailed research has been undertaken by companies such as brightedge. This shows that local pack results vary by industry, with industries like real estate/estate agents taking quite a big hit as a result of Pigeon, with much less likelihood of gaining a 7 pack listing (source:brightedge.com)
Another change has been to correct what has been described as ‘the yelp problem’ by some industry observers.
The Yelp Online directory complained to Google about Yelp SERP listings appearing below listings of Google owned services such as Google plus and Organic Google local, SeachEngineLand.com suggests that this problem has been rectified for a range of local business directories and not just Yelp! Listings, here is Search Engine Lands verdict on this aspect of the Pigeon algorithm changes
It looks like Yelp and other local directory-style sites are benefiting with higher visibility after the Pigeon update, at least in some verticals. And that seems logical since, as Google said, this update ties local results more closely to standard web ranking signals. That should benefit big directory sites like Yelp and TripAdvisor — sites that have stronger SEO signals than small, individual restaurants and hotels are likely to have. (Search Engine Land.com)
Some Industry experts such as Mike Blumenthal and Andrew Shotland have suggested that the pigeon update results in greater hyper localisation of Search Results
Hyperlocal searches are conducted in a specific locale, womeninbusiness.com offer the following definition, hyperlocal=local community. Google offer the following definition ‘Hyperlocal distance information lets your customers know how close they are to your business. Available on smartphones, hyperlocal ads gives users down-to-the-block-level detail about your business including your address, phone number, and where you are on Google Maps for Mobile.’(Google.co.uk) Given the growing importance of mobile/local search adjusting the search engines to really take advantage of smartphone technology makes a lot of sense. The advice offered by Google is fairly standard when considering local search results, its important to make sure your businesses NAP (Name, Address, Phone number) is displayed on every page of your website. Its also important to ensure that this information is consistent around the web (i.e. the same NAP is used in business directories)
The 7 pack is the name of the area reserved for local company listings on the Google Search Engine Results Page (SERP), all of the major Search Engines (Google, Bing & Yahoo) have an area set aside for local search results, as the Google area frequently contains 7 company listings it is known as the 7 pack.
Citations are an essential element when developing a profile for local search; Rutledge suggests that ‘ the best way to target your area is to make sure your business is listed on every local web directory available’ (Rutledge 2014) these listings are classified as citations. Any website which makes reference to your company is a citation. Citations have been identified as a key component of the ranking algorithms for both Google and Bing (moz.com) who go on to suggest that a citations also validate that a business is a part of the community, this is especially true in less competitive niches where competitors are less likely to have company websites.
Building an effective citation profile can be quite a time consuming process and to help a number of companies have developed tools to speed up the process and to check for errors in your citation profile such as inconsistencies, Unfortunately it looks like not all of the tools cater for the UK market.
Consistency is a vital part of the efforts to obtain a 7 pack listing for your website, by this I mean the same NAP (Company name, address, phone number need to be employed between the various websites which are used form your local search profile, Google and the other search engines will penalise inconsistencies in this data and it ill affect your chances of attaining a 7 pack listing, for these reasons I find it useful to draw up a ‘standard profile’ before my profile building efforts get underway, this is usually in the form of a Word document containing Company name, address, phone number, website URL, email address and company description so I can simply copy & paste the information required when creating company listings and accounts on various websites such as Google plus, Yelp et al, it makes life a lot easier in the long run if profiles are created with consistent data rather than trying to adjust it later.
On page optimisation also has a role to play in local Search with Rich Snippets being especially important; this involves using microdata which utilises the schema.org vocabulary this puts your company data into a format readily understood by the Search Engines when they crawl your website.
This weeks blog post will examine Relevance one of the most important factors when determining search engine rankings,the other important concept Popularity was covered in a previous post, Wikipedia defines relevance thus:
In information science and information retrieval, relevance denotes how well a retrieved document or set of documents meets the information need of the user(Wikipedia)
Dover and Dufforn point to the importance of these factors
‘Popularity and Relevance are two concepts that make up the bulk of Search Engine Optimisation theory'(Dover and Dufforn 2011)
other important factors include:
• Keyword use in page title tags
• Anchor text of inbound link
• Keyword use in page headings
• Keyword use in body text
The Content of a web page has a huge impact on the relevance score, this is perfectly logical when you think about it, The Search Engines perform a detailed analysis of a web pages content and build a map (known as the semantic map)of the page’s data, this map is then analysed so the the search engine can ‘understand’ exactly what the web page is discussing, so the search engine can retrieve and provide relevant results in response to the search query (Enge et al 2010)
The Search engine now needs to sort through huge numbers of results which match the search query, therefore the relevance is somewhat query dependent relying heavily on matching query terms to important on page elements (which were discussed earlier),Using a combination of factors the Search engine returns the most relevant and authoritative results to the end user, this is why on page SEO factors are so important when optimising a website.
The Search engine calculates statistics such as the tf-idf (term frequency inverse document frequency)to determine how important a word is in a web page, the tf-idf is an important factor, this is somewhat more complex than the keyword density which was utilised in the past for this purpose. Keyword Density refers to how often a keyword phrase is repeated within a web-pages content, in the past this was a ranking factor, it is no longer the case and it is regarded as bad practise to repeat keywords too much within a pages content in the mistaken belief it will boost a pages search ranking, it is actually counter productive and will alienate web site visitors, without any perceivable effect on a pages search engine results .A keyword density of around x% is regarded as perfectly adequate.
As stated previously, Authority & Relevance are the two most significant factors when determining website rankings, the Search Engine Optimisation process needs to take this into account when developing & promoting your website, carefully researched keywords should be targeted to ensure your web site provides relevant answers to the search queries people are using.
This week I will be returning to the theme of good quality content, depending on the author you are reading this is known by a variety of names, McGovern refers to “killer content” where I prefer the Valuable Content” label used by Sonja Jefferson, same concept, different label. Both McGovern and Jefferson point to the decline in traditional broadcast marketing suggesting that the way business can successfully utilise the web is to produce good quality content which answers the questions of potential customers so building trust and rapport with these leads, basically what’s generally known as content marketing. Jefferson suggests that
‘Good marketing has always been valuable but the difference today is that buyers no longer tolerate or respond to marketing which is less than good (Jefferson 2012)
the perceived lack of quality of much online content is a concern many authors share, its a concern I certainly agree with a lot of online content is frequently rather poor in agree quality with there only being a few golden nuggets of content being available, McGovern complains about the prevalence of filler over killer content, I’ll be using the rest of this post to compare and contrast two definitions of ‘good quality’ content, I’ll then proceed to develop my own definition from this which will be used as the quality benchmark for the rest of the content added to this site
Redish (2012) highlights the usefulness of a websites content and its role in converting visitors into customers, she goes on to suggest that content needs to be
• easy to find
• easy to understand
• up to date
• credible (Redish 2012)
Redish places a lot of emphasis on the usefulness of content
‘People come for information that answers their question and helps them to complete their task’ (Redish 2012)
This tallies completely with Jay Baer’s Youtility concept and Jefferson’s recommendation that valuable content needs to be useful.
This week we’ve looked at a number of viewpoints on what constitutes good content, this is important because a number of prominent authors & thought leaders bemoan the variable quality of online content, its a concern I share, a lot of online content I encounter is rather mediocre, excellent content stands out and will differentiate you from the competition, now I’ll attempt to develop my own definition of excellent quality content which builds upon the previous definitions, In my opinion excellent quality content needs to tick all these boxes.
2. Useful either Informative or entertaining
6. Easily Found
7. High quality
9. Mapped to sales funnel
I personally find that most online content is beginner level only, there is a lot less intermediate and advanced level content available, this is a problem, which extends to printed books as well as online content; perhaps the beginner market is more lucrative? I personally find there is a real dearth of intermediate/advanced level content available freely online, perhaps I need to look harder but I don’t really feel its through lack of looking on my part, Carl Potts Designs offer Content Marketing services designed to
1. Boost Search Engine Performance
2. Improve Brand awareness and trust
3. Improve client retention/satisfaction
If this interests you contact me today either by phone or use the contact form.
This week I’ll be examining Authority one of the two main criteria along with relevance which determine how a website performs on Search Engine Results pages, so if you wish to be at the top of Google its well worth taking the time to understand this concept.
Copyblogger identify 4 core components which determine the authority of a website, these are:
1. Sites which solve the problems or questions of potential leads, this is the Youtility which Jay Baer refers to in his book of the same name
2. Sites which attract links from other authoritative publishers
3. Sites which attract a lot of attention where the audience helps to share your content
4. And it’s the site with a confident, ethical sales process that converts attention into business.(Source: Copyblogger )
Hopefully this list makes it pretty obvious why SEOs constantly harp on about content (I know I do!), but this post is focused more on search authority as opposed to content, although the two factors are definitely interrelated. A site with little or no content will fail on nearly all of the 4 points; a site needs good quality content in order to satisfy the points above.
Note that I highlighted the words authoritative publishers in point 2, this means that links from dubious sites carry little weight when Google determines SERP rankings, Google does not approve of the purchasing of links and will penalise sites it suspects of engaging in the practise, reciprocal linking (if you link to my site I’ll link to yours) schemes carry a lot less weight these days the best way to generate quality backlinks is to develop good quality website content which is worth linking to, this is a point argued by Link building expert Eric Ward (aka Link Moses) who argues that
‘The less useful your content, the less likely you are to ever receive a link to it “(Ward 2013)
this accentuates the link (no pun intended) between website content and good search engine performance.
Point 3 highlights the link between content and social media. Google and the other Search Engines take account of how much attention sites get on the social networks, a Site with 1000s of links from Social Media is perceived as having more authority by the search engines and will perform well on the SERPs
‘The larger your social footprint, the more impact social media will have on your SEO efforts. (Search Engine Land 2011)
Experience suggests that its relatively easy to publicise a site full of interesting, entertaining or informative content on social media (a fact the likes of buzzfeed.com & quickmeme.com utilise) conversely a site with little or no content will receive little or no attention on the various social networks.
Although this article is ostensibly about Website authority, the concept is completely intertwined with website content and backlinks, in a nutshell the website with loads of good content will develop authority naturally leading to better SERP rankings & website traffic which is something most small business owners want. Carl Potts Designs offer Search Engine Optimisation services, Social Media support and content creation designed for specifically increasing your websites traffic.
Meta simply means ‘data about data’ and there are a number of special Meta tags for describing the contents of a webpage, the information contained in the meta tags is not displayed on the page, it is there to describe the page contents to your tells the browsers (or other web services) specific information about the page. Simply, it “explains” the page so a browser can understand it(Search Engine Watch). Meta tags are one of the biggest sources of misconceptions that I encounter when developing sites for clients, the Meta keywords tag is no longer supported by the major Search Engines (Google, Yahoo and Bing) so it will have negligible impact on the relevancy or search rankings of your website, however Bing does use the meta keywords tag to detect spam. It is for this reason I no longer bother to complete the Meta keywords tag as it provides an easy way for your competitors to identify the keywords you are targeting.
The Meta Keywords tag is a relic from when the Search Engines were not smart enough to decipher and identify the contents of a webpage and needed a helping hand, unfortunately it was widely abused by webmasters hoping to drive more website traffic. For this
This is not to say that all Meta tags are obsolete, Meta description is still an important part of a websites on page SEO but its use is more nuanced these days. Carefully crafted Meta descriptions can have an impact on the Click through rate your website achieves on SERP(Search Engine Results Pages) so its worth taking time to ensure they are well written and focussed on the keyword phrases you are trying to appeal to
Google’s Matt Cutts (searchengineland.com) suggests that its better to let Google create snippets for your webpages rather than having duplicate page descriptions spread throughout your website, the best solution is to carefully create unique Meta descriptions for the most important pages on your site. So a rule of thumb is when creating a new page: if you can create a unique meta tag for it, that is the best solution, but don’t just copy of meta description tag and use that over and over, it’s better to leave it blank. (Search Engine Watch 2013)
The reason that the SERP description does not always match your carefully crafted Meta description is that Google has auto generated Snippets to describe your page contents, your control over this is limited although steps can be taken to minimise the risk of snippets being generated, these are as follows:
1. Focus your meta description
2. Remove Duplicate METAs
3. Block Your ODP Listing
4. Block your snippet (use with caution)
5. Leave the Meta description alone(moz.com)
Step 1 here means that Meta description contains the keyword phrases that you are targeting; these will be displayed in bold text on the SERPs hopefully further boosting your click through rate (CTR)
Meta tags are not the magical solution to your SERP woes but it is worth taking time and care to ensure that certain meta tags are carefully completed