SEO: Search Engine Optimization
Last week, we found an interesting bit of research that determined it possible to project a couple’s chances at building a successful romance by looking at the Facebook friend profiles of the two involved in the relationship. The studyby Lars Backstrom of Facebook and Jon Kleinberg of Cornell University concluded that the more diverse the two sets of friend groups was from one another, the better the chance of the relationship lasting beyond a two-month period.
This conclusion runs against the commonly held misunderstanding that the more friends two people have in common, the better their chances at romantic “success.” A two-month time frame might not qualify as a successful long term relationship, but it is very significant that the relationship among friend groups may be a signal about the couple’s chances of going into the next change of seasons with the same romantic interest.
The roots of this study are also significant. The practitioners used billions of records and huge sets of data to reach their conclusions. It is in looking at how these conclusions were arrived at that provides some insight into search engine algorithms – particularly in the latest “Hummingbird” update.
The recent Google algorithm update, commonly referred to as “Hummingbird” arrived with little fanfare outside of SEO circles late last summer. Although the immediate impact of the change was minor – impacting a fairly low percentage of existing queries, the future impact could be much more significant.
Good Quality Results
The one thing all search engines compete for is of course, users. The only real competitive point then is which search engine has the best quality results for each query. Google dominates search owing to its ability to understand and deliver the most relevant results for each user search. The more relevant the results, the more likely the user is to utilize that search engine for subsequent search activity.
How do Search Engines Tell if they Got it Right?
This is where it gets interesting. The search engines understand which links and results they present to each search request. They can also monitor user behavior once the reader leaves the search engine results page and opens a page referred by the search engine. If the reader opens a referred page and then hits the return button to try a different result, the search engines capture this bit of behavior and make the judgment that their referral may not have been what the reader actually wanted. If on the other hand the reader stays on the page, consumes the content and perhaps clicks deeper into the site for more information, the search engines judge this to be a successful referral as validated by the reader.
Like the Facebook and relationship study, these analysis tools require understanding huge volumes of user history and data. However, the ability to analyze large data sets is much more possible than ever before, using powerful data sorting systems and technologies.
In order to continue providing the highest quality search results, Google launched Hummingbird to provide better and more relevant results to questions and queries that users are now submitting. Google has seen growth in the types of search requests that use longer, more conversational questions. Understanding what the user really wants is key to delivering great results. Distinguishing between what the user asks for and what the user wants is a tricky game, but Google has determined that it is one worth winning.
Longer, more conversational queries are not necessarily the only types of results that the search engine giant is trying to handle better. Readers using search terms like “animal rescue information” might also find results valuable that include “dog rescue” or “cat rescue information.” Returning the dog and cat oriented rescue websites might actually be more relevant than a result for a company specializing in getting squirrels out of chimneys. A user asking for animal rescue results may be looking for cat or dog rescue sites.
Using history and readership statistics to determine that a “cat” is actually an “animal” when used in the context of “animal rescue” seems to be a main thrust of the Hummingbird update. Understanding the actual meaning of the user query is key to delivering the best quality results.
In light of the revelations about Facebook and successful romantic relationships, it is possible to gather huge amounts of information and build models that can reasonably predict value or behavior and to understand much more about what search engine users really want.
Web search guru Cyrus Shepard posted earlier this week on MOZ, some great insight into how search engines evaluate and use ‘satisfaction’ in the algorithms that produce the order in which search engine results are placed. With all the attention given to external linking, social media activity and other interactions that can also influence SERP placement, Shepherd really got back to basics. In discussing the fundamentals of user satisfaction he brought up the simple fact that user behavior once on a Web page probably influences future search engine placement – perhaps very significantly.
Understanding that all search engines endeavor to deliver exactly the information the user has requested, it only follows that they will evaluate the activity that takes place immediately following a search engine referral to see if they got it right. Does the user immediately hit the back button and try a different result? Does the reader stay on the page and consume the content and then venture onto other pages within that Website?
The Google sees all! All this activity is tremendously important to understand. The ultimate Big Data collector, Google captures an incredible volume of search engine activity and interaction from which they can build finely tuned measurement systems to evaluate how readers interact with a given Web page. The longer readers stay on a page or site after a search engine referral, the more confidence the search engine has that they have properly ranked their results properly.
This brings up multiple areas to look at when trying to improve your Web page and its search engine placement. Among the considerations for improving ‘satisfaction’ and thus improving SERP placement then include:
- Rapid page load speed
- Easy to understand navigation
- Easy access to content and page value
- Good quality content (of course!)
- Depth of content (to keep readers on the page)
This last point – depth of content may indicate a need to redesign or restructure some of the more important web pages on your site to provide more value and give readers a reason to stay engaged. Landing page development should of course provide reasons to stay on the page, but they should also be easy to use and not inhibit one’s interest in consuming the content (such as an annoying pop up or request for more information).
With so much of the focus of SEO being on attracting natural links, this is a refreshing way to look at the SEO function. After all, what good is it to optimize a site and attract readers only to have them drop off.
In the end building a Website that gives readers what they are searching for is just good business. And… it keeps your site in good position on the search engine results.
One of the interesting observations following the roll out of Penguin 2.0 were the number of clients that now have their images appear as search results! This development isn’t necessarily the result of changes in the Google algorithm, but may be related to some of the things we (SEO Consultants in Denver ) did to prepare for it.
As rumors of a new Penguin update grew louder, we recommended to several clients that we again review their on page optimization and update any page assets to make sure we were clearly communicating to the search engines the unique content of each page. Included in this effort was attention given to image descriptions. This is a process that had occasionally been overlooked in previous reviews, but with the looming threat of a new algorithm update, we thought it best to give extra effort to all page assets.
Optimizing images for search is nothing new. Properly set and descriptive alt tags make the page more readable and searchable. However, prior to Penguin 2.0, we encouraged clients to give a little extra effort to making sure that image files and alt tag descriptions were consistent with the keyword and term strategy for each page. Renaming a few image files as well as resetting the alt tags for some of our client images didn’t take long and was something done almost as an afterthought to the entire Penguin preparation effort. In the end, there were some very positive results. Several clients now have their images appearing in non-branded searches.
Product Image Optimization
Organizations that have unique product images on their website can use these images to compete for more search engine traffic. In addition to the website appearing as an organic search result, images that match search terms may appear as separate search engine results. The image is clickable to a larger version on Google, beneath which is a link to the website from which the image was pulled. In a sense, this is an additional opportunity to compete for reader attention and to display any visual product advantages to potential buyers.
Inspiring the search engines to display images then seems to require a simple, two-fold strategy:
- strong on page optimization
- image optimization consistent with on page term strategy
On Page Optimization is essentially the same methodology of identifying keyword terms and installing descriptive title, description and even keyword terms. Making these consistent across all meta data types has proven important in building each page to attract traffic.
Image optimization requires the Web operator to make sure the file name from which the image is pulled, is somewhat descriptive of the type of image displayed. File names such as “snowthrowerimages” instead of “images1” work well. Setting alt tag content to a more descriptive level is a simple matter of describing the product, preferably utilizing a keyword term. Alt tag descriptions such as “Red Snow Thrower” then describes the image further. When the search engines crawl the page, they will note that the images come from a file named “Show Thrower Images” and one of the actual pictures is of a “Red Snow Thrower.” If the page is optimized using the term “Snow Thrower,” these images would then appear to be more likely to be picked up and displayed as a search result.
In the end, thinking of image display as part of the meta data exercise may be beneficial. Whenever meta data is modified or updated, including the image file name and alt tags in the modification process enables the search engines to more clearly understand what each page contains. And when some of the expected search terms are used, the search engines may be more likely to grab an image from the website to use in their results display.
Much of this is not news to experienced SEOs, but we’ve seen a number of Webmasters forget to include image optimization in their page maintenance programs. And after the latest Penguin update, we are seeing more client product images appear on search results pages.
Service Businesses and Image Optimization
What does this mean for service oriented business? Images can be developed for any line of work. Services that develop images or info-graphics that describe improved performance or better user experience may also offer some opportunity for image result display. Google in particular though, seems to be able to sniff out non-product images fairly easily, so image display for non-product business Websites may be a bit more challenging. Optimizing these images may require talking more about results, achievements and relationships and then describing them in a fashion consistent with other page meta data.
We are working a couple trials using descriptive service images, so we will keep everyone updated in this blog space.
After working through some search results following the latest Google algorithm update, we were able to isolate some additional characteristics of websites that ranked better after last week’s Penguin 2.0 launch. In reviewing results from multiple pages, sites and comparing rankings from before and after the 2.0 push, we have concluded that two areas seem to make the difference in performing well on the search engines. These are:
- Good Quality “On-Page” Optimization, and
- Natural Links
Good Quality On Page Optimization
Google in particular has really pushed the need for good quality content, but there seems now to be an extension into good quality optimization. As part of ‘Good Quality’ optimization, the pages we’ve noted as being more successful after Penguin 2.0 have been optimized around well-researched keyword terms. Additionally, and maybe more importantly, these sites have good quality internal linking strategies that utilize these keyword terms in naturally appearing language.
Sites that did well after the latest Penguin roll-out managed to fit the keyword strategies into the recommended character limits for title and description tags. We could not tell that style of title tags made much difference. Titles that included the keyword terms in a more natural sentence structure did just as well as those that were ‘pipe-separated.’
Sites that did well also contained the keyword terms towards the leading portion of the meta description. Although it isn’t anything new that more important keywords should be placed at the beginning of title and description tags, it was at least interesting to see that sites that endeavored to structure the tags in this fashion now seem to be rewarded.
We didn’t see any changes or ranking improvements for sites that provided a great deal of additional mark-up. One of our sites that did well had invested some time in providing additional mark up content through things like abstract tags. However the improvements in this site’s rankings may have more to do with the ‘spammy’ link profile of its competitors which may have caused them to drop down in search engine placement. This will require a bit more research, but looking at the importance of additional markup may be worth future effort.
Google has been harping for some time about the ‘natural link’ methodology for building web page strength. Links inspired due to good quality content is ideal, but other methodologies, such as guest blogging are a popular – if time consuming – way of building external links. However, we discovered that client sites that placed well after Penguin 2.0 were ones that didn’t invest a great deal in guest blogging. Several did invest in press release efforts for event and milestone announcements, but guest blogging didn’t seem to be an important part of the link building strategy. Note: can we conclude then that guest blogging is not important? Well.. no… but.. we just didn’t see any examples where a guest blogging strategy produced improved results over the past week.
Natural links still seem to be the main difference between sites that place well on the search engines and sites that struggle for attention. And in the end, good quality content promoted through press release and social media seem to inspire the link strength necessary to rank well. The slow-growth method of building link strength through quality content seems to – at least at first glance – pay off.
We will have more observations throughout the upcoming weeks on what Penguin 2.0 might mean for other types of link building and optimization.