Understanding SEO + SMO (Part 2): Specifics
This is the second of two posts that aim to provide a fundamental understanding of SEO (Search Engine Optimization) and SMO (Social Media Optimization). If you haven’t already, I’d recommend reading Part 1: Background first. Now for the specifics of how to do well on search and social media:
Content is king. Both search engines and social media reward relevant, original, and meaty content. In the past you could get away with shallow articles and well-placed keywords, but now, algorithms and social media users are smarter. If you’re doing shoddy work, it will show. Duplicate content is easily found and will be penalized in results, so make sure any published content is unique.
It’s important to publish content frequently as well. Search gives fresh content an initial boost in rankings, especially when any new information on a subject is uncovered. In social media, consistent publishing will keep conversations going and establish you or your brand as an active voice in the community.
Pique the interest of your audience with the content you create (and with your content’s title). By incorporating your audience’s vernacular — the words and phrases that are used frequently in a specific group or industry — your content will be more easily understood and shared. Imagine what sorts of questions users will ask to get to your page, and with Google’s Hummingbird update, keep in mind mobile voice search. If you’re having trouble gauging user interest on specific topics, check out the following tools:
Since users will likely read a short description of your content first, include key phrases in the first 100 characters of post text. If the value of content can be communicated in the first few sentences, you’re on the right track.
Providing content that is valuable to your audience is the ultimate goal and will keep them coming back. It’s the main reason that content marketing is so popular and successful when executed correctly. Ultimately, you can have the most optimized and well-designed page, but good content is not something anyone can “build” for you.
In one of Viget’s recent posts, developer Chris Jones provided a detailed walkthrough of Docker OS X issues and the workarounds to deal with them. This post struck a chord with the developer community — we found a high number of users sharing and searching for this type of content after it was published. A SEO and SMO win!
Appropriate On-Page Tags
At their core, search engines and social media want to help you, but you need to speak their language. On-page tags help them help you by communicating the most important information about a page.
Meta tags are important — but only some of them. At the bare minimum, your page should have a title tag and a description. This allows you control over what shows up on search engine results pages (SERPs) and gives you an opportunity to give a quick summary about the information offered. When implementing title tags use unique and relevant page titles for each page. Search engines usually show up to 50 characters of a title. The most important words and phrases should be included in the meta description, of which search engines generally show up to 150 characters. Make it concise. If a site includes many pages, these tags can be automatically generated by a content management system (CMS) or plugin based on the content of a page.
No one uses keywords anymore. Sure, in the early days of the internet they communicated additional information about pages, but after spammers tried to game the system, they became useless. See? This is why we can’t have nice things. In fact, Google has publicly stated that they haven’t used keywords since 2009(!) and Bing penalizes those who use them. ‘Nuff said.
In recent years, social media meta tags have become increasingly relevant. They determine what content is shown when something is shared, such as images, title, and author. The most-used social meta tag protocol is certainly Open Graph, the standard structure for anything shared on Facebook. It’s also used on LinkedIn and Pinterest, while reserved as a fallback for Twitter and Google+. The Open Graph protocol has a number of different tags available, and here are just a few:
Open Graph tags for Facebook, LinkedIn, and Pinterest
- og:title - The name of the title of the content, such as “My First Post”
- og:type - The Open Graph type that best describes the content
- og:image - The location of an image that should be shared with the content
- og:url - The canonical URL of the content
- og:description - A short description of the content
- og:site_name - The top-level name of the website
- og:video - The location of a video that should be shared with the content
Facebook also allows you to include an admin ID:
- fb:admins - Establishes ownership and determines the facebook ID of an admin, able to access Facebook functionality directly on the site
Twitter Cards are meant to add more media content to tweets. Using Twitter Cards helps make a tweet stand out in a feed and also allow users to view content like photos or video without having to navigate away from their browsing experience. Some cards also allow the collection of email addresses (lead generation) and provide product information. Here are a few tags used to create Twitter Cards:
- twitter:card - The type of Twitter Card used (e.g. Summary Card, Photo Card, App Card)
- twitter:site - The Twitter username of the website sharing content
- twitter:title - The name of the title of the content
- twitter:description - A short description of the content
- twitter:creator - The Twitter username of the content creator (may be the same or different than twitter:site)
- twitter:image - The location of an image that should be shared with the content
Google+ and Google’s search engine use Schema.org as the primary protocol for on-page tags. Tags from Schema.org were originally designed as a single language to communicate between search engines and websites but are now also being used for social media platforms. These attributes can be applied to any HTML element anywhere in the markup of a page. Schema.org is extensive, but a few common tags are:
- itemtype="http://schema.org/" - The type of content referenced
- itemprop="description" - A short description of the content
- itemprop="url" - The URL of a specific page element
- itemprop="image" - The specific image that should be shared with the content
Google loves to tie content back to verified sources. Eric Schmidt, executive chairman of Google said, “Information tied to verified online profiles will be ranked higher than content without such verification.” Google Authorship tags were initially introduced to work in tandem with Google+ profiles, and although Google has dropped authorship markup from search results, it’s highly doubtful that they’re ignoring authorship completely.
A couple of final notes on social meta tags:
- Every social platform has different standards for image sizing. Typically, it's easier to keep it simple and choose one image size that will work for all services.
- If you can choose only one type of social meta tag to include, your best bet is the basic set of Open Graph tags, since they work on almost every major social media platform.
To emphasize important text, use header tags (e.g. h1, h2, h3) to prioritize importance. For example, wrap the most important header in a h1 tag, followed by slightly less important information in a h2 tag, and so on. Despite what you may hear from some guides, you can actually have more than one h1 tag, just don’t go overboard.
Images are often overlooked as part of an SEO strategy, but using “alt” attributes and intuitive file names for images helps communicate the subject matter of a piece of content. There are no hard guidelines, but as a quick rule of thumb, any alt attributes should be accurate, succinct, unique, and not include “image of” or “graphic of.” It’s also a necessary addition for your audience who may be visually impaired. Interestingly enough, Google’s crawlers basically see a website the same way a visually impaired person does.
You may encounter spammers who target your content in the hopes of gaining more links to their own sites. Often, this manifests itself in blog comments or message boards. Nofollow tags are used to combat this black hat behavior by removing the effect of tagged links to search engines. Essentially, nofollow tags communicate that a specific link should not be followed by search engines, negating any effect on search engine rankings.
On the other hand, links that should be followed by a search engine should have appropriate anchor text. Anchor text is the text displayed when linking to another page. Search engines use anchor text as one way to establishing if the subject of a piece of content is deemed important by others in the community. However, search engines also look at the spread of anchor text leading to a page, and if too many terms are exact or come from websites without relevant content (showing signs of someone trying to “game” the system), it can harm rankings.
There may be times when a site must display the same content on multiple pages. For example, if one product is available in different colors and each color has its own page, the content on each page will be duplicated. Because duplicate content is penalized by search engines, indicate the preferred URl with canonical tags. If a page has simply moved URLs, it can easily retain its link notoriety with a 301 redirect. This is also the type of redirect used by most link shortening websites likes bit.ly and goo.gl, which is SEO-friendly.
Robots, Sitemaps, and Tools
While not technically on-page tags, a sites robots.txt file and XML sitemap file are two other ways to communicate with search engines. As a search engine’s crawlers review a website, it checks for these two files automatically.
The robots.txt is a simple text file uploaded to a websites root directory to guide site indexing. It’s basic function is to identify which pages to index and which to ignore. Alternatively, you can use a robots meta tag on-page if need be (although a text file is recommended by all search engines). You can also specify the location of a sitemap in the robots.txt file.
An XML sitemap is a comprehensive list of website URLs that should be indexed by search engines. It also contains more detailed information about each page such as the last time it was modified and how important it is in relation to other pages. As a quick note, an HTML sitemap differs from an XML sitemap in that it’s made to be seen by site visitors and is used primarily for user navigation.
- Optimize Schema.org tags to communicate page content and organization
- Review how page listings will appear in results
- Check for any crawling errors
- Scan for security issues
- Submit sitemaps to search engines
Intuitive Site IA (Information Architecture)
From the beginning of any project, aim to create an optimal experience for all audiences using intuitive IA. This means providing clear organization and navigation to establish an intuitive flow and structure for human visitors. Notice the focus on human visitors here. Building a site foundation with people in mind will result in positive performance on both search engines and social media platforms.
As humans, we have an instinctive need to organize information and establish hierarchy. We need to relate and structure material in an intuitive way, especially material with large amounts of information. Having a well-organized and well-documented sitemap will ensure information is grouped intuitively and that important pages are indexed. Navigation requires an instinctive internal link structure to guide users in the easiest way possible. A common goal is to reduce the number of clicks required to get to a given page.
Another important way to organize information is by URL slug, the part of the URL after the domain name. It's important that slugs are easy to understand, since they are picked up by search engine results and social media posts. Choose a clear URL like “viget.com/advance/responsive-design-an-overview” and avoid strings of random characters to give a clear indication of content that a user can see by glancing at a URL. Using a URL to communicate content information is an easy way for both search engine and social media users to decide whether or not a piece of content is worth reading (and ultimately, sharing).
Do your best to make sure you have no broken links! These give users a negative experience and increase the likelihood of exiting a website. Commonly, users encounter 404 error pages whenever reaching a non-existent page, either after typing in the wrong URL or clicking a dead link. Build a useful (and fun) 404 page to guide visitors to find the right information and surface relevant content.
Here’s a pro tip - choose a decent CMS (Content Management System) to do much of the legwork for you.
Capable Front End Design
Front end design is the first thing a visitor notices on a page and also plays a large part in SEO and SMO. Few things are as frustrating as navigating to a page and having to hunt for the promised information. Design to serve your audience’s needs, and convey information efficiently.
If you don’t already have one, create a design strategy for all devices. That means having a good user experience on all screen sizes: desktops, laptops, tablets, and smartphones. We’re big fans of responsive design and Google states that they prefer responsively designed sites over those with separate mobile experiences. Sites without mobile-friendly design may perform well in desktop searches, but Google’s Hummingbird update rightly docks cumbersome mobile experiences in mobile SERPs. If you do have a separate mobile site, make sure to alert search engines through their corresponding webmaster tools.
It’s key to surface content quickly to visitors and capture their attention. If you have or plan to have advertisements on your site, limit their intrusion. The same goes for affiliate links. Overstuff content with them, and you’ll be hit right in the search rankings (ow!).
Creating lean pages is important for search engines. When first visiting a page, a browser must request all necessary files from a site server, which are then stored in the cache so only unique page elements must be loaded in the future. Google has stated that site speed plays a factor in search rankings, and even if it didn’t, visitors will have a poor experience if pages load slowly (another reason not to use code-heavy Flash). Plugins are often used to solve problems, especially on popular CMS platforms such as Wordpress. Many plugins are helpful, but unfortunately, many add superfluous code to pages.
To encourage sharing of content across the web, it can be useful to include social sharing buttons. These buttons allow users to easily share content by hooking into social media APIs. However, there are a few things to think about before adding sharing buttons. Users tend to share content that’s already popular, so content with low (or no) shares may actually discourage some users from sharing. Buttons can also get in the way of user experience depending on size, and too many social sharing options can actually negatively impact social sharing and lead to user indecision.
Sturdy Back End Systems
The back end systems of a page provide the foundation of any experience, which plays a significant role in SEO and SMO. Users may not notice the back end when functioning properly, but they’ll certainly realize when it’s not! There are a number of different SEO and SMO factors affected by page’s back end: speed, reliability, IP address, security protocol, and content management.
Site speed has officially been a factor in Google’s algorithm for a number of years now. Knowing this, it’s imperative to build fast pages that load quickly. More important to speed than overall page size is the need for good hosting. Quality, trusted servers located near most of your audience are necessary to provide a smooth experience. For example, if most site users come from Scandinavia, it would be detrimental to have a server in New Zealand. Aside from SEO implications, a fast site will also be more engaging to users, making it more likely for users to consume and share content.
Of all speed factors, the search engine metric with the highest correlation to search engine rankings is not overall page load time (which is still very important to user experience) but rather “time to first byte,” or the amount of time it takes for an internet browser to get a response from a server when navigating to a specific page.
It’s important to know who your neighbors are, especially online. If using shared hosting, you’re likely to have a number of other sites on your same C-Block. A C-Block is essentially the collection of all IP address in which the first 3 bytes of data are identical. For example, the IP addresses 10.10.10.3 and 10.10.10.8 are on the same C-Block. If the majority of a certain C-Block of websites are acting shady, and you’re caught in the middle, your ranking will suffer. While the situation might not occur frequently, you can avoid this by dealing with reputable hosting services.
Recently, Google announced HTTPS as a SEO ranking factor. HTTPS is a secure protocol which helps prevent malicious activity like “man-in-the-middle attacks.” By protecting visitor security, websites can take advantage of better SEO. Switching to HTTPS can require more resources, and it may not be for everyone, but there are number of other benefits that arise from using the protocol such as access to additional analytics data.
Fortunately, many common SEO issues can addressed with a structured back end using a content management system (CMS). Although already touched on in this post’s earlier sections, some advantages include automated:
- URL structure and slugs that give a preview of content
- Meta tags to inform search engines and social media sites of key information
- Image alt tags to communicate image content and make a page more friendly to those with certain disabilities
- XML sitemaps that tell search engines which pages to index
- 404 error page handling, which occurs when a visitor clicks a broken link or enters an incorrect URL
Healthy Digital Presence
The final component of doing well on search and social is maintaining a healthy presence online. Communicating with audiences is now simpler than ever before, but it’s also easy to make very public mistakes. Succeeding digitally requires careful reputation management. It’s important to build online relationships with reputable, trustworthy sources and stay away from sketchballs.
With the advent of search engine algorithm factors like Pagerank, links have played a huge role in the rankings of specific pages. The quantity and quality of inbound links is a major factor in the organic search ranking algorithms. However, this has also brought about the black hat SEO tactic of link building, where webmasters attempt to inflate search rankings by pointing links to a page from unscrupulous sources.
However, as discussed earlier, updates such as Google’s Penguin combat this tactic. Be aware that even a seemingly honest a site may be guilty by association. Search engines have communicated on numerous occasions that “earning” links works much better than “building" them in the long-term. That being said, some consistently buy links and swear by it. I’m not here to make a judgement call, but you should know where your links are from and how they reflect upon you. As the saying goes, “Show me your friends and I’ll show you who you are.”
A similar mindset should kept when thinking about social media. Online reputation plays an integral part in the dissemination of information in channels like Facebook, Twitter, and LinkedIn. Social media platforms are working to keep out shady tactic by cracking down on the practice of click-baiting and updating sharing algorithms to reward those who truly engage their audience. Facebook’s news feed algorithm, often referred to as Edgerank, takes into account many factors including . While some social media accounts actively buy followers, algorithms that track how much of an audience engages with a piece of content will put these accounts at a disadvantage when fake audiences don’t engage with any content. Perhaps that’s one of the reasons Twitter recently announced a roadmap including key algorithm changes.
Engaging an audience is the the reason social media is useful, so it’s important to actually contribute to conversations going on. Far too many accounts and brands broadcast without engaging users or don’t publish relevant content. Both of these mistakes can be avoided by with a four-step strategic cycle: gather data, analyze, implement change, repeat. You’ll likely find that certain media resonate better with an audience than others, answering questions is eagerly welcomed, and participating in conversations with users and groups will reap much more engagement than simply tweeting blindly into the great unknown. In short, be relevant and speak to your audience.
Of course, it’s helpful to actually have a social media presence in the first place. When thinking about which platforms to use, research which platforms are most appropriate for your audience (hint: Google likes Google+). Take time early on to create a solid strategy and focus on the core values and messages to communicate. Grow, but make sure you grow with the right audience. Ultimately, posts on services like Facebook gain inertia based on engagement, dictated by the news feed algorithm. If your fans/followers/lemmings aren’t interested in what you have to offer, a potentially great piece of information could easily fall flat.
Managing social media accounts can be cumbersome, but there are many social media management tools developed to make scheduling and posting much easier. When choosing how often to post to social media, it can be hard to determine the appropriate frequency. Fortunately, research has already uncovered general best practices. For example, posting to Twitter many times a day is generally welcomed, but that type of frequency may be overkill for other social media platforms like LinkedIn. That being said, every audience is unique, and analyzing the results of different strategies will help yield the most fruitful combination.
Social & Search are Converging
As the digital space continues to crowd, basic SEO and SMO knowledge are becoming table stakes. The two are now interconnected in many ways, and social media presence plays a huge role in search engine traffic (at the least, showing a strong correlation). If you haven’t already, check out our first post in the series for more information on the background of both SEO and SMO, Understanding SEO + SMO (Part 1): Background. For even more information, check out the list of resources below.