0 Comments
Now, after reading the title, you can think, “What new can I read here? At least every month I see similar articles on different blogs”. I can say without a doubt you’ll definitely like this post. My article is developed on the basis of unique research. Every SEO specialist checks a site with the help of some SEO service. I work at one of the most popular all-in-one SEO platforms — Serpstat. Every year our team analyzes site audit results of our users to find out which SEO errors are really the most common. In this article, I’ll shed light on the results we’ve got for the last year. Serpstat research: Results we’ve gotDuring 2018, our users carried out 204K audits and checked 223M pages through Serpstat. Our team analyzed this data and collected the stat. All stat you can see on the infographics below the text. I just want to specify some facts in words here. After the research, we’ve discovered that most sites had problems with meta tags, markups, and links. The most common errors are concerned with headlines, HTTPS certificate, and redirects. Issues with hreflang, multimedia, content, indexing, HTTP status codes, AMP (accelerated mobile pages), and loading time were least likely. Also, we’ve analyzed country-specific domains to get more exact information. The stat we’ve got from it shows that 70% of “.com” domains have the most common problems with links, loading time, and indexing. The same situation is with “.uk” and “.ca” domains. The most common mistakes and how to fix them1. Meta tagsMeta tags are rather important despite the fact they aren’t visible to website users. They tell search engines what the page is about and take part in snippets creation. Meta tags affect your website ranking. Errors which can occur with them may spoil user signals. According to our research, you should first check the length of the title and description itself. 2. Links, markups, and headingsExternal links (their number and quality) affect your site’s position in SERP as search engines rate link profiles very carefully. Also, you should always remember about internal links factors (nofollow attributes and URL optimization). The Serpstat team also found out that bugs with markups and headings are rather popular ones despite the fact that they are very important for websites. Markups and headings contain attributes which mark and structure the data of the page. They also help search engines and networks crawl and display the site correctly. The most common errors in this chapter are with:
3. HTTPS certificateThis certificate is one of the important ranking factors as it ensures a secure connection to the website and the browser. If your website uses personal information, don’t forget to pay attention to it. The most common mistake here is the referral of HTTPS website to HTTP one. 4. Redirects, hreflang attribute, multimediaRedirects direct users from the requested URL to another one you need. According to our statistics, you should avoid the most common error with them — having a multilingual interface it’s necessary to apply the hreflang attribute for the same content in different languages. In such a way search engines can understand which version of your texts users prefer. Multimedia elements don’t affect SEO directly. Although, they can cause bad user signals and indexing errors. Also, pictures affect the website’s loading time. That’s why multimedia are rather important. And here is the same situation with the hreflang attribute — if you have the multilingual interface, you should apply it for the same content in multiple languages. More info about errors in this section you can find on the infographics. 5. IndexingSearch engines find out what sites are about while indexing. If the site is closed for indexing, users can’t find it in the SERP. Some weak spots of the site that often lead to errors are the following:
6. HTTP status codes, AMP, and contentAnswers that the server delivers on user request have the name HTTP status codes. Errors with them are rather serious problems and negatively affect the position of the site in SERPs. AMP is accelerated pages optimized for mobile devices. You should use such technologies to improve the loading time of the site. Also, poor content causes the deterioration of ranking positions. The most common problems here are:
7. Loading timeLong loading time can worsen the site’s usability and waste the crawling budget. Serpstat team found that the most common problems with this issue are associated with the use of browser cache, image, JavaScript, and CSS optimization. You can view the detailed infographic here. How to correct these errorsTo find all the above-mentioned errors for your own site, you can start a custom project at Serpstat Audit tool. Here you can check the whole site or even just a separate page. The module checks 20 pages per second and finds more than 50 errors that potentially harm your site. In its reports, Serpstat sorts errors by importance and categories and gives the list of pages on which these problems were found. In addition, it offers recommendations on how to resolve a specific problem. Some of them are not errors in the true sense (“Information”), they are only shown for you to be aware of such problems. SummaryThere are a lot of errors that can damage your site and its rankings. Despite this fact, you can find them all at once with the help of audit tools. At first, pay your attention to the most common weaknesses:
Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter @erin_yat. The post Research: The most common SEO errors appeared first on Search Engine Watch. from https://searchenginewatch.com/2019/07/17/the-most-common-seo-errors-research-infographics/ If you’re looking for a way to optimize your site for technical SEO and rank better, consider deleting your pages. I know, crazy, right? But hear me out. We all know Google can be slow to index content, especially on new websites. But occasionally, it can aggressively index anything and everything it can get its robot hands on whether you want it or not. This can cause terrible headaches, hours of clean up, and subsequent maintenance, especially on large sites and/or ecommerce sites. Our job as search engine optimization experts is to make sure Google and other search engines can first find our content so that they can then understand it, index it, and rank it appropriately. When we have an excess of indexed pages, we are not being clear with how we want search engines to treat our pages. As a result, they take whatever action they deem best which sometimes translates to indexing more pages than needed. Before you know it, you’re dealing with index bloat. What is the index bloat?Put simply, index bloat is when you have too many low-quality pages on your site indexed in search engines. Similar to bloating in the human digestive system (disclaimer: I’m not a doctor), the result of processing this excess content can be seen in search engines indices when their information retrieval process becomes less efficient. Index bloat can even make your life difficult without you knowing it. In this puffy and uncomfortable situation, Google has to go through much more content than necessary (most of the times low-quality and internal duplicate content) before they can get to the pages you want them to index. Think of it this way: Google visits your XML sitemap to find 5,000 pages, then crawls all your pages and finds even more of them via internal linking, and ultimately decides to index 30,000 URLs. This comes out to an indexation excess of approximately 500% or even more. But don’t worry, diagnosing your indexation rate to measure against index bloat can be a very simple and straight forward check. You simply need to cross-reference which pages you want to get indexed versus the ones that Google is indexing (more on this later). The objective is to find that disparity and take the most appropriate action. We have two options:
You will find that most of the time, index bloat results in removing a relatively large number of pages from the index by adding a “NOINDEX” meta tag. However, through this indexation analysis, it is also possible to find pages that were missed during the creation of your XML sitemap(s), and they can then be added to your sitemap(s) for better indexing. Why index bloat is detrimental for SEOIndex bloat can slow processing time, consume more resources, and open up avenues outside of your control in which search engines can get stuck. One of the objectives of SEO is to remove roadblocks that hinder great content from ranking in search engines, which are very often technical in nature. For example, slow load speeds, using noindex or nofollow meta tags where you shouldn’t, not having proper internal linking strategies in place, and other such implementations. Ideally, you would have a 100% indexation rate. Meaning every quality page on your site would be indexed – no pollution, no unwanted material, no bloating. But for the sake of this analysis, let’s consider anything above 100% bloat. Index bloat forces search engines to spend more resources (which are limited) than needed processing the pages they have in their database. At best, index bloat causes inefficient crawling and indexing, hindering your ranking capability. But index bloat at worst can lead to keyword cannibalization across many pages on your site, limiting your ability to rank in top positions, and potentially impacting the user experience by sending searchers to low-quality pages. To summarize, index bloat causes the following issues:
Sources of index bloat1. Internal duplicate contentUnintentional duplicate content is one of the most common sources of index bloat. This is because most sources of internal duplicate content revolve around technical errors that generate large numbers of URL combinations that end up indexed. For example, using URL parameters to control the content on your site without proper canonicalization. Faceted navigation has also been one of the “thorniest SEO challenges” for large ecommerce sites, as Portent describes, and has the potential of generating billions of duplicate content pages by overlooking a simple feature. 2. Thin contentIt’s important to mention an issue introduced by the Yoast SEO plugin version 7.0 around attachment pages. This WordPress plugin bug led to “Panda-like problems” in March of 2018 causing heavy ranking drops for affected sites as Google deemed these sites to be lower in the overall quality they provided to searchers. In summary, there is a setting within the Yoast plugin to remove attachment pages in WordPress – a page created to include each image in your library with minimal content – the epitome of thin content for most sites. For some users, updating to the newest version (7.0 then) caused the plugin to overwrite the previous selection to remove these pages and defaulted to index all attachment pages. This then meant that having five images per blog post would lead to 5x-ing the number of indexed pages with 16% of actual quality content per URL, causing a massive drop in domain value. 3. PaginationPagination refers to the concept of splitting up content into a series of pages to make content more accessible and improve user experience. This means that if you have 30 blog posts on your site, you may have ten blog posts per page that go three pages deep. Like so:
You’ll see this often on shopping pages, press releases, and news sites, among others. Within the purview of SEO, the pages beyond the first in the series will very often contain the same page title and meta description, along with very similar (near duplicate) body content, introducing keyword cannibalization to the mix. Additionally, the purpose of these pages is for a better browsing user experience for users already on your site, it doesn’t make sense to send search engine visitors to the third page of your blog. 4. Under-performing contentIf you have content on your site that is not generating traffic, has not resulted in any conversions, and does not have any backlinks, you may want to consider changing your strategy. Repurposing content is a great way to maximize any value that can be salvaged from under-performing pages to create stronger and more authoritative pages. Remember, as SEO experts our job is to help increase the overall quality and value that a domain provides, and improving content is one of the best ways to do so. For this, you will need a content audit to evaluate your own individual situation and what the best course of action would be. Even a 404 page that results in a 200 Live HTTP status code is a thin and low-quality page that should not be indexed. Common index bloat issuesOne of the first things I do when auditing a site is to pull up their XML sitemap. If they’re on a WordPress site using a plugin like Yoast SEO or All in One SEO, you can very quickly find page types that do not need to be indexed. Check for the following:
To determine if the pages in your XML sitemap are low-quality and need to be removed from search really depends on the purpose they serve on your site. For instance, sites do not use author pages in their blog, but still, have the author pages live, and this is not necessary. “Thank you” pages should not be indexed at all as it can cause conversion tracking anomalies. Test pages usually mean there’s a duplicate somewhere else. Similarly, some plugins or developers build custom features on web builds and create lots of pages that do not need to be indexed. For example, if you find an XML sitemap like the one below, it probably doesn’t need to be indexed:
Different methods to diagnose index bloatRemember that our objective here is to find the greatest contributors of low-quality pages that are bloating the index with low-quality content. Most times it’s very easy to find these pages on a large scale since a lot of thin content pages follow a pattern. This is a quantitative analysis of your content, looking for volume discrepancies based on the number of pages you have, the number of pages you are linking to, and the number of pages Google is indexing. Any disparity between these numbers means there’s room for technical optimization, which often results in an increase in organic rankings once solved. You want to make these sets of numbers as similar as possible. As you go through the various methods to diagnose index bloat below, look out for patterns in URLs by reviewing the following:
Next, I will walk you through a few simple steps you can take on your own using some of the most basic tools available for SEO. Here are the tools you will need:
As you start finding anomalies, start adding them to a spreadsheet so they can be manually reviewed for quality. 1. Screaming Frog crawlUnder Configuration > Spider > Basics, configure Screaming Frog to crawl (check “crawl all subdomains”, and “crawl outside of start folder”, manually add your XML sitemap(s) if you have them) for your site in order to run a thorough scan of your site pages. Once the crawl has been completed, take note of all the indexable pages it has listed. You can find this in the “Self-Referencing” report under the Canonicals tab. Take a look at the number you see. Are you surprised? Do you have more or fewer pages than you thought? Make a note of the number. We’ll come back to this. 2. Google’s Search ConsoleOpen up your Google Search Console (GSC) property and go to the Index > Coverage report. Take a look at the valid pages. On this report, Google is telling you how many total URLs they have found on your site. Review the other reports as well, GSC can be a great tool to evaluate what the Googlebot is finding when it visits your site. How many pages does Google say it’s indexing? Make a note of the number. 3. Your XML sitemapsThis one is a simple check. Visit your XML sitemap and count the number of URLs included. Is the number off? Are there unnecessary pages? Are there not enough pages? Conduct a crawl with Screaming Frog, add your XML sitemap to the configuration and run a crawl analysis. Once it’s done, you can visit the Sitemaps tab to see which specific pages are included in your XML sitemap and which ones aren’t. Make a note of the number of indexable pages. 4. Your own Content Management System (CMS)This one is a simple check too, don’t overthink it. How many pages on your site do you have? How many blog posts do you have? Add them up. We’re looking for quality content that provides value, but more so in a quantitative fashion. It doesn’t have to be exact as the actual quality a piece of content has can be measured via a content audit. Make a note of the number you see. 5. GoogleAt last, we come to the final check of our series. Sometimes Google throws a number at you and you have no idea where it comes from, but try to be as objective as possible. Do a “site:domain.com” search on Google and check how many results Google serves you from its index. Remember, this is purely a numeric value and does not truly determine the quality of your pages. Make a note of the number you see and compare it to the other numbers you found. Any discrepancies you find indicates symptoms of an inefficient indexation. Completing a simple quantitative analysis will help direct you to areas that may not meet minimum qualitative criteria. In other words, comparing numeric values from multiple sources will help you find pages on your site that contain a low value. The quality criteria we evaluate against can be found in Google’s Webmaster guidelines. How to resolve index bloatResolving index bloat is a slow and tedious process, but you have to trust the optimizations you’re performing on the site and have patience during the process, as the results may be slow to become noticeable. 1. Deleting pages (Ideal)In an ideal scenario, low-quality pages would not exist on your site, and thus, not consume any limited resources from search engines. If you have a large number of outdated pages that you no longer use, cleaning them up (deleting) can often lead to other benefits like fewer redirects and 404s, fewer thin-content pages, less room for error and misinterpretation from search engines, to name a few. The less control you give search engines by limiting their options on what action to take, the more control you will have on your site and your SEO. Of course, this isn’t always realistic. So here are a few alternatives. 2. Using Noindex (Alternative)When you use this method at the page level please don’t add a site-wide noindex – happens more often than we’d like), or within a set of pages, is probably the most efficient as it can be completed very quickly on most platforms.
All of the above can be noindexed and removed from your XML sitemap(s) with a few clicks on WordPress if you use Yoast SEO or All in One SEO. 3. Using Robots.txt (Alternative)Using the robots.txt file to disallow sections or pages of your site is not recommended for most websites unless it has been explicitly recommended by an SEO Expert after auditing your website. It’s incredibly important to look at the specific environment your site is in and how a disallow of certain pages would affect the indexation of the rest of the site. Making a careless change here may result in unintended consequences. Now that we’ve got that disclaimer out of the way, disallowing certain areas of your site means that you’re blocking search engines from even reading those pages. This means that if you added a noindex, and also disallowed, Google won’t even get to read the noindex tag on your page or follow your directive because you’ve blocked them from access. Order of operations, in this case, is absolutely crucial in order for Google to follow your directives. 4. Using Google Search Console’s manual removal tool (Temporary)As a last resort, an action item that does not require developer resources is using the manual removal tool within the old Google Search Console. Using this method to remove pages, whole subdirectories, and entire subdomains from Google Search is only temporary. It can be done very quickly, all it takes is a few clicks. Just be careful of what you’re asking Google to deindex. A successful removal request lasts only about 90 days, but it can be revoked manually. This option can also be done in conjunction with a noindex meta tag to get URLs out of the index as soon as possible. ConclusionSearch engines despise thin content and try very hard to filter out all the spam on the web, hence the never-ending search quality updates that happen almost daily. In order to appease search engines and show them all the amazing content we spent so much time creating, webmasters must make sure their technical SEO is buttoned up as early in the site’s lifespan as possible before index bloat becomes a nightmare. Using the different methods described above can help you diagnose any index bloat affecting your site so you can figure out which pages need to be deleted. Doing this will help you optimize your site’s overall quality evaluation in search engines, rank better, and get a cleaner index, allowing Google to find the pages you’re trying to rank quickly and efficiently. Pablo Villalpando is a Bilingual SEO Strategist for Victorious. He can be found on Twitter @pablo_vi. The post Delete your pages and rank higher in search – Index bloat and technical optimization 2019 appeared first on Search Engine Watch. from https://searchenginewatch.com/2019/07/16/delete-your-pages-and-rank-higher-in-search-index-bloat-and-technical-optimization-2019/ Keyword research is one of the most important digital marketing tasks. Furthermore, it lies at the foundation of any business strategy or campaign you are planning. Keyword research provides useful insight into organic ranking opportunities, persona building, competitive research, product development — you name it! Another reason why I love keyword research is that it’s a highly creative process. There is never such a thing as “enough tools” when it comes to keyword research. Each data source and the way the data is presented brings something new to the table. Sometimes when I feel stuck, all I need is to play with a new keyword intelligence tool. With that in mind, I decided to create a roundup of free (and freemium) keyword research tools, i.e. those tools you can run right now, without the need to pay first. Some of those tools are freemium (meaning you can pay for the upgrade) but all of them are quite usable for free (which is what I recommend doing first before deciding if you need to upgrade). Finally, I am not going to include obvious tools like Google Ads Keyword Planner and Google Search Console as I am sure SEW readers are well aware of. New tools inspire new tactics which is what I hope you’ll end up with. 1. Rank Tracker: Aggregated keyword suggestions from multiple sources(Freemium) Rank Tracker free version gives you access to its keyword research feature that uses around 20 different keyword research sources, including Google Ads Keyword Planner, Google Suggest,Wordtracker, SEMRush and more. Rank Tracker is a downloadable tool and you do have to provide your name and email to start downloading. Other than that, the installation takes seconds, and running it won’t kill your browser. The free version includes keyword analysis feature helping you to discover most promising keywords to include into your content strategy. These metrics include:
You can export the whole list into an Excel file to play further. The premium features include collaboration, cross-tool reporting, task scheduler, multiple projects, etc. You can see the full list of features you’ll get free access to here. 2. Answer The Public: Google Suggest driven questions and more(Freemium) Answer The Public is a completely free keyword research tool that requires no registration. It uses Google Suggest data to discover questions, comparison-based queries and keywords containing prepositions. Answer The Public allows to view the data in two ways: Visualization (i.e. a mindmap) and Data: You can also export all the results in a CSV file or save any visualization as a PNG file. The recently launched premium version allows you to target keywords by location, compare data and add team members for collaboration. You can see the version comparison here. Tip: You can also use this tool to upload your Answer The Public spreadsheet to add Google search volume to each question. This will help you focus on those questions that are often being searched in Google. 3. Text Optimizer: Related concepts and terms(Freemium) Text Optimizer is the semantic analysis tool helping you identify related concepts behind each topic or query. It uses Google’s search snippets to analyze the keyword context to come up with related concepts and entities that help Google understand and classify the topic. Text Optimizer is both content optimization and research tool helping you direct your whole content creation process. Don’t get misled though: It’s not about stuffing your content with the suggested terms. Use the tool for deeper topic understanding and as a writing aid. The premium version allows to use geo-targeting, build whole sentences to help you in writing and access your historic records. You can see it in action here. 4. Kparser: Clustered keyword suggestions(Freemium) Kparser is a freemium tool that runs the whole keyword analysis for free, without requiring registration. You won’t be able to export the keyword list unless you upgrade but you can use the keyword filters to the left to group and cluster your list by a common modifier. Kparser combines multiple keyword sources including Google Trends, Ebay, Amazon, Google Trends, and YouTube. It’s a somewhat basic approach to keyword clustering but it’s nonetheless nice to have completely for free as it helps to discover more queries to optimize for. The premium features include unlimited searches, geo-targeting and more. Read more about Kparser here. Bonus: Analyze keyword performance(Free trial) Finteza is a nice affordable alternative to Google Analytics with huge focus on conversion optimization and monetization. One of its highly useful feature is search analysis section showing you which keywords brought most clicks to your site. It’s a great way to identify more queries to focus on: If you select any of the queries and keep browsing the site, you’ll see data related to that keyword only, e.g. its conversion rate, associated conversion funnel analysis and user demographics. Finteza also recently added retargeting feature allowing you to serve specific content based on the initial referral or engagement. You can read more on Finteza’s traffic analytics here. Which keyword research tools do you know that are usable free of charge? Please share yours in the comments! The post Four cool keyword research tools you can use for free now appeared first on Search Engine Watch. from https://searchenginewatch.com/2019/07/15/keyword-research-tools-free/ Featured snippets, also known as “position zero” placements on Google, have been receiving their fair share of glory and blame lately. While some big corporations like Forbes went ahead and questioned if Google is stealing traffic with the featured snippet, content creators like me have found it easy to get more traffic, thanks to being able to rank small sites on a featured snippet. This post will give you a brief idea on how you can rank a page on Google’s featured snippet — without building any links to that page. Understand the typesThere are three major types of featured snippets that you can go for. As most of our clients are bloggers, we tend to go for either the paragraph snippets or the list snippets. Table snippet is another popular one that you can target. Here’s a quick graph from Ahrefs about the snippet type and their percentages. Targeting the right keywordsOnce you finalize the type of snippet that you would want to go for, it is time to dig deep into your keyword research to find keywords that suit your blog and match the requirements for the type of snippet that you are going after. If you are going for a paragraph snippet, you will have to find keywords that are primarily related to these types:
If you are trying to rank for a numeric list (numbered list or bullet points), the idea would be to structure your content in a way so that it offers step by step guides to someone. As per our experience, Google only shows a numeric list on featured snippet when the keyword tells Google that the searcher is looking for a list. For table snippets, the idea is to have structured schema data on your website that compares at least two sets of data on the page. You don’t really have to have a properly formatted column-based table to be able to rank for table snippets as long as the comparison and the schema is there. Understanding the type and targeting the right keywords will do more than half of the job for you when it comes to ranking your website on the featured snippet with zero links. However, you are not going to win the battle by out-throwing an already existing featured snippet. This will only work for keywords that don’t already have a featured snippet ranking on Google. To grab featured snippets from the existing competition, you will need to go ahead and perform a few more steps. Copying your competitorSome will call it “being inspired”, but essentially, what you are doing is copying the structure of an existing featured snippet article and trying to make it better (both with content and if possible, with links). What do I mean when I say, copying the structure of an existing page and making it better? If you want to rank for the featured snippet for the keyword “best cat food brands” and if the one, ranking at this moment already has a list of 20, you will have to create a list of 25, in the exact same format that the current one is using. Once that’s done, the final step is simply to make sure you have proper schema on the page. Note: It is very unlikely that this method will help you outrank an existing featured snippet unless you also rank in the top ten for that keyword. How do we find keywords for featured snippets?As you can imagine, finding the right keyword to target is winning half of the battle when it comes to ranking on featured snippets. I use Semrush, but feel free to use your own tools. Here’s what our agency’s process looks like. Let’s assume, for the purpose of this article, that I run a pet blog and I am interested in ranking for multiple featured snippets. I would go to Semrush, and put one of my competitors on search. Source: semrush Now click on “Organic Research”, select positions and from advanced filters, select – Include > Search features > featured snippet. Source: semrush This will give you a huge list of keywords that are currently ranking as featured snippets. As you can see, we found about 231 opportunities to target here: Source: semrush It is time to add another condition to our advanced filters. Let’s select include > words count > greater than five. Here’s what the new result looks like: Source: SEMrush From here on, simply organize the keywords by volume and then select the ones that you think matches with your target market. Like any keyword research, you will have to find keywords that have low competition and moderate search volume. Personally, I would try to go for keywords that have less than 500 monthly searches. Make sure that you are following the initial three steps that we discussed. You will almost always have a higher chance of ranking on featured snippet following this strategy. Khalid Farhan blogs about internet marketing at KhalidFarhan.com. He can be found on Twitter @iamkhalidfarhan. The post How to grab featured snippet rankings with zero link building effort appeared first on Search Engine Watch. from https://searchenginewatch.com/2019/07/12/how-to-get-featured-snippets-no-link-building/ Change is a natural part of a business, particularly when it comes to your digital presence. The need to rebrand, switch up the CMS (content management system), consolidate your resources or revamp the architecture and user journey of your website, is ultimately inevitable. And whatever the goal may be, it is not uncommon for all major initiatives to fall under the umbrella of a contemporary digital marketer. How does Google feel about changesOne thing to keep in mind, however, is Google’s tendency to be less than accommodating towards major website changes, especially URL changes. And who can blame them? Whilst Google’s algorithm may be able to detect semantic differences between websites, it’s somewhat unrealistic to expect it to also realize that the similarities between store.hmv.com and hmv.com mean they’re both the same brand. Therefore, without acknowledging this, many domain changes result in staggering losses of traffic and rankings, and suddenly the most well-known brand in an industry becomes non-existent within Google’s universe. It is therefore imperative to ensure the changes you’re making can be correctly comprehended by Google. How to understand GoogleExpecting a lonesome digital marketer to be a jack of all channels is quite unrealistic. But luckily you don’t need to be. There’s a whole industry of people who are dedicating their days to figuring out how to think exactly like Google, and they can help you avoid the risk of decimating your hard-earned keyword rankings (unless you’re doing black hat tactics, in which case, those rankings aren’t very hard-earned after all). This industry is SEO. Three pillars of SEOBefore we dive into the value SEO, here’s a quick summary of the three key pillars:
So with that crash course, we can now connect the dots between SEO expertise and high-level migration requirements. Why you need SEOWhilst a website’s appearance is important, first and foremost it’s crucial to understand how you’re going to explain the changes you’re making to Google. We suggest a handwritten note: “Dear Google, Don’t worry, some things are changing but we still love you, so here is a comprehensive, incredibly large map of URL redirects detailing the new versions of the exact same pages you know and ranked the first time around.” On a more serious note, however, here are five ways in which the expertise of an SEO professional can propel your website towards successful migration. 1. Taking the complexity out of URL mapping and redirectsSince a site’s internal linking and page equity is an essential part of SEO, we deal with redirect handling and URL mapping and all the complications that come along with it, all the time. Therefore, you have to make sure each redirect makes sense, and also that each page is able to take on the new status. Common issues at this stage can include:
Just in case you’re not convinced, here’s a scary graph of what happens when you don’t do this properly. Source: Croud The process of telling Google what’s what extends beyond redirect mapping, it also includes on-page work. Specifically, the canonical tag. Fun fact: 301 redirects don’t actually stop Google from indexing your pages, so if you left it at that, you would just end up with some poor rankings and some confused users. Luckily, your friendly neighborhood SEO knows all about the various ways to help encourage Google to drop your old page out of the index as it goes along your new site. 2. Understanding your website’s behaviorSo, you’ve done all the mapping and have set up just how to introduce Google to your new site. While that’s very exciting, we do have to remember the “understanding” part of these first several weeks. The primary reason for site migration is to provide a new and improved site that will (hopefully) gain more traffic and drive more business. However, without understanding how your original site performed, it’s very difficult to establish if your new site is actually superior. This, therefore, highlights the importance of benchmarking. Of course, you may know how much traffic your ad campaigns – and even your website in general – are pulling in, but you’ll need to know more than that to be successful. As SEOs, our aim is to understand your site as much as the search engines do, which as explained above, is much more than just content on your pages. To paint the best picture of your website before you migrate, use several tools that provide a variety of key SEO data points:
By aggregating the different metrics and views of each tool, you can create a beautiful, detailed portrait of how your website behaves, and how it’s interpreted by both search engines and users. Astute benchmarking will allow for in-depth, helpful post-migration analysis, particularly for those metrics that can only be recorded at a particular moment. There’s no way to tell how fast your pages loaded, or how many pages returned non-200 status codes last week. If you don’t gather this information beforehand, you won’t be able to fully report the impact of the migration. After you complete the migration, you can gather this data again to truly judge your results. Everyone will remember to check the new traffic statistics, and even the new rankings, but only an SEO will remember to check that those numbers make sense and you haven’t accidentally orphaned half of your product pages. SEOs will make sure users aren’t just on your site, but crawlers are too. With proper data at your disposal, you can set about making iterative improvements which will undoubtedly be necessary. 3. Migrating your tracking toolsAll this talk about performance and results is for naught if you can’t actually track any of it. Much like Google’s search engine, Google tools aren’t so keen on supporting your site migration either. Therefore, you have to make sure you’re ready to start tracking the new site, ideally without losing your old data. Dealing with various tracking tools and codes all the time, an SEO has to be a Google Analytics expert too (it’s commonly a requirement on most resumes). So how do you avoid a scenario in which either you have no historical data and can’t measure the success, or when you have two different accounts and have to do the calculations for performance comparisons by yourself? By making plans to migrate your tracking tools. Ideally, you’ll use the same analytics tracking code for the migrating site, so that the old metrics can be directly compared to the new numbers once it takes place. Need some more persuasion? Take a look at this graph detailing a successful site migration. Source: Croud 4. Testing and the importance of the human touchSo you’ve planned all your new pages, and your new site is built. What’s next? Hopefully, it’s built in a staging environment and not actually live. If it’s not, you run the risk of causing all sorts of issues with duplicate content and ranking cannibalization. However, your SEO can easily take charge of this with a robots.txt directive (which will haunt them until the site is live and they can change it). Despite its purpose, a staging environment doesn’t always reflect the search engine’s behavior since it lives in isolation. There’s no way to track backlinks or see exactly what it will look like in a SERP at this time. Often, Googlebot doesn’t even fully crawl staging environments, because it’s seen as time-wasting. Therefore, your SEO’s brain is your very best test. Everyone will check that the pages are set up as planned, but your SEO will be the one who can thoroughly re-test each individual redirect at 2 am. This will likely be the last time that any mistakes will be recognized before launch, so it’s critical to make sure that every redirect behaves as expected and that they are all 301 status codes. Lastly, you’ll need to make sure that a single XML file stays live on the legacy site, containing all the legacy URLs. This will be used to push Googlebot through the old URLs and onto the new site, expediting your meticulously-mapped redirects. 5. Launching and mitigating lossFinally, you’re ready to flip the switch and the champagne bottles are out. So you turn on the new site, and congratulations – you’ve just lost 20% of your traffic. No, really, congratulations. In case you forgot the daunting chart we shared earlier in this post, website migrations can cause damaging losses, and sites that don’t prepare accordingly, often never recover. However, if you’re smart and you hired an SEO expert to take charge of this project, they’ll have the task at hand. Your traffic loss is a product of search engines and users not recognizing your new site – temporarily. Your SEO will have made sure everything is set up properly, so Googlebot is quickly figuring out that your new site contains all the same high-ranking, trustworthy content as on your old site. It’s still a little miffed at you for changing on it, so you may only get back on the second pages of results. You’ll still have some further optimizations to do, but it’s much easier to go from page two to page one, rather than page ten to page one. Just remember, we’re guiding this migration from an SEO perspective. Googlebot is basically a person, so as long as it can read the site, we assume that users will enjoy their experience too. Kailin Ambwani is a Digital Associate at global digital agency Croud, based in their New York office. The post Why an SEO should lead your website migration appeared first on Search Engine Watch. from https://searchenginewatch.com/2019/07/11/seo-lead-site-migration/ The search landscape has always been one that is evolving. Right now the big discussion is around Google’s control on how much search traffic goes to publishers vs. stays on Google.com. This might end up being a really big point for Google when it comes down to challenging the long-held point of view that competition is just one click away. Over the past ten years that I’ve been writing for this publication, I’ve written an article each year where I review how paid and organic search can work together, and how brands appear in search listings. This article over the past few years has started to evolve into how Google has changed the search page. These changes include the inclusion of more paid search results, shopping, and local listings. As you will see from the data there is certainly a trend that is fueled by both Google’s growth objectives as a publicly-traded company, as well as consumer behavior shifts (that are, mobile and local). The first piece is the overlap of paid and organic listings. What I’m tracking here is the number of times a brand appears in both paid and organic search results as a percentage of total paid results. For example, if there are three paid search ads, and GEICO, Progressive and Liberty Mutual all appear in both the paid and organic listings that would score 100%. I’ve been tracking five verticals since 2010. What’s really interesting is over the past few years the amount of overlap has gone up on average. However, this year the overlap dropped by 44% year over year. This had a lot to do with the drops across financial services, travel, and technology. Source: Google Search Data Factors driving this changeI think this trend is driven by two factors:
Source: Google Search Data So what has been happening to the other areas of optimization, especially local and shopping? I have also been tracking these areas over the last three years. The change is exactly what you would have expected. Over the past three years, the percent of search terms that have local listings has increased more than three times, from 11% in 2017 to 38% in 2019. Retail continues to have 100% of listings with the map pack. This validates the importance to both Google, brands, and consumers of having a local presence. Also gives additional credence to optimizing and cleansing your location data, not just on Google, but across the web. Source: Google Search Data Shopping ads have come on in a big wayShopping ads have also continued to have a strong presence and have grown slightly. They are up from 43% in 2017 to 47% in 2019. Shopping ads provide a more visual experience for the consumer, and some very strong conversion rates for brands. Google has also been continuing to evolve their shopping product announcing a redesigned shopping experience in May. This included new ad formats, online to in-store options, and Smart Campaigns (which help encourage SMBs to get into the game). All these changes and enhancements demonstrate a commitment to the product and the value to both consumers and brands. What should you be doing as a search marketer?So what is the impact of this data to us as search marketers? I think there are two key takeaways:
The search engine results page will continue to evolve as consumer behavior and technology evolves. Think about the continued expectations of online to offline buying behavior, real-time inventory, or the impact 5G will have on the marketplace. Remind yourself to take a look around at a macro-level to see the trends vs. always focusing on detailed keyword level optimizations. You will often find some great trends to help put your strategy in context. P.S. Special thanks to Audrey Goodrick who helped pull together this data. Thank you for your help this summer Audrey. The post Search engine results: The ten year evolution appeared first on Search Engine Watch. from https://searchenginewatch.com/2019/07/10/evolution-of-search-engine-results/ There’s always more content to write. Sometimes that can be encouraging, even exhilarating. You’ve got plenty of space for all your ideas, and countless opportunities to engage with potential customers and to build a stronger relationship with existing ones. But producing a constant stream of content can be exhausting. You’ll find yourself running out of ideas and running out of steam. And at that point, it can be really difficult to keep creating high-quality content on a regular basis. Even if you’re in a position to hire someone to help, you’ll still need to have a fair amount of involvement in content production – supplying ideas and outlines, at the very least. So how can you keep up with all the content you need to produce? Before we dig into some specific tips, let’s take a look at how much you actually need to create. How frequently should you post on your blog and your social media accounts?There are no rules here different blogs do different things, often within the same industry. In the content marketing world, for instance:
As a rough guideline, you’ll probably want to aim for at least one weekly post, one daily Facebook and/or Instagram post, and three or more posts a day on fast-moving networks like Twitter. (According to Louise Myers, the “general consensus” is that anything from three to 30 Tweets per day is fine. So how do you keep up with this level of content, week after week? How to create great content without burning outHere are nine ways to keep up your content production without getting to the point of feeling so burned out that you simply give up. You can use these as a step by step process, or you can pick and choose ideas that’ll make your existing process go more smoothly. 1. Decide how often you’ll post contentWhile there’s no “right” answer to how often to post content, there’s definitely a “wrong” one. Posting content whenever you feel like it, at wildly varying frequencies. It’s best – for you and for your audience – to have a consistent posting schedule, both on your blog and on social networks. That might mean, for instance, two blog posts each week, one Facebook post each day (more may be counter-productive), and five Twitter posts each day. While you might vary your schedule a little, having a clear idea of what to aim for makes it much more likely that you’ll write and publish regular posts. 2. Come up with a suitable pattern for your contentWith social media, in particular, it’s helpful to “pattern” your content. This is also a useful practice for blog posts, especially if you post twice a week or more on your blog. Rather than starting with a blank page when it comes to generating ideas, you can have a pre-set “pattern” for the content you’re going to create. For instance, if you’re writing five Twitter posts each day, you might decide to have:
3. Brainstorm lots of ideasSimply coming up with ideas for content can take a lot of time. Instead of sitting down and staring at a blank page, try “batching” the idea generation process: set aside time once every week or two to come up with a whole list of ideas. Some great ways to find content ideas include:
4. Outline longer pieces of contentWith short posts on Twitter and Facebook, you probably don’t need an outline – just a clear idea of what you’re trying to accomplish. For blog posts, though, you’ll find it’s much faster to write when you’ve got a solid outline in place, especially if you’re producing long-form content. Again, it’s often a good idea to “batch produce” your outlines, by picking four or so ideas and outlining all those posts at once. That way, when it’s time to write those posts, a lot of the hard work is already done. Plus, if you outline several posts in a single session, you’ll find it much easier to create links between them. 5. Write several short pieces of content at onceInstead of opening up HootSuite (or your favorite social media management tool or app) every single time you want to send a tweet or create a post, write lots of posts ahead of time. You might want to queue up a week’s worth of posts all at once. Buffer is a great tool for this, allowing you to schedule posts to go out at any time you want – making it easier to reach potential clients in other timezones or those on unusual schedules. 6. Set aside focused time for longer piecesCreating content requires a lot of focus – it’s not something you can easily do while you’re fielding phone calls or responding to emails every few minutes. Block out periods of time (ideally two hours long) in advance, where you can shut your office door, ignore your email, and let calls go to voicemail. 7. Get an editor involved to review your contentWhile you may have no choice but to self-edit your content, if it’s possible, get an editor involved. This might be someone already on your team, or a freelancer external to your company. A good editor will go far beyond correcting spelling mistakes and grammatical slips. They’ll help to ensure your content is well structured, that it flows smoothly, and that it’s as engaging as possible. 8. Have an assistant format and upload your contentIf you’re uploading all your own posts on your blog and social media, you’ll be spending time finding images, selecting categories, adding hashtags, including links, and so on. While these tasks are an important part of the content creation process, they don’t need to be done by you. Delegate as much of the repetitive work as possible to an assistant so that you can free up more time to write or design the content itself. 9. Get ahead and take time offIf content creation is starting to feel like a treadmill that you can’t get off, then you’re probably heading for burnout. Plan your schedule so you can get ahead, perhaps by creating an extra piece or two of content each week. That way, you can take a week off from content creation occasionally (plus, you’ll also be covered for any unexpected events, like a particularly busy period, or illness). 10. Repurpose your existing contentThere may well be excellent blog posts in your archive that rarely get read, and your social media posts will almost certainly only gather fleeting attention. Instead of always coming up with fresh ideas and creating new pieces from scratch, how about reusing some of your existing content? That might be as simple as writing an updated version of a blog post, and republishing it – or it could involve something more involved like turning a series of tweets into a blog post, or turning a post into an infographic. Valuable, high-quality content is great for your business, your potential and existing customers, and your SEO. By trying some or all of the tips above, you can keep up the flow of content, without burning out. If you have a tip for creating lots of great content, consistently, feel free to share it with us in the comments below. Joe Williams is the founder of Tribe SEO. He can be found on Twitter at @joetheseo. The post Ten ways to pump out a stream of great content without burning out appeared first on Search Engine Watch. from https://searchenginewatch.com/2019/07/09/productive-content-creation-tips/ HTTP is the standard protocol defining how information passes between your visitor’s browser and the server hosting your site and HTTP status codes are your handy way of knowing exactly what is happening within that process. For web marketers, it’s well worth the effort to get familiar with these status codes. By understanding your site’s backend activity you can recognize errors that demand attention and find opportunities to help improve (or at least not hinder) your SEO efforts. A quick HTTP status code overviewHTTP status codes are three-digit numbers. The first number indicates which of the five categories each belongs to. The categories refer to either a type of request or type of error, as follows: 1xx status codes: Information requestStatus codes beginning with a “1” communicate that a server is processing information, but has not yet completed the request. 2xx status codes: SuccessThese status codes indicate that a requested information transfer was successfully completed. For marketers seeking to improve SEO, these codes mean that no action is required, everything is working correctly. 3xx status codes: RedirectionThese redirect codes communicate that your visitor requested information that was not available at the targeted address. 4xx status codes: Client errorThese codes signal that the client (the browser accessing the site) has encountered an error when trying to receive server information. 5xx status codes: Server error5xx codes point to server-side errors, the client request was issue-free and yet the server could not finish the transfer. Six HTTP status codes that are arguably most critical to SEOWhile there are more than 60 HTTP status codes to be aware of, some are more relevant from an SEO perspective than others. The following six status codes are especially important to understand and watch out for. 1) 404 – Not foundA 404 page not found error is perhaps the most commonly known HTTP status code and can signal to marketers that a page is failing to deliver content to visitors. The server cannot return information because the resource or URL doesn’t exist. Landing at a 404 page is detrimental to SEO because unavailable content leads to a bad experience for both your audience and the search engine crawlers that are so critical to your SEO success. To address these errors, ensure that any 404 pages utilize a 301 redirect to reach an available and relevant page. 2) 301 – Moved permanentlyYou’ll recognize this code as the prescribed solution to the 404 errors just mentioned – a 301 status code means that the requested resource or URL has been permanently redirected somewhere else. This code is a valuable tool for sending visitors to relevant content that is available on the site. Marketers can and should set up 301 redirects for pages that are no longer available so that their audience lands on useful content instead of error pages. The 301 code gives search engines the message to update their index for the page. 3) 302 – FoundSimilar to code 301, code 302 is another type of useful redirect to know. However, this one is temporary rather than permanent. A 302 code directs browsers to a new URL, ensuring that visitors reach relevant content – but stops short of instructing search engines to update the page index. 4) 307 – Temporary redirectThis code offers a more specific redirect method than the 302 code and has the browser perform the redirect instead of the server. This is useful for sites served on HTTPS that are on an HTTP Strict-Transport-Security (HSTS) preload list. Side note: If you are running an HTTP site, it’s definitely in your best interest to migrate to HTTPS. Thus, using codes 301, 302, and 307, marketers can optimize SEO by closely controlling search engine crawlers’ understanding of what content exists, and how they ought to crawl and index that content. 5) 503 – Service unavailableThis error indicates that the server cannot process a request due to a temporary technical issue. The 503 code informs search engines that processing was stopped on purpose and tells the search engine not to de-index the page (as it would when seeing other server errors). However, if the 503 error isn’t resolved over a long period of time, search engines can begin to view it as a permanent error that warrants deindexing. Therefore, marketers should address 503 errors as rapidly as possible to avoid deindexing of the unavailable page and the negative impact on SEO that would come hand-in-hand with that scenario. 6) 410 – GoneThis dramatic-sounding code means that a resource or URL is unavailable because it was deleted on purpose and was not redirected. When search engines see a 410, they will remove the page from the index instead of redirecting. Marketers should be sure to properly correct any page issues or implement effective redirects so that visitors arrive at content pertinent to their search needs. By at least understanding the most relevant HTTP status codes and properly addressing website fixes that can make or break SEO success – marketers can help ensure their sites function smoothly and offer the intended experiences for both search engines and potential customers. Kim Kosaka is Director of Marketing at Alexa.com. The post Six HTTP status codes most critical to your SEO success appeared first on Search Engine Watch. from https://searchenginewatch.com/2019/07/08/six-http-status-codes-seo/ Featured snippets, also known as “position zero” placements on Google, have been receiving their fair share of glory and blame lately. While some big corporations like Forbes went ahead and questioned if Google is stealing traffic with the featured snippet, content creators like me have found it easy to get more traffic, thanks to being able to rank small sites on a featured snippet. This post will give you a brief idea on how you can rank a page on Google’s featured snippet — without building any links to that page. Understand the typesThere are three major types of featured snippets that you can go for. As most of our clients are bloggers, we tend to go for either the paragraph snippets or the list snippets. Table snippet is another popular one that you can target. Here’s a quick graph from Ahrefs about the snippet type and their percentages. Targeting the right keywordsOnce you finalize the type of snippet that you would want to go for, it is time to dig deep into your keyword research to find keywords that suit your blog and match the requirements for the type of snippet that you are going after. If you are going for a paragraph snippet, you will have to find keywords that are primarily related to these types:
If you are trying to rank for a numeric list (numbered list or bullet points), the idea would be to structure your content in a way so that it offers step by step guides to someone. As per our experience, Google only shows a numeric list on featured snippet when the keyword tells Google that the searcher is looking for a list. For table snippets, the idea is to have structured schema data on your website that compares at least two sets of data on the page. You don’t really have to have a properly formatted column-based table to be able to rank for table snippets as long as the comparison and the schema is there. Understanding the type and targeting the right keywords will do more than half of the job for you when it comes to ranking your website on the featured snippet with zero links. However, you are not going to win the battle by out-throwing an already existing featured snippet. This will only work for keywords that don’t already have a featured snippet ranking on Google. To grab featured snippets from the existing competition, you will need to go ahead and perform a few more steps. Copying your competitorSome will call it “being inspired”, but essentially, what you are doing is copying the structure of an existing featured snippet article and trying to make it better (both with content and if possible, with links). What do I mean when I say, copying the structure of an existing page and making it better? If you want to rank for the featured snippet for the keyword “best cat food brands” and if the one, ranking at this moment already has a list of 20, you will have to create a list of 25, in the exact same format that the current one is using. Once that’s done, the final step is simply to make sure you have proper schema on the page. Note: It is very unlikely that this method will help you outrank an existing featured snippet unless you also rank in the top ten for that keyword. How do we find keywords for featured snippets?As you can imagine, finding the right keyword to target is winning half of the battle when it comes to ranking on featured snippets. I use Semrush, but feel free to use your own tools. Here’s what our agency’s process looks like. Let’s assume, for the purpose of this article, that I run a pet blog and I am interested in ranking for multiple featured snippets. I would go to Semrush, and put one of my competitors on search. Source: semrush Now click on “Organic Research”, select positions and from advanced filters, select – Include > Search features > featured snippet. Source: semrush This will give you a huge list of keywords that are currently ranking as featured snippets. As you can see, we found about 231 opportunities to target here: Source: semrush It is time to add another condition to our advanced filters. Let’s select include > words count > greater than five. Here’s what the new result looks like: Source: SEMrush From here on, simply organize the keywords by volume and then select the ones that you think matches with your target market. Like any keyword research, you will have to find keywords that have low competition and moderate search volume. Personally, I would try to go for keywords that have less than 500 monthly searches. Make sure that you are following the initial three steps that we discussed. You will almost always have a higher chance of ranking on featured snippet following this strategy. Khalid Farhan blogs about internet marketing at KhalidFarhan.com. He can be found on Twitter @iamkhalidfarhan. The post How to grab featured snippet rankings with zero link building effort appeared first on Search Engine Watch. from https://searchenginewatch.com/2019/07/12/how-to-get-featured-snippets-no-link-building/ |
ABOUT USRising Phoenix SEO is a Phoenix-based internet marketing firm. Launched in 2013 and spearheaded by Justin Blake, Rising Phoenix SEO works with local and national accounts to help them dominate their industry in their regions. |