Metrics can tell us more than just how we’re performing—good data can also provide background for proactive site improvement. There are already dozens of guides written on SEO metrics, so rather than rehash the basics, let’s focus on three important metrics that you won’t find in most SEO KPI reports:
1. Percentage of Traffic from SEO
Don’t you want your traffic from SEO to be the largest source of traffic to your site? If you’re the SEO manager, the answer might be yes. If you’re the CEO, the answer is definitely no.
Generally speaking, the sites of strong brands get a lot of direct traffic and referral traffic. If more than 50 percent of your traffic is coming from SEO, it’s a sign that you may need to build a stronger brand before you can build your SEO traffic much higher. Think about it this way: If your site is providing something valuable, then presumably some of the people who visit will want to come back again.
Over 50% organic SEO traffic is NOT a good sign! Find out whyClick To TweetIf you have lots of organic traffic, but little direct traffic, it means you aren’t getting a lot of repeat visitors. Some sites may have good reasons for this to be the case, but for most sites it’s a red flag.
Those repeat visitors are going to be the ones who help create the signals that tell Google you’re a legit player in the marketplace. They will talk about you online, link to your site, write reviews about you, share your content and generally make your life much easier. If you’re getting a lot of one-and-done visitors, think about how you can make your site stickier, so that you develop a relationship with the people who visit.
Conversely, if your percentage of site traffic from SEO is on the low side—especially anything under 20 percent—it likely means that your site is under-optimized and that there’s plenty of opportunity for you to improve.
20% to 50% is the sweet spot for Organic Search traffic to your site.Click To Tweet2. Average Daily Crawl
In the Crawl Stats tab of Google Search Console, it shows the average number of pages crawled per day over the last 90 days. If you have fewer pages indexed than you would like, it’s possible that your average daily crawl could be a limitation.
If the number of pages in your XML sitemaps (which should be the number of pages you are trying to get indexed) is more than six times the average-daily-crawl pages shown by GSC, it means that you may not have the crawl budget needed to get all those pages indexed.
If your XML sitemaps don’t accurately reflect your indexation goals, the principle still stands—in order to get pages indexed, you need them to be crawled on at least a semi-regular basis. (The number six is an approximation—the actual number may be higher or lower in your market, and likely depends on a number of factors that are unique to your site, but it’s a good benchmark for determining if crawl budget may be a limiting factor.)
If pages in your XML sitemap >6 times avg daily crawl shown by GSC, crawl budget is too lowClick To TweetIf your daily crawl budget is short of where you need it to be, it’s a sign that you either need to increase the number of links and positive brand signals to your site, or there may be a technical issue that’s causing Google to curtail their crawling. Potential culprits are poor time to first byte and page load times or having a large number of duplicative and/or low-quality pages on your site.
3. Percentage of Indexable Pages
This isn’t a metric you can just pull out of Google Analytics, but there are a number of crawlers on the market that should be able to give you at least a good approximation of this number (including Stone Temple’s proprietary crawler that we use with our clients). Ideally you’ll be using a crawler that follows robots.txt instructions for Googlebot and keeps track of URLs that would have been crawled but weren’t because they were blocked.
Although the need to primarily feed Google high quality, unique pages is pretty well established, it often isn’t well planned within site design. So, we have to use a lot of solutions—noindex, robots.txt and canonicals are the primary suspects—that tell Google we don’t think those pages would be good choices for their index.
While each of these solutions is much better than not having them (assuming they are applied appropriately), they all cause inefficiencies in PageRank flow and/or crawl budget. If more than 50 percent of the URLs you’re exposing to Google fit under one of these categories, there’s a strong likelihood that your site architecture could be improved in terms of the pages it’s exposing to crawl. The closer you can get to 100 percent of the URLs exposed to Google being crawlable and indexable (provided they are unique, good quality pages) the stronger your site’s organic search traffic will be.
Summary
While it’s important to track metrics that tell you how you’re doing, it’s also useful to look at metrics that can inform and potentially diagnose where you can make improvements. These metrics are just a few examples of ways to look at your site from a different perspective.
from Stone Temple Consulting http://ift.tt/2B0yf5o
via IFTTT
No comments:
Post a Comment