Ranking in the search engines requires a website with flawless technical SEO. Luckily, the Yoast SEO plugin takes care of (almost) everything on your WordPress site. Still, if you really want to get most out of your website and keep on outranking the competition, some basic knowledge of technical SEO is a must. In this post, …Read: "What is crawlability?"
Crawl directives archives
There are several reasons for cloaking or redirecting affiliate links. For instance, it’s easier to work with affiliate links when you redirect them, plus you can make them look prettier. But do you know how to cloak affiliate links? We explained how the process works in one of our previous posts. This Ask Yoast is …Read: "Ask Yoast: Should I redirect my affiliate links?"
If you have a big eCommerce site with lots of products, layered navigation can help your users to narrow down their search results. Layered or faceted navigation is an advanced way of filtering by providing groups of filters for (many) product attributes. In this filtering process, you might create a lot of URLs though, because the user …Read: "Ask Yoast: Nofollow layered navigation links?"
We regularly consult for sites that monetize, in part, with affiliate links. We usually advise people to redirect affiliate links. In the past, we noticed that there wasn’t a proper script available online that could handle this for us, so we created one to tackle this problem. In this post, I explain how you can get …Read: "How to cloak your affiliate links"
Traditionally, you will use a robots.txt file on your server to manage what pages, folders, subdomains or other content search engines will be allowed to crawl. But did you know that there’s also such a thing as the X-Robots-Tag HTTP header? Here, we’ll discuss what the possibilities are and how this might be a better …Read: "Playing with the X-Robots-Tag HTTP header"
In 2015, Google Search Console already started to actively warn webmasters not to block CSS and JS files. In 2014, we told you the same thing: don’t block CSS and JS files. We feel the need to repeat this message now. In this post, we’ll explain why you shouldn’t block these specific files from Googlebot. Why you …Read: "Don’t block CSS and JS files"
What is a crawl budget? Crawl budget is the number of pages Google will crawl on your site on any given day. This number varies slightly from day to day, but overall it’s relatively stable. The number of pages Google crawls, your “budget”, is generally determined by the size of your site, the “health” of your …Read: "Crawl budget optimization"
The robots.txt file is a file you can use to tell search engines where they can and cannot go on your site. Learn how to use it to your advantage!Read: "robots.txt: the ultimate guide"
The canonical URL allows you to tell search engines that certain similar URLs are actually one and the same. Learn how to use rel=canonical!Read: "rel=canonical: the ultimate guide"
The robots.txt file is a very powerful file if you’re working on a site’s SEO. At the same time, it also has to be used with care. It allows you to deny search engines access to certain files and folders, but that’s very often not what you want to do. Over the years, especially Google changed …Read: "WordPress robots.txt example for great SEO"
Browse other categories
AnalyticsAll about analytical tools and how to use them »
Content SEOContent is King! Read about it »
eCommerceLearn how to optimize your shop »
SEO basicsSEO made easy for you »
Social MediaOur insights on how to use them best »
Technical SEOTechnically optimize your site to the fullest »
User eXperience (UX)How to give users an awesome experience »
WordPressRead about WordPress plugins, updates and SEO »