HTTP/2 is a newish protocol for transporting data that will drastically speed up the web and can help your SEO. The ‘old’ HTTP1.1 protocol only allows web servers to send files down a single line one at a time, and that line has to open and close after each file has been sent – a …Read: "What is HTTP/2?"
Want to learn about HTTP headers?
Take our Technical SEO training!
- Which technical SEO errors hurt your site?
- Solve them and climb the rankings!
- Improve your site speed on the go
- On-demand SEO training by Yoast
Or just read some posts about http headers
Site speed is crucial. There are several ways to improve how fast your site loads. One of them is moving to the newish HTTP/2 protocol. This makes sure that your connection gets a nice speed boost. Find out what HTTP/2 is and how it can help you!
Must read articles about HTTP headers
Using robots.txt has its disadvantages, the X-Robots-Tag HTTP header can help you with those in particular cases, find some examples here.
An HTTP 503 header is a very useful tool for site maintenance. This post explains why and gives some pro tips on how to use it!
Yoast SEO Premium, as of version 3.1, allows you to set an HTTP 451 header when content has been blocked for legal reasons.
HTTP status codes are important to determine how your site is accessed by visitors and spiders alike. Find what all these codes mean for SEO.
Recent HTTP headers articles
Sometimes, your site will need some downtime, so you can fix things or update plugins. Most of the time, this tends to be a relatively short period in which Google will most likely not attempt to crawl your website. However, in the case that you need more time to get things fixed, chances are much …Read: "HTTP 503: Handling site maintenance correctly for SEO"
We’ve said it in 2009, and we’ll say it again: it keeps amazing us that there are still people using just a robots.txt files to prevent indexing of their site in Google or Bing. As a result, their site shows up in the search engines anyway. You know why it keeps amazing us? Because robots.txt …Read: "Preventing your site from being indexed, the right way"
HTTP status codes, like 404, 301 and 500, might not mean much to a regular visitor, but for SEOs they are incredibly important. Not only that, search engine spiders, like Googlebot, use these to determine the health of a site. These status codes offer a way of seeing what happens between the browser and the server. …Read: "HTTP status codes and what they mean for SEO"
Traditionally, you will use a robots.txt file on your server to manage what pages, folders, subdomains or other content search engines will be allowed to crawl. But did you know that there’s also such a thing as the X-Robots-Tag HTTP header? Here, we’ll discuss what the possibilities are and how this might be a better …Read: "Playing with the X-Robots-Tag HTTP header"
At the end of last year, a new HTTP status code saw the light. This status code, HTTP 451, is intended to be shown specifically when content has been blocked for legal reasons. If you’ve received a take down request, or are ordered by a judge to delete content, this is the status code that …Read: "HTTP 451: content unavailable for legal reasons"