You are reading the article Google Shares Guidance On Pagination For Seo updated in March 2024 on the website Moimoishop.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested April 2024 Google Shares Guidance On Pagination For SeoWhen to Use Pagination
Here is the question:
“I’m having issues with canonicals on my site and I believe it’s due to the angular JS used in the HTML. For thousands of pages on the site Google is ignoring the canonical link and selecting their own, the issue is the page Google is selecting is incorrect.”
The problem with that question is that the OP is using the wrong tool for solving a problem it wasn’t designed to solve.
John Mueller answered:
“That’s basically a question of how to handle pagination…
Some sites feel all pages in a paginated series are important, so they keep them indexed (the fancy ones using rel-next/prev). Some sites cap paginated series at a certain number, perhaps letting the first one get indexed, and the rest not.
The decision is also sometimes based on the content of the paginated series. For example, if it’s a list of linked detail pages, then you could decide by whether or not you can reach all pages even if you don’t have the full paginated set indexed (if you cross-link to related posts/products, then usually that’s the case).”
John Mueller then warned about using rel=canonical to try to force Google to stick to the first page of a series. That’s actually the wrong way to do it.
Here’s how John Mueller explained it:
“The main thing to avoid, since this post is about canonicalization, is to use the rel=canonical on page 2 pointing to page 1. Page 2 isn’t equivalent to page 1, so the rel=canonical like that would be incorrect, practically speaking. Short of page 2 potentially being indexed, it wouldn’t break anything significantly though.”
John Mueller then said that normal indexing should typically be able to handle a few groups of paginated content that may exist on a website:
“If there are just a few paginated sets across your site like this (which sounds like it might be the case), then I wouldn’t sweat it and just let them get indexed normally, without any special rel=prev/next linking or no-indexing.”
This is interesting because it seems to suggest that the rel=prev/next is more useful for Google in situations where there’s a lot of paginated content. This is typically the case in active forum discussions, where a great many discussions could go on for many pages.
“Use rel=”next” and rel=”prev” links or headers to indicate the relationship between component URLs. This markup provides a strong hint to Google that you would like us to treat these pages as a logical sequence, thus consolidating their linking properties and usually sending searchers to the first page.”
The key takeaways about rel next/previous is:
Indicates are relationship between the URLs
Behaves like a canonical tag in that it’s a hint, not a directive. A canonical or a rel-previous/next are suggestions and hints.
Inbound links to the various pages are consolidated.
Google usually sends searchers to the first page of the paginated content.Do Not Use Rel=Prev/Next for an Article Series
Google’s official guidance warns against using rel=prev/next for a series of articles. For example, if you create a group of articles related to how to groom a dog and you create separate articles for cutting their nails, brushing and so on. Each of those articles, although they are a part of a series, should not be joined together using pagination.
According to Google, Rel=Prev/Next is meant for use on a single article (or document), not for a series of articles. Here’s what Google’s developer’s page states:
“You should not use this technique merely to indicate a reading list of an article series; you should use this to indicate a single long piece of content that is broken into multiple pages.”
That’s good information about pagination. SEO has so many technical details and this is one of them. It’s good to have a periodic check up because some of them, like pagination, come in handy.
Read the entire Reddit discussion here.More Resources
Images by Shutterstock, Modified by Author
You're reading Google Shares Guidance On Pagination For Seo
Google published a new video about site migrations. John Mueller offered insights into how Google handles website migrations and how long they can take. The major takeaways are that site migrations can be difficult and that a comprehensive plan needs to be in place before the migration.
The video begins with a question:
“We’re currently going through a site migration and we’d also like to restructure the URLs on the site. Does this impose any risks?”
Migrating a site typically means changing the domain name, sometimes because the company is merged with another one or because the branding changed.
Joining two sites together are the trickiest because you have to choose what URLs will remain and which will be merged into existing pages that are similar.
John Mueller answered:
“Unfortunately, while this may at first sound like a small change within a website, it’s not that simple for search engines.
In particular, search engines like Google store their index on a per-page basis.
So if you change the address or the URL of a page, that page’s data has to be forwarded somehow, otherwise it gets lost.
It doesn’t matter if you’re completely rebuilding a website or if you’re just removing a slash from the end of URLs. These are all essentially site moves.”John Mueller Offers Site Migration Tips 1. Research the Options and Potential Effects
Site moves can be disruptive so it’s important to plan out the move by mapping one site to another. One way to do it is to divide the two sites into sections and see if sections can map to each other.
From there it’s a matter of mapping URLs one to one and deciding which URLs cannot be moved to the new site and should resolve to a 404 response, which can be tough if there are links pointing to those pages. Which is why it’s important to plan ahead, thoroughly.Screenshot of Google’s John Mueller 2. Create a List of Old and New URLs
This is an important step.
According to John:
“…This tip will help help you to track and check the changes afterward.”
Once you’ve got the redirects in place and the new URLs up, you can check the work by uploading the list of the old site structure to Screaming Frog so that it can crawl the URLs.
Screaming Frog will crawl the old URLs in the list and show you which URLs are redirecting to the new URLs and which are not and returning a 404 page not found error response code.
The 404 URLs may be the URLs that didn’t make it over to the new site (if they aren’t the URLs planned on being purposely dropped).
You’ll have to determine if the 404 is the correct response (you meant to do that) or if the URL was unintentionally left out of the site migration and needs to be mapped to a new URL.
Related: The Ultimate Guide for an SEO-Friendly URL Structure3. Implement the Migration
“301 redirect all the old URLs to the new ones, also update all internal mentions such as:
and the chúng tôi file”
Related: Site Migration Issues: 11 Potential Reasons Traffic Dropped4. Monitor the Migration
“Check all pages for the redirect. In Google’s search console report you should see a quick change for the most important pages and then a slower change as our systems reprocess the rest.”
Mueller cautioned that this last part can take months to finish. He’s talked about how determining overall site quality can take months. Google has to basically learn what a site is about, including site quality and to understand where the site fits within the Internet.Mueller Recommends Leaving Redirects for At Least 1 Year
Google’s John recommended leaving the redirects in place for at least one year.
In my experience, it may be necessary to consider leaving the redirects in place for longer than a year. The reason is because old URLs that have links from other sites pointed at them will become broken links if the redirects are removed.
You can create an outreach to contact sites that are linking to you and ask them to fix the links to point to the changed URLs.
But you have to be aware that doing this kind of outreach can backfire because some sites, for many reasons, may decide to remove the link altogether.
Also, there may be links out there that you don’t know about, so you can never be sure that you had all the inbound links updated. So for that reason, it may be necessary to keep those redirects in place and be ready to update them should some of the URLs change again, to avoid creating chained redirects.
And chained redirects is a reason why you might not want to keep redirects up permanently.
A chained redirect is when an old URL redirects to another old URL which itself redirects to another old URL before it redirects to the final URL. Over the years this can create a chain of redirects which becomes problematic for crawling.Site Migrations are Tough Citation Watch John Mueller Offer Site Migration Tips
Google updated the structured data guidance to better emphasize that all three structured data formats are acceptable to Google and also explain why JSON-LD is is recommended.
The updated Search Central page that was updated is the Supported Formats section of the Introduction to structured data markup in Google Search webpage.
The most important changes were to add a new section title (Supported Formats), and to expand that section with an explanation of supported structured data formats.Three Structured Data Formats
Google supports three structured data formats.
But only one of the above formats, JSON-LD, is recommended.
According to the documentation, the other two formats (Microdata and RDFa) are still fine to use. The update to the documentation explains why JSON-LD is recommended.
Google also made a minor change to a title of a preceding section to reflect that the section addresses structured data vocabulary
The original section title, Structured data format, is now Structured data vocabulary and format.
Google added a section title the section that offers guidance on Google’s preferred structured data format.
This is also the section with the most additional text added to it.New Supported Formats Section Title
The updated content explains why Google prefers the JSON-LD structured data format, while confirming that the other two formats are acceptable.
Previously this section contained just two sentences:
“Google Search supports structured data in the following formats, unless documented otherwise:
Google recommends using JSON-LD for structured data whenever possible.”
“Google Search supports structured data in the following formats, unless documented otherwise.
In general, we recommend using a format that’s easiest for you to implement and maintain (in most cases, that’s JSON-LD); all 3 formats are equally fine for Google, as long as the markup is valid and properly implemented per the feature’s documentation.
In general, Google recommends using JSON-LD for structured data if your site’s setup allows it, as it’s the easiest solution for website owners to implement and maintain at scale (in other words, less prone to user errors).”Structured Data Formats
JSON-LD is arguably the easiest structured data format to implement, the easiest to scale, and the most straightforward to edit.
Most, if not all, WordPress SEO and structured data plugins output JSON-LD structured data.
Nevertheless, it’s a useful update to Google’s structured data guidance in order to make it clear that all three formats are still supported.
Google’s documentation on the change can be read here.
Featured image by Shutterstock/Olena Zaskochenko
Google Webmaster Tools – A guide for marketers and site owners What is Google Webmaster tools (GWT)
Google Webmaster Tools is a system built by Google that gives you feedback on your website from how Google sees it. It shows everything from phrases used to find your site through to pages it can”€™t distinguish or access through internal and external links.Why is it important?
With Google accounting for over 90% of searches in the UK and many other European countries, any insights that Google provides about the effectiveness of your website are worth reviewing. Google Webmaster Tools alerts you to how Google sees your website & alerts you to problems it finds..
Online businesses often overlook the basic aspects of natural search management, but with this simple interface you can quickly see if you are ticking all the boxes.About this marketers guide
SEO specialists will be aware of these features in Google Webmaster Tools and others beside – please let us know what you see as important!
In this guide we”€™ll show you how to get the most from it in these ten steps. For full details, examples and screengrabs download the PDF at the end of the 10 steps.Step 1. Setup and verification.
A necessary evil for gaining access to the insights that Google Webmaster tools offer. We have put together a simple guide to help you through the process. Google offer a similar one too!
You can remove Sitelinks if you don’t like an individual one at this stage – which is often handy!Step 2. Review current keyphrase ranking Step 3. Site indexing effectiveness audit including: Step 4. Sitemaps Step 5. Robots.txt
It may be a little ‘old school’ but the chúng tôi file and be your friend as much as it can be your enemy. Is your file working hard for you and your website by allowing search engines to focus on the content that is most relevant to it. In this section we cover the tools Google have gifted us to test & create chúng tôi file as well as things to consider to improve your use of chúng tôi for your business.Step 6. Crawl errors
Technology often lets us down and websites are no different. As sites develop and grow you tend to find broken links, pages that display errors etc. Especially for bigger sites this can become difficult to manage. In this section we introduce Google’s tool whch displays and informs you of errors they encountered on your website. While its better to be proactive than reactive this tool can help you stay on-top of what can be a tiresome task.Step 7. Three Ws (canonical URLs)
Canonical URL’s were an appreciated gift from Google. With various content management systems that are widely available are extremely good (for all the right reasons) at creating duplicate content on your behalf. In this section we cover tools that allow you to manage duplicate content on a website level as well as a page by page.Step 8. Site performance
As the speed of the internet has evolved so have websites, more images, videos etc etc has meant slow loading pages and frustrating experiences for all of us at some point. In this section we look at how your site performs and how you can use Google Webmaster tools to identify issues and move forwards with solutions.Step 9. Inbound Link Analysis
A crucial part of any natural search strategy. This section covers Google’s insights into the links into your website including things like where the links come from & the anchor text of your links.Step 10. HTML Suggestions
One of the lesser used sections of Google Webmaster tools this area allows you to manage basic on-page optimisation tactics as it gives you data surrounding missing / duplicate title & description tags. A key part to your on-page natural search efforts.
You can download the guide here or view it in Scribd below.
SEO Back to Basics : Google Webmaster Tools
SEO & Linking for Yahoo, MSN or Google?
This week some SEO sites have revisited the old argument of whether or not to target all search engines with a site, or focus the SEO techniques for one specific search engine. Although when representing a business entity, white gloves optimization which should attract respectable rankings among all search engines is a good rule of thumb.
However, for more sales, product, or niche target demographic oriented sites; optimizing for only one search engine (or the Yahoo-MSN tag team of average head of household, money spending Internet users) – could result in high ranking and enhanced conversions.
Andy Hagans discusses the technique of targeting site SEO towards the algorithm of one search engine over another in terms of affiliate site SEO:
Are we getting to a point where it’s more cost effective for affiliates to take an either/or approach to ranking in Google or MSN/Yahoo! ? In my experience, if you target both, it’s hard to do GREAT in either (For simplicity’s sake I’m lumping Yahoo! and MSN together as they both seem to reward linking in the same general way.)
MSN/Yahoo! is my bread and butter for new affiliate sites. The tradeoff always comes down to this: I can do XYZ and it will probably screw the site in Google, when it may have ranked well there 2 years down the road — but I’ll rank in MSN in a month, and Yahoo! in three months. Or I can skip XYZ which will make ranking in Yahoo!/MSN impossible but hey who knows in 2 years Google may want to rank my site! Doesn’t even seem like much of a choice, to me…
Barry Schwartz discusses some techniques used for Yahoo oriented SEO which may differentiate a bit from sites looking for high rankings via link building on Google:
* KEY: Try to get the attention of local media
I’ll dig a bit deeper into what Barry is listing.
In my opinion, the emphasis on getting the ‘attention of local media’ is quite important for search engines like Yahoo and Google which are working on methods of changing the results served to users based upon user profiling via gender, region and online behavior.
Article distribution as a link building tool, make that mass article distribution, is a bit more effective for building Yahoo backlinks than it is for Google, as Google has a tendency to identify syndicated duplicate content and omit it from its results.
Directory links also tend to lean more towards Yahoo & MSN SEO than Google, as some in the SEO community tend to reap the rewards of focusing on building site value and definition via targeted directory listings.
Other link building techniques which in my experience have high value in Yahoo include:
And what about linking for Google?
My opinion is that hard work, link baiting, slow and targeted linking, and most importantly building yourself or your business as an authority on a subject both on and offline are the ways to differentiate your site from the rest.
This week for Ask An SEO, Vimal of London reached out concerning his business and his choice of a hosting company and its location. Vimal asks:
“I have a question with respect to the server location. My business is 100% UK based and particularly around a 50km radius around London. I recently migrated from GoDaddy to SiteGround. The questions I have are as per follows:
Given my geographical business focus, does the location of the host make a difference?
When I select my host in Google to verify my site, it has GoDaddy listed but not SiteGround. So I select “other”. I raise this because it makes me feel that Google has a preference to GoDaddy I do not mind switching back.
My competitors are typically IP Server and Geolocation UK based but I don’t know the specifics. I have read on the internet from Google representatives that Server/host location does make a difference but the article I read is 6-7 years old.”
There are two aspects to consider about the location your hosting company uses to host your website as it relates to your organic rankings.Site Speed
The first aspect is the effect it has on your site speed.
The closer a site visitor is to the datacenter that hosts your website, the quicker it will load for them.
Being a local business, I always recommend finding a local datacenter in or close by to the city my client works in and targets.
While it has not been proven that search engines use server location as a ranking factor, there has been some strong evidence to show that the IP address of a website can affect the rankings of the site.
For instance, if it has an IP address assigned that belongs to the wrong country.
Most people just choose a hosting company based on name or recommendation, and for mass hosts, you get placed on a random server with a shared IP address and put little thought into where that server resides or the geolocation of the IP address.
I suspect if you have a chúng tôi domain name, but your web server and its IP address are located in the United States, Google, and other search engines would figure that out, and it won’t be an issue.
For this reason, I would personally choose to take the extra step to make sure your server and IP address are assigned to your own country.
I’d recommend reaching out to your hosting company and confirm that your site is hosted on one of their UK servers and, if not, request that to be fixed ASAP.Verifying a Domain Property in Search Console
As for your second question, you are referring to the process that Google goes through to verify a domain property in Google Search Console.
Google does have a list of companies available to connect to in order to verify your site by adding a DNS record.
DNS settings are usually, though not always, managed at your domain’s registrar.
GoDaddy, along with the other companies listed, are some of the more common domain registrars out there, which is why they made the list.
Yes, some people do manage their DNS and domain name registration at their hosting company, but it’s not as common practice.
Google could only integrate so many options so don’t let the fact that your hosting company is not listed here lead you to assume that Google has a preference for any of these companies.
Especially not in a way that translates to your rankings in any way, shape, or form.
Update the detailed information about Google Shares Guidance On Pagination For Seo on the Moimoishop.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!