How to avoid duplicate content when WoodMart Filter and pagination exist together

When Filter and Pagination are enabled at the same time, search engines can easily catch a large number of pages with highly similar content.These pages are useful for users, but a hidden danger for SEO. Duplicate content is not WoodMart The "error" is that the default behavior is not restricted for search engines. To solve the problem, you need to start from the URL structure, indexing control and theme settings at three levels.

Picture[1]-WoodMart Filter + paging content duplication? A set of practical program to completely solve the SEO hidden danger

I. How content duplication arises

When a user uses a filter on a category page, the URL usually takes this form:

  • /shop/?filter_color=black
  • /shop/?filter_color=black&page=2
  • /shop/?page=2&filter_color=black

From a user's perspective, this is a different way of viewing the same set of items. From a search engine perspective, this isMultiple different URLs with highly similar body content.. Common sources of duplication include:

  • Only the parameters are different, the order or quantity of products are similar.
  • Filter + paging combination to generate a large number of accessible URLs
  • Sorting parameters, price range parameters involved in indexing

If left unchecked, search engines will waste crawl budgets and even lower the overall weight of the category page.

Second, first distinguish which pages "should be included".

Picture [2]-WoodMart Filter + paging content duplication? A set of practical program to completely solve the SEO hidden danger

There needs to be a clear standard of judgment before dealing with it.

Pages suitable for indexing

  • main category page
  • Pagination without filters (e.g. page 2, page 3)
  • Identify core categories with search value

Pages not suitable for indexing

  • Arbitrary Filter Parameter Combination Page
  • Filter + paging deep pages
  • Sort only changed URLs

This step is a prerequisite for all settings.

Third, through the noindex control Filter page

Picture [3]-WoodMart Filter + paging content duplication? A set of practical program to completely solve the SEO hidden danger

The most direct and stable method is to let the search engine "see but not included".

There is a unity of thought in the approach:

  • As long as the URL contains a filter parameter, set the noindex, follow
  • Allow search engines to crawl the links, but not the page itself

There are three common paths to realization:

  • Rules for URL parameters using SEO plugins
  • be directed against ?filter_,?query_type_ This type of parameterization noindex
  • Retain normal indexing state for main categories and pagination

The advantage of this is:

  • Does not affect user screening
  • No blocking of internal link weight transfer
  • No 404s or redirects to confuse

Fourth, to avoid paging and screening superimposed on the indexed

Pagination is not necessarily a problem in and of itself.The problem is that the pagination and filtering overlay is treated as a separate pageThe

Common treatment principles are:

  • permissible /category/page/2/ indexed
  • discourage ?filter=xxx&page=2 This type of combination goes into the index

If the SEO plugin supports rule determination, it can be controlled directly by URL parameters.
If it is not supported, it is preferred to ensure that the filter page noindex, and paging will naturally be overridden.

Fifth, do not use robots.txt direct blocking screening page

Picture [4]-WoodMart Filter + paging content duplication? A set of practical programs to completely solve the SEO hidden danger

This is an easy point to step on. the problem with robots.txt is:

  • Blocking crawling, but not necessarily indexing
  • Search engines may still retain discovered URLs
  • Internal link weights get cut off

A more prudent strategy is:

  • robots.txt is only used to block obvious system-type parameters
  • Content-level filter pages, handled by noindex

In other words.Control indexing, not brute force disabling accessThe

VI. WoodMart's own settings should also be coordinated with

Picture [5]-WoodMart Filter + paging content duplication? A set of practical program to completely solve the SEO hidden danger

WoodMart Provides more options related to filtering, but defaults to "function first".

Focused inspections are recommended:

  • Whether the Filter is AJAX-based (AJAX mode is better for reducing URL exposure)
  • Is "Clean URL" turned on but not working with SEO rules?
  • Whether there are multiple sorting parameters in effect at the same time on the sort page

If filters are used only for the user experience and not as a source of search traffic, minimize the chances of them generating "indexable URLs".

VII. Summary

exist WoodMart + WooCommerce scenarios are more stable:

  • Normal indexing of main category pages and plain pagination pages
  • Uniform noindex for all pages with filter parameters
  • Forced blocking of filtered URLs without robots.txt
  • Avoid screening page participation in sitemaps
  • Ensure that the internal linking structure remains fluid

This will not affect the user's use of filtering, but also to avoid the search engine to fall into content duplication.


Contact Us
Can't read the tutorial? Contact us for a free answer! Free help for personal, small business sites!
Customer Service
Customer Service
Tel: 020-2206-9892
QQ咨询:1025174874
(iii) E-mail: info@361sale.com
Working hours: Monday to Friday, 9:30-18:30, holidays off
© Reprint statement
This article was written by WoW
THE END
If you like it, support it.
kudos144 share (joys, benefits, privileges etc) with others
commentaries sofa-buying

Please log in to post a comment

    No comments