Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Noindex vs. page removal - Panda recovery
-
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site?
I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm?
Thanks very much in advance for your thoughts, and corrections on my assumptions
-
I think it can get pretty complicated, but a couple of observations:
(1) In my experience, NOINDEX does work - indexation is what Google cares about primarily. Eventually, you do need to trim the crawl paths, XML sitemaps, etc., but often it's best to wait until the content is de-indexed.
(2) From an SEO perspective (temporarily ignoring Panda), a 301 consolidates link juice - so, if a page has incoming links or traffic, that's generally the best way to go. If the page really has no value at all for search, either a 404 or NOINDEX should be ok (strictly from an SEO perspective). If the page is part of a path, then NOINDEX,FOLLOW could preserve the flow of link juice, whereas a 404 might cut it off (not to that page, but to the rest of the site and deeper pages).
(3) From a user perspective, 301, 404, and NOINDEX are very different. A 301 is a good alternative to pass someone to a more relevant or more current page (and replace an expired one), for example. If the page really has no value at all, then I think a 404 is better than NOINDEX, just in principle. A NOINDEX leaves the page lingering around, and sometimes it's better to trim your content completely.
So, the trick is balancing (2) and (3), and that's often not a one-sized fits all solution. In other words, some groups of pages may have different needs than others.
-
Agreed - my experience is that NOINDEX definitely can have a positive impact on index dilution and even Panda-level problems. Google is mostly interested in index removal.
Of course, you still need to fix internal link structures that might be causing bad URLs to roll out. Even a 404 doesn't remove a crawl path, and tons of them can cause crawler fatigue.
-
I disagree with everyone
The reason panda hit you is because you were ranking for low quality pages you were telling Google wanted them to index and rank.
When you
a) remove them from sitemap.xmls
b) block them in robots.txt
c) noindex,follow or noindex, nofollow them in metas
you are removing them from Googles index and from the equation of good quality vs low quality pages indexed on your site.
That is good enough. You can still have them return a 200 and be live on your site AND be included in your user navigation.
One example is user generated pages when users signup and get their own URL www.mysite.com/tom-jones for example.Those pages can be live but should not be indexed because they have no content usually other than a name.
As long as you are telling Google - don't index them I don't want them to be considered in the equation of pages to show up in the index, you are fine with keeping these pages live!
-
Thanks guys
-
I would agree noindex is not as good as removing the content but it still can work as long as there are no links or sitemaps that lead Google back to the low quality content.
I worked on a site that was badly affected by Panda in 2011. I had some success by noindexing genuine duplicates (pages that looked really alike but did need to be there) and removing low quality pages that were old and archived. I was left with about 60 genuine pages that needed to be indexed and rank well so I had to pay a copywriter to rewrite all those pages (originally we had the same affiliate copy on there as lots of other sites). That took about 3 months for Google to lift or at least reduce the penalty and our rankings to return to the top 10.
Tom is right that just noindexing is not enough. If pages are low quality or duplicates then keep them out of sitemaps and navigation so you don't link to them either. You'll also nned redirects in case anyone else links to them. In my experience, eventually Google will drop them from the index but it doesn't happen overnight.
Good luck!
-
Thanks Tom
Understand your points. The idea behind noindexing is that you're telling Google not to take any notice of the page.
I guess the question is whether that works:
- Not at all
- A little bit
- A lot
- Is as good as removing the content
I believe it's definitely not as good as actually removing the content, but not sure about the other three possibilities.
We did notice that we got a small improvement in placement when we noindexed a large amount of the site and took several hundred other pages actually down. Hard to say which of those two things caused the improvement.
We've heard of it working for others, which is why I'm asking...
Appreciate your quick response
Phil
-
I don't see how noindexing pages would help with regards to a Panda recovery if you're already penalised.
Once the penalty is in place, my understanding is that it will remain so until all offending pages have been removed or changed to unique content. Therefore, noindexing would not work - particularly if that page is accessible via an HTML/XML sitemap or a site navigation system. Even then, I would presume that Google will have the URL logged and if it remained as is, any penalty removable would not be forthcoming.
Noindexing pages that has duplicate content but hasn't been penalised yet would probably prevent (or rather postpone) any penalty - although I'd still rather avoid the issue outright where possible. Once a penalty is in place, however, I'm pretty sure it will remain until removed, even if noindexed.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Customer Reviews on Product Page / Pagination / Crawl 3 review pages only
Hi experts, I present customer feedback, reviews basically, on my website for the products that are sold. And with this comes the ability to read reviews and obviously with pagination to display the available reviews. Now I want users to be able to flick through and read the reviews to help them satisfy whatever curiosity they have. My only thinking is that the page that contains the reviews, with each click of the pagination will present roughly the same content. The only thing that changes is the title tags which will contain the number in the H1 to display the page number. I'm thinking this could be duplication but i have yet to be notified by Google in my Search console... Should i block crawlers from crawling beyond page 3 of reviews? Thanks
Technical SEO | | Train4Academy.co.uk0 -
Low page impressions
Hey there MOZ Geniuses; While checking my webmaster data I noticed that almost all my Google impressions are generated by the home page, most other content pages are showing virtually no impression data <50 (the home page is showing around 1500 - a couple of the pages are in the 150-200 range). the site has been up for about 8 months now. Traffic on average is about 500 visitors, but I'm seeing very little entry other then the home page. Checking the number Sitemap section 27 of 30 are index Webmaster tools are not reporting errors Webmaster keyword impressions are also extremely low 164 keywords with the highest impression count of 79 and dropping from there. MOZ is show very few minor issues although it says that it crawled 10k pages? -- we only have 30 or so. The answer seems obvious, Google is not showing my content ... the question is why and what steps can I take to analyze this? Could there be a possibility of some type of penalty? I welcome all your suggestions: The site is www.calibersi.com
Technical SEO | | VanadiumInteractive0 -
How to determine which pages are not indexed
Is there a way to determine which pages of a website are not being indexed by the search engines? I know Google Webmasters has a sitemap area where it tells you how many urls have been submitted and how many are indexed out of those submitted. However, it doesn't necessarily show which urls aren't being indexed.
Technical SEO | | priceseo1 -
What is the best way to find missing alt tags on my site (site wide - not page by page)?
I am looking to find all the missing alt tags on my site at once. I have a FF extension that use to do it page by page, but my site is huge and that will take forever. Thanks!!
Technical SEO | | franchisesolutions1 -
Determining When to Break a Page Into Multiple Pages?
Suppose you have a page on your site that is a couple thousand words long. How would you determine when to split the page into two and are there any SEO advantages to doing this like being more focused on a specific topic. I noticed the Beginner's Guide to SEO is split into several pages, although it would concentrate the link juice if it was all on one page. Suppose you have a lot of comments. Is it better to move comments to a second page at a certain point? Sometimes the comments are not super focused on the topic of the page compared to the main text.
Technical SEO | | ProjectLabs1 -
Adding 'NoIndex Meta' to Prestashop Module & Search pages.
Hi Looking for a fix for the PrestaShop platform Look for the definitive answer on how to best stop the indexing of PrestaShop modules such as "send to a friend", "Best Sellers" and site search pages. We want to be able to add a meta noindex ()to pages ending in: /search?tag=ball&p=15 or /modules/sendtoafriend/sendtoafriend-form.php We already have in the robot text: Disallow: /search.php
Technical SEO | | reallyitsme
Disallow: /modules/ (Google seems to ignore these) But as a further tool we would like to incude the noindex to all these pages too to stop duplicated pages. I assume this needs to be in either the head.tpl or the .php file of each PrestaShop module.? Or is there a general site wide code fix to put in the metadata to apply' Noindex Meta' to certain files. Current meta code here: Please reply with where to add code and what the code should be. Thanks in advance.0 -
Thoughts about stub pages - 200 & noindex ok, or 404?
With large database/template driven websites it is often possible to get a lot of pages with no content on them. What are the current thoughts regarding these pages with no content, options; Return a 200 header code with noindex meta tag Return a 404 page & header code Something else? Thanks
Technical SEO | | slingshot0