Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long does google take to show the results in SERP once the pages are indexed ?
-
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions -
A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap.
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
An answer from SEO experts will be highly appreciated. Thnx !
-
No problem my friend. You are most welcome. As most of your site gets served through https, you need to have your http version of URLs re-directed to their https equivalents. I repeat, HTTP to HTTPS. Make sure that the re-direction gives an HTTP header status message 301 and not anything else. If you do so you do not loose any of your efforts put in to building links to the https version.
You can check the HTTP header status messages for your URLs by using any of the tools like the one found here: http://web-sniffer.net
Best regards,
Devanur Rafi.
-
Hey thanks Moosa.
-
Hello Devanur,
Thanks for the prompt reply. Never knew that http & https would be so much of a trouble. Will get this one resolved. Btw, I just wanted to know whether after making this changes (https to http) will the link value be passed/ redirected from https to http or will I have loose the entire effort made on https pages? Thanks again. Awaiting your reply
Regards,
PepMoBot
-
Sorry but I am little lazy at writing so i will try to keep it short and simple
There is no time of it... but your website should be appear for branded terms like if your website s www.exampleABC.com ... your website should at least appear against “example ABC”. If you want to target more keywords and you want your website to appear against them then other then optimized pages you need some targeted links pointing back to your website.
-
Hi,
there is no fixed time after which or under which an indexed page starts appearing in the SERPs.
I just checked your sitemap.xml file and it has only the https version of the URLs. In the index, I saw non https version is URLs are also listed. So there is no consistency. You have decided to serve the entire site in https and parts of it are still non-https. Serving the pages in https puts an overhead on your server. This might result in poor page loading times. If you have good resources on the server side, then this should not be a problem.
Though guys at Google say they don't care if the URLs are https or http when in comes to ranking but here I would like to mention as site loading time is an official ranking factor, when Google comes across two similarly capable and eligible pages competing for the same keyword, the one that has better loading times will be favored. By the way, can you let me know the reason behind serving the entire site in https?
Your linking profile is not at all consistent. You build links to http://www.peprismine.com and https://www.peprismine.com
Please beware that http://www.peprismine.com, though it takes you to https://www.peprismine.com, it does not give an http header status 301 instead it gives a status 200 message. This should be fixed immediately. If you get this fixed, I think you should be fine technically but be careful with pages being served over SSL as this tends to screw the page loading times sometimes. You might want to look in to this. Don't blindly go by the page speed test scores instead, look at the actual page loading times. You can do a test here: http://www.urivalet.com and also go ahead and perform a test at webpagetest.org and check out the performance review section.
Best regards,
Devanur Rafi.
-
Hi Devanur,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa,
Thanks for the reply. I have posted a query below (In continuation to my previous query).... Would be good if you could let me know
-
Hi Moosa & Devanur,
Thanks for your responses. However, I would like to know some more information on my 1st query
After making the necessary changes to our web pages, how long will it usually take to rank for particular keyword/ keywords (Assuming we have optimized these pages, as per the requirement). I read in some websites, that it will take minimum of 1 month, after the indexing is done. Is this really true or a myth? What have been your experiences?
P.S: I'm unable to see my url for any of my keywords yet
(Not even in the last page too)
Regards,
PepMozBot
-
Hi there,
Straight into the meat:
1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ?
A. Once the pages are in the index they become eligible to appear in the SERPs but, where they appear, on which page and in which position will they appear depends on lot of factors like the competition for the search term, your content, the back links that you have and the list goes on.
2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ?
A. To a little extent and in some cases yes, but this again depends on the quality (in terms of relevance, uniqueness, originality etc, etc, ) of the content on a website, the quality of its link popularity and all the other 200+ factors that Google considers before positioning a website in the SERPs. To put it straight, you do not need to worry about the number of pages if your content is of pristine quality and highly relevant as per Google.
3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page)
A. If the URLs being removed had duplicate content then in this case you will not have any negative effect.
Over a period of time, gradually, on an as and when required basis, keep adding pages that target one search term per page with relevant, unique and up-to-date content. This will result in a positive change in your organic traffic numbers. And very importantly, do not build links desperately from all the places. Earn links, that is what I would say, you have to earn links by giving a reason for your visitors to visit your website.
1. Try to earn links from authority sites in your niche. Links like this fall in the tier 1 category.
2. Get links from generic authority websites (like Wikipedia) by posting quality content. This would be your tier 2.
3. Get links from similar theme (sites that operate in your niche) websites. These links can be your tier 3.
4. Finally, earn links from generic web properties like forums, blogs, social networking sites, social bookmarking sites etc. These would be your tier 4 links.
A very important thing to keep in mind while doing the above is, "the quality of the content being posted". Be specific and try to address an issue or provide a solution in your posts. Never engage in low quality link exchanges and bulk link building. Above all, keep asking yourself all the time, "why should anyone visit my website?", "what can I do to make a visitor's visit to my website worthwhile" and " what should I do to make my website to give a better user experience or a better advantage than that of my competitors?"
With questions like the above, you will be able to secure a good longstanding and enduring position in the SERPs for your website.
Also be an active participant in social sites to attract good social buzz. Social signals are very good for your search engine optimization efforts and they can give a boost to your SEO efforts.
Wish you good luck.
Best regards,
Devanur Rafi.
-
Ok, when you said the URLs are indexed, this simply means they are appearing in SERPs, you can type in your exact URL in the Google search bar to see if the page is appearing or not... appearing against a keyword is a completely different topic it has nothing to do with indexing alone.
It’s good to have more pages but if more pages are not producing any value and your overall website is getting low value then you should prefer to go with less pages and more value.
Removal or Change of URL can have an impact on rankings... for instance one of your URL is ranking on first page for some “XYZ” keyword if you are going to change or remove the URL it will obviously going to lost its rankings...
It is always recommended to add 301 from old domain to new domain when changing or removing the URL.
Hope this helps...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
Home page suddenly dropped from index!!
A client's home page, which has always done very well, has just dropped out of Google's index overnight!
Intermediate & Advanced SEO | | Caro-O
Webmaster tools does not show any problem. The page doesn't even show up if we Google the company name. The Robot.txt contains: Default Flywheel robots file User-agent: * Disallow: /calendar/action:posterboard/
Disallow: /events/action~posterboard/ The only unusual thing I'm aware of is some A/B testing of the page done with 'Optimizely' - it redirects visitors to a test page, but it's not a 'real' redirect in that redirect checker tools still see the page as a 200. Also, other pages that are being tested this way are not having the same problem. Other recent activity over the last few weeks/months includes linking to the page from some of our blog posts using the page topic as anchor text. Any thoughts would be appreciated.
Caro0 -
Wrong meta descriptions showing in the SERPS
We recently launched a new site on https, and I'm seeing a few errors in the SERPS with our meta descriptions as our pages are starting to get indexed. We have the correct meta data in our code but it's being output in Google differently. Example: http://imgur.com/ybqxmqg Is this just a glitch on Google's side or is there an obvious issue anyone sees that I'm missing? Thanks guys!
Intermediate & Advanced SEO | | Brian_Owens_10 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
How do you check the google cache for hashbang pages?
So we use http://webcache.googleusercontent.com/search?q=cache:x.com/#!/hashbangpage to check what googlebot has cached but when we try to use this method for hashbang pages, we get the x.com's cache... not x.com/#!/hashbangpage That actually makes sense because the hashbang is part of the homepage in that case so I get why the cache returns back the homepage. My question is - how can you actually look up the cache for hashbang page?
Intermediate & Advanced SEO | | navidash0 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0 -
How to find all indexed pages in Google?
Hi, We have an ecommerce site with around 4000 real pages. But our index count is at 47,000 pages in Google Webmaster Tools. How can I get a list of all pages indexed of our domain? trying to locate the duplicate content. Doing a "site:www.mydomain.com" only returns up to 676 results... Any ideas? Thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0