Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to combine 2 pages (same domain) that rank for same keyword?
-
Hi Mozzers,
A quick question. In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP. Always right next to each other (for example, position #7 and #8 or #3 and #4). So in the SERP it looks something like:
- www.mycompetition1.com
- www.mycompetition2.com
- www.mywebsite.com/page1.html
4) www.mywebsite.com**/page2.html**
5) www.mycompetition3.com
Now, I actually need both pages since the content on both pages is different - but on the same topic. Both pages have links to them, but page1.html always tends to have more. So, what is the best practice to tell Google that I only want 1 page to rank? Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1.
Does anybody have any experience in this? Any advice is much appreciated.
-
Hi there,
Realistically, the tag should be used for duplicates, yes. How "duplicated" a page is, is subjective: a page with 50% of the same content as another page is probably going to count as duplicated as far as Google goes... where that line of duplication acceptability goes isn't something any of us really know.
For pages where the content is totally different besides the header and footer, you technically shouldn't use canonicalisation. However, experiments have shown that Google honours the tag, even if the pages aren't duplicates. Dr. Pete did an experiment when the tag came out (admittedly a few years ago) where he showed that you could radically reduce the number of pages Google had indexed for a site by canonicalising everything to the home page. I personally had a client do this by accident a couple of years ago, and sure enough, their number of indexed pages dropped very quickly, along with all the rankings those pages had. As an ecommerce site that was ranking for clothing terms, this was very very bad. It took about six weeks to get those rankings back again after we fixed the tags, and the tags were fixed within about five days (should have been quicker but our urgent request went into a dev queue).
So the answer would be that Google seems to honour the tag no matter the content of the pages, but I am pretty sure that if you asked a Googler, they'd tell you that it should only be used for dupes or near-dupes.
-
Hi Jane,
Thanks for the advice. One question. I was under the impression that the rel="canonical" tag was for two pages that had the same content to let google know that the page it is pointing to is the original and should be the one to rank. Do you have any experience using them between 2 pages that have totally different content (minus the header and footer)?
Thanks again.
-
If you are happy for the second page to still exist but not rank, you should use the canonical tag to point the second page to the first one. This will lend the first page the majority of the strength of the second page and perhaps improve its authority and ranking as a result. However, the second page will no longer be indexed because the canonical tag tells Google: "ignore this page over here; it should be considered the same as the canonical version, here."
Again, this can benefit the first page, but it does mean that the second page will no longer rank at all. Only do this if you are okay with that scenario.
Cheers,
Jane
-
I'm afraid that there isn't a perfect solution, but there are various options to consider.
1.) The only way to "combine the SEO juice of both pages" is to 301 redirect one of the pages to the other (and add the content from the old page to the remaining one). However, this means that the second page will no longer exist for your website visitors (coming from organic search or not).
2.) You can use a rel=canonical tag pointing from the secondary page to the preferred one to encourage Google to list only the preferred one the pages in search results. In addition, you could use the robots.txt file or noindex meta tag (the meta tag is the preferred option) to block search engines from indexing the page and having it appear in search results. However, this will not "combine the SEO juice."
Assuming that it is crucial that the second page still exist on your website, I would probably not do anything. You appear twice in the first page of results -- great! Why mess with that? I would just focus on doing all the good SEO best practices and earning more links to those two pages to push them higher over time. (Of course, if I knew your exact situation, I would probably have additional suggestions.)
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page rank and menus
Hi, My client has a large website and has a navigation with main categories. However, they also have a hamburger type navigation in the top right. If you click it it opens to a massive menu with every category and page visible. Do you know if having a navigation like this bleeds page rank? So if all deep pages are visible from the hamburger navigation this means that page rank is not being conserved to the main categories. If you click a main category in the main navigation (not the hamburger) you can see the sub pages. I think this is the right structure but the client has installed this huge menu to make it easier for people to see what there is. From a technical SEO is this not bad?
Intermediate & Advanced SEO | | AL123al0 -
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Why my website disappears for the keywords ranked, then reappears and so on?
Hello to everyone. In the last 2 weeks my website emorroidi.imieirimedinaturali.it has a strange behavior in SERP: it disappears for the keywords ranked and then reappears, and so on. Here's the chronicle of the last days: 12/6: message in GWT: Improvement of the visibility of the website in search. 12/6 the website disappears for all the keywords ranked 16/6 the website reappears for all the keywords ranked with some keywords higher in ranking 18/6 the website disappears for all the keywords ranked 22/6 the website reappears for all the keywords ranked 24/6 the website disappears for all the keywords ranked... I can't explain this situation. Could it be a penalty? What Kind? Thank you.
Intermediate & Advanced SEO | | emarketer0 -
Google indexing only 1 page out of 2 similar pages made for different cities
We have created two category pages, in which we are showing products which could be delivered in separate cities. Both pages are related to cake delivery in that city. But out of these two category pages only 1 got indexed in google and other has not. Its been around 1 month but still only Bangalore category page got indexed. We have submitted sitemap and google is not giving any crawl error. We have also submitted for indexing from "Fetch as google" option in webmasters. www.winni.in/c/4/cakes (Indexed - Bangalore page - http://www.winni.in/sitemap/sitemap_blr_cakes.xml) 2. http://www.winni.in/hyderabad/cakes/c/4 (Not indexed - Hyderabad page - http://www.winni.in/sitemap/sitemap_hyd_cakes.xml) I tried searching for "hyderabad site:www.winni.in" in google but there also http://www.winni.in/hyderabad/cakes/c/4 this link is not coming, instead of this only www.winni.in/c/4/cakes is coming. Can anyone please let me know what could be the possible issue with this?
Intermediate & Advanced SEO | | abhihan0 -
How to recover google rank after changing the domain name?
I just started doing SEO for a new client. The case is a bit unique as they build a new website and for some reason lunched in under another domain name. Old name is foodstepsinasia.com and new one is foodstepsinasiatravel.com OLD one is a respected webites with 35 in MOZ page authority and with +15000 incomming link (104 root domains) NEW one is curently on 0 The programmer has just that build the new website has set it up so that when people write or find the old domain name it redirect to the front page of the new website with the new domain name. this caused that my friends lost a lot of their rankings was so I believ it was a very bad solution. But I also think I can get most of the old rankings back, but my question is what to do now to get as much back of the rankings as fast as possible?? A) I believe I must change the domain name back to foodstepsinasia.com on the new website ? O B) Should I on the old website try finding the url of the pages with most page authority and recreate these urls on the new website or should i redict them to a page with related content? Looking forward to feedback from someone who have experience with similar cases. Thanks!
Intermediate & Advanced SEO | | nm19770 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
Ranking for local searches without city specific keywords?
Hey guys! I had asked this question a few months ago and now that we are seeing even more implicit information determining search results, I want to ask it again..in two parts. Is is STILL best practice for on-page to add the city name to your titles, h1s, content etc? It seems that this will eventually be an outdated tactic, right? If there is a decent amount of search volume without any city name in the search query (ie. "storefont signs", but no search volume for the phrase when specific cities are added (ie. "storefront signs west palm beach) is it worth trying to rank and optimize for that search term for a company in West Palm Beach? We can assume that if there are 20,000 monthly searches for the non-location specific term that SOME of them would be fairly local, so do we optimize the page without the city name and trust Google to display results with a local intent...therefore showing our client's site in the SERPS when someone searches "sign company" and they are IN West Palm Beach? If there is any confusion, please just ask me to clarify! I think this would be a great WhiteBoard Friday topic for Rand!
Intermediate & Advanced SEO | | RickyShockley0 -
Keep multiple domains or combine them?
I need some help figuring out if I should combine multiple domains or if I should let them be separate? I have domain1.com, domain2.com, and domain3.com. Well, domain1.com owns domain2.com and domain3.com. And currently domain1.com points to domain2.com and domain3.com from the homepage. They are going through some changes at their business, and now the option is on the table to combine the domains or still let them be separate as long as they link to each other. What is the best way to handle this and are there more things I should go through before making a decision? None of them have a ton of links to them, and they aren't super robust, but would just to have some advice. Thanks a lot
Intermediate & Advanced SEO | | Rocket.Fuel0