Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Targeting local areas without creating landing pages for each town
-
I have a large ecommerce website which is structured very much for SEO as it existed a few years ago. With a landing page for every product/town nationwide (its a lot of pages).
Then along came Panda...
I began shrinking the site in Feb last year in an effort to tackle duplicate content. We had initially used a template only changing product/town name.
My first change was to reduce the amount of pages in half by merging the top two categories, as they are semantically similar enough to not need their own pages. This worked a treat, traffic didn't drop at all and the remaining pages are bringing in the desired search terms for both these products.
Next I have rewritten the content for every product to ensure they are now as individual as possible.
However with 46 products and each of those generating a product/area page we still have a heap of duplicate content. Now i want to reduce the town pages, I have already started writing content for my most important areas, again, to make these pages as individual as possible.
The problem i have is that nobody can write enough unique content to target every town in the UK via an individual page (times by 46 products), so i want to reduce these too.
QUESTION: If I have a single page for "croydon", will mentioning other local surrounding areas on this page, such as Mitcham, be enough to rank this page for both towns?
I have approx 25 Google local place/map listings and grwoing, and am working from these areas outwards. I want to bring the site right down to about 150 main area pages to tackle all the duplicate content, but obviously don't want to lose my traffic for so many areas at once.
Any examples of big sites that have reduced in size since Panda would be great.
I have a headache... Thanks community.
-
My pleasure, Silkstream. I can understand how what you are doing feels risky, but in fact, you are likely preventing fallout from worse risks in the future. SEO is a process, always evolving, and helping your client change with the times is a good thing to do! Good luck with the work.
-
Thank you Miriam. I appreciate you sharing with me the broad idea of the type of structure that you feel a site should have in this instance (if starting from scratch).
You have pretty much echoed my proposal for a new site structure, built for how Google works nowadays, rather than 2-3 years ago. We are currently reducing the size of the current site, to bring it as close to this type of model as possible. However the site would need a complete redesign to make it viably possible to have this type of structure.
I guess what I've been looking for is some kind of reassurance that we are moving in the right direction! Its a scary prospect reducing such a huge amount of pages down to a compact targeted set. With prospects of losing so much long tail traffic, it can make us a little hesitant.
However the on-site changes we have made so far, seem to be having a positive affect.And thank you for giving me some ideas about content creation for each town. I really like this as an idea to move forward after the changes are complete, which will hopefully be by the new year!
-
Hi Silkstream,
Thank you so much for clarifying this! I understand now.
If I were starting with a client like this, from scratch, this would be the approach I would take:
-
View content development as two types of pages. One set would be the landing pages for each physical location, optimized for each city, with unique content. The other set would be service pages, optimized for the services, but not for a particular city.
-
Create a Google+ Local page for each of the physical locations, linked to its respective landing page on the website. So, let's say you now have 25 city pages and 46 service pages. That's a fairly tall order, but certainly do-able.
-
Build structured citations for each location on third party local business directories. Given the number of locations, this would be an enormous jobs.
-
Build an onsite blog and designate company bloggers, ideally one in each physical office. The job of these bloggers would be something like each of them creating one blog post per month about a project that was accomplished in their city. In this way, the company could begin developing content under their own steam that would meet the need of showcasing a given service with a given city. Over time, this body of content would grow the pool of queries for which they have answers for.
-
Create a social outreach strategy, likely designating brand representatives within the company who could be active on various platforms.
-
Likely need to develop a link earning strategy tied in with steps 4 and 5.
-
Consider video marketing. A good video or two for each physical location could work wonders.
I'm painting in broad strokes here, but this is likely what the overall strategy would look like. You've come into the scenario midway and don't have the luxury of starting from scratch. You are absolutely right to be cleaning up duplicate content and taking other measures to reduce the spaminess and improve the usefulness of the site. Once you've got your cleanup complete, I think the steps I've outlined would be the direction to go in. Hope this helps.
-
-
Hi Miriam,
Thanks for jumping in.
The business model is service-based. So when i refer to "46 products" they are actually 46 different types of service available.
The customer will typically book and pay online, through the website, and they are then served at their location which is most often either their home or place of work. They actually have far more than the 25 actual locations, much closer to 120 I believe. However, I only began their SEO in February, AFTER they were hit by Panda. So building up their local listings is taking time, as the duplicate content issue seems far more urgent. Trying to strike a balance, and fix this all slowly over time to lay a solid foundation for inbound marketing, as its being diluted by the poor site structure.
Does this help? Am I doing the right things here?
-
Hi Silkstream,
I think we need to clarify what your business model is. You say you have a physical location in each of your 25 towns. So far, so good, but are you saying that your business has in-person transactions with its customers at each of the 25 locations? The confusion here is arising from the fact that e-commerce companies are typically virtual, meaning that they do not have in-person transactions with their customers. The Google Places Quality Guidelines state:
Only businesses that make in-person contact with customers qualify for a Google Places listing.
Thus, my wanting to be sure that your business model is actually eligible, given that you've described it as an e-commerce business, which would be ineligibl_e._ If you can clarify your business model, I think it will help you to receive the most helpful answers from the community.
-
You scared me then Chris!
-
Of course, if you've got the physical locations, you're in good shape there.
-
"It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization."
Why? The business has a physical location in every town, so why should they not have a page for every location? This is what we were advised to do?
"If there was no other competition, you would almost certainly rank for your keywords along with the town name"
I have used this tactic before, for another nationwide business, but on a smaller scale and it worked. Ie; they ranked (middle of page 1) but for non competitive keywords and the page has strong backlinks. With this site, the competition is stronger and the pages will not have a strong backlink profile at first.
My biggest worry, is to cut all the existing pages and lose the 80% long tail the site currently pulls in. But what other way is there to tackle so much duplicate content?
-
It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization. If not that, it's likely to stop being worth the time as a visibility tactic.
As far as whether or not mentioning local surrounding towns in your page copy will be enough to get you to rank for them, it would depend on competition. If there was no other competition, you would almost certainly rank for your keywords along with the town name but with competition, all the local ranking factors start coming into play and your ability to rank for each one will depend on a combination of all of them.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel canonical tag from shopify page to wordpress site page
We have pages on our shopify site example - https://shop.example.com/collections/cast-aluminum-plaques/products/cast-aluminum-address-plaque That we want to put a rel canonical tag on to direct to our wordpress site page - https://www.example.com/aluminum-plaques/ We have links form the wordpress page to the shop page, and over time ahve found that google has ranked the shop pages over the wp pages, which we do not want. So we want to put rel canonical tags on the shop pages to say the wp page is the authority. I hope that makes sense, and I would appreciate your feeback and best solution. Thanks! Is that possible?
Intermediate & Advanced SEO | | shabbirmoosa0 -
JSON-LD schema markup for a category landing page
I'm working on some schema for a client and have a question regarding the use of schema for a high-level category page. This page is merely the main lander for Categories. For example: https://www.examples.com/pages/categories And all it does is list links to the three main categories (Men's, Women's, Kid's) - it's a clothing store. This is the code I have right now. In short, simply using type @Itemlist and an array that uses @ListItem. Structured Data Testing Tool returns no errors with it, but my main question is this: Is this the _correct _way to do a page like this, or are there better options? Thanks.
Intermediate & Advanced SEO | | Alces0 -
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
Fresh page versus old page climbing up the rankings.
Hello, I have noticed that if publishe a webpage that google has never seen it ranks right away and usually in a descend position to start with (not great but descend). Usually top 30 to 50 and then over the months it slowly climbs up the rankings. However, if my page has been existing for let's say 3 years and I make changes to it, it takes much longer to climb up the rankings Has someone noticed that too ? and why is that ?
Intermediate & Advanced SEO | | seoanalytics0 -
Does having a different sub domain for your Landing Page and Blog affect your overall SEO benefits and Ranking?
We have a domain www.spintadigital.com that is hosted with dreamhost and we also have a seperate subdomain blog.spintadigital.com which is hosted in the Ghost platform and we are also using Unbounce landing pages with the sub domain get.spintadigital.com. I wanted to know whether having subdomain like this would affect the traffic metric and ineffect affect the SEO and Rankings of our site. I think it does not affect the increase in domain authority, but in places like similar web i get different traffic metrics for the different domains. As far as i can see in many of the metrics these are considered as seperate websites. We are currently concentrating more on our blogs and wanted to make sure that it does help in the overall domain. We do not have the bandwidth to promote three different websites, and hence need the community's help to understand what is the best option to take this forward.
Intermediate & Advanced SEO | | vinodh-spintadigital0 -
Replace dynamic paramenter URLs with static Landing Page URL - faceted navigation
Hi there, got a quick question regarding faceted navigation. If a specific filter (facet) seems to be quite popular for visitors. Does it make sense to replace a dynamic URL e.x http://www.domain.com/pants.html?a_type=239 by a static, more SEO friendly URL e.x http://www.domain.com/pants/levis-pants.html by creating a proper landing page for it. I know, that it is nearly impossible to replace all variations of this parameter URLs by static ones but does it generally make sense to do this for the most popular facets choose by visitors. Or does this cause any issues? Any help is much appreciated. Thanks a lot in advance
Intermediate & Advanced SEO | | ennovators0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0