Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Disavow - Broken links
-
I have a client who dealt with an SEO that created not great links for their site.
When I drilled down in opensiteexplorer there are quite a few links where the sites do not exist anymore - so I thought I could test out Disavow out on them .. maybe just about 6 - then we are building good quality links to try and tackle this problem with a more positive approach.
I just wondered what the consensus was?
-
Thanks everyone.
Well this is an example http://www.hotmarketable4you.com/SpecialReports/PRE30141.html
I had checked lots of these links maybe 2 weeks ago and then had (poor) content on them - but now all seem to be broken so i suspect was a link farm.
And Mike - was more irrelevant rather than "bad" content
Think I'll build links over next few weeks and then evaluate where we are then - hopefully rankings will start to improve
-
I think that the better tactic would be to create new content for those broken links. Unless these links are located on a very bad domain (link farm, etc.), I would just create a new page.
Be careful before you start messing with the disavow tool. The only time I would use the disavow tool is if the link is obviously bad. Like obviously obviously bad (if that makes sense). Many people assume that their ranking tanked because of some algo update and start disavowing links without really checking into it. Just be careful before using that tool and research the hell out of the link before you throw it away.
Here is a good article that gives you the Do's and Don't of using the Disavow tool.
http://www.portent.com/blog/seo/google-disavow-links-tool-best-practices.htm
Good luck!
-
I think if the links are broken and Google has been made aware of such, ie it has recrawled and cached the page (simply add "cache:" in front of the URL for the last cache copy - if the URL itself is broken, check if it is still indexed in Google), then it would know that the link has been broken and shouldn't count it.
If that's the case, I don't think the disavow would have any benefit, unless of course if the link were to return, which could be a possibility.
If the page is cached and that cached version has got the broken version = no worries.
If the URL is broken and the page is no longer indexed = no worries.
If the URL is broken and still indexed = check to see if any other links point to that URL (including the URLs site navigation and/or sitemap, if applicable. If not, should deindex soon. If there are links, I'd disavow.
Just my two pennies, hope it helps!
-
links that don't exist or links to pages that don't exist?
..heck, either way i'd ignore them and focus on phase 2 of your plan. Disavow seems to be a bit overused in my opinion. It's more of a last-ditch effort for penalty recovery IMHO.
and if it's 404 errors you're trying to fix: Google will eventually stop following those after they 404 long enough. Don't even worry about it. (unless they're links you want, then put a relevant redirect in place.)
Hope this was helpful.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do links from subdomains pass the authority and link juice of main domain ?
Hi, There is a subdomain with a root domain's DA 90. I can earn a backlink from that subdomain. This subdomain is fresh with no traffic yet. Do I get the ranking boost and authority from the subdomain? Example: I can earn a do-follow link from **https://what-is-crm.netlify.app/ **but not from https://netlify.app
White Hat / Black Hat SEO | | teamtc0 -
Sitewide nav linking from subdomain to main domain
I'm working on a site that was heavily impacted by the September core update. You can see in the attached image the overall downturn in organic in 2019 with a larger hit in September bringing Google Organic traffic down around 50%. There are many concerning incoming links from 50-100 obviously spammy porn-related websites to just plain old unnatural links. There was no effort to purchase any links so it's unclear how these are created. There are also 1,000s of incoming external links (most without no-follow and similar/same anchor text) from yellowpages.com. I'm trying to get this fixed with them and have added it to the disavow in the meantime. I'm focusing on internal links as well with a more specific question: If I have a sitewide header on a blog located at blog.domain.com that has links to various sections on domain.com without no-follow tags, is this a possible source of the traffic drops and algorithm impact? The header with these links is on every page of the blog on the previously mentioned subdomain. **More generally, any advice as to how to turn this around? ** The website is in the travel vertical. 90BJKyc
White Hat / Black Hat SEO | | ShawnW0 -
Pinging Links
Interested to know if anybody still uses the strategy of pinging links to make sure they get indexed, there are a number of sites out there which offer it. Is it considered dangerous/spamy?
White Hat / Black Hat SEO | | seoman100 -
Site Footer Links Used for Keyword Spam
I was on the phone with a proposed web relaunch firm for one of my clients listening to them talk about their deep SEO knowledge. I cannot believe that this wouldn’t be considered black-hat or at least very Spammy in which case a client could be in trouble. On this vendor’s site I notice that they stack the footer site map with about 50 links that are basically keywords they are trying to rank for. But here’s the kicker shown by way of example from one of the themes in the footer: 9 footer links:
White Hat / Black Hat SEO | | RosemaryB
Top PR Firms
Best PR Firms
Leading PR Firms
CyberSecurity PR Firms
Cyber Security PR Firms
Technology PR Firms
PR Firm
Government PR Firms
Public Sector PR Firms Each link goes to a unique URL that is basically a knock-off of the homepage with a few words or at the most one sentences swapped out to include this footer link keyword phrase, sometimes there is a different title attribute but generally they are a close match to each other. The canonical for each page links back to itself. I simply can’t believe Google doesn’t consider this Spammy. Interested in your view.
Rosemary0 -
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
Disavow wn.com?
I am cleaning up some spammy backlinks for a client and will be submitting a disavow at Google. This particular company website has 2,000+ backlinks from the domain wn.com which appears to be "World News". If you go to it, it appears to be nothing more than scraped content from other sites. Here is a recent example, where my client is linked to (I don't even see the backlink on the page, but it is in the source code!):
White Hat / Black Hat SEO | | gbkevin
http://article.wn.com/view/2013/11/22/Hungarian_Woman_Sentenced_to_One_Year_in_Prison_for_Her_Role/#/related_news But when I look at Moz metrics, WN.com has a domain authority of 90! So I don't want to disavow something that could POTENTIALLY be helping us. The client's website gets zero traffic from wn.com and I've never seen my client linked to in anything worthwhile... it kinda looks spammy to me. If you were me, after looking at WN.com and taking everything into account... would you disavow it? This client really needs to create a healthier backlink profile. Thanks!0 -
Is it worth getting links from .blogspot.com and .wordpress.com?
Our niche ecommerce site has only one thing going for it: We have numerous opportunities on a weekly basis to get reviews from "mom bloggers". We need links - our domain authority is depressing. My concern is that these "mom bloggers" tend to have blogs that end with .blogspot.com or .wordpress.com. How do I screen for "reviewers" that are worth getting links from and how can I make the most of the community we have available to us?
White Hat / Black Hat SEO | | Wilkerson1 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0