Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can hidden backlinks ever be ok?
-
Hi all,
I'm very new to SEO and still learning a lot.
Is it considered a black hat tactic to wrap a link in a DIV tag, with display set to none (hidden div), and what can the repercussions be?
From what I've learnt so far, is that this is a very unethical thing to be doing, and that the site hosting these links can end up being removed from Google/Bing/etc indexes completely. Is this true?
The site hosting these links is a group/parent site for a brand, and each hidden link points to one of the child sites (similar sites, but different companies in different areas).
Thanks in advance!
-
Hi Ryan,
Thanks for the quick feedback.
This clears up things for me a bit.Thanks,
Stephen -
The separation between black hat and white hat tactics is generally a clear line. The simple question is, does the code exist for the benefit of your site's visitors or solely to manipulate search engines?
DIV tags are used to apply CSS rules to specific pieces of code. If you have a link contained in a DIV and the display set to none, that link would clearly never be seen by the site's visitors. It is apparent the link exists solely to manipulate search engine results, and therefore is a black hat tactic.
When Google and other search engines discover black hat tactics being used on a site, they will take action. The action can be relatively minor such as ignoring the link. The action could be mid-range such as removing the page containing the link from the index. At the extreme end, they can remove the entire site from the index.
Each search engine has their own internal guidelines on how to handle these issues. Some issues are handled automatically via algorithms, while other issues are handled by manual review. There are no published standards on exactly which punishments will be handed out for a given violation. It is simply best to completely avoid anything black hat.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you use Screaming Frog to find all instances of relative or absolute linking?
My client wants to pull every instance of an absolute URL on their site so that they can update them for an upcoming migration to HTTPS (the majority of the site uses relative linking). Is there a way to use the extraction tool in Screaming Frog to crawl one page at a time and extract every occurrence of _href="http://" _? I have gone back and forth between using an x-path extractor as well as a regex and have had no luck with either. Ex. X-path: //*[starts-with(@href, “http://”)][1] Ex. Regex: href=\”//
Technical SEO | | Merkle-Impaqt0 -
ADA, WCAG, Section 508 Accessibility and hidden text
I am working on fixing accessibility issues on client's site, and they have contracted with a vendor who provides both tools to monitor the site and consulting to help us fix issues that are found. When there are spatial relationships between elements on a page that would be not be evident to someone listening via a screen reader, a strategy that they recommended to us to is to add text helpers that are not visible, but still read by the screen readers. An example: Directions to our Fifth Avenue Store I have seen this technique used on a major brand site but I am concerned that their brand strength insulates them from a hidden text penalty far more than my client's brand would. Also, their implementation uses class names like "ada_hidden" which may help search engines understand the intent, or may not at all. I am looking for opinions regarding the use of this technique. Normally I wouldn't use it for risk of penalty, but here the intent is to improve the user experience of the pages. Anyone used similar techniques for ADA/WCAG, or solved the problem in a more SEO-friendly way? Thanks, Will
Technical SEO | | WillW0 -
Can a CMS affect SEO?
As the title really, I run www.specialistpaintsonline.co.uk and 6 months ago when I first got it it had bad links which google had put a penalty against it so losts it value. However the penalty was lift in Sept, the site corresponds to all guidelines and seo work has been done and constantly monitored. the issue I have is sales and visits have not gone up, we are failing fast and running on 2 or 3 sales a month isn't enough to cover any sort of cost let alone wages. hence my question can the cms have anything to do with it? Im at a loss and go grey any help or advice would be great. thanks in advance.
Technical SEO | | TeamacPaints0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Can I mark up breadcrumbs without showing them? (responsive design)
I am working on a site that has responsive design. We use faceted search for the desktop version but implemented a style of breadcrumbs for the mobile version as sidebars take up too much screen real estate. On the desktop design we are putting a display:none in front of the breadcrumbs. If we mark up those breadcrumbs and they are behind a display none, can we still get the rich snippets? Will Google see this is cloaking? In follow up, is there a way to markup breadcrumbs in the or somewhere else that is constant?
Technical SEO | | MarloSchneider0 -
Backlinks that we have if they are 404?
Hi All, Backlinks that we have if they are 404? Open site explorer shows 1,000 of links and when I check many are 404 and those are spammy links which we had but now the sites are 404 I am doing a link profile check which is cleaning up all spammy links Should i take any action on them? As open site explorer or Google still shows these links on the searches. Should we mention these URL's in disallow in Google webmaster. Thanks
Technical SEO | | mtthompsons0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Can you do a 301 redirect without a hosting account?
Trying to retire domain1 and 301 it to domain2 - just don't want to get stuck having to pay the old hosting provider simply to serve a .htaccess file with the redirect rule.
Technical SEO | | TitanDigital0