Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Good robots txt for magento
-
Dear Communtiy,
I am trying to improve the SEO ratings for my website www.rijwielcashencarry.nl (magento). My next step will be implementing robots txt to exclude some crawling pages.
Does anybody have a good magento robots txt for me? And what need i copy exactly?Thanks everybody!
Greetings,
Bob
-
This is fine, as long as you don't want to exclude robots from crawling any part of your site.
-
Me to have this problem, if someone can help with setting root.txt
my webcurrent configuration is
Sitemap: http://www.myweb/sitemap.xml
User-agent: *
Disallow:THIS IS GOOD ?
-
Hi Ruth,
Also thanks for your response!
Greetings,
Bob
-
Hi Peter,
Thanks for your response! I am going to follow up your advice and build a good Robots TXT.
Greetings,
Bob
-
Peter is correct - your search, admin and user pages are common pages to block for Magento. What you block is up to you, though. Don't forget that a page that is blocked by robots.txt can still be found by search engines, so if it's a page that will contain private information you should protect it with a password.
-
Hi there! Did Peter's response take care of this for you? If so, please mark it as a "Good Answer."
-
Hi,
Creating robots.txt file for the site is one of the most important thing, you need to understand your website or stores basic needs what to keep private and what to make public; I think you need to block some part in your magento site like your search pages (?*sid) and admin pages, user dashboard pages, here is an example links Robots.txt for Magento and Robots.txt File Examples
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | Breemcc0 -
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Google Search console says 'sitemap is blocked by robots?
Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: *
Technical SEO | | Extima-Christian
Disallow: Sitemap: http://www.website.com/sitemap_index.xml It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue?1 -
The W3C Markup Validation Service - Good, Bad or Impartial?
Hi guys, it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task. My questions to you fellow SEO'rs out there are 2: 1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with. 2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation. *As a note i will say that i mostly refer to Wordpress driven sites. would love ot hear your take. Daniel.
Technical SEO | | artdivision0 -
Hiding h1 tags in Magento
Hi Moz Community, I know that hiding h1 tags isn't a good practice for SEO and google, but we have banners that look much nicer than the stock text Magento uses for its titles. The banners have the same text and the h1 is in the source code, just not visible on front end. The option Magento gives is "hide title on the page." So I'm not sure if this is actually the bad way to hide it or if it's fine for search engines. Thanks,
Technical SEO | | IceIcebaby
-Reed0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Googlebot does not obey robots.txt disallow
Hi Mozzers! We are trying to get Googlebot to steer away from our internal search results pages by adding a parameter "nocrawl=1" to facet/filter links and then robots.txt disallow all URLs containing that parameter. We implemented this late august and since that, the GWMT message "Googlebot found an extremely high number of URLs on your site", stopped coming. But today we received yet another. The weird thing is that Google gives many of our nowadays robots.txt disallowed URLs as examples of URLs that may cause us problems. What could be the reason? Best regards, Martin
Technical SEO | | TalkInThePark0 -
DISQUS COMMENTS backlinks-good for seo? YES/NO?
DISQUS COMMENTS backlinks-good for seo? YES/NO? I have just started commenting on "powered by disquus" websites in the Disqus comments box and left a link to my website in the name field! Having googled whether Disqus comments backlinks are any good for seo purposes i have discovered that there is a 50/50 view on the subject with some people saying they are a "goldmine" for getting high PR backlinks and others saying they are a waste of time because googlebot cannot read Java. My own experience of commenting on Disqus powered websites is that wordpress blogs powered by disqus comments ARE INDEXED by GOOGLE and the "BACKLINK IS IN THE SOURCE OF THE PAGE" When i comment on normal websites using the Disqus comment system i have found that my Disqus comments ARE NOT indexed by Google and there IS NO BACKLINK in the page source! Has anybody got any views on whether Disqus comments backlinks are any good?
Technical SEO | | Freebetsuk2