<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:media="http://search.yahoo.com/mrss/" 
	>
<channel>
	<title>
	Comments on: Collection of Robots.txt Files	</title>
	<atom:link href="https://dailyblogtips.com/collection-of-robotstxt-files/feed/" rel="self" type="application/rss+xml" />
	<link>https://dailyblogtips.com/collection-of-robotstxt-files/</link>
	<description>DailyBlogTips.com takes you from SEO to CEO. You’ll learn everything you need to know to master blogging, SEO, marketing, web design leading you to passive income.</description>
	<lastBuildDate>Mon, 24 Jul 2023 21:54:15 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.7.4</generator>
	<item>
		<title>
		By: Dan from Lawn Mower Reviews		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-1291030</link>

		<dc:creator><![CDATA[Dan from Lawn Mower Reviews]]></dc:creator>
		<pubDate>Fri, 25 Mar 2011 05:00:40 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-1291030</guid>

					<description><![CDATA[Hi,
I use the All in one SEO plugin on most of my sites. Will this automatically insert a robot.txt file for me? I have heard about this before, but to be honest I kinda overlooked it and didn&#039;t think it was too important, however now I am a little worried that I am missing a vital trick here. 

I would guess that wordpress automatically create the robot.txt file for you, but just double checking?

Thanks]]></description>
			<content:encoded><![CDATA[<p>Hi,<br />
I use the All in one SEO plugin on most of my sites. Will this automatically insert a robot.txt file for me? I have heard about this before, but to be honest I kinda overlooked it and didn&#8217;t think it was too important, however now I am a little worried that I am missing a vital trick here. </p>
<p>I would guess that wordpress automatically create the robot.txt file for you, but just double checking?</p>
<p>Thanks</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: shabi		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-1231876</link>

		<dc:creator><![CDATA[shabi]]></dc:creator>
		<pubDate>Mon, 16 Aug 2010 12:56:03 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-1231876</guid>

					<description><![CDATA[hi...nice to see this...but can anyone tell me...whatis the function of these lines in robots.txt...
&quot;User-Agent: Googlebot
Disallow: /index.xml$
Disallow: /excerpts.xml$&quot;
i have seen these lines used gizmodo and other bloggers.. 
should these 2 lines help in removing duplicate content??
pls i want the answer..
thanks]]></description>
			<content:encoded><![CDATA[<p>hi&#8230;nice to see this&#8230;but can anyone tell me&#8230;whatis the function of these lines in robots.txt&#8230;<br />
&#8220;User-Agent: Googlebot<br />
Disallow: /index.xml$<br />
Disallow: /excerpts.xml$&#8221;<br />
i have seen these lines used gizmodo and other bloggers..<br />
should these 2 lines help in removing duplicate content??<br />
pls i want the answer..<br />
thanks</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Free Web Directory		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-885514</link>

		<dc:creator><![CDATA[Free Web Directory]]></dc:creator>
		<pubDate>Wed, 03 Jun 2009 08:17:11 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-885514</guid>

					<description><![CDATA[nice stuff, you make ideas for me to change my robots.txt]]></description>
			<content:encoded><![CDATA[<p>nice stuff, you make ideas for me to change my robots.txt</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Elle		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-866055</link>

		<dc:creator><![CDATA[Elle]]></dc:creator>
		<pubDate>Sun, 10 May 2009 20:54:15 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-866055</guid>

					<description><![CDATA[Should the /wp-content folder be included as well...where all the themes and plugins reside?  I&#039;ve not noticed it listed on any robot.txt file anywhere..

Thanks.]]></description>
			<content:encoded><![CDATA[<p>Should the /wp-content folder be included as well&#8230;where all the themes and plugins reside?  I&#8217;ve not noticed it listed on any robot.txt file anywhere..</p>
<p>Thanks.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Bang Kritikus		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-762243</link>

		<dc:creator><![CDATA[Bang Kritikus]]></dc:creator>
		<pubDate>Mon, 02 Feb 2009 04:11:32 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-762243</guid>

					<description><![CDATA[But, my blogspot&#039;s robot.txt is not editable..]]></description>
			<content:encoded><![CDATA[<p>But, my blogspot&#8217;s robot.txt is not editable..</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: SEO Freelancer		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-343693</link>

		<dc:creator><![CDATA[SEO Freelancer]]></dc:creator>
		<pubDate>Sat, 08 Mar 2008 08:59:38 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-343693</guid>

					<description><![CDATA[Nice collection - this will help me and new webmaster and web designer to create a robots.txt file as they want]]></description>
			<content:encoded><![CDATA[<p>Nice collection &#8211; this will help me and new webmaster and web designer to create a robots.txt file as they want</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: John		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-306965</link>

		<dc:creator><![CDATA[John]]></dc:creator>
		<pubDate>Fri, 08 Feb 2008 08:59:30 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-306965</guid>

					<description><![CDATA[Hello...!

Can anyone tell me the list of websites which archives the websites. Pandora, Internet archive&#039;s Waybackmachine are the some of the examples,  I want to know the entire web archiving websites, please.....]]></description>
			<content:encoded><![CDATA[<p>Hello&#8230;!</p>
<p>Can anyone tell me the list of websites which archives the websites. Pandora, Internet archive&#8217;s Waybackmachine are the some of the examples,  I want to know the entire web archiving websites, please&#8230;..</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Sangesh		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-258182</link>

		<dc:creator><![CDATA[Sangesh]]></dc:creator>
		<pubDate>Thu, 03 Jan 2008 15:23:22 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-258182</guid>

					<description><![CDATA[I got to know more about the &quot;robots.txt&quot; in this article.

Thanks.]]></description>
			<content:encoded><![CDATA[<p>I got to know more about the &#8220;robots.txt&#8221; in this article.</p>
<p>Thanks.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: ä¸œèŽžç½‘ç«™å»ºè®¾		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-196498</link>

		<dc:creator><![CDATA[ä¸œèŽžç½‘ç«™å»ºè®¾]]></dc:creator>
		<pubDate>Tue, 20 Nov 2007 09:06:12 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-196498</guid>

					<description><![CDATA[Very good learning]]></description>
			<content:encoded><![CDATA[<p>Very good learning</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: AskApache		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-84689</link>

		<dc:creator><![CDATA[AskApache]]></dc:creator>
		<pubDate>Thu, 09 Aug 2007 13:35:39 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-84689</guid>

					<description><![CDATA[The real benefit to learning about the robots.txt file and how it works is it teaches you to think like the web crawlers.  Especially when you start targeting different user-agents/bots...

webmasterworld is definately the coolest, and 2nd is of course askapache.com


 ]]></description>
			<content:encoded><![CDATA[<p>The real benefit to learning about the robots.txt file and how it works is it teaches you to think like the web crawlers.  Especially when you start targeting different user-agents/bots&#8230;</p>
<p>webmasterworld is definately the coolest, and 2nd is of course askapache.com</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Visitor413		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-68986</link>

		<dc:creator><![CDATA[Visitor413]]></dc:creator>
		<pubDate>Thu, 19 Jul 2007 16:18:37 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-68986</guid>

					<description><![CDATA[Your site found in Google:  ]]></description>
			<content:encoded><![CDATA[<p>Your site found in Google:  </p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Visitor367		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-68984</link>

		<dc:creator><![CDATA[Visitor367]]></dc:creator>
		<pubDate>Thu, 19 Jul 2007 16:18:04 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-68984</guid>

					<description><![CDATA[I have visited your site 623-times]]></description>
			<content:encoded><![CDATA[<p>I have visited your site 623-times</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Daniel		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-64695</link>

		<dc:creator><![CDATA[Daniel]]></dc:creator>
		<pubDate>Sat, 14 Jul 2007 16:16:49 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-64695</guid>

					<description><![CDATA[The effect upon individual ranking of your pages should not be huge, so do not expect to go from the tenth page to the first page of Google just because of using a robots.txt file.

That said, your search engine traffic will probably increase a lot if many of your pages were in the supplemental hell. First and foremost because now you will be cover many more keywords and terms.]]></description>
			<content:encoded><![CDATA[<p>The effect upon individual ranking of your pages should not be huge, so do not expect to go from the tenth page to the first page of Google just because of using a robots.txt file.</p>
<p>That said, your search engine traffic will probably increase a lot if many of your pages were in the supplemental hell. First and foremost because now you will be cover many more keywords and terms.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: TechZilo		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-64509</link>

		<dc:creator><![CDATA[TechZilo]]></dc:creator>
		<pubDate>Sat, 14 Jul 2007 11:04:51 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-64509</guid>

					<description><![CDATA[I&#039;d like to echo Zath&#039;s question, since my number of indexed pages has gone down too..will it affect SERPs?]]></description>
			<content:encoded><![CDATA[<p>I&#8217;d like to echo Zath&#8217;s question, since my number of indexed pages has gone down too..will it affect SERPs?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: vijay		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-54629</link>

		<dc:creator><![CDATA[vijay]]></dc:creator>
		<pubDate>Sun, 01 Jul 2007 14:05:18 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-54629</guid>

					<description><![CDATA[Hmm. I haven&#039;t thought yet to update my robot.txt
Its as simple as problogger.net no complications ;-)
I actually avoided that part coz I am not that much aware of robot.txt file changes and its effects!
Soon will give some time for that.
Thanks for the advice anyway.]]></description>
			<content:encoded><![CDATA[<p>Hmm. I haven&#8217;t thought yet to update my robot.txt<br />
Its as simple as problogger.net no complications 😉<br />
I actually avoided that part coz I am not that much aware of robot.txt file changes and its effects!<br />
Soon will give some time for that.<br />
Thanks for the advice anyway.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: vijay		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-54627</link>

		<dc:creator><![CDATA[vijay]]></dc:creator>
		<pubDate>Sun, 01 Jul 2007 14:04:38 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-54627</guid>

					<description><![CDATA[Hmm. I have not thinked yest to update my robot.txt
Its as simple as problogger.net no complications ;-)
I actually avoided that part coz I am not that much aware of robot.txt file changes and its effects!
Soon will give some time for that.
Thanks for the advice anyway.]]></description>
			<content:encoded><![CDATA[<p>Hmm. I have not thinked yest to update my robot.txt<br />
Its as simple as problogger.net no complications 😉<br />
I actually avoided that part coz I am not that much aware of robot.txt file changes and its effects!<br />
Soon will give some time for that.<br />
Thanks for the advice anyway.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Zath		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-54470</link>

		<dc:creator><![CDATA[Zath]]></dc:creator>
		<pubDate>Sun, 01 Jul 2007 08:56:51 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-54470</guid>

					<description><![CDATA[I recently set up a robots.txt file and have noticed that my supplemental links on Google have gone down from around 2000 pages to about 250.

I&#039;m thinking that&#039;s pretty good, but like others have said, I&#039;m not quite sure how much of a difference it makes to my site rankings.

Will this give more search engine traffic going forward or increase the chances of a better Pagerank?]]></description>
			<content:encoded><![CDATA[<p>I recently set up a robots.txt file and have noticed that my supplemental links on Google have gone down from around 2000 pages to about 250.</p>
<p>I&#8217;m thinking that&#8217;s pretty good, but like others have said, I&#8217;m not quite sure how much of a difference it makes to my site rankings.</p>
<p>Will this give more search engine traffic going forward or increase the chances of a better Pagerank?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Matt Wardman		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-45007</link>

		<dc:creator><![CDATA[Matt Wardman]]></dc:creator>
		<pubDate>Fri, 15 Jun 2007 13:42:08 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-45007</guid>

					<description><![CDATA[&#062;Wow, I&#039;m surprised that so many SEO experts don&#039;t include a line for sitemap autodiscovery. It&#039;s not like it&#039;s difficult to implement or anything

If you have a Google sitemap plugin for Wordpress it pingfs google every time you post anyway.

And:

The robots.txt for webmasterworld.com has a blog in it. Fun.

 ]]></description>
			<content:encoded><![CDATA[<p>&gt;Wow, I&#8217;m surprised that so many SEO experts don&#8217;t include a line for sitemap autodiscovery. It&#8217;s not like it&#8217;s difficult to implement or anything</p>
<p>If you have a Google sitemap plugin for WordPress it pingfs google every time you post anyway.</p>
<p>And:</p>
<p>The robots.txt for webmasterworld.com has a blog in it. Fun.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: CypherHackz		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44878</link>

		<dc:creator><![CDATA[CypherHackz]]></dc:creator>
		<pubDate>Fri, 15 Jun 2007 07:17:15 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44878</guid>

					<description><![CDATA[i have this, list of robots.txt links. you can see here: Big Websites with Big Robots]]></description>
			<content:encoded><![CDATA[<p>i have this, list of robots.txt links. you can see here: Big Websites with Big Robots</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Pchere		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44840</link>

		<dc:creator><![CDATA[Pchere]]></dc:creator>
		<pubDate>Fri, 15 Jun 2007 05:03:07 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44840</guid>

					<description><![CDATA[I am also tweaking my robots.txt to remove duplicate content in Wordpress. It was very insightful to see how top sites are dealing with the issue.]]></description>
			<content:encoded><![CDATA[<p>I am also tweaking my robots.txt to remove duplicate content in WordPress. It was very insightful to see how top sites are dealing with the issue.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Jordan McCollum		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44707</link>

		<dc:creator><![CDATA[Jordan McCollum]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 22:16:26 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44707</guid>

					<description><![CDATA[What timing! I was just contemplating roboting out my category and archive pages.  Thanks for this!]]></description>
			<content:encoded><![CDATA[<p>What timing! I was just contemplating roboting out my category and archive pages.  Thanks for this!</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Patrix		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44698</link>

		<dc:creator><![CDATA[Patrix]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 21:48:29 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44698</guid>

					<description><![CDATA[I have been tweaking my robots.txt file for quite some time now mostly to reduce duplicate content (get pages out of supplemental hell)but haven&#039;t noticed any appreciable difference. 

I have been checking a few A-bloggers blogs for their robots.txt files so thanks for doing this. 

BTW why does Shoemoney disallow some directories with and without the forward slash? What is the difference?]]></description>
			<content:encoded><![CDATA[<p>I have been tweaking my robots.txt file for quite some time now mostly to reduce duplicate content (get pages out of supplemental hell)but haven&#8217;t noticed any appreciable difference. </p>
<p>I have been checking a few A-bloggers blogs for their robots.txt files so thanks for doing this. </p>
<p>BTW why does Shoemoney disallow some directories with and without the forward slash? What is the difference?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Daniel		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44571</link>

		<dc:creator><![CDATA[Daniel]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 16:10:56 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44571</guid>

					<description><![CDATA[Adnan, I will need to tweak mine as well. So far I am getting pretty good results with a minimalistic one though, just exclusing feeds, trackbacks and WP files.]]></description>
			<content:encoded><![CDATA[<p>Adnan, I will need to tweak mine as well. So far I am getting pretty good results with a minimalistic one though, just exclusing feeds, trackbacks and WP files.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Adnan		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44570</link>

		<dc:creator><![CDATA[Adnan]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 16:06:35 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44570</guid>

					<description><![CDATA[Hey Daniel - thanks for that compilation - its very interesting to see how some SEO sites like SearchEngineJournal were minimal, but how SEOMoz has something different.
Now I need to decide on which one to choose ;)]]></description>
			<content:encoded><![CDATA[<p>Hey Daniel &#8211; thanks for that compilation &#8211; its very interesting to see how some SEO sites like SearchEngineJournal were minimal, but how SEOMoz has something different.<br />
Now I need to decide on which one to choose 😉</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Hugh &#124; A Politically Incorrect Entrepreneur		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44557</link>

		<dc:creator><![CDATA[Hugh &#124; A Politically Incorrect Entrepreneur]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 15:14:30 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44557</guid>

					<description><![CDATA[While crawling around the interweb a few days ago, I found the robots.txt file for the whitehouse (whitehouse.gov/robots.txt)

I just thought it interesting the things they disallowed.]]></description>
			<content:encoded><![CDATA[<p>While crawling around the interweb a few days ago, I found the robots.txt file for the whitehouse (whitehouse.gov/robots.txt)</p>
<p>I just thought it interesting the things they disallowed.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: John Wesley		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44538</link>

		<dc:creator><![CDATA[John Wesley]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 14:34:22 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44538</guid>

					<description><![CDATA[Very interesting post. I actually started using something similar to Chow&#039;s after he published it on his blog last week. It seems to be adding a bit of Google traffic.]]></description>
			<content:encoded><![CDATA[<p>Very interesting post. I actually started using something similar to Chow&#8217;s after he published it on his blog last week. It seems to be adding a bit of Google traffic.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Daniel		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44528</link>

		<dc:creator><![CDATA[Daniel]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 14:05:37 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44528</guid>

					<description><![CDATA[Nia, sorry for that I just updated the article with a link to an introductory post I wrote sometime ago:

 ]]></description>
			<content:encoded><![CDATA[<p>Nia, sorry for that I just updated the article with a link to an introductory post I wrote sometime ago:</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Nia		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44525</link>

		<dc:creator><![CDATA[Nia]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 14:01:19 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44525</guid>

					<description><![CDATA[This looks valuable except I don&#039;t know how to use it yet. I&#039;ve put it in my RSS shares and when I figure it out I&#039;ll implement the lesson and post about it. Thanks. ;)]]></description>
			<content:encoded><![CDATA[<p>This looks valuable except I don&#8217;t know how to use it yet. I&#8217;ve put it in my RSS shares and when I figure it out I&#8217;ll implement the lesson and post about it. Thanks. 😉</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Pablo		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44521</link>

		<dc:creator><![CDATA[Pablo]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 13:56:54 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44521</guid>

					<description><![CDATA[nice stuff, i already changed my robots.txt]]></description>
			<content:encoded><![CDATA[<p>nice stuff, i already changed my robots.txt</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Daniel		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44520</link>

		<dc:creator><![CDATA[Daniel]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 13:54:29 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44520</guid>

					<description><![CDATA[Stephen, I don&#039;t think the &quot;autodiscovery&quot; factor is related to how easy it is to implement.

The question is: will it bring tangible improvements?]]></description>
			<content:encoded><![CDATA[<p>Stephen, I don&#8217;t think the &#8220;autodiscovery&#8221; factor is related to how easy it is to implement.</p>
<p>The question is: will it bring tangible improvements?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Stephen		</title>
		<link>https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44516</link>

		<dc:creator><![CDATA[Stephen]]></dc:creator>
		<pubDate>Thu, 14 Jun 2007 13:50:14 +0000</pubDate>
		<guid isPermaLink="false">https://dailyblogtips.com/collection-of-robotstxt-files/#comment-44516</guid>

					<description><![CDATA[Wow, I&#039;m surprised that so many SEO experts don&#039;t include a line for sitemap autodiscovery.  It&#039;s not like it&#039;s difficult to implement or anything...]]></description>
			<content:encoded><![CDATA[<p>Wow, I&#8217;m surprised that so many SEO experts don&#8217;t include a line for sitemap autodiscovery.  It&#8217;s not like it&#8217;s difficult to implement or anything&#8230;</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
