Getting the Most from a Robots.txt File in DSpace

Fri, 2014-06-20 13:59 -- carol

From Bram Luyten, @mire

Heverlee, Belgium  Do you get the most visibility from your Google interaction where 50% of your DSpace repository traffic comes from?
 
@mire has deployed a prototype of a free tool that can analyze a few SEO properties for your DSpace repository. More precisely, the tool takes a look at your robots.txt file and gives you an analysis of how robust your robot.text file is on the web.
 
You can access the tool here: https://bitly.com/or14-analysis
 
If the tool identifies certain problems, you can find more information at: https://wiki.duraspace.org/display/DSDOC4x/Search+Engine+Optimization
 
This tool was partly developed at the Open Repositories 2014 Developer Challenge. A short slidedeck about the tool is available at: http://www.slideshare.net/bramluyten/big-elephant
 
If you have any suggestions for this tool, please feel free to send them. We intend to extend this tool with a few more tests in the future.
RSS Feeds: