Does SmallWiki have support for robot.txt? There needs to be a way to tell
the robots to stay away from history and edit pages. The way I did this in
WikiWorks was to generate a robot.txt file that tells them to stay away.
For every wiki XXX, it puts in lines of the form
wiki.cs.uiuc.edu/XXX/EDIT
wiki.cs.uiuc.edu/XXX/HISTORY
This only works because the page name comes AFTER the command. SmallWiki
puts the command after the page name, so you'd have to generate a line for
every command on every page. Or am I missing something?
Suppose I wanted to change the URLs around so that I could prevent robots
from executing actions. Would this be a major change, or easy?
-Ralph