1 Cleanup HTML using BeautifulSoap or Tidy.
2 Parse downloaded file and get javascript redirects.
4 More and better documentation.
6 Merge "writers" to storage managers.
7 New storage managers: shelve, SQL, ZODB, MetaKit.
8 More robots (URL checkers): threading, asyncore-based.
10 Configuration file for configuring defaults - global defaults for the system
11 and local defaults for subsystems.
13 Ruleset-based mechanisms to filter out what types of URLs to check: checking
14 based on URL schema, host, port, path, filename, extension, etc.
16 Detailed reports on robot run - what's old, what's new, what has been moved,
18 WWW-interface to the report.
20 Bigger database. Multiuser database. Robot should operates on a part of
22 WWW-interface to the database. User should import/export/edit bookmarks,
23 schedule robot run, etc.