X-Git-Url: https://git.phdru.name/?a=blobdiff_plain;f=doc%2FTODO;h=f98d0769a9116d382c103c62e6340ea8ce50012a;hb=27c6253f3e707d0b90e67ee52f78e1335482e17e;hp=a74684b467cee497e87c307fb42f3bbd859812b5;hpb=037c50a4a20df82a51375d9fcb075d4ac5add0b8;p=bookmarks_db.git diff --git a/doc/TODO b/doc/TODO index a74684b..f98d076 100644 --- a/doc/TODO +++ b/doc/TODO @@ -1,23 +1,34 @@ - Cleanup HTML before parsing using BeautifulSoap or Tidy. - Parse downloaded file and get javascript redirects. +Robot based on urllib2: handle timeout and ftp. - More and better documentation. +A new robot based on PycURL. - Merge "writers" to storage managers. - New storage managers: shelve, SQL, ZODB, MetaKit. - More robots (URL checkers): threading, asyncore-based. +HTML parser based on BeautifulSoup4. - Configuration file to configure defaults - global defaults for the system - and local defaults for subsystems. +A program to publish bookmarks with icons. - Ruleset-based mechanisms to filter out what types of URLs to check: checking - based on URL schema, host, port, path, filename, extension, etc. +Fetch description from and store it in +bookmark.description if the description is empty. (How to update old +descriptions without replacing my own comments?) - Detailed reports on robot run - what's old, what's new, what has been moved, - errors, etc. - WWW-interface to the report. +Parse (or interpret) downloaded file and get javascript redirects. - Bigger database. Multiuser database. Robot should operates on a part of - the DB. - WWW-interface to the database. User should import/export/edit bookmarks, - schedule robot run, etc. +More and better documentation. + +Merge "writers" to storage managers. +New storage managers: shelve, SQL, ZODB, MetaKit. +More robots (URL checkers): threading, asyncore-based; +robots that test many URLs in parallel. +Configuration file to configure defaults - global defaults for the system +and local defaults for subsystems. + +Ruleset-based mechanisms to filter out what types of URLs to check: checking +based on URL schema, host, port, path, filename, extension, etc. + +Detailed reports on robot run - what's old, what's new, what has been moved, +errors, etc. +WWW-interface to the report. + +Bigger database. Multiuser database. Robot should operates on a part of +the DB. +WWW-interface to the database. User should import/export/edit bookmarks, +schedule robot run, etc.