X-Git-Url: https://git.phdru.name/?a=blobdiff_plain;f=doc%2FTODO;h=7e98a6ece9e6a47a7772281cc1747b7f3aaa1e4a;hb=7915c0f51c9b7556d6af1de9582f68e98f473799;hp=a74684b467cee497e87c307fb42f3bbd859812b5;hpb=037c50a4a20df82a51375d9fcb075d4ac5add0b8;p=bookmarks_db.git diff --git a/doc/TODO b/doc/TODO index a74684b..7e98a6e 100644 --- a/doc/TODO +++ b/doc/TODO @@ -1,23 +1,32 @@ - Cleanup HTML before parsing using BeautifulSoap or Tidy. - Parse downloaded file and get javascript redirects. +A new robot based on PycURL. - More and better documentation. +HTML parser based on BeautifulSoup4. - Merge "writers" to storage managers. - New storage managers: shelve, SQL, ZODB, MetaKit. - More robots (URL checkers): threading, asyncore-based. +A program to publish bookmarks with icons. - Configuration file to configure defaults - global defaults for the system - and local defaults for subsystems. +Fetch description from and store it in +bookmark.description if the description is empty. (How to update old +descriptions without replacing my own comments?) - Ruleset-based mechanisms to filter out what types of URLs to check: checking - based on URL schema, host, port, path, filename, extension, etc. +Parse (or interpret) downloaded file and get javascript redirects. - Detailed reports on robot run - what's old, what's new, what has been moved, - errors, etc. - WWW-interface to the report. +More and better documentation. - Bigger database. Multiuser database. Robot should operates on a part of - the DB. - WWW-interface to the database. User should import/export/edit bookmarks, - schedule robot run, etc. +Merge "writers" to storage managers. +New storage managers: shelve, SQL, ZODB, MetaKit. +More robots (URL checkers): threading, asyncore-based; +robots that test many URLs in parallel. +Configuration file to configure defaults - global defaults for the system +and local defaults for subsystems. + +Ruleset-based mechanisms to filter out what types of URLs to check: checking +based on URL schema, host, port, path, filename, extension, etc. + +Detailed reports on robot run - what's old, what's new, what has been moved, +errors, etc. +WWW-interface to the report. + +Bigger database. Multiuser database. Robot should operates on a part of +the DB. +WWW-interface to the database. User should import/export/edit bookmarks, +schedule robot run, etc.