X-Git-Url: https://git.phdru.name/?a=blobdiff_plain;f=doc%2FTODO;h=b4ff4c1da7a80c226a7c4c42a610bdcfb7d24f80;hb=0e76f1851882b99da63a7c8a9e4cdf0c4a48657f;hp=51af6655a4cc59683d7f96cb117a03d55351ae0f;hpb=c9c1b9838ef53785e761e41fc4eac028ce91dec6;p=bookmarks_db.git diff --git a/doc/TODO b/doc/TODO index 51af665..b4ff4c1 100644 --- a/doc/TODO +++ b/doc/TODO @@ -1,22 +1,33 @@ - Parse downloaded file and get javascript redirects. +New robot based on PycURL. - Documentation. +Use lxml to parse broken HTML. - Merge "writers" to storage managers. - New storage managers: shelve, SQL, ZODB, MetaKit. - Robots (URL checkers): threading, asyncore-based. +New database format: pyyaml. - Configuration file for configuring defaults - global defaults for the system - and local defaults for subsystems. +A program to publish bookmarks with icons. - Ruleset-based mechanisms to filter out what types of URLs to check: checking - based on URL schema, host, port, path, filename, extension, etc. +Fetch description from and store it in +bookmark.description if the description is empty. (How to update old +descriptions without replacing my own comments?) - Detailed reports on robot run - what's old, what's new, what has been moved, - errors, etc. - WWW-interface to the report. +Parse (or interpret) downloaded file and get javascript redirects. - Bigger database. Multiuser database. Robot should operate on a part of - the DB. - WWW-interface to the database. User will import/export/edit bookmarks, - schedule robot run, etc. +More and better documentation. + +Merge "writers" to storage managers. +New storage managers: shelve, SQL, ZODB, MetaKit. +More robots (URL checkers): threading, asyncore-based. +Configuration file to configure defaults - global defaults for the system +and local defaults for subsystems. + +Ruleset-based mechanisms to filter out what types of URLs to check: checking +based on URL schema, host, port, path, filename, extension, etc. + +Detailed reports on robot run - what's old, what's new, what has been moved, +errors, etc. +WWW-interface to the report. + +Bigger database. Multiuser database. Robot should operates on a part of +the DB. +WWW-interface to the database. User should import/export/edit bookmarks, +schedule robot run, etc.