X-Git-Url: https://git.phdru.name/?a=blobdiff_plain;f=doc%2FTODO;h=8893578ebf264cd2427b3923bceef0f44e105a3d;hb=bbcb4777fc62ff721ff60ebc1697ac61acbf0617;hp=6b4e748da00e3372adb33feee13066b42ebf24d3;hpb=9e35a705cfd7aa7640069ed805919a084ba2405c;p=bookmarks_db.git diff --git a/doc/TODO b/doc/TODO index 6b4e748..8893578 100644 --- a/doc/TODO +++ b/doc/TODO @@ -1,23 +1,25 @@ - Cleanup HTML using BeautifulSoap or Tidy. - Parse downloaded file and get javascript redirects. +Store favicon.ico in attributes ala FF 2.0. - More and better documentation. +Cleanup HTML before parsing using BeautifulSoap or Tidy. +Parse downloaded file and get javascript redirects. - Merge "writers" to storage managers. - New storage managers: shelve, SQL, ZODB, MetaKit. - More robots (URL checkers): threading, asyncore-based. +More and better documentation. - Configuration file for configuring defaults - global defaults for the system - and local defaults for subsystems. +Merge "writers" to storage managers. +New storage managers: shelve, SQL, ZODB, MetaKit. +More robots (URL checkers): threading, asyncore-based. - Ruleset-based mechanisms to filter out what types of URLs to check: checking - based on URL schema, host, port, path, filename, extension, etc. +Configuration file to configure defaults - global defaults for the system +and local defaults for subsystems. - Detailed reports on robot run - what's old, what's new, what has been moved, - errors, etc. - WWW-interface to the report. +Ruleset-based mechanisms to filter out what types of URLs to check: checking +based on URL schema, host, port, path, filename, extension, etc. - Bigger database. Multiuser database. Robot should operates on a part of - the DB. - WWW-interface to the database. User should import/export/edit bookmarks, - schedule robot run, etc. +Detailed reports on robot run - what's old, what's new, what has been moved, +errors, etc. +WWW-interface to the report. + +Bigger database. Multiuser database. Robot should operates on a part of +the DB. +WWW-interface to the database. User should import/export/edit bookmarks, +schedule robot run, etc.