X-Git-Url: https://git.phdru.name/?a=blobdiff_plain;ds=sidebyside;f=doc%2FTODO;h=64ef4ca4e8591fc4317a69e9613576119e96f026;hb=a8d53ec6ab657ced2a8778ad8616ed9fc0e6a332;hp=887e5a9b0afc9a4790418482df7ecd86ab553491;hpb=fb5c3b2b91aeeb615d6d6d890491af3fdff69556;p=bookmarks_db.git
diff --git a/doc/TODO b/doc/TODO
index 887e5a9..64ef4ca 100644
--- a/doc/TODO
+++ b/doc/TODO
@@ -1,25 +1,38 @@
- Parse downloaded file and get some additional information out of headers
- and parsed data - title, for example. Or redirects using .
- (Partially done - now extracting title).
+Split simple robot: separate network operations and URL handling.
- Documentation.
+Allow parameters in BKMK_ROBOT; for example, 'forking:urllib'.
- Merge "writers" to storage managers.
- New storage managers: shelve, SQL, ZODB, MetaKit.
- Robots (URL checkers): threading, asyncore-based.
- Aliases in bookmarks.html.
+A new robot based on urllib2.
- Configuration file for configuring defaults - global defaults for the system
- and local defaults for subsystems.
+A new robot based on PycURL.
- Ruleset-based mechanisms to filter out what types of URLs to check: checking
- based on URL schema, host, port, path, filename, extension, etc.
+HTML parser based on BeautifulSoup4.
- Detailed reports on robot run - what's old, what's new, what was moved,
- errors, etc.
- WWW-interface to the report.
+A program to publish bookmarks with icons.
- Bigger database. Multiuser database. Robot should operate on a part of
- the DB.
- WWW-interface to the database. User will import/export/edit bookmarks,
- schedule robot run, etc.
+Fetch description from and store it in
+bookmark.description if the description is empty. (How to update old
+descriptions without replacing my own comments?)
+
+Parse (or interpret) downloaded file and get javascript redirects.
+
+More and better documentation.
+
+Merge "writers" to storage managers.
+New storage managers: shelve, SQL, ZODB, MetaKit.
+More robots (URL checkers): threading, asyncore-based;
+robots that test many URLs in parallel.
+Configuration file to configure defaults - global defaults for the system
+and local defaults for subsystems.
+
+Ruleset-based mechanisms to filter out what types of URLs to check: checking
+based on URL schema, host, port, path, filename, extension, etc.
+
+Detailed reports on robot run - what's old, what's new, what has been moved,
+errors, etc.
+WWW-interface to the report.
+
+Bigger database. Multiuser database. Robot should operates on a part of
+the DB.
+WWW-interface to the database. User should import/export/edit bookmarks,
+schedule robot run, etc.