-Split simple robot: separate network operations and URL handling.
-
-Allow parameters in BKMK_ROBOT; for example, 'forking:urllib'.
-
-A new robot based on urllib2.
-
-A new robot based on PycURL.
-
-HTML parser based on BeautifulSoup4.
+Robot based on PycURL.
A program to publish bookmarks with icons.
+Configuration file to configure defaults - global defaults for the system
+and local defaults for subsystems.
+
Fetch description from <META name="description" content="..."> and store it in
bookmark.description if the description is empty. (How to update old
descriptions without replacing my own comments?)
More and better documentation.
Merge "writers" to storage managers.
-New storage managers: shelve, SQL, ZODB, MetaKit.
-More robots (URL checkers): threading, asyncore-based;
-robots that test many URLs in parallel.
-Configuration file to configure defaults - global defaults for the system
-and local defaults for subsystems.
+New storage managers: SQL (SQLite?).
+More robots (URL checkers): asyncore-based;
+robot(s) that test many URLs in parallel.
Ruleset-based mechanisms to filter out what types of URLs to check: checking
based on URL schema, host, port, path, filename, extension, etc.
the DB.
WWW-interface to the database. User should import/export/edit bookmarks,
schedule robot run, etc.
+
+A program to collect and check links from a site.