-Robots no longer have one global temporary file - there are at least two
-(html and favicon), and in the future there will be more for
-asynchronous robot(s) that would test many URLs in parallel.
+ Split simple robot: separate network operations and
+ URL handling/HTML parsing.
+
+ Change parse_html to parse strings, not files.
+
+ Split parse_html/__init__.py into __main__.py.
+
+ Adapt JSON storage to recent Mozilla export format.
+
+ Add ChangeLog.
+
+ Allow parameters in BKMK_* environment variables; for example,
+ BKMK_ROBOT=forking:subproc=urllib or
+ BKMK_STORAGE=json:filename=bookmarks_db.json.
+
+ Pass subproc parameter to the subprocess to allow different robots.