X-Git-Url: https://git.phdru.name/?a=blobdiff_plain;ds=inline;f=doc%2FANNOUNCE;h=f0f261716dae2493e6135922facacab329b5b3fa;hb=c58611ea4c70dc27b550ad4e0267b1a89edda7e0;hp=abcefea391a18498b0554ca1ff6cceda84601c79;hpb=e77293aee7d62c0929943548b41572aed2acaea5;p=bookmarks_db.git
diff --git a/doc/ANNOUNCE b/doc/ANNOUNCE
index abcefea..f0f2617 100644
--- a/doc/ANNOUNCE
+++ b/doc/ANNOUNCE
@@ -2,80 +2,43 @@
Bookmarks Database and Internet Robot
WHAT IS IT
- There is a set of classes, libraries, programs and plugins I use to
-manipulate my bookmarks.html. I like Netscape Navigator, but I need more
-features, so I write and maintain these programs for my needs. I need to
-extend Navigator's "What's new" feature (Navigator 4 calls it "Update
-bookmarks").
+ A set of classes, libraries, programs and plugins I use to manipulate my
+bookmarks.html.
+WHAT'S NEW
+Version 4.6.0 (2014-07-06)
-WHAT'S NEW in version 3.4.0 (2004-08-04)
- Updated to m_lib version 1.2. Extended support for Mozilla;
-keywords in bookmarks.
+ Split simple robot: separate network operations and
+ URL handling/HTML parsing.
+ Change parse_html to parse strings, not files.
-WHAT'S NEW in version 3.3.2
- parse_html.py can now recode unicode entities in titles.
+ Split parse_html/__init__.py into __main__.py.
+ Adapt JSON storage to recent Mozilla export format.
-WHAT'S NEW in version 3.3.0
- Required Python 2.2.
- HTML parser. If the protocol is HTTP, and there is Content-Type header, and
-content type is text/html, the object is parsed to extract its title; if the
-Content-Type header has charset, or if the HTML has with charset, the
-title is converted from the given charset to the default charset. The object is
-also parsed to extract tag with redirect.
+ Add ChangeLog.
+ Allow parameters in BKMK_* environment variables; for example,
+ BKMK_ROBOT=forking:subproc=urllib or
+ BKMK_STORAGE=json:filename=bookmarks_db.json.
-WHAT'S NEW in version 3.0
- Complete rewrite from scratch. Created mechanism for pluggable storage
-managers, writers (DB dumpers/exporters) and robots.
+ Pass subproc parameter to the subprocess to allow different robots.
WHERE TO GET
- Master site: http://phd.pp.ru/Software/Python/#bookmarks_db
+ Home page: https://phdru.name/Software/Python/#bookmarks_db
+ git clone https://git.phdru.name/bookmarks_db.git
+ git clone git://git.phdru.name/bookmarks_db.git
- Faster mirrors: http://phd.by.ru/Software/Python/#bookmarks_db
- http://phd2.chat.ru/Software/Python/#bookmarks_db
+ Requires: Python 2.5+, m_lib 2.0+.
AUTHOR
- Oleg Broytmann
+ Oleg Broytman
COPYRIGHT
- Copyright (C) 1997-2002 PhiloSoft Design
+ Copyright (C) 1997-2017 PhiloSoft Design
LICENSE
GPL
-
-STATUS
- Storage managers: pickle, FLAD (Flat ASCII Database).
- Writers: HTML, text, FLAD (full database or only errors).
- Robots (URL checker): simple, simple+timeoutscoket, forking.
-
-TODO
- Parse downloaded file and get some additional information out of headers
- and parsed data - title, for example. Or redirects using .
- (Partially done - now extracting title).
-
- Documentation.
-
- Merge "writers" to storage managers.
- New storage managers: shelve, SQL, ZODB, MetaKit.
- Robots (URL checkers): threading, asyncore-based.
- Aliases in bookmarks.html.
-
- Configuration file for configuring defaults - global defaults for the system
- and local defaults for subsystems.
-
- Ruleset-based mechanisms to filter out what types of URLs to check: checking
- based on URL schema, host, port, path, filename, extension, etc.
-
- Detailed reports on robot run - what's old, what's new, what was moved,
- errors, etc.
- WWW-interface to the report.
-
- Bigger database. Multiuser database. Robot should operate on a part of
- the DB.
- WWW-interface to the database. User will import/export/edit bookmarks,
- schedule robot run, etc.