What do you need, a way to create a tool for this, or a tool that can handle it easily? I do not have the latter, but I know a way to do the first. (and this is a programming forum :-)
The following delphi / freepascal / script program downloads multi-file HTML output from the 2 html (tex4ht) latex converter, fixes broken links, restores TOC, and puts it in a separate file (kwd) with keywords in the index.
http://svn.freepascal.org/cgi-bin/viewvc.cgi/trunk/compilelatexchm.pp?view=markup&root=docs
You can try to improve this by using the html parsing procedure to scan keywords, filter them out manually and then revert back to the modified script, starting with the output of the CHM decompressor. (CHM decompression tools come with Free Pascal 2.4.4, they can also store internal CHM files for regular files, so you can reuse them in generation)
The script looks complicated, but this is partly due to the fact that it rewrites HTML code (topic names, TOC mutation, link correction).
Update comments below.
The CHM packet does not generate indexes. The documentation tool (delphicodetodoc or fpdoc) must pass the appropriate indexing information to the CHM compiler. Either as XML, or by creating a tree that invokes various methods. If your html is generated using a tool such as delphitodoc, madness tries to independently restore this index using the above method. The tool has a higher-level view that can generate a much higher quality score, and there will probably be less effort to adapt it.
Please note that I assume that you really mean the index, not the full text, which is automatic.
I donβt know how delphicodetodoc works (and because of draconian build requirements that I wonβt try), but problems sound like delphicodetodoc are not so good at generating CHM.
fpdoc has as a constructive solution that it will not make documents in the source (considered messy), and I don't think that will change.