How should I handle autostart in the contents of a wiki page?

What I mean by autolinking is the process by which wiki links embedded in the content of a page are generated either in a hyperlink to the page (if it exists) or in a creation link (if the page does not exist).

Using the analyzer that I use is a two-step process - firstly, the contents of the page are analyzed and all links to wiki pages are extracted from the original markup. Then I pass the array of existing pages back to the parser before the final HTML markup is created.

What is the best way to handle this process? It seems that I need to save the cached list of each individual page on the site, instead of extracting the index of the page titles each time. Or is it better to check each link separately to see if it exists? This can lead to a large number of database queries if the list has not been cached. Will it still be viable for a larger wiki site with thousands of pages?

+4
source share
6 answers

In my own wiki, I check all links (without caching), but my wiki is used only by a few people inside. You should compare such things.

+1
source

On my own wiki, my caching system is pretty simple - when the page refreshes, it checks the links to make sure they are valid and applies the correct formatting / location for those that don’t. The cache page is saved as an HTML page in my cache root.

Pages marked as "not created" during page refresh are inserted into the database table that contains the page, and then to the csv of the pages that link to it.

When someone creates this page, he initiates a scan to view each link page and re-cache the link page with the correct link and formatting.

If you were not interested in highlighting pages that were not created, however, you can simply check if the page was created while trying to access it - and if not redirected to the creation page. Then simply link to the pages as usual in other articles.

+1
source

I tried to do it once, and it was a nightmare! My solution was an unpleasant loop in the SQL procedure, and I do not recommend it.

One thing that was causing me problems was to decide which link to use for a verbose phrase. Say you had a text that said, “I use Stack Overflow,” and your wiki had 3 pages called “stack”, “overflow” and “stack overflow” .... what part of your phrase is related to that? It will happen!

+1
source

My idea would be to query for titles like SELECT title FROM articles , and just check if every wikilink is in this array of strings. If you link to a page, if not, you link to a creation page.

0
source

In a personal project that I did with Sinatra ( link text ) after launching the content via Markdown, I do gsub to replace wiki words and other things (for example, [[Here is my link]] and something else) with the correct links, when each check to see if the page exists and links to it to create or view.

This is not the best, but I have not created this caching / speed application. This is a basic simple wiki.

If speed was more important, you could wrap the application in cache. For example, a sinatra may be wrapped in rack caching.

0
source

Based on my experience developing Juli , which is a self-contained autonomous personal wiki, generating a static HTML approach can solve your problem.

Do you think it takes a long time to create an autocomplete Wiki page. However, when generating a static HTML situation, the regeneration of an automatically freezing Wiki page occurs only when wikipage is added or deleted again (in other words, this does not happen when wikipage is updated), and “regeneration” can be performed in the background, so usually I don’t like this will take a long time. The user will see only the generated static HTML.

0
source

Source: https://habr.com/ru/post/1276358/


All Articles