SQLite-Talk

Summary: Talk Page for the SQLite recipe
Maintainer: Petko
Categories: Administration

Please place your comments here. --Petko March 04, 2008, at 05:01 AM


Hi Petko, thank you for this recipe, I tried it and honestly, when I saw the speed I nearly fell off my chair! Before starting to convert all pages to sqlite, I need to ask you to consider this kind of development (to prevent future problems about the number of pages and (perhaps) to increase the performance with many users reading and writing pages at the same time):

Multiple databases associated to the same database connection. A fixed number, in my case I would need no more than 26 (alphabetic) databases (e.g. a.sqlite.db, b.sqlite.db, c.sqlite.db, etc.), so that if the page created belongs to a group that starts with the letter A, the page will be stored in the database a.sqlite.db, if it starts with B, it will be saved in the database a.sqlite.db, and so on. I quote from an official www.sqlite.org page:

“The ATTACH statement is an SQLite extension that allows two or more databases to be associated to the same database connection and to operate as if they were a single database. The number of simultaneously attached databases is limited to SQLITE_MAX_ATTACHED which is set to 10 by default. The maximum number of attached databases cannot be increased above 125. The maximum number of attached databases can be lowered at run-time using the sqlite3_limit(db,SQLITE_LIMIT_ATTACHED,size) interface."

I'm a beginner with databases, but I think that the “in-memory-(temporary)database” generated by the ATTACH statement (described here) could increase the performance when the site has a lot of simultaneous readings and writings - Thanks – Frank July 09, 2015, at 04:33 AM

Sorry, I no longer use the recipe and I don't have enough free time to work on it. If you or someone else would like to improve it, the GPLv3 license allows it so go for it. [Removed demo code which didn't work well.] --Petko July 09, 2015, at 10:53 AM

Thanks for the quick response. I tried the code, it works well, but it is not what I need. I will send you an email shortly. – Frank July 09, 2015, at 06:24 AM


Good day Petko,

I am setting up PmWiki using BlogIt for the homepage, combined with wiki and publish to PDF capabilities for other pages. How robust is your SQLite recipe? Is it currently used in production environments? Do you continue to support it with bugfixes and improvements?

It is not extensively tested but there are no known bugs. It is currently used in a dozen small sites with 1-3 editors and <1000 visitors/day. Installing it together with other recipes which modify the PageStore class may be tricky. --Petko December 02, 2011, at 05:10 AM

Thank you very much for the warning about CustomPageStore. I will reconsider flatfiles, since PDF capability is essential.
Luke, December 2, 2011, 07:09 AM EST

Also, what is the crossover threshold at which point performance is noticeably improved by switching from flat files to SQLite? How many thousands of text files does it take for standard PmWiki to slow down a bit, where SQLite-enhanced PmWiki is faster? I am especially interested in search speed, as I do not care to set up Google to handle my search function.

The recipe is not extensively tested in various environments. The performance of the flat files depends much on the hardware (SSD disks are faster than PATA/SATA which are faster than IDE disks) and on the filesystem -- a recent ReiserFS or Ext4 system will be very fast for many small files. Using it in wikis with many authors editing at the same time may degrade the performance. See also this discussion at the mailing list. --Petko December 02, 2011, at 05:10 AM

Thank you,
Luke
December 02, 2011, at 05:31 AM EST


Hi Petko, I am interested in trying this recipe but I want to be sure I have a backup plan in place. Is it possible to uninstall the SQLite recipe but keep page edits that were made while it was installed?

Thanks, Rik Blok? February 22, 2010, at 08:10 PM

Hello. No, not yet. --Petko February 22, 2010, at 10:14 PM

Thanks for letting me know, Petko. There's no rush, I appreciate all the great work you're doing! -- Rik Blok? February 23, 2010, at 12:01 AM


Isn't it dangerous to store the DB in $WorkDir? It is not a temporary file. Deleting the contents of $WorkDir ought not to lose any data, just caches/indexes. Maxim? January 16, 2010, at 06:06 AM

No it isn't, $WorkDir is 'wiki.d', the directory where the wiki pages (disk files) are stored in a normal pmwiki installation. Even, the recipe doesn't impose this directory, an admin can install it elsewhere. --Petko January 16, 2010, at 06:42 AM


Short URLs don't work anymore with version 20090414. The short url page variables (ShortName, ShortPage, ShortURL) show now the default page names. Only the display of the short URLs is broken - the redirection of short URLs still works.
OliverJakoubek April 22, 2009, at 10:14 AM

Thanks, fixed in 20090422 (just released). --Petko April 22, 2009, at 10:42 AM


This recipe is fantastic! I hope it will prove itself stable so to be used on productions sites.
Blues March 05, 2008, at 04:55 AM

Wow, I'm very impressed, Petko! You mention that there's the possibility of storing page-text variables in separate fields... have you done that yet? If so, this could be a much simpler solution than DataQuery for many sites, because DataQuery has all the overhead of ADOdb.
Ben Stallings March 09, 2008, at 11:20 AM

Thanks for your comments! Blues: yes I plan to use it (if performance is really better) on a number of productions sites, so hopefully it will get better. Ben: no, I haven't but I will in the next few days/weeks for a real website. The code is written so that such local customizations can already be added (in the $SQLiteWriteFunctions list); I'll show some examples soon: in the meantime, reading the code and tinkering with it is allowed... :-) --Petko March 10, 2008, at 04:34 PM


This is an excellent recipe Pekto, well done! Generally seems to work well on my Win32 server with PHP 5.2 and the latest PmWiki. If you are interested I have the following observations:

  • If you have the recipe InlineDiff installed then page diffs do not work. I expect it's something the other cookbook author is doing that is non-standard.
  • If you have existing pages under wiki.d they are NOT found contrary to the recipe docs. I've played with the 'PageStore' configuration you specify on your page and it simply will not find my existing pages under wiki.d. It DOES find the pages under wikilib.d and when editing any of them adds them to SQLite database. My workaround for now is to copy all my wiki.d files to wikilib.d.
  • The page delete protection is great, especially the additional Site.LocalTemplates additions. I've played with this a bit and noticed that when you delete a page which has a number of historical edits, then restore it, all your old history is lost. I expect this is as designed and you clear this out as a result of creating the deleted version of the page. I feel this feature would be perfect if you were able to preserve the historical edits from delete to restore.
  • When I do a (:pagelist fmt=#diffsearch:) on a page I get a great summary listing with links for edit and diff - very nice. For some reason I don't get any number next to the "rev" word e.g. (3 rev). Assume something is wrong here as it works fine on your demo site.

--Smc April 13, 2008, at 10:11PM BST

Thanks a lot for your feedback. Here are my comments:

  • There is a conflict with InlineDiff as both recipes try to handle the page history. To make them both work, try adding to config.php (after including SQLite):
 $HandleDiffFmt[array_search('function:PrintDiff', $HandleDiffFmt)] = 'function:PrintDiffInline';

You also could set this line, for the paginated links to default to "markup view" (slightly different from the code in the recipe):

 if(!isset($_GET['source'])) $_GET['source'] = $_REQUEST['source'] = 'y';
  • I do have existing pages in wiki.d and they are found. Maybe use this code instead of the one I suggested:
 $WikiLibDirs = array(
   &$WikiDir,
   new PageStore('wiki.d/{$FullName}'),### this line changes
   new PageStore('$FarmD/wikilib.d/{$FullName}')
 );
  • "Undeleting" a page should restore all history : there was a bug, fixed in version 84E (just released). Thanks for catching this! :-)
  • The =$RevMatches variable works only with pages that have already been edited (and therefore copied to the sqlite database), AND the pagelist needs to have at least one of the parameters user=, ip=, days= (non-empty).

Thanks, --Petko April 13, 2008, at 06:09 PM

Wow, what was a fast response, thanks for that. Here's my follow-up feedback:

  • Your recommended configuration of 'HandleDiffFmt' is great and now works fine with InlineDiff recipe - cool.
  • Your patch to sqlite.php has sorted out the history restore 'feature' ;-) This works great now.
  • Adding user=* sorted the 'rev' numbering issue out. I was unlucky as I wanted to know the summary for anyone. Trust me to pick the configuration that is not supported.
  • I still had an issue with the search order for pages. I've been digging and found it was my bad. I thought I had a clean install of PmWiki - something I always use to debug new recipes, however I still had XESBlog recipe installed. Once I disabled that, all was working again. Further playing found that I could also get it to work with XESBlog if I did a 'include_once' xesblog.php before the $WikiLibDirs configuration - I had it set including sqlite.php, then the $WikiLibDirs configuration, then include xesblog.php. Had to swap the last two to get it working.

In summary a happy man, you've sorted all the issues I found and deconflicted with recipes I use!

More icing on the cake would be a way to automatically import 'old' flat files into the sqlite database without editing each one - I have over a thousand!

--Smc April 14, 2008, at 10:23PM BST

  • A note about pagelists: if you just want to list all pages and their full number of revisions held in the database, it is much faster to use days=9999 (a big number of days, longer than the life-length of your wiki), than wildcards user=* or ip=*. (Even this is not very optimized if you only want the total number of all all revisions, see below for a better one.)
  • You could also display the total number of revisions of any page (in/out of sqlite, any pagelist, no need to user=...) by adding a custom PageVariable in config.php:
    $FmtPV['$RevCount'] = '$page["rev"]';
    and then use in any pagelist template your new variable {=$RevCount}.
  • The blog recipe has a PageStore definition that conflicts with the existing one; I fear that if it is included before SQLite, it may not always find its own pages/templates; it can be done differently, so I'll talk to Crisses. (The line 24 of xesblog.php could be changed to :
        array_splice($WikiLibDirs, isset($WikiLibDirs[1])?-1:1, 0, 
            array(new PageStore('$FarmD/xeslib.d/{$FullName}')));
    or something similar, and then it should be OK to include it after SQLite, which is highly recommended.)
  • The import feature is on my non-urgent todo list.

Thanks. --Petko April 14, 2008, at 05:28 PM


I have just converted 1.280.000 txt files into pmwikifiles and used the PerGroupSubDirectories recipe to store those files on wiki.d and it goes like this:

$FmtPV['$ixI'] = 'strtoupper(substr($group,0,1))';
$FmtPV['$ixII'] = 'strtoupper(substr($group,0,2))';
$WikiDir = new PageStore('wiki.d/$ixI/$ixII/$Group/{$FullName}');

Pmwiki works fine if I don't use pagelist or try to search anything, but I was wondering if could still use pmwiki for this site, perhaps using your recipe but I got some questions about it.

My questions are:

  1. Will pagelist and search work if I use your recipe?
  2. Will pmwiki still be slow for this amount of files?
  3. Do I have to create more sqlite.db files and folders just like in PerGroupSubDirectories to make it work?
  4. The best thing would be to discard pmwiki as the primary alternative and use mysql and custom programing to make it work?

Thank you for any advice you may have

CarlosAB July 26, 2008, at 01:16 PM

Hi. First, I do not know anyone who used this recipe with that many files -- I have at most 2-300 wiki pages in the database. A test with that many files will be great to see if there is some difference.

Second, there is not yet an "import" function that would copy all wiki pages to the SQLite database: when you install the recipe, only newly created or edited pages will be placed on the database. An import function is on my ToDo list, but I am not sure when it will happen.

About your questions:

  1. I have a feeling that pagelist/searches/includes/PageTextVariables with SQLite run faster than with text files on a filesystem, it needs to be verified with a large number of files. Some usual searches or pagelists could be greatly optimized with a custom function or data table.
  2. The recipe does not need many files, it usually works with one sqlite.db file, where the wiki pages, metadata and edit history are stored.
  3. You do not "have" to, one database file should be able to handle many millions of records, according to the SQLite system developers. You "can" have per-group database files if you need to limit a search to the current group; this however is not trivial and should be done by experienced wiki-admins as some important PmWiki pages need to always be visible.
  4. The recipe uses the SQLite engine as a database storage (in a file), with the included functions in PHP 5, and not MySQL or another external database server. SQLite has some advantages (small, fast, robust) and some limitations that should not be a problem for a PmWiki file storage class.

Hope that helps. --Petko July 30, 2008, at 05:28 AM


Wow PmWiki with SQLite is fast as hell ! Tested with 1000+ pages : pagelist, includes, searches etc. performs A LOT faster. Trully amazing. SQLite support should be included into PmWiki core IMHO. Nice job Petko ! -- Arnold

Talk page for the SQLite PageStore class recipe (users?).