* r41081 was causing the job queue to be flooded with tiny htmlCacheUpdate jobs which were less than the batch size and so, according to the original logic, should have been done immediately. This was causing template updates to be delayed even when the template has few backlinks. This is fixed.
* Introduced a shared cache called BacklinkCache with the main purpose of sharing data from these backlink queries, thus recovering the performance of r41081.
* Refactored backlink partitioning code, which in r40741 was copied from HTMLCacheUpdate to LinksUpdate with a bug intact. The bug caused every htmlCacheUpdate or refreshLinks2 job to be split into at least two pieces even when number of rows is less than the batch size.
* Fixed a bug from r40741 causing refreshLinks2 jobs with end=false to be ignored.
* Made SquidUpdate::newFromTitles() accept a TitleArray
Doxygen documentation update:
* Changed alls @addtogroup to @ingroup. @addtogroup adds the comment to the group description, but doesn't add the file, class, function, ... to the group like @ingroup does. See for example http://svn.wikimedia.org/doc/group__SpecialPage.html where it's impossible to see related files, classes, ... that should belong to that group.
* Added @file to file description, it seems that it should be explicitely decalred for file descriptions, otherwise doxygen will think that the comment document the first class, variabled, function, ... that is in that file.
* Removed some empty comments
* Removed some ?>
Added following groups:
* ExternalStorage
* JobQueue
* MaintenanceLanguage
One more thing: there are still a lot of warnings when generating the doc.
* Convert "$dbw =& wfGetDB( DB_MASTER );" --> "$dbw = wfGetDB( DB_MASTER );"
* convert "$skin =& $wgUser->getSkin();" --> "$skin = $wgUser->getSkin();"
For the time being have not changed the function definitions of wfGetDB() or User::getSkin() [i.e. they are still both return-by-ref], so as to ensure the interface does not change for extensions [some of which may still be trying to run on PHP4 environments]. However presumably at some point this can be changed too.
Also includes tiny tweak to newlines in parserTests - will show 1 rather than 2 newlines between the "Reading tests from" strings when in quiet mode.
Somebody changed the parameters and return value of Image->thumbUrl()
and didn't update all uses. Silly of them!
Also add a paranoia check on urls in the list in SquidUpdate
And don't redefine the socket options constants when called a second time
You can now feed both local relative URLs and fully-qualified URLs to the
squid purge functions; local URLs will have $wgInternalServer prepended
if needed. This avoids the broken URLs produced by other functions prepending
it when it wasn't always needed.
dealing with moves and deletes of widely-linked pages.
The updaters should be fixed up to understand future versions without the
tables there without breaking upgrades.
* Introduced image "broken links" allowing the user to quickly upload an image with that name
* "Upload a new version of this image" link from the image description page
* DB_WRITE now called DB_MASTER, DB_READ now called DB_SLAVE
* Converted to use SQL wrapper functions instead of direct SQL in various places
* Experimental method for preserving the chronological order of events when slave servers are used. Untested.
* Fixes to the new post-parse existence test feature
* Some.. other stuff
* Split MediaWiki namespace into MediaWiki and Template (requires changes to all language files)
* Purge links to on edit of Template namespace
* General refactoring of purging and cache invalidation code
This saves trouble in a number of places where we can now do joins with the link tables to get other info (such as cur_is_redirect!) as well as the name, and fewer bits need to be juggled on page renaming, as outgoing links no longer have to be changed (cur_id remains the same when a page is renamed).
rebuildLinks.inc and some of the tools in the 'maintenance page' still need to be updated to work with the new setup. (Special:Maintenance needs a *lot* of cleanup in general. It's kind of a catch-all of vaguely defined features which suck performance like a hydroelectric dam.)
Also I've slipped in some extra debug code. And, I think 'indexes.sql' is a big waste of time and should all be moved into tables.sql. Building indexes separately doesn't help on InnoDB and won't do anything on MyISAM either if you're just going to replace the table after it's built with an imported one from a dump which creates it with indexes.
practical, clear methods:
Title::getLocalURL() - "/wiki/index.php/Foobar" or "/wiki/index.php?title=Foobar&action=edit"
Title::getFullUrl() - ditto with $wgServer on the front
Title::getInternalUrl() - ditto with $wgInternalServer on the front (for some squid-related functions)
Title::escapeLocalUrl() - local URL escaped for HTML output
Title::escapeFullUrl() - full URL escaped for HTML output
All take an optional query parameter.
Title::getURL(), wfFullUrl() and wfFullUrlE() are now officially
deprecated and will result in instant death. wfLocalUrl() and wfLocalUrlE()
will be killed shortly; they are still used in the language files.
* Converted many instances of globals from the query to $_REQUEST
* Renamed near-useless Title::getURL() to Title::getPartialURL()
* Created new Title::getURL(), to replace wfLocalUrl, wfLocalUrlE, wfFullUrl and wfFullUrlE. Replaced most instances throughout the code
* In Parser.php, generalised stripping of <nowiki>, <pre> and <math> to allow more general use such as nesting
* Moved body of Article::preSaveTransform to Parser.php
* Put lots of comments in Title.php