The service locator, MediaWikiServices, is intended to facilitate
"manual" dependency injection in static entry points.
See also the Dependency Injection RFC T384 and Service Locator
RFC T124792 for details.
The following key points were implemented according the
discussion surrounding these RFCs:
* a configurable DI container that allows extensions to add and replace services.
* no auto-wiring, since it's prone to add confusion in large and complex applications.
* no 3rd party framework, since they typically do too much.
The following services in MediaWiki core are made accessible via the service locator
mechanism to showcase the bootstrapping mechanism:
* ConfigFactory and MainConfig
* SiteLookup and SiteStore
However, the implementation of these services was not yet converted to using proper DI
throughout the code.
Bug: T124792
Change-Id: I3c25c0ac17300d3dd13e1cf5100558a605eee15f
I have left the phpdocs in NullStatsdDataFactory
to clearly show the return types of the interface
implemented.
Change-Id: I96cb64b4af16fc087028269a53d539f8c132f81c
SearchInputWidget is similar to a TitleInputWidget, but doesn't has
a visible loading indication, doesn't highlight the first result and
uses the opensearch api endpoint for suggestions, instead of
prefixsearch.
Extra points:
* Improve documentation of mw.widgets.TitleInputWidget's configuration
option validateTitle
Bug: T118443
Change-Id: I8b8098041fe2971389fa908d007d2e77255829ec
This breaks out the toggle checkbox code into a separate class in
includes/ListToggle.php
Bug: T92230
Change-Id: I8d1aefb83008053e63d59abf8b8915b93e15fcc2
This removes static logic from WatchedItem into a
class that we can slowly fill with watchlist database
and storage things.
This also removes the dual namespace handling in
the new implementation.
Change-Id: Ia67ab1d200ac393c65013b2091e61acefcb3defb
AutoLoaderTest didn't know about traits.
generateLocalAutoload found a missing Trait from the autoloader and
a class map ordering issue.
Change-Id: I34bf2698ad838b6a977c9bf39f6e416330ff0e5d
Add the data values and types to the exception raised when mismatched
session data is processed. This is done by passing the old and new
values on via a new MetadataMergeException class. The attached data is
added to the debug logging context info when caught.
Change-Id: If8a7174399289bc284ca1b36052ba515c8857c50
SessionManager cannot work in the installer since it depends on
ObjectCache which is just an EmptyBagOStuff and so doesn't store
anything. So, introduce a custom SessionProvider which pretends to
persist sessions but actually doesn't.
Bug: T126177
Change-Id: I13d8aa1453c519df7c19ca2f1fb052c99ade043c
* New class ApiMergeHistory handles action=mergehistory
* Merge History functionality moved from SpecialMergeHistory to
MergeHistory
* SpecialMergeHistory now uses MergeHistory for actual merging
* Unit tests and i18n messages for above
Bug: T69742
Change-Id: Ic5078307dae78a2b3687e34a5d0a584988d483a1
By default it still uses PrefixSearch and supports PrefixSearchBackend
but it can be deprecated and phased out and SearchEngine extensions used
instead.
New APIs:
- SearchEngine
public function defaultPrefixSearch( $search );
public function completionSearch( $search );
public function completionSearchWithVariants( $search );
Search engines should override:
protected function completionSearchBackend( $search );
Bug: T121430
Change-Id: Ie78649591dff94d21b72fad8e4e5eab010a461df
Ie161e0f was done in a hurry, and so didn't do things in the best ways.
This introduces a new "CachedBagOStuff" that transparently handles all
the logic that had been copy-pasted all over in Ie161e0f.
The differences between CachedBagOStuff and MultiWriteBagOStuff are:
* CachedBagOStuff supports only one "backend".
* There's a flag for writes to only go to the in-memory cache.
* The in-memory cache is always updated.
* Locks go to the backend cache (with MultiWriteBagOStuff, it would wind
up going to the HashBagOStuff used for the in-memory cache).
Change-Id: Iea494729bd2e8c6c5ab8facf4c241232e31e8215
The plan here is to take it out of 1.27.0-wmf.12 and put it back in
1.27.0-wmf.13.
Since BotPasswords depends on SessionManager, that's getting temporarily
removed too.
This reverts the following commits:
* 6acd424e0d SessionManager: Notify AuthPlugin before calling hooks
* 4d1ad32d8a Close a loophole in CookieSessionProvider
* fcdd643a46 SessionManager: Don't save non-persisted sessions to backend storage
* 058aec4c76 MessageCache: Don't get a ParserOptions for $wgUser before the end of Setup.php
* b5c0c03bb7 SessionManager: Save user name to metadata even if the user doesn't exist locally
* 13f2f09a19 SECURITY: Fix User::setToken() call on User::newSystemUser
* 305bc75b27 SessionManager: Don't generate user tokens when checking the tokens
* 7c4bd85d21 RequestContext::exportSession() should only export persisted session IDs
* 296ccfd4a9 SessionManager: Save 'persisted' flag in session metadata
* 94ba53f677 Move CSRF token handling into MediaWiki\Session\Session
* 46a565d6b0 Avoid false "added in both Session and $_SESSION" when value is null
* c00d0b5d94 Log backtrace for "User::loadFromSession called before the end of Setup.php"
* 4eeff5b559 Use $wgSecureCookie to decide whether to actually mark secure cookies as 'secure'
* 7491b52f70 Call session_cache_limiter() before starting a session
* 2c34aeea72 SessionManager: Abstract forceHTTPS cookie setting
* 9aa53627a5 Ignore auth cookies with value 'deleted'
* 43f904b51a SessionManager: Kill getPersistedSessionId()
* 50c5256352 SessionManager: Add SessionBackend::setProviderMetadata()
* f640d40315 SessionManager: Notify AuthPlugin when auto-creating accounts
* 70b05d1ac1 Add checks of $wgEnableBotPasswords in more places
* bfed32eb78 Do not raise a PHP warning when session write fails
* 722a7331ad Only check LoggedOut timestamp on the user loaded from session
* 4f5057b84b SessionManager: Change behavior of getSessionById()
* 66e82e614e Fix typo in [[MediaWiki:Botpasswords-editexisting/en]]
* f9fd9516d9 Add "bot passwords"
* d7716f1df0 Add missing argument for wfDebugLog
* a73c5b7395 Add SessionManager
Change-Id: I2389a8133e25ab929e9f27f41fa9a05df8147a50
The test in I3a426b92892f4c00cab33a13f6a717751120367c would prevent
the need of this commit :P
Follow up: I2b875326a0eb1e6d1f4bc758b8ac97b8f9324c4e
Change-Id: Id5efc5a5b43a1795c133f64025ea04f3e2d159cb
Move ZhConversion.php and Names.php to languages/data and make them both
expose their data as static class variables instead of in the local
scope. This means that the autoloader can be used to load the data,
which is efficient and secure. This also makes additional request-local
caching of the arrays unnecessary.
Change-Id: Iafb96ac4165d0965fcb9a69f1d0a91139ea9790c
To be used by things that do not care if they are
passed a Title or TitleValue so long as it has a
dbkey, namespace, possible fragment and text
Example uses include:
LinkBatch::addObj
MediaWikiPageLinkRenderer methods
MediaWikiTitleCodec methods
PageLinkRenderer methods
TitleFormatter methods
Change-Id: I2aa1a82129bb01155924a8e6a403c7a47391432f
User keeps most of its token-related methods because anon edit tokens
are special. Login and createaccount tokens are completely moved.
Change-Id: I524218fab7e2d78fd24482ad364428e98dc48bdf
Someone maybe added it by hand :) It will add some noise to
other changes, where autoload.php is genereted with
maintenance/generateLocalAutoload.php
Change-Id: If064fcc1fe87165df72eaf368117e62a51b24e95
None of this works and it's been long begging for a mercy kill.
All it does is waste contributor time on updating deprecations
in the dead code. I imagine we wouldn't reuse much of this
code if we're ever going to reimplement it.
Bug: T119336
Change-Id: Ibd26a4bea621857aac77823017e9be9b7dc52cca
Imported RandomRootPage extension as SpecialRandomrootpage, including
its aliases and localization messages.
Bug: T109809
Change-Id: I7252ae9f4a8f1822b023cc4f0d3a732af48d84d3
Bot passwords are something like OAuth-lite, or Google's application
passwords: the bot can use API action=login to log in with the special
username and password, and will then be allowed to use the API with a
restricted set of rights.
This is intended to provide an easy migration path for legacy bots and
for bots on wikis without OAuth, since AuthManager is going to greatly
complicate non-interactive authentication. If OAuth is available, an
owner-only consumer would be a better choice.
Bug: T121113
Change-Id: Iaa4015e00edbfbfaedcc8b2d27a2d3fd25009159
Depends-On: I7e15331efb162275c4116bcae61f19d6b884cbe3
This also adds code to User to allow SessionProviders to apply the grant
restrictions without needing to hook UserGetRights.
Change-Id: Ida2b686157aab7c8240d6a7a5a5046374ef86d52
SessionManager is a general-purpose session management framework, rather
than the cookie-based sessions that PHP wants to provide us.
While fallback is provided for using $_SESSION and other PHP session
management functions, they should be avoided in favor of using
SessionManager directly.
For proof-of-concept extensions, see OAuth change Ib40b221 and
CentralAuth change I27ccabdb.
Bug: T111296
Change-Id: Ic1ffea74f3ccc8f93c8a23b795ecab6f06abca72
Only works for the currently logged-in user (I'm not sure how that works
with bots, if OAuth will do that correctly or whatever) but will help us
do some neat stuff with tools that use the stash - including resuming
uploads.
Bug: T85561
Change-Id: I215ac6936185563f4c7b42a4bced65e4b096fd15
importTextFiles.php can be used to import pages from text files
containing wikitext.
Also, added $userObj to WikiRevision so that it can accept a User
object instead of just a username.
This is a GCI task.
Change-Id: I20eaf2005bdd3d041f55d8c0b108f001c064d638
Into a new MediaWikiPageNameNormalizer.
The code has been copied over almost 1:1, I only
left the phpunit test special handling in MediaWiki site.
Change-Id: I008cadd29a2aa1f21098339b895c35a100959b04
Use the Maintenance class's new $orderedOptions and support for
passing options multiple times. This allows for option "chaining".
The BackupDumper and TextPassDumper class now extend Maintenance, but
should continue to function as before. The public function processArgs()
has been removed and replaced by processOptions(), which takes no
parameters. It is unlikely that users of these classes were calling
processArgs.
Inheritors of these classes that overrode processOption() will now need to
override processOptions() and use Maintenance::getOption() and friends.
The maintenance/backupTextPass.inc file has been deleted. Users should
include maintenance/dumpTextPass.php instead.
Bug: T122587
Change-Id: I2473ee119c185d1b2b00ac4b1e70ee8a6cafe4a3
The autocomplete search allows special pages to define the list of
subpages to be excepted. Use user names on the following special pages,
because these special pages can be called with a user name as subpage.
Special:Block
Special:Contributions
Special:DeletedContributions
Special:Emailuser
Special:Listfiles
Special:Unblock
Special:Userrights
This makes it easier to navigate to this special pages with a prefilled
user name field.
Hidden user names are always not shown, because the suggestion is cached
between priviliged user and non-priviliged user.
Change-Id: I7db575bf66caaa5136489ed99f1655673b55adaf
Moved classes in Export.php to seperate files in the new directory
includes/export/ and updated autoload.php to these new locations.
Bug: T122531
Change-Id: Idd3bba5a85d65c952f2ff503bea2ca76624c9b7f
These scripts interact with keys that used to be set by ParserCache.php.
However this hasn't been the case for a long time now. They use wfIncrStats(),
which is configured by $wgStatsdServer.
Change-Id: Id6a62aec57801085ed684af9362a96eca0914e92
This is a low level catch-all net for huge updates that still slip
through. Features that let users add/remove arbitrarily many rows
to lists of arbitrary size can easily result in high lag due to
strange usage patterns or deliberate attacks.
Also removed duplicate 'autochange-username' JSON key.
Bug: T95501
Change-Id: I58a91ca23cae528ef1954d2d78c8f0a90681983e
* DeferrableUpdate classes can implement MergeableUpdate.
Duplicate updates will be merged via the merge() method.
* Make SquidUpdate support merge() so that duplicate URL
purges are now caught accross the entire pre-send request
execution.
Change-Id: Idffdd3e71d89e4a0f28281e65a881113caae497c
* PRESEND/POSTSEND constants can now be used in addUpdate()
and addCallableUpdate() to control when the update runs.
This is useful for updates that may report errors the client
should see or to just get a head start on queued or pubsub
based updates like CDN purges. The OutputPage::output() method
can easily take a few 100ms.
* Removed some argument b/c code from doUpdates().
* Also moved DeferrableUpdate to a separate file.
Change-Id: I9831fe890f9f68f9ad8c4f4bba6921a8f29ba666
* Recursive link updates no longer mention an category changes.
It's hard to avoid either duplicate mentioning of changes or
confusing explicit and automatic category changes.
* LinksUpdate no longer handles this logic, but rather WikiPage
decides to spawn this update when needed in doEditUpdates().
* Fix race conditions with calculating category deltas. Do not
rely on the link tables for the read used to determine these
writes, as they may be out-of-date due to slave lag. Using the
master would still not be good enough since that would assume
FIFO and serialized job execution, which is not garaunteed.
Use the parser output of the relevant revisions to determine
the RC rows. If 3 users quickly edit a page's categories, the
old way could misattribute who actually changed what.
* Make sure RC rows are inserted in an order that matches that
of the corresponding revisions.
* Better avoid mentioning time-based (parser functions) category
changes so they don't get attributed to the next editor.
* Also wait for slaves between RC row insertions if there where
many category changes (it theory it could well over 10K rows).
* Using a separate job better separates concerns as LinksUpdate
should not have to care about recent changes updates.
* Added more docs to $wgRCWatchCategoryMembership.
Bug: T95501
Change-Id: I5863e7d7483a4fd1fa633597af66a0088ace4c68
Anything that wants to be "central" right now has to depend on
CentralAuth, and then either can't work without CentralAuth or has to
branch all over the place based on whether CentralAuth is present. Most
of the time all it really needs is a mapping from local users to central
user IDs and back or the ability to query whether the local user is
attached on some other wiki, so let's make an interface for that in
core.
See I52aa0460 for an example implementation (CentralAuth), and Ibd192e29
for an example use (OAuth).
Bug: T111302
Change-Id: I49568358ec35fdfd0b9e53e441adabded5c7b80f
MediaWiki:Foo.json and User:Foo/bar.json pages now have a default
content model of JSON, but existing pages using those names will be set
to defaults of wikitext.
The content models of those pages are now set to "json", unless it has
invalid syntax, in which case it will be set to "wikitext".
For convenience, the script is automatically run as part of update.php.
Bug: T108663
Change-Id: I1412937ccea8e65dba58580beec79cbf2286ae01
It's a dropdown select with the ability to add custom options, or a
text field with input suggestions, whichever you prefer.
This is meant to be a replacement for HTMLSelectOrOtherField and
HTMLAutoCompleteSelectField.
In regular HTML mode, it uses HTML5 `<datalist>` element with no
custom JavaScript. This is supported by a wide range of browsers
(IE 10+, modern Firefox and Chrome, Opera 12+).
In OOUI mode, it uses a ComboBoxInputWidget.
Depends on: I14b40884f185fb4e5
Bug: T118119
Change-Id: I954d3d24ed4efe90be9596a1bd9586ba3aee1e23
The detail information about the count of revision and the interwiki
title get moved into log params. To show the new information an
ImportLogFormatter was written, which changes the key for new log items.
This allows to show the detail information in the user language.
It also decouple the user supplied data from detail information
Change-Id: Iaa57da0fc3e20c33c94fd850cd02db96fdb32dac
Follows-up f36b73e96c, which moved these classes to libs/objectcache.
* Fix wrong @throws in MemcachedPeclBagOStuff.
* Fix wrong @returns in MemcachedBagOStuff::getClient().
* Rename MWMemcached to MemcachedClient.
* Remove mention of 'memcached.php', which doesn't exist anymore.
Change-Id: I34dbc859be4778cea489fd2344f233f30452605c
SiteObject and SiteArray were deprecated in 1.21 and
can't find any use of them anywhere else in gerrit.
Change-Id: Iff3ba8a60ac9566998cce828c4034066fdefe804
* Add `Timing`, an interface which mimics the W3C User Timing API.
It provides a canonical way to store and retrieve markers (timestamps)
and measures (timestamps + duration).
* As the initial use-case, use it to record 'requestShutdown'.
Change-Id: I36b29162ffcc091406df025463b0e2797e52f19a
Also consistently use self:: instead of BagOStuff:: for constants
referenced within the BagOStuff class.
Change-Id: I20fde9fa5cddcc9e92fa6a02b05dc7effa846742
Paths to ./composer.lock is changed to ./vendor/composer/installed.json
and dependency information now is read from the installed.json file.
A new ComposerInstalled class has been declared and uses the class now
to access the getInstalledDependencies() method to read data
Bug: T106247
Change-Id: Ic216577bb19b4fc5832ba003fcbbe9195d707b41
Also removed a few MW dependencies from MemcachedPeclBagOStuff.
It still uses an IP class method, so it has to stay for now.
Change-Id: I8c5c83046c58fb58091d6ce11b2385208262460f
This uses a non-standard output and requires a custom
collector that wmf does not maintain nor use anymore.
Change-Id: I41a68f7061465417fbdc5ca41f8eb6e1f99f1111
I was initially going to refactor out the error message into Import.php,
but it quickly became apparent that WikiImporter's error handling needs a
LOT of work. In particular, to localise the error message into the user's
language is sadly non-trivial.
Although not used currently, the MWUnknownContentModelException subclass
will help with error handling improvements in the future.
Bug: T49270
Change-Id: I9f53c9d6a8a2ea842cb2ba94d4131e10a8b08f5d
Add a simple class to `libs/` for memoizing functions by caching return values
in APC. I decided not to make this an external library just yet because I see
this as potentially a part of a larger functional programming library.
Doesn't use APCBagOStuff for two reasons: (1) avoid dependency on MediaWiki
code; (2) ability to pass third &$success parameter to apc_store, to
distinguish between cache misses and cached false values.
Use this in ResourceLoaderFileModule to cache CSSMin::remap.
Change-Id: I00a47983a2583655d4631ecc9c6ba17597e36b5f
* Simplify the debug log call and use queries group
* Remove $wgDebugDumpSqlLength, as profiler output
already has shortened query strings (one can use
profiling without DBO_DEBUG)
* Removed $wgDebugDBTransactions as BEGIN/COMMIT already show
* Removed PostgresTransactionState as it was only used for
$wgDebugDBTransactions handling
* This cuts down on lots of global variable usage
Change-Id: I185adb1694441d074dea965960429b4910727620
* Most callers gracefully check wfReadOnly(),
but fail in case they dont. This also catches
foreign DBs which might slip through the cracks.
* Also remove useless wfDebug() call around
mDoneWrites check as write queries show in
the logs anyway.
Change-Id: I560ebd19c4eb2b3a040d4331702346440617cfaa
This gives static method callers the option
to use methods like Database::factory() instead
of having to use the uglier DatabaseBase::factory().
Change-Id: I61800626b71ad2803a897df060059dbaf8778679
MWCryptRand already has some useful utility functions wrapping PHP's
hash() and hash_hmac(). Let's make them public so we can use them from
other code.
But since "MWCryptRand" isn't really a good place for hashing functions,
let's move them to "MWCryptHash" instead.
Change-Id: I7542c719ac72beba7b0f6aa170bdb4c69fa6beab
* This jobs should only be constructed via relevant Content object,
e.g. the result of enqueueUpdate() being called on a DataUpdate
returned by Content::getSecondaryUpdates().
* Also modified LinksDeletionUpdate to support a $pageId parameter.
* LinksDeletionUpdate can now be enqueued to a DeleteLinksJob.
Change-Id: I650dcf0bd172ede0d61357ec158a4704ae1f2033
The class only contains two dependencies upon MediaWiki (ObjectCache &
wfGlobalCacheKey) which are suitable for inclusion in the utils
directory.
Change-Id: I85b4c763be2670c40f26d93e75cedcb68eaa7987
Caching the output of a LESS compiler is tricky, because a LESS file may
include additional LESS files via @imports, in which case the cache needs
to vary as the contents of those files vary (and not just the contents of
the primary LESS file).
To solve this, we first introduce a utility class, FileContentsHasher. This
class is essentially a smart version of md5_file() -- given one or more file
names, it computes a hash digest of their contents. It tries to avoid
re-reading files by caching the hash digest in APC and re-using it as long as
the files' mtimes have not changed. This is the same approach I used in
I5ceb8537c.
Next, we use this class in ResourceLoaderFileModule in the following way:
whenever we compile a LESS file, we cache the result as an associative array
with the following keys:
* `files` : the list of files whose contents influenced the compiled CSS.
* `hash` : a hash digest of the combined contents of those files.
* `css` : the CSS output of the compiler itself.
Before using a cached value, we verify that it is still current by asking
FileContentHasher for a hash of the combined contents of all referenced files,
and we compare that against the value of the `hash` key of the cached entry.
Bug: T112035
Change-Id: I1ff61153ddb95ed17e543bd4af7dd13fa3352861
* Updates can now declare themselves as having enqueueUpdate()
as an alternative to doUpdate(). This lets more expensive
or slave lag producing updates use the job queue if desired.
* Added a $mode flag to DataUpdate::runUpdates() to prefer
pushing jobs over calling doUpdate().
* Made page deletions defer deletion updates when possible.
Bug: T95501
Change-Id: Ic6f50f92768089ba0fbc223b8d178f5a91512959
This allows a logging channel to be configured to write
directly to kafka. Logs can be serialized either to json
blobs or the more compact apache avro format.
The Kafka handler for monolog needs a list of one of more
kafka servers to query cluster metadata from. This should be
able to use any monolog formatter, although some like
JsonFormatter require you to disable formatBatch as Kafka
protocol would prefer to encode each record independently in
the protocol. This requires the nmred/kafka-php library,
version >= 1.3.0.
Adds a new formatter which serializes to the apache avro
format. This is a compact binary format which uses pre-
defined schemas. This initial implementation is very simple
and takes the plain schemas as a constructor argument.
Adds a new option to MonologSpi to wrap handlers in a
BufferHandler. This doesn't flush until the request shuts
down and prevents any network requests in the logger from
adding latency to web requests.
Related mediawiki/vendor update: Ibfe4bd2036ae8e998e2973f07bd9a6f057691578
The necessary config is something like:
array(
'loggers' => array(
'CirrusSearchRequests' => array(
'handlers' => array( 'kafka' ),
),
),
'handlers' => array(
'kafka' => array(
'factory' => '\\MediaWiki\\Logger\\Monolog\\KafkaHandler::factory',
'args' => array( 'localhost:9092' ),
'formatter' => 'avro',
'buffer' => true,
),
),
'formatters' => array(
'avro' => array(
'class' => '\\MediaWiki\\Logger\\Monolog\\AvroFormatter',
'args' => array(
array(
'CirrusSearchRequests' => array(
'type' => 'record',
'name' => 'CirrusSearchRequests'
'fields' => array( ... )
),
),
),
),
),
)
Bug: T106256
Change-Id: I6ee744b3e5306af0bed70811b558a543eed22840
This adds a "requires" property to extension.json, which extensions and
skins can use to indicate which versions of MediaWiki core they support.
The hacky wfUseMW() is now deprecated in favor of this.
Rather than writing our own version constraint and parser library, we
can re-use composer's, which was recently split out into a separate
library named "composer/semver" for this patch.
Any syntax accepted by composer[1] is available for usage here. Test
cases have been provided to demonstrate how versions are parsed. For now
it is recommended that people stick to expressing compatability with
stable versions (e.g. ">= 1.26").
This patch does not support requiring specific MediaWiki core WMF
branches, since those do not follow the standard semver format that
composer parses. If we are unable to parse $wgVersion, all checking will
be skipped and reported as compatible.
[1] https://getcomposer.org/doc/01-basic-usage.md#package-versions
Bug: T99084
Change-Id: I7785827216e16c596356d0ae42d6b30f3f179f10
* Split tidy implementations into a class hierarchy
* Bring all tidy configuration into a single associative array and
deprecate the old configuration.
* Remove $wgAlwaysUseTidy
This is preparatory to replacement of Tidy (T89331). I used the name
"Raggett" for things relating to Dave Raggett's Tidy, since if we use
"tidy" to mean the new abstract system as well as Raggett's tidy, it
gets confusing.
Change-Id: I77af1a16cbbb47fc226d05fb9aad56c58e8910b5
Recreated the form with OOUI widgets. This required putting together
ComplexTitleInputWidget to handle the split namespace+title field on
this page.
Bug: T86865
Change-Id: Ice69df851137e3454ae2c9f4c75494b18cf8a75a
Complete the 'librarization' of IPSet by replacing the code in core with a
dependency on the external library.
Change-Id: I789b4fb42ee1da44ea3d8e1db551b047e11a439e
mw.ForeignApi is an extension of mw.Api, automatically handling
everything required to communicate with another MediaWiki wiki via
cross-origin requests (CORS).
Authentication-related MediaWiki extensions may extend it further to
ensure that the user authenticated on the current wiki will be
automatically authenticated on the foreign one. A CentralAuth
implementation is provided in I0fd05ef8b9c9db0fdb59c6cb248f364259f80456.
Bug: T66636
Change-Id: Ic20b9682d28633baa87d22e6e9fb71ce507da58d
* Refactor NamespaceInputWidget into two widgets: NamespaceInputWidget
and ComplexNamespaceInputWidget. The former is now only the dropdown
(and inherits from DropdownInputWidget), the latter is the dropdown
plus two checkboxes.
* Change ComplexNamespaceInputWidget configuration to take nested config
for `invert`, `associated`, and `namespace`, rather than require
parameters like `invertName` and so on for every combination.
* Implement standalone JavaScript versions of both widgets (previously
mw.widgets.NamespaceInputWidget could only be created via infusion
of the PHP widget).
Bug: T99256
Bug: T106138
Bug: T109559
Change-Id: Ie2fee6d035339ceb934fca991675480db3d630d1
Though this script was run on Wikimedia sites years ago (see T8399), there
are enough reasons to doubt it will be run again:
* There is a hardcoded maximum page_id value, which would have to be
changed or removed before reuse.
* It scans the entire page table in a single SELECT query and stores all
values of page_id and page_latest in a PHP array, which might not be
feasible on a large wiki.
* It writes directly to slaves. In contrast, the manual page for
pt-table-sync (from Percona Toolkit) says in general, "[...] it always
makes the changes on the replication master, never the replication slave
directly. This is in general the only safe way to bring a replica back
in sync [...]".
* It only works on the page/revision/text tables. In contrast, pt-table-sync
can work on any table having a primary key.
* It does try to detect whether revisions are missing on the master (instead
of on the slave). However, this won't work because of a bug introduced in
r91243 / bb1df74f87 (it actually queries the master twice and puts the
second result set in $slaveIDs).
Change-Id: I85c98821af308abf7dde8068d7cbca17d06b1362
Migrate the move protect log as first sub type of the protection log,
because it does not have complex log parameter, which needs some way of
handling/migration.
It also keeps the gerrit change smaller and hopefully makes review
easier.
The other sub types of the protection log will be migrated in a later
patch set.
This allows use of gender on Special:Log. Old message is kept for use
in IRC. A test was added to ensure an unchanged IRC message.
Bug: T47988
Change-Id: I57b3bd8a7dc823acdbb56520d2364f5542283373
This is a set of classes written for Echo to simplify writing
maintenance scripts that iterate over an entire table and update
some of those rows.
This has shown to be reusable elsewhere, especially the BatchRowIterator
class and will be useful to have generally avilable in core. The Echo
classes are all prefixed with the Echo name so there wont be any
conflict is both are installed.
Change-Id: I64c1751106caf34f41af799dbaf8794115537f06
Add a Monolog Formatter class that uses
MWExceptionHandler::getRedactedTraceAsString when outputting stack
traces.
Bug: T107440
Change-Id: Ic580c137e27aac95435f7b073a18cf61820b172f