The `supportsContentModel` method is really querying Parsoid for the
set of content models it supports, so it makes sense to put it in the
Parsoid-specific SiteConfig service.
This is part of the work to deprecate and remove ParsoidOutputAccess.
Change-Id: I81eb2df8cef93ede95361a4e03185b3d58e5b84b
This patch introduces a namespace declaration for the
MediaWiki\Content to JsonContent and establishes a class
alias marked as deprecated since version 1.43.
Bug: T353458
Change-Id: I44abb1ab5bd1fabf9886dc1457e241d7cae068bc
This patch introduces a namespace declaration for the
MediaWiki\Content to TextContent and establishes a class
alias marked as deprecated since version 1.43.
Bug: T353458
Change-Id: Ic251b1ddfcf6db9c85cb54cddf912aa827d2bc3a
This patch introduces a namespace declaration for the
Wikimedia\FileBackend to FileBackend and establishes a class
alias marked as deprecated since version 1.43.
Bug: T353458
Change-Id: Id897687b1d679fd7d179e3a32e617aae10ebff33
This commit replaces some of the uses of getErrorsArray(),
getWarningsArray(), getErrorsByType(), and getErrors().
In many cases the code becomes shorter and clearer.
Follow-up to Ibc4ce11594cf36ce7b2495d2636ee080d3443b04.
Change-Id: Id0ebeac26ae62231edb48458dbd2e13ddcbd0a9e
This patch introduces a namespace declaration for the
MediaWiki\Json to FormatJson and establishes a class
alias marked as deprecated since version 1.43.
Bug: T353458
Change-Id: I5e1311e4eb7a878a7db319b725ae262f40671c32
This patch introduces a new namespace declaration,
MediaWiki\Xml and adds Xml and XmlSelect to it
and establishes class aliases marked as deprecated
since version 1.43.
Bug: T353458
Change-Id: I45cccd540b6e15f267d3ab588a064fbeb719d921
Creates two metrics, one emitting milliseconds for
backwards-compatibility and another emitting seconds, aligned with
Prometheus' upstream recommendations.
Update test.
Bug: T359382
Change-Id: Id5a22937b60a209b6ba46633879551d24cf93a45
Similar as done to CA script, incoporating:
* a09e50656c22f486a2169318d3bc020592c47ffe
* da2d6db2c028b94c3eb63331c9035dbf22140c9e
* 087fca71e3d05d891d74ba36738b3347f45b95f5
* 224fcb473a9326e524b3f7dc0eae36cd4db61501
* 1be0586281881c8c9e485501fe5c28ab060bcc3f
Change-Id: I80a7046a4a0c6a699645aced225362baeb6cd732
* Deprecate and stop using $wgBlockTargetMigrationStage. Remove
block_target migration code.
* Make the $schema parameters to DatabaseBlockStore methods default to
SCHEMA_BLOCK. Avoid passing these parameters where possible.
* Remove cleanupBlocks.php
* Deprecate DatabaseBlock static methods which try to present the old
schema for b/c.
Bug: T362133
Change-Id: I845bad8cc09a4528fef46a6f9d490ebdec881d99
Migrate from ipblocks to block/block_target and drop the ipblocks
table. Update tests.
In PostgresUpdater, change some schema update functions to skip field
updates if the table doesn't exist, by analogy with
DatabaseUpdater::modifyField.
Bug: T346293
Change-Id: Icf91b35f7f729cead7c800429653eb30731762a1
Prior to this patch, userOptions.php was missing
the LIMIT SQL clause, which means it was trying
to delete all rows within a single SQL query.
This patch adds the missing LIMIT, and userOptions.php
now actually batches when --delete is used.
Bug: T364311
Change-Id: Icc9311febbdda4dbda5280b466cd5483c578a7b1
Wikidata is at 51.3% of its maximum value (see [1]), so we still have
comfortable time to make this change, but it will inevitably be
necessary and there is no point in postponing this change into the
future. The autoincrement value will not get smaller. ¯\_(ツ)_/¯
Also since rc only stores stuff for 30 days, the table is not that
big.
[1] - https://grafana.wikimedia.org/d/79S1Hq9Mz/wikidata-reliability-metrics?viewPanel=29&orgId=1
Bug: T63111
Change-Id: Icf3dc9815814ef73aa6a39f1c221a349e6b76872
* Drop default value from rev_actor and rev_comment_id
* Make rev_id a bigint
Bug: T215466
Depends-On: I88318d7bcc063bc86a56eeb5f00048ea6e81964b
Change-Id: Id0a3d920e8b2dc8643fa3c0341b34ab3ed5761dc
Why:
* The generateSitemap.php script currently generates a sitemap
using pages from the namespaces in the $wgSitemapNamespaces,
or if this is not defined then all namespaces that currently
have pages.
* However, being able to specify what namespaces should be used
instead of using the site config is useful in the case of
generating sitemaps ad-hoc and/or if that config is not set.
What:
* If the '--namespaces' argument is provided, then use it instead
of checking for the site config or looking for all namespaces
that are defined on the wikis.
* Use the namespaces provided in the namespace argument over the
site config. If no namespaces are provided in the command line
arguments, then check the site config.
Bug: T19748
Change-Id: If4a393605201be00200833c36b522bb34fcb651d
Writing joins with the join() function of the SelectQueryBuilder makes
it easier to read instead of having a list of tables and the condition
in the where clause.
Change-Id: I4d52f73c1c07f68cd0d06f36f8e5696d0533d238
DBConnRef is internal, use of DatabaseSqlite class for this special
maintenance script seems valid.
Change-Id: I82c0085b9953367557f32c3ebb8dea3aacc9c1ae
In the following test cases in MaintenanceRunnerTest.php:
yield 'extension script path, using prefix'
=> [ 'FakeExtension:fakeScript.php', FakeScript::class ];
// NOTE: assumes the class has been loaded by the previous test case!
yield 'extension class, using prefix'
=> [ 'FakeExtension:FakeScript', FakeScript::class ];
…the noted assumption did not quite hold on Windows.
After including …/fakeScript.php, the class was indeed loaded. The
next case would use the loaded class, but before it did that, it
first tries to include …/FakeScript.php - which doesn't exist, but
it's the same as …/fakeScript.php on case-insensitive filesystems,
so the file was loaded twice, causing "Fatal error: Cannot declare
class MediaWiki\Extension\FakeExtension\Maintenance\FakeScript,
because the name is already in use in …\fakeScript.php on line 0".
Re-order some code in MaintenanceRunner, so that it tries to use the
existing class (or autoload it) before it tries to include the file.
Change-Id: Ic7ed7139bbede48097df59338c82688081688c3b
* Update the maintenance Makefile to point to npm run doc and drop the custom file
* Drop sync references to the eg-iframe system, dropped in 5a3922a4a
* Drop a file from OOUI only imported for said eg-iframe system
Bug: T138401
Change-Id: Ic34c028ef6b43e2ba3dc6f215b6a1e7d94d97e0a
According to the dictionary, "per" (or more conventionally "as per")
means "according to". Refer OED "per" sense II.3.a. For example:
"No value was passed, so return null, as per default".
In this sentence, we are not specifying the default, we are referring
to the default. This correct usage of "per default" was used nowhere
in MediaWiki core as far as I can see.
Instead we have "per default" being used to mean "by default", that is,
giving the value to use when no explicit value was specified.
In OED, the phrase "by default" is blessed with its own section just
for computing usage:
"P.1.e. Computing. As an option or setting adopted automatically by a
computer program whenever an alternative is not specified by the user
or programmer. Cf. sense I.7a."
There are highly similar pre-computing usages of the same phrase,
whereas the phrase "per default" is not mentioned.
As a matter of style, I think "per default" should not be used even
when it is strictly correct, since the common incorrect usage makes it
ambiguous and misleading.
Change-Id: Ibcccc65ead864d082677b472b34ff32ff41c60ae
Why:
- We don't want to allow unlimited acquisition of temp account names.
These should be rate limited in similar way to how we limit the
creation of temp accounts
What:
- Provide a TempAccountNameAcquisitionThrottle, and use it in the
acquireName() method
- Set a default that is 10 times the limit of
TempAccountNameCreationThrottle
Depends-On: If660aad1d0f04f366414084aff3f88484a19d416
Bug: T343101
Change-Id: I99d5973498a89ac227847de5837c0a8e895c28fb
Also type-hint for IReadableDatabase
Depends-On: I15fc617eae8bc18c911525ea382e99e82b40011a
Depends-On: Ib77f8f409b48115684396bf920428adb075c2820
Change-Id: I09d07ba11e8cd6d288c1ed5ddea89b654e24cbb2
Remove check for false from IDatabase::select as this is not possible
A DBQueryError is thrown (documented since efda8cd3 / I056b7148)
Use IResultWrapper::numRows to check for empty IResultWrapper
This ignores includes\libs\rdbms as QUERY_SILENCE_ERRORS is an internal
option to get false from this function
Change-Id: I4b2fc26ca0e68612f6beadc01e68097a74962c84
Commit 676fcf43 changes various substr() calls to str_starts_with,
except in some cases it adopted `$str[0]` or `$str[-1]` notation
instead. The cases in question:
* includes/Request/PathRouter.php
This one was safe since `$path` is already presumed to be non-empty
earlier in the same function via `$path[0]`.
* maintenance/generateSitemap.php
This one is not safe as the stting may be empty, leading to a
warning for undefined offsets in some cases.
Bug: T361379
Change-Id: I008f7b390fa08a813a0e44e8c29671e705db64f0
Based on CI output, and what `doxygen -s -u` does automatically.
Diff minimised to only differences that fix warnings. Verbose
comments and copies of new default values are left uncommitted.
* HTML_TIMESTAMP=YES -> TIMESTAMP=YES
Renamed in 1.9.7.
https://github.com/doxygen/doxygen/commit/27933ab863
* FORMULA_TRANSPARENT=YES
DOT_TRANSPARENT=YES
No longer configurable since 1.9.5.
https://github.com/doxygen/doxygen/commit/8b7822cb67
* DOT_FONTNAME=Helvetica -> DOT_COMMON_ATTR
DOT_FONTSIZE=10 -> DOT_COMMON_ATTR
Renamed in 1.9.5.
https://github.com/doxygen/doxygen/commit/4bfd8297
Matches default.
https://doxygen.nl/manual/config.html#cfg_dot_common_attr
> The default value is: labelfontname=Helvetica,labelfontsize=10.
* DIA_PATH
MSCFILE_DIRS
Moved to a different section by `doxygen -s -u`.
Change-Id: I6354dc34af3aba6c0132f31fd290154ae54942c4
Follows-up:
* I696f1e7ede (60f106e112): Apply `vendor` and `node_modules` exclude
patterns even when `--file` is used to generate docs for an extension
or a subset of core.
* I9313457796 (eaf491b4f4): Use faster and upstream supported EXCLUDE
instead of EXCLUDE_PATTERNS for skipping whole top-level dirs.
Neither of these addressed the fact that this `if` condition is not
actually possible to reach because `explode()` always returns an
array with at least one item. e.g. `exclude(',', '') == [ '' ]`.
Test plan:
* `php maintenance/mwdocgen.php --output ./docs/ --version whatever`
* Review the contents the mentioned `/tmp/MWDocGen-*` file that is
auto-generated.
Before:
```
INPUT = ""
…
EXCLUDE = cache images
```
After:
```
INPUT =
…
EXCLUDE = cache images extensions skins
```
Bug: T317451
Change-Id: I871b321fa9fdb2d763fba532ab6248b0fc4b1701
Added a flag (--by-id) to maintenance/deleteBatch.php that allows users to pass page IDs instead of titles.
Bug: T357019
Change-Id: I599f9fcc8d4cfd9e667717d5a575b05990141fb2
* Prefer the MW_INSTALL_PATH constant where possible.
* For input and output, there is no need to specify full paths,
only to strip the prefix again afterwards.
See also Doxyfile in any repo that is not mediawiki/core where we
already set the path relative to the project directory, e.g.
"src/" as input and "docs" as output.
<https://codesearch.wmcloud.org/search/?q=INPUT%7COUTPUT&files=Doxyfile>
Test Plan:
* `php maintenance/mwdocgen.php --file docs,includes/libs/objectcache/`
Change-Id: I4119161accdc665d0e4d0d4e49d0c42c68f8e2a2
"Deny from all" is deprecated; the replacement syntax has been
available since Apache 2.4 (originally released in 2012).
See <https://httpd.apache.org/docs/2.4/howto/access.html>.
Bug: T360850
Change-Id: I825053ccefe34f6ca4e04af5ad2601f79e4d51a7
The RandomPage alias was dropped in an earlier commit, so
group the release note entry in with these.
Change-Id: I207d7463ced1a1bb8b8ac749eba175fc0037a217
Don't go into an infinite loop if the batch contains only rows that are
skipped due to ipb_user === 0.
Change-Id: I2b8398bcc78267e80313ecc6e64033805d83fdef
When copying blocks from ipblocks to the new block table, skip rows for
which the relevant ID already exists in the block table. In write-both
mode, it is expected that some blocks will have already been copied.
This also allows the script to be run multiple times on a wiki, so that
it can be used to clean up the current situation in production.
Bug: T355034
Change-Id: I54e65adef685bfc7d4f63853cd50ca0f55e2ecdb
The idea is similar to the one behind TempUser\Pattern::toLikeValue(),
which has been effectively deprecated by allowing multiple patterns.
Change-Id: Iddb284bff21355deb8ceaa6925d3c2aea34d727e
Why:
- The pathway to creating a regular account is different from that of a
temporary account. It makes sense to rate limiting creation of these
two types of accounts at different rates.
What:
- Add a TempAccountCreationThrottle config with a restrictive default
that matches the existing production configuration of
AccountCreationThrottle (6 creations per day)
- Update resetAuthenticationThrottle.php to support resetting the temp
account creation throttle
- For now, not adding an equivalent hook for account creation throttle's
ExemptFromAccountCreationThrottle
Bug: T357777
Change-Id: Ibf36ac96f717107bace6f0a3326f79ed129a1dfe
The TranslationAliasesDirs configuration allows defining translatable
aliases in JSON files. The value should be a name or names of folders
that contains files that have localized aliases. Each language should
have a separate file.
Currently, it supports defining special page aliases but in the
future can be extended to support magic words and namespace aliases.
The patch adds a script: ConvertExtensionsMessagesToTranslationAlias
that can be used to convert existing ExtensionMessagesFiles to the new
format.
Bug: T89947
Change-Id: Ief16a48a8dc8742854f67301791aa2a0b0531116
When replication is broken to a host, its lag is returned as `false`
which was converted to an integer and thus yielding 0. That hides the
fact the replication is broken, specially when all other replicating
hosts have sub second lag.
Give a meaningful message when replication is broken, same as we did for
lag.php in Ibc88d2b86384a68b3cf6fe0d9739144247534821.
Bug: T358484
Change-Id: I321516cf054f23940e0f0944832b57a387434721
Test the output of maintenance/lag.php in both one off and continuous
reporting modes (when it is passed -r). The test coverage gives the
basis to later adjust the script output, notably when a replication is
broken (which prints the misleading "false").
Bug: T358484
Change-Id: Ib2e5597e5cd3b9b1b71c1c61c12814be5c5e7091
TestSetup uses CheckComposerLockUpToDate to check if the composer
dependencies are up to date. CheckComposerLockUpToDate uses wfMessage to
generate localized output, which will fail since TestSetup runs before
MediaWikiServices is initialized.
Change-Id: I995a1cb01abcde7ebe2282b82d33cb50c31587a6
Update the Chinese conversion table routinely to fix bugs reported at
https://zh.wikipedia.org/wiki/Wikipedia:字词转换/修复请求.
It is only data changes and only works for Chinese WikiProjects.
Change-Id: I71591d77028d818dc20e04b9be4508e74c127f83
As part of using conditional defaults in Echo, database
rows matching the current defaults should be deleted. This
is easiest to do by a maintenance script that iterates through
all rows for that option
While this is slow, there is no other way to process conditional
defaults. The slowness should not be an issue, because the script
will be ran as a one-off cleanup action.
Bug: T355367
Change-Id: Id60856f9942a06dc494e539f488ce3789353f88a
Updating name & email addresses for Brooke Vibber.
Re-ran updateCredits.php as well so there are some new entries in
there as well.
There are a couple of files in resources/libs that will have to
be changed upstream to keep tests happy, I will do patches
later. :D
Change-Id: I2f2e75d3fa42e8cf6de19a8fbb615bac28efcd54
After exhuastive research, we concluded that iwl_prefix_from_title is
not used and in case it's actually used, other indexes provide enough
cardinality.
This table is about to grow quite large in Commons, let's avoid making
it bigger than it needs to be.
Bug: T343131
Change-Id: I89e40dff384291968d2465e4109a3d212ae2f8c7
If it's WRITE_BOTH, means the old fields also needs updating but since
it's READ_NEW, the fields are set to linktarget fields. That is wrong.
Bug: T350431
Change-Id: I6512d8b119861ff7e6016202e8c36da0536a9b26
Set the render ID for each parse stored into cache so that we are able
to identify a specific parse when there are dependencies (for example
in an edit based on that parse). This is recorded as a property added
to the ParserOutput, not the parent CacheTime interface. Even though
the render ID is /related/ to the CacheTime interface, CacheTime is
also used directly as a parser cache key, and the UUID should not be
part of the lookup key.
In general we are trying to move the location where these cache
properties are set as early as possible, so we check at each location
to ensure we don't overwrite a previously-set value. Eventually we
can convert most of these checks into assertions that the cache
properties have already been set (T350538). The primary location for
setting cache properties is the ContentRenderer.
Moved setting the revision timestamp into ContentRenderer as well, as
it was set along the same code paths. An extra parameter was added to
ContentRenderer::getParserOutput() to support this.
Added merge code to ParserOutput::mergeInternalMetaDataFrom() which
should ensure that cache time, revision, timestamp, and render id are
all set properly when multiple slots are combined together in MCR.
In order to ensure the render ID is set on all codepaths we needed to
plumb the GlobalIdGenerator service into ContentRenderer, ParserCache,
ParserCacheFactory, and RevisionOutputCache. Eventually (T350538) it
should only be necessary in the ContentRenderer.
Bug: T350538
Bug: T349868
Followup-To: Ic9b7cc0fcf365e772b7d080d76a065e3fd585f80
Change-Id: I72c5e6f86b7f081ab5ce7a56f5365d2f75067a78
This table has eight indexes plus PK. It has around 1000 rows only. Even
if it needs these indexes (which it doesn't), they are still useless.
Looking at the code, the only potential useful index is the one on
site_global_key, they are showing up in the report of unused indexes in
the db and I checked with Fandom (which might benefit from an index on
this table) and they said they don't use sites table.
Bug: T342856
Change-Id: I06b3db0f33bd35bfa68f4b418d8c2f4b9b988409
- Add getRightDescriptionHtml() to return HTML
the change handles cases where the message contains wikitext
- Use the new method getRightDescriptionHtml() in createBotPassword
Bug: T312819
Change-Id: If3b9bce2f02806572cc6cc1194a07cb7d5b8d6da
The use of LinksMigration::getLinksConditions returns 1=0 when there is
no lt_id in the linktarget.
Aquire a new id instead of a fatal sql error as update to 1=0 is invalid
Bug: T341993
Follow-up: I01c3a545248b06d1f73aa99dd898f60767482c8f
Change-Id: I264d366adb4588cbcfb14d52c4a56a05edeef8c1
The idea here was obviously to speed up the process by skipping files
that don't contain the substring "@deprecated" anywhere. Only these
files are parsed and traversed – which can be expensive.
The problem is that PHP's strpos() function never returns -1. It
returns false.
This patch doesn't change what the script does. It just runs faster.
Change-Id: I95a5a0fd3e024ec4132f53d770e3f61031d81250
Why:
* The user_is_temp column exists in the user table to allow finding
temporary users when reading from the DB directly.
* This column was added in f283c0e990,
but this was not written to until
6e68107b3a which was several months
later and after a release version was branched.
* As such, we need a maintenance script to populate the user_is_temp
column for wikis that have enabled temporary account creation.
What:
* Add a maintenance script named populateUserIsTemp.php which
populates the user_is_temp column by looking for rows in the
user table that have user_is_temp as 0 and have user_name match
at least one temporary account match pattern. These rows are then
updated to have user_is_temp as 1.
* This script will be added to update.php in a future commit, so that
betawikis can have this run manually.
* Create unit tests and an integration test for this maintenance
script.
Bug: T355181
Change-Id: I6223496d7aee65e3ab207fe86e386b01bef8b388
* Change `$services->getDBLoadBalancerFactory()->waitForReplication()`
to `$this->waitForReplication()`
* Change various complicated expressions to `$this->getReplicaDB()`
and `$this->getPrimaryDB()`
* Remove unused variables
Change-Id: Ia857be54938a32bb6288dcdf695a35cd38761c3c
Found via (?<!IDBAccessObject)::READ_
We are planning to deprecate and remove implementing IDBAccessObject
interface just to use the constants.
Bug: T354194
Change-Id: I89d442fa493b8e5332ce118e5bf13f13b8dd3477
Long time ago, when we changed collation way more often and the HDD was
the norm for databases, indexing on cl_collation helped us speed up the
updates and minimize user impact. That's not the case anymore and on top
of that, we now have feature of copying the table via setting
--target-table and there is no need to run the schema change super fast.
Also, categorylinks table is quite large (in Commons it has reached
210GB, comparable to enwiki's revision table) and an extra index like
that is quite taxing on the infra.
So let's just do whatever other scripts do, go through all rows in batches
and take advantage of cl_from index instead. This is similar to what
migrateLinksTable does.
Bug: T342854
Change-Id: Ie4dd91ee29308c980ec0b9b7ee684cb175ffca43
The shell.php environment may be useful for testing and debugging even
without a configured MediaWiki instance. Allow it to run without
LocalSettings.php.
Change-Id: I738679384ee8065826f05148829cd04aa9f52efd
Clarify that this column is for external tools that only have
access to the database, and that it should not be used within
MediaWiki.
This is to avoid there being two different ways to look up if a
user is temporary. Parsing the name is preferred within MediaWiki,
because it can be done directly from the client and it corresponds
to how other user types are determined (e.g. external users).
The need for this column indicates technical debt, specifically
the need for a user type system, discussed in T336176. It is
considered temporary and replaceable by whatever system is
designed.
Bug: T333223
Change-Id: I94ec08daf78e76273ca055d21f5cea85021490c4
And start using them instead of wfGetDB(), LB/LBF connection methods or
worse, $this->getDB().
$this->getDB() reuses the database object regardless of whether you're
calling a replica or primary, leading to returning a replica on a
primary and other way around.
Bug: T330641
Change-Id: I9e2cf85ca277022284fc26b9f37db57bd12aaa81
The --delete mode of userOptions script checks whether
the --old argument was passed via `if ( $old )`. Unfortunately,
when someone does `--old ''` or similar, that check evaluates to false,
and the up_value condition is not added, resulting in more rows
being deleted than intended.
Bug: T355310
Change-Id: Ia479bc12808560700a5f404abb503e58aa729063
Why:
* There is a need to update the generation and match pattern on
WMF wikis to a new format that includes the year and starts with
`~`. As such, the 'matchPattern' key needs to be updated.
* Removing the old 'matchPattern' from the wgAutoCreateTempUser
config currently leaves existing temporary accounts as no longer
recongnised as temporary accounts.
* Instead, the 'matchPattern' needs to be able to take an array of
string patterns so that old patterns can still be used for matching.
What:
* Update the MainConfigSchama to indicate that 'matchPattern' in the
wgAutoCreateTempUser config can be an array of strings.
* Add TempUserConfig::getMatchPatterns and deprecate TempUserConfig::
getMatchPattern. This is needed because ::getMatchPattern was typed
to only ever return one Pattern, which is no longer the case with this
config change.
* Update the RealTempUserConfig to support multiple patterns defined in
the 'matchPattern' key. The RealTempUserConfig::getMatchPattern method
returns the pattern or first pattern if multiple are defined to allow
time for existing usages of this deprecated method to be updated.
* Update the RealTempUserConfig to rely on other methods instead of checking
object property values where possible (e.g. use ::isEnabled instead of
checking $this->enabled) to allow easier unit testing.
* Update UserSelectQueryBuilder and ChangesListSpecialPage to use TempUserConfig
::getMatchPatterns instead of ::getMatchPattern.
* Update mediawiki.util/util.js to be able to parse the 'matchPattern' value
when it is an array of strings.
* Update maintenance/userOptions.php to use ::getMatchPatterns instead of
::getMatchPattern.
* Add and update unit and integration tests for the new code, as well as
expanding coverage over existing code that was indirectly affected.
Bug: T354619
Change-Id: I3763daefe4dc7c76370bd934fb20452591c9c762
Why:
Since T354417 added support for --old to be the default value
for an option, userOptions.php can run into temporary accounts.
Because temporary accounts don't have user options, they always
have the default setting. The T354417 change means userOptions.php
can now encounter temporary accounts.
What:
Avoid changing user options for temporary accounts by filtering
them out in the query builder template.
Note:
I am aware of I122f001ab24e879a573b19468d642b8f579d1024,
which deprecates getMatchPattern, but I am uploading this
as a separate patch, because I need userOptions.php to work
in Wikimedia beta ASAP.
Bug: T355204
Change-Id: I3fdceccb9019aabd42934e44033bdd0190fa00e8
Introduce LockFileChecker that used to check whether
composer-installed dependencies (no-dev) are up-to-date.
Bug: T283389
Change-Id: I0d56f235604d5c856bae5d170230f8c7ca0729c6
Prior to this patch, userOptions.php couldn't be used
to change an option's value from its default to a new value,
since the script strictly expected $from to equal up_value
in user_properties, which never happens if $from is the default
value.
This patch fixes this by doing a LEFT JOIN and examining
missing user_properties rows (ie. cases when up_value is NULL).
For each such case, $from is compared with the default value
for the given user and if it matches, the option is changed.
To ensure this querying does not take long time when it is not
actually required, the new behaviour only applies
when --old-is-default is passed.
Bug: T354417
Change-Id: I95781588e9b494ef479790368e557c9182bdb6f8
The old ipblocks table is left touched.
The data is normalized to match the new schema, including resolving
any data integrity issues with ipb_parent_block_id (T282890).
Data is not validated as it is copied. Existing logic could be used from
DatabaseBlockStore, but it performs other non-idempotent actions, and
would probably slow down the script.
The default batch size is 500 instead of the typical 1000 because two
INSERTs are performed for each ipblock.
Bug: T350361
Bug: T282890
Change-Id: I526bb9b8febc5c1cb6a56b9a1044dedcf99c2224
$from is not the first ID, but the old up_value.
We also need to check both the first ID and last ID
are undefined, not only the first one.
Change-Id: I559f06055c8e76ccc3d320ef4fbc6350859de062
Why:
Sometimes, it is necessary to have different behavior
for newly registered and existing users. For example,
this happens in the Echo or GrowthExperiments extensions.
As of now, this behavior is implemented by inserting
user_properties rows in onLocalUserCreated.
Over time, this results in a singificant amount of rows
inserted, which contributes to the user_properties table bloat,
which is already overly large (cf. T54777). This patch makes it
possible to remove such rows by supporting conditional defaults
for user properties.
What:
Add support for conditional defaults of user properties. This can be
configured via `ConditionalUserOptions` config option.
Bug: T321527
Change-Id: I1549c3137e66801c85e03e46427e27da333d68e2
Maintenance::finalSetup should have access to a SettingsBuilder so it
can manipulates config settings without resorting to global variables.
MaintenanceRunner will always provide a SettingsBuilder when calling
this method, so implementations should be able to rely on always getting
one.
The $settings parameter was introduced as optional in order to maintain
backwards compatibility with implementations that did not declare the
parameter. But these should all have been fixed since.
Depends-On: I8a3699b13bfb4dc15f3bed562731ed9d525651cc
Change-Id: I334a103e02fd905faafc43c7c5b95996bc91fd18
The wfParseUrl function is deprecated as of MediaWiki 1.39 and has been
replaced with the UrlUtils::parse method provided by the UrlUtils class.
List of affected classes:
- deleteSelfExternals
- UserMailer
Change-Id: I5e36ee80e5c30e95b79bf45e7b26860cb2668d56
This adds support for ES6 and ES7 syntax to user scripts, thus
matching the wikimedia/minify library.
Bug: T75714
Depends-On: I43d4619a32e37eb42e1aaa55a1f602962609c52b
Depends-On: If3b2b4a75013baeaa0d9b92cd10dfb06e5534153
Change-Id: Ie309e761f8b20640f7c0e85def0a3d1ccc8a658e
Why:
* Part of a temporary user name is generated from an index that
increments, which is stored in the database.
* As specified in T345855, the index will be restarted each year.
* Also specified in T345855, the year will be included in
generated temporary user names.
What:
* Since the year must be included in the name in order to avoid
naming conflicts if the index is restarted each year, both are
implemented together and controlled by a single config.
* Add a new config option that, when true, restarts the name
generation index at the start of each year and add the year into
the user name: $wgAutoCreateTempUser['serialProvider']['useYear']
* Add a uas_year column to the user_autocreate_serial table, which
is unique in combination with uas_shard, so the index can be
stored for each shard/year combination.
* The year is added into the username just after the prefix, as
specified in T345855. This is based on research that having the
year near the start of the name aids understanding that the
names are not IP addresses. The position of the year within the
name is therefore not configurable (though whether to include
it is). See T345855 for the research.
Bug: T349494
Bug: T349501
Depends-On: I6b3c640a4e74f52fd4a4f46de5a2cbe80fe3b665
Change-Id: If51acb3f4efa361ce36d919c862a52501a5a7d24
Using php5.6, which is outdated for use with MediaWiki
Use quickstarts described at DEVELOPERS.md for supported versions
Change-Id: I1001d50ecd60d4bc8c836e2aa22ffb57c1b2b859
Why:
- PHPUnit errors when running locally (T353873)
What:
- Add Maintenance namespace to script
- Make uppercase to match convention for other classes
Bug: T353458
Bug: T353873
Change-Id: I3d2200ee9b53f45d39e7f7b143f1128b2d855849
Error 1064: You have an error in your SQL syntax; check the manual that
corresponds to your MariaDB server version for the right syntax to use
near 'FROM `revision` WHERE rev_page = 42 LIMIT 1' at line 1
Function: AttachLatest::execute
Query: SELECT MAX(rev_timestamp FROM `revision` WHERE rev_page = 42
LIMIT 1
Follow-Up: Idfe83b900de3fbc6a251a5a78d8af1a4cd88970f
Change-Id: Ibdb4bdc68f10f87c53ae648674c1c1729e2ba2b2
For readability. Allowed since PHP 7.4.
I searched for integer literals of 6 or more digits, and also changed
some nearby smaller numbers for consistency.
Bug: T353205
Change-Id: I8518e04889ba8fd52e0f9476a74f8e3e1454b678
Pass Authority to WikiImporter constructor, instead of looking at the
user from RequestContext::getMain(), and skipping this check if
$wgCommandLineMode is true.
Maintenance scripts now use UltimateAuthority, to match the original
intent of skipping permission checks, see 2ed55f42 / r96311.
The Authority parameter to WikiImporterFactory::getWikiImporter() is
optional for now for backwards-compatibility. It should become
required later after deprecation.
Change-Id: Iea1d03dcdcbda2f9a9adbff1b0d319efd22c4d86
Changes to the use statements done automatically via script
Addition of missing use statements and changes to docs done manually
Change-Id: I443aada1c18c8628b02671aa9fd6f441961e5c2e
Why:
- We want developers to have DevelopmentSettings loaded by default
What:
- Define a new `--with-developmentsettings` argument for install.php,
Set it to true in the `composer mw-install:sqlite` invocation
For now, this option is not supported in the web installer.
Bug: T347347
Change-Id: Icba2d614fd1349463fb745ef31f53a3b3834e5ad
Support migration stages when reading and writing blocks.
I tried to set it up for an easy next stage, in which support for the
old schema is removed. I tried to avoid factoring out of shared code
between the two schemas, so that the old schema cases can simply be
deleted without the need to revert unnecessary abstractions.
However, I added HideUserUtils to factor out ipb_deleted queries. Code
review showed that this was already quite complex, with multiple
approaches to the problem, so it benefits from refactoring even without
the schema abstraction.
HideUserUtils is a service rather than a standalone class to support
unit tests, since unit tests do not allow global config access. When
the migration stage config is removed, it will be a service with no
constructor parameters -- an unnecessary abstraction which should
ideally be resolved at that time.
When interpreting result rows, it is possible to share code by using
field aliases. But when constructing WHERE conditions, the actual field
names need to be used, so the migration is more intrusive in
ApiQueryBlocks and SpecialBlockList, where complex conditions are used.
Bug: T346293
Bug: T51504
Bug: T349883
Change-Id: I408acf7a57b0100fe18c455fc13141277a598925
This reverts commit 8b1532d4dd.
Reason for revert: Because of recent changes to the script,
it should be safe to use it again.
Bug: T350443
Change-Id: I56593542bb0676792251d7e966b31be69be437bb
IDatabase::affectedRows does not return the affected rows for UPDATE
IGNORE as mention in the docs, it returns the matching rows and making
IDatabase::affectedRows not usable to determine if update has update the
row or not, reverting usage in namespaceDupes.php (01a548cc)
Change-Id: Ia3b7cd034731da9c7067a11a3bfa51c81a4c5530
Limit the deletion of key conflicts for the links table to
$wgUpdateRowsPerQuery rows.
Also use IDatabase::factorConds to build the condiiton
Bug: T350443
Change-Id: I64ec5a01b457a395a1e830c28aabbc2dd3c0f539
These commented-out options have been pending for many years, and
there doesn't seem to be a plan to make them work. They can easily
be re-added if required.
Change-Id: I6aa9bfe4509062b0f204800fcee70ba767fbc517
Limit the update to $wgUpdateRowsPerQuery rows.
The update now is over the primary key.
Bug: T350443
Change-Id: I4286ea35115aab331eaecf71fb3daf03708d3d38
Limit the update to $wgUpdateRowsPerQuery rows.
The update now is over the primary key.
Also select only the rows with different _from_namespace and needs
change.
There is always a _backlinks_namespace index with the _from_namespace
field and all primary key fields to support this.
Bug: T350443
Change-Id: I503da35f71902c5ed57c5244cb3833fc4730ada3
Why:
Temporary accounts (introduced as part of IP Masking)
are supposed to expire 1 year after their registration.
Automatic account expiration can be done via a maintenance
script, which would be periodically executed via cron / systemd.
Make it possible for extensions to provide their own logic
for generating a list of temporary accounts to invalidate.
This is used in CentralAuth to base registration timestamp
on the global registration timestamp.
The default behavior is "temporary accounts do not expire",
given the feature requires a maintenance script to run
periodically, which will not be the case on third party
instances.
What:
* Add `expireAfterDays` to $wgAutoCreateTempUser, controlling
how many days temporary accounts have.
* Add UserSelectQueryBuilder::whereRegisteredTimestamp(),
filtering accounts based on user_registration.
* Add ExpireTemporaryAccounts maintenance script, which is
@stable to extend.
Bug: T344695
Change-Id: If17bf84ee6620c8eb784b7d835682ad5e7afdfcc
I've upgraded Doxygen to 1.9.8 in WMF CI (Ie025bd8a5e9), and among
the bug fixes was https://github.com/doxygen/doxygen/issues/9047,
which makes it possible to use tags that contain dashes in ALIASES.
Change-Id: Ida5fddb89b76445922a87904745eff0a1e299043
namespaceDupes.php has lacks limits on delete and update queries. For
large updates and deletes this causes replication lag.
Disable this script until this issue is resolved.
Bug: T350443
Change-Id: I2b578535ff77f3080b4672ce098c24775f08a1e2
All of the fields it's cleaning up have been deleted. The script cleans
up denormalized username fields which were deleted in the actor
migration.
It skips tables for which the name fields don't exist, which is all of
them.
Change-Id: I407a75c85cbd5ff6ab0d6d48d2bf07793e7c0c3e
Replace ->where( array_merge( a, b ) ) with ->where( a )->andWhere( b ).
It's shorter and I find it easier to read.
Change-Id: I94fef6219b5611659f7a09fd3a555aba001f5339
$cont will be an empty array the first time, which is invalid.
Follow-up to 865002b57c.
Caught by Phan in I0d69ea6788304e8a073b7521a217882be7a92993.
Change-Id: I5c219e70e41c34869c03d6cd1ee6b7f1876a2a22
A few more fairly simple cases that don't quite match the regexp in
I2cfc3070c2a08fc3888ad48a995f7d79198cc336 or required other tweaks.
Change-Id: I5438c777344e9ba07f3b62a452fce9ec63baa48a
This reverts commit c5f4ffd4e6,
re-applies commit b0fe2c4111.
WikiPage::getRedirectTarget() needs to still allow missing rows,
but for a different reason.
Bug: T348881
Change-Id: I6e1fd823fbe140819c28096d5adc41cd15bcc8c0
* Updated ParserOutput to set Parsoid render ids that REST API
functionality expects in ParserOutput objects.
* CacheThresholdTime functionality no longer exists since it was
implemented in ParsoidOutputAccess and ParserOutputAccess doesn't
support it. This is tracked in T346765.
* Enforce the constraint that uncacheable parses are only for fake or
mutable revisions. Updated tests that violated this constraint to
use 'getParseOutput' instead of calling the parse method directly.
* Had to make some changes in ParsoidParser around use of preferredVariant
passed to Parsoid. I also left some TODO comments for future fixes.
T267067 is also relevant here.
PARSOID-SPECIFIC OPTIONS:
* logLinterData: linter data is always logged by default -- removed
support to disable it. Linter extension handles stale lints properly
and it is better to let it handle it rather than add special cases
to the API.
* offsetType: Moved this support to ParsoidHandler as a post-processing
of byte-offset output. This eliminates the need to support this
Parsoid-specific options in the ContentHandler hierarchies.
* body_only / wrapSections: Handled this in HtmlOutputRendererHelper
as a post-processing of regular output by removing sections and
returning the body content only. This does result in some useless
section-wrapping work with Parsoid, but the simplification is probably
worth it. If in the future, we support Parsoid-specific options in
the ContentHandler hierarchy, we could re-introduce this. But, in any
case, this "fragment" flavor options is likely to get moved out of
core into the VisualEditor extension code.
DEPLOYMENT:
* This patch changes the cache key by setting the useParsoid option
in ParserOptions. The parent patch handles this to ensure we don't
encounter a cold cache on deploy.
TESTS:
* Updated tests and mocks to reflect new reality.
* Do we need any new tests?
Bug: T332931
Change-Id: Ic9b7cc0fcf365e772b7d080d76a065e3fd585f80
This reverts commit b0fe2c4111.
Reason for revert: Causing test failures in the UserMerge extension.
Bug: T348881
Change-Id: I35e82df7a7f95150927dc6e4ad68588c3400b63f