wiki.techinc.nl/composer.json

219 lines
8.2 KiB
JSON
Raw Normal View History

Revert "Make it possible to install extensions using Composer" This reverts commit d6e69d774. MediaWiki extensions are by definition part of the MediaWiki software ecosystem, and could therefore be managed using a specialized solution; The same is not true of PHP packages at large: we're not in a position to change how the PHP community at large manages dependencies. To the extent that we have a choice, we should use interfaces like composer.json to solve for the problem of integration with the outside world. Change Ib125bea00 made the opposite choice, compromising our ability to express how MediaWiki relates to external software components in exchange for superficial gains in convenience of managing MediaWiki extensions. (I consider the gains superficial because they do not leverage the fact that extensions share MediaWiki's code -- a property that should be exploited to provide an extension management solution that is MediaWiki-aware, providing, for example, a uniform configuration management interface.) The cost of that change are manifest in bug 64597: we lost the ability to express a dependency on PHPUnit in the way that the PHPUnit upstream recommends. The problem that change Ib125bea00 set out to solve is that modifying composer.json to express dependencies on extensions means having a local diff that conflicts with the code in version control. This issue is actually discussed by the Composer documentation. In short, Composer's developers recommend that if your production environment uses a cvs to version code, you should maintain your own clone of the upstream repository and version your composer.json that way. More detailed discussion of these issues can be found at <https://getcomposer.org/doc/05-repositories.md#vcs> and <https://getcomposer.org/doc/faqs/why-can't-composer-load-repositories-recursively.md>. I also note that the Composer documentation makes many references to monolog as an exemplary Composer project, and it too maintains a composer.json in the repository root. Change-Id: I3e7c668ee32401e731120cfa9f96986fd8fde8f4
2014-05-11 10:00:35 +00:00
{
"name": "mediawiki/core",
"description": "Free software wiki application developed by the Wikimedia Foundation and others",
"type": "mediawiki-core",
"keywords": [
"mediawiki",
"wiki"
],
Revert "Make it possible to install extensions using Composer" This reverts commit d6e69d774. MediaWiki extensions are by definition part of the MediaWiki software ecosystem, and could therefore be managed using a specialized solution; The same is not true of PHP packages at large: we're not in a position to change how the PHP community at large manages dependencies. To the extent that we have a choice, we should use interfaces like composer.json to solve for the problem of integration with the outside world. Change Ib125bea00 made the opposite choice, compromising our ability to express how MediaWiki relates to external software components in exchange for superficial gains in convenience of managing MediaWiki extensions. (I consider the gains superficial because they do not leverage the fact that extensions share MediaWiki's code -- a property that should be exploited to provide an extension management solution that is MediaWiki-aware, providing, for example, a uniform configuration management interface.) The cost of that change are manifest in bug 64597: we lost the ability to express a dependency on PHPUnit in the way that the PHPUnit upstream recommends. The problem that change Ib125bea00 set out to solve is that modifying composer.json to express dependencies on extensions means having a local diff that conflicts with the code in version control. This issue is actually discussed by the Composer documentation. In short, Composer's developers recommend that if your production environment uses a cvs to version code, you should maintain your own clone of the upstream repository and version your composer.json that way. More detailed discussion of these issues can be found at <https://getcomposer.org/doc/05-repositories.md#vcs> and <https://getcomposer.org/doc/faqs/why-can't-composer-load-repositories-recursively.md>. I also note that the Composer documentation makes many references to monolog as an exemplary Composer project, and it too maintains a composer.json in the repository root. Change-Id: I3e7c668ee32401e731120cfa9f96986fd8fde8f4
2014-05-11 10:00:35 +00:00
"homepage": "https://www.mediawiki.org/",
"authors": [
{
"name": "MediaWiki Community",
"homepage": "https://www.mediawiki.org/wiki/Special:Version/Credits"
}
],
"license": "GPL-2.0-or-later",
Revert "Make it possible to install extensions using Composer" This reverts commit d6e69d774. MediaWiki extensions are by definition part of the MediaWiki software ecosystem, and could therefore be managed using a specialized solution; The same is not true of PHP packages at large: we're not in a position to change how the PHP community at large manages dependencies. To the extent that we have a choice, we should use interfaces like composer.json to solve for the problem of integration with the outside world. Change Ib125bea00 made the opposite choice, compromising our ability to express how MediaWiki relates to external software components in exchange for superficial gains in convenience of managing MediaWiki extensions. (I consider the gains superficial because they do not leverage the fact that extensions share MediaWiki's code -- a property that should be exploited to provide an extension management solution that is MediaWiki-aware, providing, for example, a uniform configuration management interface.) The cost of that change are manifest in bug 64597: we lost the ability to express a dependency on PHPUnit in the way that the PHPUnit upstream recommends. The problem that change Ib125bea00 set out to solve is that modifying composer.json to express dependencies on extensions means having a local diff that conflicts with the code in version control. This issue is actually discussed by the Composer documentation. In short, Composer's developers recommend that if your production environment uses a cvs to version code, you should maintain your own clone of the upstream repository and version your composer.json that way. More detailed discussion of these issues can be found at <https://getcomposer.org/doc/05-repositories.md#vcs> and <https://getcomposer.org/doc/faqs/why-can't-composer-load-repositories-recursively.md>. I also note that the Composer documentation makes many references to monolog as an exemplary Composer project, and it too maintains a composer.json in the repository root. Change-Id: I3e7c668ee32401e731120cfa9f96986fd8fde8f4
2014-05-11 10:00:35 +00:00
"support": {
"issues": "https://phabricator.wikimedia.org/",
"irc": "irc://irc.libera.chat/mediawiki",
Revert "Make it possible to install extensions using Composer" This reverts commit d6e69d774. MediaWiki extensions are by definition part of the MediaWiki software ecosystem, and could therefore be managed using a specialized solution; The same is not true of PHP packages at large: we're not in a position to change how the PHP community at large manages dependencies. To the extent that we have a choice, we should use interfaces like composer.json to solve for the problem of integration with the outside world. Change Ib125bea00 made the opposite choice, compromising our ability to express how MediaWiki relates to external software components in exchange for superficial gains in convenience of managing MediaWiki extensions. (I consider the gains superficial because they do not leverage the fact that extensions share MediaWiki's code -- a property that should be exploited to provide an extension management solution that is MediaWiki-aware, providing, for example, a uniform configuration management interface.) The cost of that change are manifest in bug 64597: we lost the ability to express a dependency on PHPUnit in the way that the PHPUnit upstream recommends. The problem that change Ib125bea00 set out to solve is that modifying composer.json to express dependencies on extensions means having a local diff that conflicts with the code in version control. This issue is actually discussed by the Composer documentation. In short, Composer's developers recommend that if your production environment uses a cvs to version code, you should maintain your own clone of the upstream repository and version your composer.json that way. More detailed discussion of these issues can be found at <https://getcomposer.org/doc/05-repositories.md#vcs> and <https://getcomposer.org/doc/faqs/why-can't-composer-load-repositories-recursively.md>. I also note that the Composer documentation makes many references to monolog as an exemplary Composer project, and it too maintains a composer.json in the repository root. Change-Id: I3e7c668ee32401e731120cfa9f96986fd8fde8f4
2014-05-11 10:00:35 +00:00
"wiki": "https://www.mediawiki.org/"
},
"prefer-stable": true,
Revert "Make it possible to install extensions using Composer" This reverts commit d6e69d774. MediaWiki extensions are by definition part of the MediaWiki software ecosystem, and could therefore be managed using a specialized solution; The same is not true of PHP packages at large: we're not in a position to change how the PHP community at large manages dependencies. To the extent that we have a choice, we should use interfaces like composer.json to solve for the problem of integration with the outside world. Change Ib125bea00 made the opposite choice, compromising our ability to express how MediaWiki relates to external software components in exchange for superficial gains in convenience of managing MediaWiki extensions. (I consider the gains superficial because they do not leverage the fact that extensions share MediaWiki's code -- a property that should be exploited to provide an extension management solution that is MediaWiki-aware, providing, for example, a uniform configuration management interface.) The cost of that change are manifest in bug 64597: we lost the ability to express a dependency on PHPUnit in the way that the PHPUnit upstream recommends. The problem that change Ib125bea00 set out to solve is that modifying composer.json to express dependencies on extensions means having a local diff that conflicts with the code in version control. This issue is actually discussed by the Composer documentation. In short, Composer's developers recommend that if your production environment uses a cvs to version code, you should maintain your own clone of the upstream repository and version your composer.json that way. More detailed discussion of these issues can be found at <https://getcomposer.org/doc/05-repositories.md#vcs> and <https://getcomposer.org/doc/faqs/why-can't-composer-load-repositories-recursively.md>. I also note that the Composer documentation makes many references to monolog as an exemplary Composer project, and it too maintains a composer.json in the repository root. Change-Id: I3e7c668ee32401e731120cfa9f96986fd8fde8f4
2014-05-11 10:00:35 +00:00
"require": {
"composer/semver": "3.4.3",
"cssjanus/cssjanus": "2.3.0",
"ext-calendar": "*",
"ext-ctype": "*",
"ext-dom": "*",
"ext-fileinfo": "*",
"ext-iconv": "*",
"ext-intl": "*",
"ext-json": "*",
"ext-libxml": "*",
"ext-mbstring": "*",
"ext-openssl": "*",
"ext-xml": "*",
"ext-xmlreader": "*",
"guzzlehttp/guzzle": "7.9.2",
"justinrainbow/json-schema": "5.3.0",
"liuggio/statsd-php-client": "1.0.18",
"mck89/peast": "1.16.3",
"monolog/monolog": "2.9.3",
"oojs/oojs-ui": "0.51.2",
"pear/mail": "2.0.0",
"pear/mail_mime": "1.10.12",
"pear/net_smtp": "1.12.1",
"php": ">=8.1.0",
"psr/container": "1.1.2",
"psr/http-message": "1.1",
"psr/log": "1.1.4",
WebRequest & RequestFromGlobals: get HTTP headers in one way apache_request_headers() is a vendor-specific function - it got used when present and alternative code paths were exercised otherwise. These preserved certain "special" headers, e.g. Content-Type, only inconsistently. The function getallheaders() is an alias[1] for apache_request_headers() on systems where the latter is present. Alternatively, there is a polyfill (ralouphie/getallheaders) which is already installed in mediawiki-vendor[2] (by virtue of guzzle). Using getallheaders() exclusively, will make sure these "special" headers are consistently available alongside their "regular"[3] peers and helps MediaWiki code focus on its domain. The dependency to ralouphie/getallheaders is made explicit in the same version in which it is currently locked in mediawiki-vendor[4]. This surfaced because the deprecation warning for API POST requests without a Content-Type header, introduced in bba1a0f, appeared in my development system (somewhat dated addshore/mediawiki-docker-dev/) even though the client did a fine job. Interesting implementation detail: While WebRequest keeps track of headers using keys in all upper case, REST RequestFromGlobals does so in all lower case - but both use retrieval logic complementary to their respective approach however. In case of REST RequestFromGlobals this is encapsulated inside of HeaderContainer (setting and retrieving), while WebRequest does all of this by itself. Cf. [5] and [6] [1]: https://www.php.net/manual/en/function.getallheaders.php [2]: https://github.com/wikimedia/mediawiki-vendor/tree/8f2967d/ralouphie/getallheaders [3]: https://www.php.net/manual/en/reserved.variables.server.php#110763 [4]: https://github.com/wikimedia/mediawiki-vendor/blob/8f2967d/composer.lock#L3250 [5]: https://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.2 [6]: https://www.php.net/manual/en/function.apache-request-headers.php#124236 Bug: T245535 Change-Id: Iba52f152e15928473b729a2588c2462e76e85634
2020-03-18 08:49:15 +00:00
"ralouphie/getallheaders": "3.0.3",
"symfony/polyfill-php82": "1.31.0",
"symfony/polyfill-php83": "1.31.0",
"symfony/yaml": "5.4.45",
"wikimedia/assert": "0.5.1",
"wikimedia/at-ease": "3.0.0",
"wikimedia/base-convert": "2.0.2",
"wikimedia/bcp-47-code": "2.0.0",
"wikimedia/cdb": "3.0.0",
"wikimedia/cldr-plural-rule-parser": "2.0.0",
"wikimedia/common-passwords": "0.5.0",
"wikimedia/composer-merge-plugin": "2.1.0",
"wikimedia/html-formatter": "4.1.0",
"wikimedia/ip-utils": "5.0.0",
"wikimedia/json-codec": "3.0.3",
"wikimedia/less.php": "5.1.2",
"wikimedia/minify": "2.9.0",
"wikimedia/normalized-exception": "2.0.0",
"wikimedia/object-factory": "5.0.1",
"wikimedia/parsoid": "0.20.3",
"wikimedia/php-session-serializer": "3.0.0",
"wikimedia/purtle": "2.0.0",
"wikimedia/relpath": "4.0.1",
"wikimedia/remex-html": "4.1.1",
"wikimedia/request-timeout": "2.0.0",
"wikimedia/running-stat": "2.1.0",
"wikimedia/scoped-callback": "5.0.0",
"wikimedia/services": "4.0.0",
"wikimedia/shellbox": "4.1.1",
"wikimedia/utfnormal": "4.0.0",
"wikimedia/timestamp": "4.1.1",
"wikimedia/wait-condition-loop": "2.0.2",
"wikimedia/wrappedstring": "4.0.1",
"wikimedia/xmp-reader": "0.9.4",
"zordius/lightncandy": "1.2.6"
Revert "Make it possible to install extensions using Composer" This reverts commit d6e69d774. MediaWiki extensions are by definition part of the MediaWiki software ecosystem, and could therefore be managed using a specialized solution; The same is not true of PHP packages at large: we're not in a position to change how the PHP community at large manages dependencies. To the extent that we have a choice, we should use interfaces like composer.json to solve for the problem of integration with the outside world. Change Ib125bea00 made the opposite choice, compromising our ability to express how MediaWiki relates to external software components in exchange for superficial gains in convenience of managing MediaWiki extensions. (I consider the gains superficial because they do not leverage the fact that extensions share MediaWiki's code -- a property that should be exploited to provide an extension management solution that is MediaWiki-aware, providing, for example, a uniform configuration management interface.) The cost of that change are manifest in bug 64597: we lost the ability to express a dependency on PHPUnit in the way that the PHPUnit upstream recommends. The problem that change Ib125bea00 set out to solve is that modifying composer.json to express dependencies on extensions means having a local diff that conflicts with the code in version control. This issue is actually discussed by the Composer documentation. In short, Composer's developers recommend that if your production environment uses a cvs to version code, you should maintain your own clone of the upstream repository and version your composer.json that way. More detailed discussion of these issues can be found at <https://getcomposer.org/doc/05-repositories.md#vcs> and <https://getcomposer.org/doc/faqs/why-can't-composer-load-repositories-recursively.md>. I also note that the Composer documentation makes many references to monolog as an exemplary Composer project, and it too maintains a composer.json in the repository root. Change-Id: I3e7c668ee32401e731120cfa9f96986fd8fde8f4
2014-05-11 10:00:35 +00:00
},
"require-dev": {
"composer/spdx-licenses": "1.5.8",
"doctrine/dbal": "3.8.4",
"doctrine/sql-formatter": "1.1.3",
"ext-simplexml": "*",
"giorgiosironi/eris": "^0.14.0",
"hamcrest/hamcrest-php": "^2.0",
"johnkary/phpunit-speedtrap": "^4.0",
"mediawiki/mediawiki-codesniffer": "45.0.0",
"mediawiki/mediawiki-phan-config": "0.14.0",
"mediawiki/minus-x": "1.1.3",
"nikic/php-parser": "^5.5.0",
"php-parallel-lint/php-console-highlighter": "1.0.0",
"php-parallel-lint/php-parallel-lint": "1.4.0",
"phpunit/phpunit": "9.6.19",
"psy/psysh": "^0.12.3",
"seld/jsonlint": "1.10.2",
"wikimedia/alea": "1.0.0",
"wikimedia/langconv": "^0.4.2",
"wikimedia/testing-access-wrapper": "^3.0.0",
"wmde/hamcrest-html-matchers": "^1.0.0"
Revert "Make it possible to install extensions using Composer" This reverts commit d6e69d774. MediaWiki extensions are by definition part of the MediaWiki software ecosystem, and could therefore be managed using a specialized solution; The same is not true of PHP packages at large: we're not in a position to change how the PHP community at large manages dependencies. To the extent that we have a choice, we should use interfaces like composer.json to solve for the problem of integration with the outside world. Change Ib125bea00 made the opposite choice, compromising our ability to express how MediaWiki relates to external software components in exchange for superficial gains in convenience of managing MediaWiki extensions. (I consider the gains superficial because they do not leverage the fact that extensions share MediaWiki's code -- a property that should be exploited to provide an extension management solution that is MediaWiki-aware, providing, for example, a uniform configuration management interface.) The cost of that change are manifest in bug 64597: we lost the ability to express a dependency on PHPUnit in the way that the PHPUnit upstream recommends. The problem that change Ib125bea00 set out to solve is that modifying composer.json to express dependencies on extensions means having a local diff that conflicts with the code in version control. This issue is actually discussed by the Composer documentation. In short, Composer's developers recommend that if your production environment uses a cvs to version code, you should maintain your own clone of the upstream repository and version your composer.json that way. More detailed discussion of these issues can be found at <https://getcomposer.org/doc/05-repositories.md#vcs> and <https://getcomposer.org/doc/faqs/why-can't-composer-load-repositories-recursively.md>. I also note that the Composer documentation makes many references to monolog as an exemplary Composer project, and it too maintains a composer.json in the repository root. Change-Id: I3e7c668ee32401e731120cfa9f96986fd8fde8f4
2014-05-11 10:00:35 +00:00
},
"replace": {
"symfony/polyfill-ctype": "1.99",
"symfony/polyfill-intl-grapheme": "1.17.1",
"symfony/polyfill-intl-normalizer": "1.17.1",
"symfony/polyfill-mbstring": "1.99",
"symfony/polyfill-php80": "1.99",
"symfony/polyfill-php81": "1.99"
},
Revert "Make it possible to install extensions using Composer" This reverts commit d6e69d774. MediaWiki extensions are by definition part of the MediaWiki software ecosystem, and could therefore be managed using a specialized solution; The same is not true of PHP packages at large: we're not in a position to change how the PHP community at large manages dependencies. To the extent that we have a choice, we should use interfaces like composer.json to solve for the problem of integration with the outside world. Change Ib125bea00 made the opposite choice, compromising our ability to express how MediaWiki relates to external software components in exchange for superficial gains in convenience of managing MediaWiki extensions. (I consider the gains superficial because they do not leverage the fact that extensions share MediaWiki's code -- a property that should be exploited to provide an extension management solution that is MediaWiki-aware, providing, for example, a uniform configuration management interface.) The cost of that change are manifest in bug 64597: we lost the ability to express a dependency on PHPUnit in the way that the PHPUnit upstream recommends. The problem that change Ib125bea00 set out to solve is that modifying composer.json to express dependencies on extensions means having a local diff that conflicts with the code in version control. This issue is actually discussed by the Composer documentation. In short, Composer's developers recommend that if your production environment uses a cvs to version code, you should maintain your own clone of the upstream repository and version your composer.json that way. More detailed discussion of these issues can be found at <https://getcomposer.org/doc/05-repositories.md#vcs> and <https://getcomposer.org/doc/faqs/why-can't-composer-load-repositories-recursively.md>. I also note that the Composer documentation makes many references to monolog as an exemplary Composer project, and it too maintains a composer.json in the repository root. Change-Id: I3e7c668ee32401e731120cfa9f96986fd8fde8f4
2014-05-11 10:00:35 +00:00
"suggest": {
"ext-apcu": "Faster web responses overall.",
"ext-bcmath": "Increased performance of some operations. Required especially on 32 bit machines. This or ext-gmp are needed for scrambling Temporary Accounts.",
"ext-curl": "Faster HTTP services, e.g. when using InstantCommons, Swift, or Etcd.",
"ext-exif": "Enable processing of EXIF information in file uploads.",
"ext-gd": "Enable thumbnails for file uploads.",
"ext-gmp": "Increased performance of some operations. Required especially on 32 bit machines. This or ext-bcmath are needed for scrambling Temporary Accounts.",
"ext-igbinary": "Enables use of igbinary for serialisation.",
"ext-imagick": "Enables use of imagemagick for image manipulation.",
"ext-memcached": "Enables use of Memcached for caching purposes.",
"ext-mysqli": "Enable the MySQL and MariaDB database type for MediaWiki.",
"ext-pdo": "Enable the SQLite database type for MediaWiki.",
"ext-pgsql": "Enable the PostgreSQL database type for MediaWiki.",
"ext-posix": "Enable CLI concurrent processing, e.g. for runJobs.php.",
"ext-pcntl": "Enable CLI concurrent processing, e.g. for runJobs.php and rebuildLocalisationCache.php.",
"ext-readline": "Enable CLI history and autocomplete, e.g. for eval.php and other REPLs.",
"ext-redis": "Enables use of Redis for caching purposes.",
"ext-sockets": "Enable CLI concurrent processing, e.g. for rebuildLocalisationCache.php.",
"ext-wikidiff2": "Faster text difference engine.",
"ext-zlib": "Enable use of GZIP compression, e.g. for SqlBagOStuff (ParserCache), $wgCompressRevisions, or $wgUseFileCache."
},
"autoload": {
"psr-4": {
"MediaWiki\\Composer\\": "includes/composer"
}
},
"autoload-dev": {
"files": [
"vendor/hamcrest/hamcrest-php/hamcrest/Hamcrest.php",
"vendor/wmde/hamcrest-html-matchers/src/functions.php"
]
},
"scripts": {
"mw-install:sqlite": "@php maintenance/run.php install --server=http://localhost:4000 --dbtype sqlite --with-developmentsettings --dbpath cache/ --scriptpath= --pass adminpassword MediaWiki Admin",
"serve": [
"Composer\\Config::disableProcessTimeout",
"@putenv MW_LOG_DIR=logs",
"@putenv MW_LOG_STDERR=1",
build: Set PHP_CLI_SERVER_WORKERS=8 in composer serve * This way page views and most common user interacts don't involve queued requests by default. * Devtools waterfall charts show realistic latency numbers, instead of making it seem that serving a .jpg is slower than a PHP request, because it happens to be queued behind one, when there are only 3 requests at the same time. * Profilers become more comparable between quickstart, docker, vagrant, beta, and production. * Allow benchmarking with 'ab' out of the box, with at least some concurrency, and in a way that the concurrency is obvious and easy to find (the user ran 'composer serve' and can see WORKERS= setting in the CLI output, as well as in composer.json). * Slightly faster page views, especially interactions like loading VisualEditor where there are 2-3 concurrent API requests. https://www.mediawiki.org/w/index.php?title=Talk%3ALocal_development_quickstart&oldid=6271780#Multiple_workers_for_PHP_server == Why 8 == I believe most CPUs used by contributer's devices have at least 8 cores these days. However, even if they have a 2-core or 4-core machine, I'd much rather requests be handled concurrently by the OS, than forcefully queued in their entirety. Most latency in web apps tends to be blocked on I/O (file, network, database), so there's plenty of oppertunity for CPU sharing. Even in WMF production, we spawn 2x the php-fpm threads as we have CPUs (fpm_workers_multiplier). We could comfortably set this to 16 or 32 as well, but given this is a local server for a single-user, you're realistically not going to actually be able to (naturally) cause more concurrency, short of benchmarking. 8 seems about right as being just above what is normal on a typical pageview, yet short of potential arguments about how much and whether to induce CPU sharing for most people. Bug: T347347 Change-Id: I7d6fbe02e13375d4e95c12165c29469cd0946d34
2023-12-19 16:00:46 +00:00
"@putenv PHP_CLI_SERVER_WORKERS=8",
"@php -S 127.0.0.1:4000"
],
"lint": "parallel-lint --exclude node_modules --exclude vendor",
"phan": "phan -d . --long-progress-bar",
"phpcs": "phpcs -p -s --cache",
"fix": [
"minus-x fix .",
"phpcbf"
],
"pre-install-cmd": "MediaWiki\\Composer\\VersionChecker::onEvent",
"pre-update-cmd": "MediaWiki\\Composer\\VersionChecker::onEvent",
"post-install-cmd": "MediaWiki\\Composer\\ComposerVendorHtaccessCreator::onEvent",
"post-update-cmd": "MediaWiki\\Composer\\ComposerVendorHtaccessCreator::onEvent",
"releasenotes": "@phpunit --group ReleaseNotes",
"test": [
"@lint .",
"@phpcs .",
"minus-x check ."
Define unit and integration test suites Following discussion in Ibb8175981092d7f41864e641cc3c118af70a5c76, this patch proposes to further reduce the scope of what unit tests may access, by removing the loading of DefaultSettings and GlobalFunctions.php. This also has the implied effect of disabling the storage backend, as well as the global service locator. MediaWikiTestCase is renamed to MediaWikiIntegrationTestCase so it's scope and purpose is more clear. Whether we still need to keep `@group Database` annotation around is debatable, as it's unclear to me what the performance costs are of implying database access for all tests which extend IntegrationTestCase. As far as I can tell, `@group Database` is primarily used in CI to run faster tests before slower ones, and with the new UnitTestCase the annotation seems redundant. To run all testsuites, use `composer phpunit`. Other composer scripts: - `composer phpunit:unit` to run unit tests - `composer phpunit:integration` to run integration tests - `composer phpunit:coverage` to generate code coverage reports from unit tests (requires XDebug). Note that you can pass arguments to composer scripts with `--`, e.g. `composer phpunit:integration --exclude-group Dump`. Other changes: - Rename bootstrap.php to bootstrap.maintenance.php so it's clear it's part of the legacy PHPUnit-as-maintenance-class setup - Create new bootstrap.php which loads the minimal configuration necessary for the tests, and do additional setup in the run() method of the unit/integration test case classes - Move the unit-tests.xml file to phpunit.xml.dist in preparation for this being the default test configuration For a follow-up patch: - Find unit/integration tests for extensions/skins - Migrate other test suites from suite.xml - Support running all tests via vendor/bin/phpunit Bug: T84948 Bug: T89432 Bug: T87781 Change-Id: Ie717b0ecf4fcfd089d46248f14853c80b7ef4a76
2019-06-26 02:33:14 +00:00
],
"test-some": [
"@lint",
"@phpcs"
],
"phpunit": "phpunit",
"phpunit:unit": "@phpunit --colors=always --testsuite=core:unit,extensions:unit,skins:unit",
"phpunit:integration": "@phpunit --colors=always --testsuite=core:integration,extensions:integration,skins:integration",
"phpunit:coverage": "@phpunit --testsuite=core:unit --exclude-group Dump,Broken",
"phpunit:coverage-edit": "MediaWiki\\Composer\\ComposerPhpunitXmlCoverageEdit::onEvent",
"phpunit:entrypoint": "@phpunit",
Add `phpunit:prepare-parallel:extensions` command In T361190 and Quibble 1.9.0, we introduced parallel execution of PHPUnit tests to speed up the CI jobs. The existing implementation is purely Python/Quibble, and cannot directly be used by developers locally. With this patch, we re-implement the test splitting logic already implemented in CI as a composer task so that the parallel tests can be run locally. There are a couple of different approaches to running PHPUnit tests in parallel. The different approaches have been discussed at length in T50217. Ideally, we would just install the `paratest` extension and use that to parallelise the execution. Unfortunately we have complex test suites (specifically Parser tests and the Scribunto test suite) that dynamically create tests as they run, which makes it hard for `paratest` to work out which tests will run. To overcome this limitation, we use the `phpunit --list-tests` function to create a list of test classes that would be included in the execution of the test suite, then scan the filesystem for classes named in the `tests-list.xml` output. The classes we find are then collected into smaller groups (`split_group_X`) which we can run in parallel in separate processes. We split into 7-8 groups here, as that experimentally leads to an even spread of the tests and consumes 100% of all cores on a 4-core processor. Because `ParserIntegrationTest.php` is a single test class that generates thousands of integration tests, we put that in its own bucket rather than allocating it round-robin to one of the split buckets. This again helps to keep the buckets roughly the same size. The current implementation only supports splitting the `extensions` test suite. We need to do some more development and testing to support splitting other suites. The new composer command `phpunit:prepare-parallel:extensions` will generate a `phpunit.xml` file with the same contents as `phpunit.xml.dist`, but with the split-group suites added. The result of running all of the split groups should be the same as the result of running the whole test suite. Bug: T365976 Change-Id: I2d841ab236c5367961603bb526319053551bec2e
2024-05-31 12:44:45 +00:00
"phpunit:prepare-parallel:extensions": [
"MediaWiki\\Composer\\PhpUnitSplitter\\PhpUnitXmlManager::listTestsNotice",
Add `phpunit:parallel:extensions` composer command In T361190 and Quibble 1.9.0, we introduced parallel execution of PHPUnit tests to speed up the CI jobs. The existing implementation is purely Python/Quibble, and cannot directly be used by developers locally. With this patch, we re-implement the parallel test execution already implemented in CI as a composer task so that the parallel tests can be run locally. The `phpunit:parallel:extensions` command expects to be run after `phpunit:prepare-parallel:extensions`, and expects to find 8 test suites with the names `split_group_X` (for X in 0 through 7) in the PHPUnit configuration file. 8 here is currently a hard-coded number that corresponds to the number of parallel test executions we need to saturate the CPU of a 4-core developer machine, and experimentally leads to a good speed-up versus the serial execution. When this command runs, it forks to launch 8 parallel processes, each running one of the `split_group_X` suites. The parent process waits for the children to complete, buffers the output, collects the exit statuses, then dumps the buffered output and exits with a non-zero status if any of the child processes failed (i.e. if there were test failures). We introduce `phpunit:prepare-parallel:default` as a complement to `phpunit:prepare-parallel:extensions`, and the two commands `phpunit:parallel:database` and `phpunit:parallel:databaseless`. This creates four possible combinations - two different test suites, and two different test groups. This is a similar setup to that which we have in CI - the Database and non-Database tests are run in separate groups, and some jobs use the `extensions` suite while others just use the default suite. The `phpunit:parallel:...` commands will fail with a helpful message if no `split_group_`s are found in the active PHPUnit configuration. To help test whether the split test runs are really running all the tests in the suite, we generate and store the PHPUnit results cache file. Comparing the results cache files from linear versus parallel runs should tell us if all the tests have been executed. Bug: T365976 Change-Id: If106802f08edd5d4c841bb7970c69b88ab3bb39b
2024-06-04 06:38:55 +00:00
"@phpunit --list-tests-xml=tests-list-extensions.xml --testsuite=extensions",
"MediaWiki\\Composer\\PhpUnitSplitter\\PhpUnitXmlManager::splitTestsListExtensions"
],
"phpunit:prepare-parallel:default": [
"MediaWiki\\Composer\\PhpUnitSplitter\\PhpUnitXmlManager::listTestsNotice",
"@phpunit --list-tests-xml=tests-list-default.xml",
"MediaWiki\\Composer\\PhpUnitSplitter\\PhpUnitXmlManager::splitTestsListDefault"
Add `phpunit:prepare-parallel:extensions` command In T361190 and Quibble 1.9.0, we introduced parallel execution of PHPUnit tests to speed up the CI jobs. The existing implementation is purely Python/Quibble, and cannot directly be used by developers locally. With this patch, we re-implement the test splitting logic already implemented in CI as a composer task so that the parallel tests can be run locally. There are a couple of different approaches to running PHPUnit tests in parallel. The different approaches have been discussed at length in T50217. Ideally, we would just install the `paratest` extension and use that to parallelise the execution. Unfortunately we have complex test suites (specifically Parser tests and the Scribunto test suite) that dynamically create tests as they run, which makes it hard for `paratest` to work out which tests will run. To overcome this limitation, we use the `phpunit --list-tests` function to create a list of test classes that would be included in the execution of the test suite, then scan the filesystem for classes named in the `tests-list.xml` output. The classes we find are then collected into smaller groups (`split_group_X`) which we can run in parallel in separate processes. We split into 7-8 groups here, as that experimentally leads to an even spread of the tests and consumes 100% of all cores on a 4-core processor. Because `ParserIntegrationTest.php` is a single test class that generates thousands of integration tests, we put that in its own bucket rather than allocating it round-robin to one of the split buckets. This again helps to keep the buckets roughly the same size. The current implementation only supports splitting the `extensions` test suite. We need to do some more development and testing to support splitting other suites. The new composer command `phpunit:prepare-parallel:extensions` will generate a `phpunit.xml` file with the same contents as `phpunit.xml.dist`, but with the split-group suites added. The result of running all of the split groups should be the same as the result of running the whole test suite. Bug: T365976 Change-Id: I2d841ab236c5367961603bb526319053551bec2e
2024-05-31 12:44:45 +00:00
],
"phpunit:prepare-parallel:split-file": "MediaWiki\\Composer\\PhpUnitSplitter\\PhpUnitXmlManager::splitTestsCustom",
Add `phpunit:parallel:extensions` composer command In T361190 and Quibble 1.9.0, we introduced parallel execution of PHPUnit tests to speed up the CI jobs. The existing implementation is purely Python/Quibble, and cannot directly be used by developers locally. With this patch, we re-implement the parallel test execution already implemented in CI as a composer task so that the parallel tests can be run locally. The `phpunit:parallel:extensions` command expects to be run after `phpunit:prepare-parallel:extensions`, and expects to find 8 test suites with the names `split_group_X` (for X in 0 through 7) in the PHPUnit configuration file. 8 here is currently a hard-coded number that corresponds to the number of parallel test executions we need to saturate the CPU of a 4-core developer machine, and experimentally leads to a good speed-up versus the serial execution. When this command runs, it forks to launch 8 parallel processes, each running one of the `split_group_X` suites. The parent process waits for the children to complete, buffers the output, collects the exit statuses, then dumps the buffered output and exits with a non-zero status if any of the child processes failed (i.e. if there were test failures). We introduce `phpunit:prepare-parallel:default` as a complement to `phpunit:prepare-parallel:extensions`, and the two commands `phpunit:parallel:database` and `phpunit:parallel:databaseless`. This creates four possible combinations - two different test suites, and two different test groups. This is a similar setup to that which we have in CI - the Database and non-Database tests are run in separate groups, and some jobs use the `extensions` suite while others just use the default suite. The `phpunit:parallel:...` commands will fail with a helpful message if no `split_group_`s are found in the active PHPUnit configuration. To help test whether the split test runs are really running all the tests in the suite, we generate and store the PHPUnit results cache file. Comparing the results cache files from linear versus parallel runs should tell us if all the tests have been executed. Bug: T365976 Change-Id: If106802f08edd5d4c841bb7970c69b88ab3bb39b
2024-06-04 06:38:55 +00:00
"phpunit:parallel:custom-groups": "MediaWiki\\Composer\\ComposerLaunchParallel::launchTestsCustomGroups",
"phpunit:parallel:database": "MediaWiki\\Composer\\ComposerLaunchParallel::launchTestsDatabase",
"phpunit:parallel:databaseless": "MediaWiki\\Composer\\ComposerLaunchParallel::launchTestsDatabaseless",
"maintenance": "@php maintenance/run.php"
},
"config": {
"optimize-autoloader": true,
"prepend-autoloader": false,
"allow-plugins": {
"wikimedia/composer-merge-plugin": true,
"composer/installers": true,
"dealerdirect/phpcodesniffer-composer-installer": true
}
},
"extra": {
"merge-plugin": {
"include": [
"composer.local.json"
],
"merge-dev": false
}
Revert "Make it possible to install extensions using Composer" This reverts commit d6e69d774. MediaWiki extensions are by definition part of the MediaWiki software ecosystem, and could therefore be managed using a specialized solution; The same is not true of PHP packages at large: we're not in a position to change how the PHP community at large manages dependencies. To the extent that we have a choice, we should use interfaces like composer.json to solve for the problem of integration with the outside world. Change Ib125bea00 made the opposite choice, compromising our ability to express how MediaWiki relates to external software components in exchange for superficial gains in convenience of managing MediaWiki extensions. (I consider the gains superficial because they do not leverage the fact that extensions share MediaWiki's code -- a property that should be exploited to provide an extension management solution that is MediaWiki-aware, providing, for example, a uniform configuration management interface.) The cost of that change are manifest in bug 64597: we lost the ability to express a dependency on PHPUnit in the way that the PHPUnit upstream recommends. The problem that change Ib125bea00 set out to solve is that modifying composer.json to express dependencies on extensions means having a local diff that conflicts with the code in version control. This issue is actually discussed by the Composer documentation. In short, Composer's developers recommend that if your production environment uses a cvs to version code, you should maintain your own clone of the upstream repository and version your composer.json that way. More detailed discussion of these issues can be found at <https://getcomposer.org/doc/05-repositories.md#vcs> and <https://getcomposer.org/doc/faqs/why-can't-composer-load-repositories-recursively.md>. I also note that the Composer documentation makes many references to monolog as an exemplary Composer project, and it too maintains a composer.json in the repository root. Change-Id: I3e7c668ee32401e731120cfa9f96986fd8fde8f4
2014-05-11 10:00:35 +00:00
}
}