2014-05-11 10:00:35 +00:00
|
|
|
{
|
|
|
|
|
"name": "mediawiki/core",
|
|
|
|
|
"description": "Free software wiki application developed by the Wikimedia Foundation and others",
|
|
|
|
|
"keywords": ["mediawiki", "wiki"],
|
|
|
|
|
"homepage": "https://www.mediawiki.org/",
|
|
|
|
|
"authors": [
|
|
|
|
|
{
|
|
|
|
|
"name": "MediaWiki Community",
|
|
|
|
|
"homepage": "https://www.mediawiki.org/wiki/Special:Version/Credits"
|
|
|
|
|
}
|
|
|
|
|
],
|
2015-01-31 23:49:49 +00:00
|
|
|
"license": "GPL-2.0+",
|
2014-05-11 10:00:35 +00:00
|
|
|
"support": {
|
2014-12-11 20:54:59 +00:00
|
|
|
"issues": "https://bugs.mediawiki.org/",
|
2014-05-11 10:00:35 +00:00
|
|
|
"irc": "irc://irc.freenode.net/mediawiki",
|
|
|
|
|
"wiki": "https://www.mediawiki.org/"
|
|
|
|
|
},
|
|
|
|
|
"require": {
|
2015-11-23 01:05:25 +00:00
|
|
|
"composer/semver": "1.2.0",
|
2015-12-19 00:58:19 +00:00
|
|
|
"cssjanus/cssjanus": "1.1.2",
|
2015-02-23 15:03:24 +00:00
|
|
|
"ext-iconv": "*",
|
2015-11-23 00:37:33 +00:00
|
|
|
"liuggio/statsd-php-client": "1.0.18",
|
2015-09-18 18:52:43 +00:00
|
|
|
"mediawiki/at-ease": "1.1.0",
|
2015-12-08 21:46:11 +00:00
|
|
|
"oojs/oojs-ui": "0.14.1",
|
2015-09-29 22:39:41 +00:00
|
|
|
"oyejorge/less.php": "1.7.0.9",
|
2014-10-24 01:21:50 +00:00
|
|
|
"php": ">=5.3.3",
|
2014-10-30 18:21:34 +00:00
|
|
|
"psr/log": "1.0.0",
|
2015-04-22 16:46:29 +00:00
|
|
|
"wikimedia/assert": "0.2.2",
|
2015-10-29 22:56:07 +00:00
|
|
|
"wikimedia/base-convert": "1.0.1",
|
2015-09-08 20:13:46 +00:00
|
|
|
"wikimedia/cdb": "1.3.0",
|
2015-09-24 07:15:49 +00:00
|
|
|
"wikimedia/cldr-plural-rule-parser": "1.0.0",
|
2015-11-06 22:11:35 +00:00
|
|
|
"wikimedia/composer-merge-plugin": "1.3.0",
|
2015-06-26 19:26:46 +00:00
|
|
|
"wikimedia/ip-set": "1.0.1",
|
2015-09-23 01:35:38 +00:00
|
|
|
"wikimedia/relpath": "1.0.3",
|
2015-10-28 22:31:47 +00:00
|
|
|
"wikimedia/running-stat": "1.1.0",
|
2015-08-29 19:46:03 +00:00
|
|
|
"wikimedia/utfnormal": "1.0.3",
|
2015-07-31 00:31:58 +00:00
|
|
|
"wikimedia/wrappedstring": "2.0.0",
|
2015-05-15 22:15:02 +00:00
|
|
|
"zordius/lightncandy": "0.21"
|
2014-05-11 10:00:35 +00:00
|
|
|
},
|
|
|
|
|
"require-dev": {
|
2015-06-19 20:04:25 +00:00
|
|
|
"jakub-onderka/php-parallel-lint": "0.9",
|
Implement extension registration from an extension.json file
Introduces wfLoadExtension()/wfLoadSkin() which should be used in
LocalSettings.php rather than require-ing a PHP entry point.
Extensions and skins would add "extension.json" or "skin.json" files
in their root, which contains all the information typically
present in PHP entry point files (classes to autoload, special pages,
API modules, etc.) A full schema can be found at
docs/extension.schema.json, and a script to validate these to the
schema is provided. An additional script is provided to convert
typical PHP entry point files into their JSON equivalents.
The basic flow of loading an extension goes like:
* Get the ExtensionRegistry singleton instance
* ExtensionRegistry takes a filename, reads the file or tries
to get the parsed JSON from APC if possible.
* The JSON is run through a Processor instance,
which registers things with the appropriate
global settings.
* The output of the processor is cached in APC if possible.
* The extension/skin is marked as loaded in the
ExtensionRegistry and a callback function is executed
if one was specified.
For ideal performance, a batch loading method is also provided:
* The absolute path name to the JSON file is queued
in the ExtensionRegistry instance.
* When loadFromQueue() is called, it constructs a hash
unique to the members of the current queue, and sees
if the queue has been cached in APC. If not, it processes
each file individually, and combines the result of each
Processor into one giant array, which is cached in APC.
* The giant array then sets various global settings,
defines constants, and calls callbacks.
To invalidate the cached processed info, by default the mtime
of each JSON file is checked. However that can be slow if you
have a large number of extensions, so you can set $wgExtensionInfoMTime
to the mtime of one file, and `touch` it whenever you update
your extensions.
Change-Id: I7074b65d07c5c7d4e3f1fb0755d74a0b07ed4596
2014-10-15 00:31:15 +00:00
|
|
|
"justinrainbow/json-schema": "~1.3",
|
2015-09-26 22:00:39 +00:00
|
|
|
"mediawiki/mediawiki-codesniffer": "0.4.0",
|
2015-10-27 22:53:03 +00:00
|
|
|
"monolog/monolog": "~1.17.2",
|
2015-09-22 10:27:29 +00:00
|
|
|
"nmred/kafka-php": "0.1.4",
|
2015-09-27 05:16:10 +00:00
|
|
|
"phpunit/phpunit": "3.7.37",
|
|
|
|
|
"wikimedia/avro": "1.7.7"
|
2014-05-11 10:00:35 +00:00
|
|
|
},
|
|
|
|
|
"suggest": {
|
2015-09-27 05:16:10 +00:00
|
|
|
"ext-apc": "Local data and opcode cache",
|
2015-08-14 18:04:47 +00:00
|
|
|
"ext-fileinfo": "Improved mime magic detection",
|
|
|
|
|
"ext-intl": "ICU integration",
|
|
|
|
|
"ext-mbstring": "Multibyte string support",
|
|
|
|
|
"ext-wikidiff2": "Diff accelerator",
|
|
|
|
|
"monolog/monolog": "Flexible debug logging system",
|
Produce monolog messages through kafka+avro
This allows a logging channel to be configured to write
directly to kafka. Logs can be serialized either to json
blobs or the more compact apache avro format.
The Kafka handler for monolog needs a list of one of more
kafka servers to query cluster metadata from. This should be
able to use any monolog formatter, although some like
JsonFormatter require you to disable formatBatch as Kafka
protocol would prefer to encode each record independently in
the protocol. This requires the nmred/kafka-php library,
version >= 1.3.0.
Adds a new formatter which serializes to the apache avro
format. This is a compact binary format which uses pre-
defined schemas. This initial implementation is very simple
and takes the plain schemas as a constructor argument.
Adds a new option to MonologSpi to wrap handlers in a
BufferHandler. This doesn't flush until the request shuts
down and prevents any network requests in the logger from
adding latency to web requests.
Related mediawiki/vendor update: Ibfe4bd2036ae8e998e2973f07bd9a6f057691578
The necessary config is something like:
array(
'loggers' => array(
'CirrusSearchRequests' => array(
'handlers' => array( 'kafka' ),
),
),
'handlers' => array(
'kafka' => array(
'factory' => '\\MediaWiki\\Logger\\Monolog\\KafkaHandler::factory',
'args' => array( 'localhost:9092' ),
'formatter' => 'avro',
'buffer' => true,
),
),
'formatters' => array(
'avro' => array(
'class' => '\\MediaWiki\\Logger\\Monolog\\AvroFormatter',
'args' => array(
array(
'CirrusSearchRequests' => array(
'type' => 'record',
'name' => 'CirrusSearchRequests'
'fields' => array( ... )
),
),
),
),
),
)
Bug: T106256
Change-Id: I6ee744b3e5306af0bed70811b558a543eed22840
2015-08-04 18:02:47 +00:00
|
|
|
"nmred/kafka-php": "Send debug log events to kafka",
|
2015-08-14 18:04:47 +00:00
|
|
|
"pear/mail": "Mail sending support",
|
2015-08-14 18:58:04 +00:00
|
|
|
"pear/mail_mime": "Mail sending support",
|
Produce monolog messages through kafka+avro
This allows a logging channel to be configured to write
directly to kafka. Logs can be serialized either to json
blobs or the more compact apache avro format.
The Kafka handler for monolog needs a list of one of more
kafka servers to query cluster metadata from. This should be
able to use any monolog formatter, although some like
JsonFormatter require you to disable formatBatch as Kafka
protocol would prefer to encode each record independently in
the protocol. This requires the nmred/kafka-php library,
version >= 1.3.0.
Adds a new formatter which serializes to the apache avro
format. This is a compact binary format which uses pre-
defined schemas. This initial implementation is very simple
and takes the plain schemas as a constructor argument.
Adds a new option to MonologSpi to wrap handlers in a
BufferHandler. This doesn't flush until the request shuts
down and prevents any network requests in the logger from
adding latency to web requests.
Related mediawiki/vendor update: Ibfe4bd2036ae8e998e2973f07bd9a6f057691578
The necessary config is something like:
array(
'loggers' => array(
'CirrusSearchRequests' => array(
'handlers' => array( 'kafka' ),
),
),
'handlers' => array(
'kafka' => array(
'factory' => '\\MediaWiki\\Logger\\Monolog\\KafkaHandler::factory',
'args' => array( 'localhost:9092' ),
'formatter' => 'avro',
'buffer' => true,
),
),
'formatters' => array(
'avro' => array(
'class' => '\\MediaWiki\\Logger\\Monolog\\AvroFormatter',
'args' => array(
array(
'CirrusSearchRequests' => array(
'type' => 'record',
'name' => 'CirrusSearchRequests'
'fields' => array( ... )
),
),
),
),
),
)
Bug: T106256
Change-Id: I6ee744b3e5306af0bed70811b558a543eed22840
2015-08-04 18:02:47 +00:00
|
|
|
"pear/mail_mime-decode": "Mail sending support",
|
|
|
|
|
"wikimedia/avro": "Binary serialization format used with kafka"
|
2014-09-23 19:31:30 +00:00
|
|
|
},
|
|
|
|
|
"autoload": {
|
|
|
|
|
"psr-0": {
|
|
|
|
|
"ComposerHookHandler": "includes/composer"
|
|
|
|
|
}
|
|
|
|
|
},
|
|
|
|
|
"scripts": {
|
2015-01-06 18:47:25 +00:00
|
|
|
"lint": "parallel-lint --exclude vendor",
|
2015-06-29 21:35:38 +00:00
|
|
|
"phpcs": "phpcs -p $PHPCS_ARGS",
|
2015-09-27 05:16:10 +00:00
|
|
|
"pre-install-cmd": "ComposerHookHandler::onPreInstall",
|
|
|
|
|
"pre-update-cmd": "ComposerHookHandler::onPreUpdate",
|
2015-01-06 18:47:25 +00:00
|
|
|
"test": [
|
|
|
|
|
"composer lint",
|
|
|
|
|
"composer phpcs"
|
2015-09-27 05:16:10 +00:00
|
|
|
]
|
2014-12-02 17:19:18 +00:00
|
|
|
},
|
|
|
|
|
"config": {
|
2015-09-27 05:16:10 +00:00
|
|
|
"optimize-autoloader": true,
|
|
|
|
|
"prepend-autoloader": false
|
2014-12-30 23:42:32 +00:00
|
|
|
},
|
|
|
|
|
"extra": {
|
|
|
|
|
"merge-plugin": {
|
|
|
|
|
"include": [
|
|
|
|
|
"composer.local.json"
|
2015-11-06 22:11:35 +00:00
|
|
|
],
|
|
|
|
|
"merge-dev": false
|
2014-12-30 23:42:32 +00:00
|
|
|
}
|
2014-05-11 10:00:35 +00:00
|
|
|
}
|
|
|
|
|
}
|