use or update a custom session provider if needed.
* Deprecated APIEditBeforeSave hook in favor of EditFilterMergedContent.
* The 'UploadVerification' hook is deprecated. Use 'UploadVerifyFile' instead.
+* SiteConfiguration::isLocalVHost() was removed (deprecated since 1.25).
== Compatibility ==
This document is intended to provide useful advice for parties seeking to
-redistribute MediaWiki to end users. It's targeted particularly at maintainers
+redistribute MediaWiki to end users. It's targeted particularly at maintainers
for Linux distributions, since it's been observed that distribution packages of
-MediaWiki often break. We've consistently had to recommend that users seeking
+MediaWiki often break. We've consistently had to recommend that users seeking
support use official tarballs instead of their distribution's packages, and
-this often solves whatever problem the user is having. It would be nice if
+this often solves whatever problem the user is having. It would be nice if
this could change.
== Background: why web applications are different ==
MediaWiki is intended to be usable on any web host that provides support for
-PHP and a database. Many users of low-end shared hosting have very limited
+PHP and a database. Many users of low-end shared hosting have very limited
access to their machine: often only FTP access to some subdirectory of the web
-root. Support for these users entails several restrictions, such as:
+root. Support for these users entails several restrictions, such as:
- 1) We cannot require installation of any files outside the web root. Few of
+ 1) We cannot require installation of any files outside the web root. Few of
our users have access to directories like /usr or /etc.
2) We cannot require the ability to run any utility on the command line.
Many shared hosts have exec() and similar PHP functions disabled.
- 3) We cannot assume that the software has write access anywhere useful. The
+ 3) We cannot assume that the software has write access anywhere useful. The
user account that MediaWiki (including its installer) runs under is often
different from the account the user used to upload the files, and we might be
restricted by PHP settings such as safe mode or open_basedir.
Since anything that works on cheap shared hosting will work if you have shell
or root access too, MediaWiki's design is based around catering to the lowest
-common denominator. Although we support higher-end setups as well (like
+common denominator. Although we support higher-end setups as well (like
Wikipedia!), the way many things work by default is tailored toward shared
-hosting. These defaults are unconventional from the point of view of normal
+hosting. These defaults are unconventional from the point of view of normal
(non-web) applications -- they might conflict with distributors' policies, and
they certainly aren't ideal for someone who's installing MediaWiki as root.
== Directory structure ==
Because of constraint (1) above, MediaWiki does not conform to normal
-Unix filesystem layout. Hopefully we'll offer direct support for standard
+Unix filesystem layout. Hopefully we'll offer direct support for standard
layouts in the future, but for now *any change to the location of files is
-unsupported*. Moving things and leaving symlinks will *probably* not break
+unsupported*. Moving things and leaving symlinks will *probably* not break
anything, but it is *strongly* advised not to try any more intrusive changes to
-get MediaWiki to conform more closely to your filesystem hierarchy. Any such
+get MediaWiki to conform more closely to your filesystem hierarchy. Any such
attempt will almost certainly result in unnecessary bugs.
The standard recommended location to install MediaWiki, relative to the web
-root, is /w (so, e.g., /var/www/w). Rewrite rules can then be used to enable
-"pretty URLs" like /wiki/Article instead of /w/index.php?title=Article. (This
+root, is /w (so, e.g., /var/www/w). Rewrite rules can then be used to enable
+"pretty URLs" like /wiki/Article instead of /w/index.php?title=Article. (This
is the convention Wikipedia uses.) In theory, it should be possible to enable
the appropriate rewrite rules by default, if you can reconfigure the web
-server, but you'd need to alter LocalSettings.php too. See
+server, but you'd need to alter LocalSettings.php too. See
<https://www.mediawiki.org/wiki/Manual:Short_URL> for details on short URLs.
If you really must mess around with the directory structure, note that the
correctly:
* api.php, img_auth.php, index.php, load.php, opensearch_desc.php, thumb.php,
- profileinfo.php, redirect.php, trackback.php. These are the entry points for
- normal usage. This list may be incomplete and is subject to change.
+ profileinfo.php. These are the entry points for normal usage. This list may be
+ incomplete and is subject to change.
* mw-config/index.php: Used for web-based installation (sets up the database,
prompts for the name of the wiki, etc.).
- * images/: Used for uploaded files. This could be somewhere else if
+ * images/: Used for uploaded files. This could be somewhere else if
$wgUploadDirectory and $wgUploadPath are changed appropriately.
* skins/*/: Subdirectories of skins/ contain CSS and JavaScript files that
- must be accessible to web browsers. The PHP files and Skin.sample in skins/
- don't need to be accessible. This could be somewhere else if
+ must be accessible to web browsers. The PHP files and Skin.sample in skins/
+ don't need to be accessible. This could be somewhere else if
$wgStyleDirectory and $wgStylePath are changed appropriately.
* extensions/: Many extensions include CSS and JavaScript files in their
- extensions directory, and will break if they aren't web-accessible. Some
+ extensions directory, and will break if they aren't web-accessible. Some
extensions might theoretically provide additional entry points as well, at
least in principle.
But all files should keep their position relative to the web-visible
-installation directory no matter what. If you must move includes/ somewhere in
-/usr/share, provide a symlink from /var/www/w. If you don't, you *will* break
-something. You have been warned.
+installation directory no matter what. If you must move includes/ somewhere in
+/usr/share, provide a symlink from /var/www/w. If you don't, you *will* break
+something. You have been warned.
== Configuration ==
-MediaWiki is configured using LocalSettings.php. This is a PHP file that's
+MediaWiki is configured using LocalSettings.php. This is a PHP file that's
generated when the user visits mw-config/index.php to install the software, and
-which the user can edit by hand thereafter. It's just a plain old PHP file,
-and can contain any PHP statements. It usually sets global variables that are
+which the user can edit by hand thereafter. It's just a plain old PHP file,
+and can contain any PHP statements. It usually sets global variables that are
used for configuration, and includes files used by any extensions.
-Distributors can easily add extra statements to the autogenerated
-LocalSettings.php by changing mw-config/overrides.php (see that file for details
-and examples).
+Distributors can easily change the installer behavior, including LocalSettings
+generated, by placing their overrides into mw-config/overrides directory. Doing
+that is highly preferred to modifying MediaWiki code directly. See
+mw-config/overrides/README for more details and examples.
There's a new maintenance/install.php script which could be used for performing
an install through the command line.
intelligently:
* $wgEmergencyContact: An e-mail address that can be used to contact the wiki
- administrator. By default, "wikiadmin@ServerName".
+ administrator. By default, "wikiadmin@ServerName".
* $wgPasswordSender: The e-mail address to use when sending password e-mails.
By default, "MediaWiki Mail <apache@ServerName>".
(with ServerName guessed from the http request)
an inconsistent wiki that may produce blank pages (php errors) when new features
using the changed schema would be used.
-Since MediaWiki 1.17 it is possible to upgrade using the installer by providing
+Since MediaWiki 1.17 it is possible to upgrade using the web installer by providing
an arbitrary secret value stored as $wgUpgradeKey in LocalSettings (older versions
needed to rename LocalSettings.php in order to upgrade using the installer).
== Documentation ==
MediaWiki's official documentation is split between two places: the source
-code, and <https://www.mediawiki.org/>. The source code documentation is written
+code, and <https://www.mediawiki.org/>. The source code documentation is written
exclusively by developers, and so is likely to be reliable (at worst,
-outdated). However, it can be pretty sparse. mediawiki.org documentation is
+outdated). However, it can be pretty sparse. mediawiki.org documentation is
often much more thorough, but it's maintained by a wiki that's open to
anonymous edits, so its quality is sometimes sketchy -- don't assume that
anything there is officially endorsed!
== Upstream ==
MediaWiki is a project hosted and led by the Wikimedia Foundation, the
-not-for-profit charity that operates Wikipedia. Wikimedia employs the lead
+not-for-profit charity that operates Wikipedia. Wikimedia employs the lead
developer and several other paid developers, but commit access is given out
-liberally and there are multiple very active volunteer developers as well. A
+liberally and there are multiple very active volunteer developers as well. A
list of developers can be found at <https://www.mediawiki.org/wiki/Developers>.
-MediaWiki's bug tracker is at <https://bugzilla.wikimedia.org>. However, most
-developers follow the bug tracker little or not at all. The best place to
-post if you want to get developers' attention is the wikitech-l mailing list
-<https://lists.wikimedia.org/mailman/listinfo/wikitech-l>. Posts to wikitech-l
-will inevitably be read by multiple experienced MediaWiki developers. There's
+MediaWiki's bug tracker is at <https://phabricator.wikimedia.org>. However, you
+might find that the best place to post if you want to get developers' attention
+is the wikitech-l mailing list
+<https://lists.wikimedia.org/mailman/listinfo/wikitech-l>. Posts to wikitech-l
+will inevitably be read by multiple experienced MediaWiki developers. There's
also an active IRC chat at <irc://irc.freenode.net/mediawiki>, where there are
usually several developers at reasonably busy times of day.
-Unfortunately, we don't have a very good system for patch review. Patches
-should be submitted on Bugzilla (as unified diffs produced with "svn diff"
-against the latest trunk revision), but many patches languish without review
-until they bitrot into uselessness. You might want to get a developer to
-commit to reviewing your patch before you put too much effort into it.
-Reasonably straightforward patches shouldn't be too hard to get accepted if
-there's an interested developer, however -- posting to Bugzilla and then
-dropping a note on wikitech-l if nobody responds is a good tactic.
+Our Git repositories are hosted at <https://gerrit.wikimedia.org>, see
+<https://www.mediawiki.org/wiki/Gerrit> for more information. Patches should
+be submitted there. If you know which developers are best suited to review your
+patch, add them to it, otherwise ask on IRC to get better review time.
All redistributors of MediaWiki should be subscribed to mediawiki-announce
-<https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce>. It's
-extremely low-traffic, with an average of less than one post per month. All
+<https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce>. It's
+extremely low-traffic, with an average of less than one post per month. All
new releases are announced here, including critical security updates.
== Useful software to install ==
* APC (Alternative PHP Cache), XCache, or similar: Will greatly speed up the
execution of MediaWiki, and all other PHP applications, at some cost in
- memory usage. Will be used automatically for the most part.
- * clamav: Can be used for virus scanning of uploaded files. Enable with
+ memory usage. Will be used automatically for the most part.
+ * clamav: Can be used for virus scanning of uploaded files. Enable with
"$wgAntivirus = 'clamav';".
- * DjVuLibre: Allows processing of DjVu files. To enable this, set
+ * DjVuLibre: Allows processing of DjVu files. To enable this, set
"$wgDjvuDump = 'djvudump'; $wgDjvuRenderer = 'ddjvu'; $wgDjvuTxt = 'djvutxt';".
- * HTML Tidy: Fixes errors in HTML at runtime. Can be enabled with
+ * HTML Tidy: Fixes errors in HTML at runtime. Can be enabled with
"$wgUseTidy = true;".
- * ImageMagick: For resizing images. "$wgUseImageMagick = true;" will enable
- it. PHP's GD can also be used, but ImageMagick is preferable.
- * Squid: Can provide a drastic speedup and a major cut in resource
- consumption, but enabling it may interfere with other applications. It might
- be suitable for a separate mediawiki-squid package. For setup details, see:
- <https://www.mediawiki.org/wiki/Manual:Squid_caching>
+ * ImageMagick: For resizing images. "$wgUseImageMagick = true;" will enable
+ it. PHP's GD can also be used, but ImageMagick is preferable.
+ * HTTP cache such as Varnish or Squid: can provide a drastic speedup and a
+ major cut in resource consumption, but enabling it may interfere with other
+ applications. It might be suitable for a separate package. For setup details, see:
+ - <https://www.mediawiki.org/wiki/Manual:Varnish_caching>
+ - <https://www.mediawiki.org/wiki/Manual:Squid_caching>
* rsvg or other SVG rasterizer: ImageMagick can be used for SVG support, but
- is not ideal. Wikipedia (as of the time of this writing) uses rsvg. To
+ is not ideal. Wikipedia (as of the time of this writing) uses rsvg. To
enable, set "$wgSVGConverter = 'rsvg';" (or other as appropriate).
- * texvc: Included with MediaWiki. Instructions for compiling and
- installing it are in the math/ directory.
-MediaWiki uses some standard GNU utilities as well, such as diff and diff3. If
+MediaWiki uses some standard GNU utilities as well, such as diff and diff3. If
these are present in /usr/bin or some other reasonable location, they will be
configured automatically on install.
-MediaWiki also has a "job queue" that handles background processing. Because
+MediaWiki also has a "job queue" that handles background processing. Because
shared hosts often don't provide access to cron, the job queue is run on every
-page view by default. This means the background tasks aren't really done in
-the background. Busy wikis can set $wgJobRunRate to 0 and run
-maintenance/runJobs.php periodically out of cron. Distributors probably
+page view by default. This means the background tasks aren't really done in
+the background. Busy wikis can set $wgJobRunRate to 0 and run
+maintenance/runJobs.php periodically out of cron. Distributors probably
shouldn't set this up as a default, however, since the extra cron job is
unnecessary overhead for a little-used wiki.
== Web server configuration ==
MediaWiki includes several .htaccess files to restrict access to some
-directories. If the web server is not configured to support these files, and
+directories. If the web server is not configured to support these files, and
the relevant directories haven't been moved someplace inaccessible anyway (e.g.
symlinked in /usr/share with the web server configured to not follow symlinks),
then it might be useful to deny web access to those directories in the web
static function srcSet( array $urls ) {
$candidates = [];
foreach ( $urls as $density => $url ) {
- // Cast density to float to strip 'x'.
- $candidates[] = $url . ' ' . (float)$density . 'x';
+ // Cast density to float to strip 'x', then back to string to serve
+ // as array index.
+ $density = (string)(float)$density;
+ $candidates[$density] = $url;
}
+
+ // Remove duplicates that are the same as a smaller value
+ ksort( $candidates, SORT_NUMERIC );
+ $candidates = array_unique( $candidates );
+
+ // Append density info to the url
+ foreach ( $candidates as $density => $url ) {
+ $candidates[$density] = $url . ' ' . $density . 'x';
+ }
+
return implode( ", ", $candidates );
}
}
return Http::request( 'POST', $url, $options, $caller );
}
- /**
- * Check if the URL can be served by localhost
- *
- * @param string $url Full url to check
- * @return bool
- */
- public static function isLocalURL( $url ) {
- global $wgCommandLineMode, $wgLocalVirtualHosts;
-
- if ( $wgCommandLineMode ) {
- return false;
- }
-
- // Extract host part
- $matches = [];
- if ( preg_match( '!^http://([\w.-]+)[/:].*$!', $url, $matches ) ) {
- $host = $matches[1];
- // Split up dotwise
- $domainParts = explode( '.', $host );
- // Check if this domain or any superdomain is listed as a local virtual host
- $domainParts = array_reverse( $domainParts );
-
- $domain = '';
- $countParts = count( $domainParts );
- for ( $i = 0; $i < $countParts; $i++ ) {
- $domainPart = $domainParts[$i];
- if ( $i == 0 ) {
- $domain = $domainPart;
- } else {
- $domain = $domainPart . '.' . $domain;
- }
-
- if ( in_array( $domain, $wgLocalVirtualHosts ) ) {
- return true;
- }
- }
- }
-
- return false;
- }
-
/**
* A standard user-agent we can use for external requests.
* @return string
// Otherwise, fallback to $wgHTTPProxy if this is not a machine
// local URL and proxies are not disabled
- if ( Http::isLocalURL( $this->url ) || $this->noProxy ) {
+ if ( self::isLocalURL( $this->url ) || $this->noProxy ) {
$this->proxy = '';
} else {
$this->proxy = Http::getProxy();
}
}
+ /**
+ * Check if the URL can be served by localhost
+ *
+ * @param string $url Full url to check
+ * @return bool
+ */
+ private static function isLocalURL( $url ) {
+ global $wgCommandLineMode, $wgLocalVirtualHosts;
+
+ if ( $wgCommandLineMode ) {
+ return false;
+ }
+
+ // Extract host part
+ $matches = [];
+ if ( preg_match( '!^https?://([\w.-]+)[/:].*$!', $url, $matches ) ) {
+ $host = $matches[1];
+ // Split up dotwise
+ $domainParts = explode( '.', $host );
+ // Check if this domain or any superdomain is listed as a local virtual host
+ $domainParts = array_reverse( $domainParts );
+
+ $domain = '';
+ $countParts = count( $domainParts );
+ for ( $i = 0; $i < $countParts; $i++ ) {
+ $domainPart = $domainParts[$i];
+ if ( $i == 0 ) {
+ $domain = $domainPart;
+ } else {
+ $domain = $domainPart . '.' . $domain;
+ }
+
+ if ( in_array( $domain, $wgLocalVirtualHosts ) ) {
+ return true;
+ }
+ }
+ }
+
+ return false;
+ }
+
/**
* Set the user agent
* @param string $UA
return $multi ? $res : current( $res );
}
- /**
- * Returns true if the given vhost is handled locally.
- *
- * @deprecated since 1.25; check if the host is in $wgLocalVirtualHosts instead.
- * @param string $vhost
- * @return bool
- */
- public function isLocalVHost( $vhost ) {
- return in_array( $vhost, $this->localVHosts );
- }
-
/**
* Merge multiple arrays together.
* On encountering duplicate keys, merge the two, but ONLY if they're arrays.
*/
public function updateNotificationTimestamp( User $editor, LinkTarget $target, $timestamp ) {
$dbw = $this->getConnection( DB_MASTER );
- $res = $dbw->select( [ 'watchlist' ],
- [ 'wl_user' ],
+ $uids = $dbw->selectFieldValues(
+ 'watchlist',
+ 'wl_user',
[
'wl_user != ' . intval( $editor->getId() ),
'wl_namespace' => $target->getNamespace(),
'wl_title' => $target->getDBkey(),
'wl_notificationtimestamp IS NULL',
- ], __METHOD__
+ ],
+ __METHOD__
);
+ $this->reuseConnection( $dbw );
- $watchers = [];
- foreach ( $res as $row ) {
- $watchers[] = intval( $row->wl_user );
- }
-
+ $watchers = array_map( 'intval', $uids );
if ( $watchers ) {
// Update wl_notificationtimestamp for all watching users except the editor
$fname = __METHOD__;
- $dbw->onTransactionIdle(
- function () use ( $dbw, $timestamp, $watchers, $target, $fname ) {
+ DeferredUpdates::addCallableUpdate(
+ function () use ( $timestamp, $watchers, $target, $fname ) {
global $wgUpdateRowsPerQuery;
+ $dbw = $this->getConnection( DB_MASTER );
+
$watchersChunks = array_chunk( $watchers, $wgUpdateRowsPerQuery );
foreach ( $watchersChunks as $watchersChunk ) {
$dbw->update( 'watchlist',
}
}
$this->uncacheLinkTarget( $target );
+
+ $this->reuseConnection( $dbw );
}
);
}
- $this->reuseConnection( $dbw );
-
return $watchers;
}
# Parse content; note that HTML generation is only needed if we want to cache the result.
$content = $page->getContent( Revision::RAW );
- $enableParserCache = $this->getConfig()->get( 'EnableParserCache' );
- $p_result = $content->getParserOutput(
- $title,
- $page->getLatest(),
- $popts,
- $enableParserCache
- );
-
- # Logging to better see expensive usage patterns
- if ( $forceRecursiveLinkUpdate ) {
- LoggerFactory::getInstance( 'RecursiveLinkPurge' )->info(
- "Recursive link purge enqueued for {title}",
- [
- 'user' => $this->getUser()->getName(),
- 'title' => $title->getPrefixedText()
- ]
+ if ( $content ) {
+ $enableParserCache = $this->getConfig()->get( 'EnableParserCache' );
+ $p_result = $content->getParserOutput(
+ $title,
+ $page->getLatest(),
+ $popts,
+ $enableParserCache
);
- }
-
- # Update the links tables
- $updates = $content->getSecondaryDataUpdates(
- $title, null, $forceRecursiveLinkUpdate, $p_result );
- DataUpdate::runUpdates( $updates );
-
- $r['linkupdate'] = true;
- if ( $enableParserCache ) {
- $pcache = ParserCache::singleton();
- $pcache->save( $p_result, $page, $popts );
+ # Logging to better see expensive usage patterns
+ if ( $forceRecursiveLinkUpdate ) {
+ LoggerFactory::getInstance( 'RecursiveLinkPurge' )->info(
+ "Recursive link purge enqueued for {title}",
+ [
+ 'user' => $this->getUser()->getName(),
+ 'title' => $title->getPrefixedText()
+ ]
+ );
+ }
+
+ # Update the links tables
+ $updates = $content->getSecondaryDataUpdates(
+ $title, null, $forceRecursiveLinkUpdate, $p_result );
+ DataUpdate::runUpdates( $updates );
+
+ $r['linkupdate'] = true;
+
+ if ( $enableParserCache ) {
+ $pcache = ParserCache::singleton();
+ $pcache->save( $p_result, $page, $popts );
+ }
}
} else {
$error = $this->parseMsg( [ 'actionthrottledtext' ] );
'be-tarask' => [ "Ё" ],
'cy' => [ "Ch", "Dd", "Ff", "Ng", "Ll", "Ph", "Rh", "Th" ],
'en' => [],
- 'fa' => [ "آ", "ء", "ه" ],
+ // RTL, let's put each letter on a new line
+ 'fa' => [
+ "آ",
+ "ء",
+ "ه",
+ "ا",
+ "و"
+ ],
'fi' => [ "Å", "Ä", "Ö" ],
'fr' => [],
'hu' => [ "Cs", "Dz", "Dzs", "Gy", "Ly", "Ny", "Ö", "Sz", "Ty", "Ü", "Zs" ],
*
* @param IDatabase $dbw
* @param integer $pageId
- * @return ScopedCallback|null Returns null on failure
+ * @param string $why One of (job, atomicity)
+ * @return ScopedCallback
* @throws RuntimeException
* @since 1.27
*/
- public static function acquirePageLock( IDatabase $dbw, $pageId ) {
- $scopedLock = $dbw->getScopedLockAndFlush(
- "LinksUpdate:pageid:$pageId",
- __METHOD__,
- 15
- );
+ public static function acquirePageLock( IDatabase $dbw, $pageId, $why = 'atomicity' ) {
+ $key = "LinksUpdate:$why:pageid:$pageId";
+ $scopedLock = $dbw->getScopedLockAndFlush( $key, __METHOD__, 15 );
if ( !$scopedLock ) {
- throw new RuntimeException( "Could not acquire lock on page #$pageId." );
+ throw new RuntimeException( "Could not acquire lock '$key'." );
}
return $scopedLock;
}
$pageId = $this->params['pageId'];
+
+ // Serialize links updates by page ID so they see each others' changes
+ $scopedLock = LinksUpdate::acquirePageLock( wfGetDB( DB_MASTER ), $pageId, 'job' );
+
if ( WikiPage::newFromID( $pageId, WikiPage::READ_LATEST ) ) {
// The page was restored somehow or something went wrong
$this->setLastError( "deleteLinks: Page #$pageId exists" );
* @return bool
*/
protected function runForTitle( Title $title ) {
+ $stats = MediaWikiServices::getInstance()->getStatsdDataFactory();
+
$page = WikiPage::factory( $title );
$page->loadPageData( WikiPage::READ_LATEST );
+
+ // Serialize links updates by page ID so they see each others' changes
+ $scopedLock = LinksUpdate::acquirePageLock( wfGetDB( DB_MASTER ), $page->getId(), 'job' );
+ // Get the latest ID *after* acquirePageLock() flushed the transaction.
+ // This is used to detect edits/moves after loadPageData() but before the scope lock.
+ // The works around the chicken/egg problem of determining the scope lock key.
+ $latest = $title->getLatestRevID( Title::GAID_FOR_UPDATE );
+
if ( !empty( $this->params['triggeringRevisionId'] ) ) {
// Fetch the specified revision; lockAndGetLatest() below detects if the page
// was edited since and aborts in order to avoid corrupting the link tables
$revision = Revision::newFromTitle( $title, false, Revision::READ_LATEST );
}
- $stats = MediaWikiServices::getInstance()->getStatsdDataFactory();
-
if ( !$revision ) {
$stats->increment( 'refreshlinks.rev_not_found' );
$this->setLastError( "Revision not found for {$title->getPrefixedDBkey()}" );
return false; // just deleted?
- } elseif ( !$revision->isCurrent() || $revision->getPage() != $page->getId() ) {
- // If the revision isn't current, there's no point in doing a bunch
- // of work just to fail at the lockAndGetLatest() check later.
+ } elseif ( $revision->getId() != $latest || $revision->getPage() !== $page->getId() ) {
+ // Do not clobber over newer updates with older ones. If all jobs where FIFO and
+ // serialized, it would be OK to update links based on older revisions since it
+ // would eventually get to the latest. Since that is not the case (by design),
+ // only update the link tables to a state matching the current revision's output.
$stats->increment( 'refreshlinks.rev_not_current' );
$this->setLastError( "Revision {$revision->getId()} is not current" );
return false;
}
}
- $latestNow = $page->lockAndGetLatest();
- if ( !$latestNow || $revision->getId() != $latestNow ) {
- // Do not clobber over newer updates with older ones. If all jobs where FIFO and
- // serialized, it would be OK to update links based on older revisions since it
- // would eventually get to the latest. Since that is not the case (by design),
- // only update the link tables to a state matching the current revision's output.
- $stats->increment( 'refreshlinks.rev_cas_failure' );
- $this->setLastError( "page_latest changed from {$revision->getId()} to $latestNow" );
- return false;
- }
-
DataUpdate::runUpdates( $updates );
InfoAction::invalidateCache( $title );
}
// Additional densities for responsive images, if specified.
- if ( !empty( $this->responsiveUrls ) ) {
- $attribs['srcset'] = Html::srcSet( $this->responsiveUrls );
+ // If any of these urls is the same as src url, it'll be excluded.
+ $responsiveUrls = array_diff( $this->responsiveUrls, [ $this->url ] );
+ if ( !empty( $responsiveUrls ) ) {
+ $attribs['srcset'] = Html::srcSet( $responsiveUrls );
}
Hooks::run( 'ThumbnailBeforeProduceHTML', [ $this, &$attribs, &$linkAttribs ] );
return false;
}
- $title = $this->mTitle;
- wfGetDB( DB_MASTER )->onTransactionIdle( function() use ( $title ) {
- // Invalidate the cache in auto-commit mode
- $title->invalidateCache();
- } );
-
+ $this->mTitle->invalidateCache();
// Send purge after above page_touched update was committed
DeferredUpdates::addUpdate(
- new CdnCacheUpdate( $title->getCdnUrls() ),
+ new CdnCacheUpdate( $this->mTitle->getCdnUrls() ),
DeferredUpdates::PRESEND
);
],
'mediawiki.content.json' => [
'position' => 'top',
- 'styles' => 'resources/src/mediawiki/mediawiki.content.json.css',
+ 'styles' => 'resources/src/mediawiki/mediawiki.content.json.less',
],
'mediawiki.confirmCloseWindow' => [
'scripts' => [
+++ /dev/null
-/*!
- * CSS for styling HTML-formatted JSON Schema objects
- *
- * @file
- * @author Munaf Assaf <massaf@wikimedia.org>
- */
-
-.mw-json {
- border-collapse: collapse;
- border-spacing: 0;
- font-style: normal;
-}
-
-.mw-json th,
-.mw-json td {
- border: 1px solid #808080;
- font-size: 16px;
- padding: 0.5em 1em;
-}
-
-.mw-json .value,
-.mw-json-single-value {
- background-color: #dcfae3;
- font-family: monospace, monospace;
- white-space: pre-wrap;
-}
-
-.mw-json-single-value {
- background-color: #eee;
-}
-
-.mw-json-empty {
- background-color: #fff;
- font-style: italic;
-}
-
-.mw-json tr {
- margin-bottom: 0.5em;
- background-color: #eee;
-}
-
-.mw-json th {
- background-color: #fff;
- font-weight: normal;
-}
-
-.mw-json caption {
- /* For stylistic reasons, suppress the caption of the outermost table */
- display: none;
-}
-
-.mw-json table caption {
- color: #808080;
- display: inline-block;
- font-size: 10px;
- font-style: italic;
- margin-bottom: 0.5em;
- text-align: left;
-}
--- /dev/null
+/*!
+ * CSS for styling HTML-formatted JSON Schema objects
+ *
+ * @file
+ * @author Munaf Assaf <massaf@wikimedia.org>
+ */
+
+.mw-json {
+ border-collapse: collapse;
+ border-spacing: 0;
+ font-style: normal;
+}
+
+.mw-json th,
+.mw-json td {
+ border: 1px solid #808080;
+ font-size: 16px;
+ padding: 0.5em 1em;
+}
+
+.mw-json .value,
+.mw-json-single-value {
+ background-color: #dcfae3;
+ font-family: monospace, monospace;
+ white-space: pre-wrap;
+}
+
+.mw-json-single-value {
+ background-color: #eee;
+}
+
+.mw-json-empty {
+ background-color: #fff;
+ font-style: italic;
+}
+
+.mw-json tr {
+ margin-bottom: 0.5em;
+ background-color: #eee;
+}
+
+.mw-json th {
+ background-color: #fff;
+ font-weight: normal;
+}
+
+.mw-json caption {
+ /* For stylistic reasons, suppress the caption of the outermost table */
+ display: none;
+}
+
+.mw-json table caption {
+ color: #808080;
+ display: inline-block;
+ font-size: 10px;
+ font-style: italic;
+ margin-bottom: 0.5em;
+ text-align: left;
+}
<li>b</li>
</ul>
!! end
+
+!! test
+Thumbnail output
+!! wikitext
+[[File:Thumb.png|thumb]]
+!! html/php+tidy
+<div class="thumb tright">
+<div class="thumbinner" style="width:137px;"><a href="/wiki/File:Thumb.png" class="image"><img alt="Thumb.png" src="http://example.com/images/e/ea/Thumb.png" width="135" height="135" class="thumbimage" /></a>
+<div class="thumbcaption">
+<div class="magnify"><a href="/wiki/File:Thumb.png" class="internal" title="Enlarge"></a></div>
+</div>
+</div>
+</div>
+!! end
'1x.png 1x, 1_5x.png 1.5x, 2x.png 2x',
'pixel depth keys may omit a trailing "x"'
],
+ [
+ [ '1' => 'small.png', '1.5' => 'large.png', '2' => 'large.png' ],
+ 'small.png 1x, large.png 1.5x',
+ 'omit larger duplicates'
+ ],
+ [
+ [ '1' => 'small.png', '2' => 'large.png', '1.5' => 'large.png' ],
+ 'small.png 1x, large.png 1.5x',
+ 'omit larger duplicates in irregular order'
+ ],
];
}
public function testUpdateNotificationTimestamp_watchersExist() {
$mockDb = $this->getMockDb();
$mockDb->expects( $this->once() )
- ->method( 'select' )
+ ->method( 'selectFieldValues' )
->with(
- [ 'watchlist' ],
- [ 'wl_user' ],
+ 'watchlist',
+ 'wl_user',
[
'wl_user != 1',
'wl_namespace' => 0,
'wl_notificationtimestamp IS NULL'
]
)
- ->will(
- $this->returnValue( [
- $this->getFakeRow( [ 'wl_user' => '2' ] ),
- $this->getFakeRow( [ 'wl_user' => '3' ] )
- ] )
- );
- $mockDb->expects( $this->once() )
- ->method( 'onTransactionIdle' )
- ->with( $this->isType( 'callable' ) )
- ->will( $this->returnCallback( function( $callable ) {
- $callable();
- } ) );
+ ->will( $this->returnValue( [ '2', '3' ] ) );
$mockDb->expects( $this->once() )
->method( 'update' )
->with(
public function testUpdateNotificationTimestamp_noWatchers() {
$mockDb = $this->getMockDb();
$mockDb->expects( $this->once() )
- ->method( 'select' )
+ ->method( 'selectFieldValues' )
->with(
- [ 'watchlist' ],
- [ 'wl_user' ],
+ 'watchlist',
+ 'wl_user',
[
'wl_user != 1',
'wl_namespace' => 0,
->will(
$this->returnValue( [] )
);
- $mockDb->expects( $this->never() )
- ->method( 'onTransactionIdle' );
$mockDb->expects( $this->never() )
->method( 'update' );
$this->getFakeRow( [ 'wl_notificationtimestamp' => '20151212010101' ] )
) );
$mockDb->expects( $this->once() )
- ->method( 'select' )
+ ->method( 'selectFieldValues' )
->will(
- $this->returnValue( [
- $this->getFakeRow( [ 'wl_user' => '2' ] ),
- $this->getFakeRow( [ 'wl_user' => '3' ] )
- ] )
+ $this->returnValue( [ '2', '3' ] )
);
- $mockDb->expects( $this->once() )
- ->method( 'onTransactionIdle' )
- ->with( $this->isType( 'callable' ) )
- ->will( $this->returnCallback( function( $callable ) {
- $callable();
- } ) );
$mockDb->expects( $this->once() )
->method( 'update' );
$this->assertFalse( $testXML->wellFormed );
}
+ /**
+ * Verify we check for recursive entity DOS
+ *
+ * (If the DOS isn't properly handled, the test runner will probably go OOM...)
+ */
+ public function testRecursiveEntity() {
+ $xml = <<<'XML'
+<?xml version="1.0" encoding="utf-8"?>
+<!DOCTYPE foo [
+ <!ENTITY test "&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;">
+ <!ENTITY a "&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;">
+ <!ENTITY b "&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;">
+ <!ENTITY c "&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;&d;">
+ <!ENTITY d "&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;&e;">
+ <!ENTITY e "&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;&f;">
+ <!ENTITY f "&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;&g;">
+ <!ENTITY g "-00000000000000000000000000000000000000000000000000000000000000000000000-">
+]>
+<foo>
+<bar>&test;</bar>
+</foo>
+XML;
+ $check = XmlTypeCheck::newFromString( $xml );
+ $this->assertFalse( $check->wellFormed );
+ }
+
/**
* @covers XMLTypeCheck::processingInstructionHandler
*/
}
/**
- * @covers ExtensionProcessor::extractConfig
+ * @covers ExtensionProcessor::extractConfig1
*/
public function testExtractConfig() {
$processor = new ExtensionProcessor;