Optimizations and performances

Aka webperf, web perf

respecting the medium

Loading, parsing, rendering, etc.

We don't want humans waiting on computers. We want computers waiting on humans.

— Gregory Szorc

Monitoring is your bank telling you you're overdrawn.

Observability is the ability to tell you're running out of money because you're spending too much money on chocolates, cakes and sweets because you've recorded data on what you spent your money on throughout the month.

Liz Fong-Jones (方禮真) sur Twitter : "Monitoring is your bank telling you you're overdrawn. Observability is the ability to tell you're running out of money because you're spending too much money on chocolates, cakes and sweets because you've recorded data on what you spent your money on throughout the month. https://t.co/kQcvlUWqqV" / Twitter

Progressive JPEGs decode slower than Baseline ones. [..] decoding a progressive JPEG takes about 3.3× as long as a baseline one. (I would still absolutely recommend using progressive, because they feel a lot faster than their baseline counterparts.)

Base64 Encoding & Performance, Part 2: Gathering Data – CSS Wizardry – Web Performance Optimisation

See also Images, Content Security Policy

Small DOMs, no CSS descendant selectors (offsetTop, etc. trigger layout)

Take account traffic, processing time, etc.

Tools (audit, checklist, benchmark, best practices, etc.):

Control your content

Organization

Impacts

To mesure impacts (KPI: bounce rate, conversion rate): use A/B testing

Mobile

Perceived performance

Aka feedback first, skeleton UI

Reduce bytes

http://www.haratetsuo.com/wp-content/themes/haratetsuo2018_cms_v2/images/ico/arrow.svg The 30MB SVG image is a simple arrow! It's served uncompressed (gzipped would be 24MB and brotli would be 3.8MB). It contains 82 base64 encoded JPG images (one per image size). There are only 10 unique base64 images encoded in the file, so a lot of repetition...

Paul Calvano on Twitter: "Exactly! The 30MB SVG image is a simple arrow!… https://t.co/ci2hz7WpLI"

The fastest byte is a byte not sent.

— Ilya Grigorik http://chimera.labs.oreilly.com/books/1230000000545

Reduce page weight can give a (wrong) feedback about the page load time increase:

When I was at Google, someone told me a story about a time that “they” completed a big optimization push only to find that measured page load times increased. When they dug into the data, they found that the reason load times had increased was that they got a lot more traffic from Africa after doing the optimizations. The team’s product went from being unusable for people with slow connections to usable, which caused so many users with slow connections to start using the product that load times actually increased.

— Dan Luu in Most of the web really sucks if you have a slow connection

Blocking deliberately or unintentionally device or connection capacities (browser version) will give wrong feedback about "users don't use old browser" (because they can't) or "users don't have slow connection" (because page can't load in decent time) See also Page Weight Matters.

Always reduce before encrypt (aka always compress before encode), not the inverse:

  • minification:

    • optimize/reduce without require reverse operation

    • remove unessary data (duplicates, unused tags, selectors)

    • merge/round similar data, reduce precision (0.2345 -> .23)

    • http://cssstats.com/

    • https://isellsoap.github.io/specificity-visualizer/

  • compact/compress (require processing to recover operable state): pre-gzip. See Precompress and Content encoding`

  • use different compression algorithm, or a better tool/algorithm implementation (like zopfli for deflate)

    Note some framework read the zlib.output_compression value to define the header Content-Encoding to gzip (set zlib.output_compression = Off in php.ini)

    Example: compress CSS with optimized dictionary based alogrithms (like gzip use huffman tables): blocks, selectors, media queries, properties, values. See [Compression & Minification](CSS#Compression & Minification)

  • split commbined HTTP headers values: to use HPACK (HTTP/2.0) Instead of:

    set-cookie: key1=value1; key2=value2; domain=.example.com; expires=Fri, 21-Dec-2018 22:15:42 GMT; path=/

    Use:

    set-cookie: key1=value1; domain=.example.com; expires=Fri, 21-Dec-2018 22:15:42 GMT; path=/
    set-cookie: key2=value2; domain=.example.com; expires=Fri, 21-Dec-2018 22:15:42 GMT; path=/
  • use the right format (if supported, or a fallback):

    • as binary representation of CSS (really usefull vs GZ?)

    • use image as data container (as colors, but could be lossy)

    • transparent video: WebM or side by side channels (RGB + A as RGB) videos. See Alpha

    • transparent image: instead of PNG use WebP (or JPEG-XR) or combined formats. See [Alpha compression](Image#Alpha compression)

    • animated image: use APNG or Animated WebP or looped video instead of GIF

      <video autoplay loop muted playsinline poster="original.jpg">
      	<source type="video/webm" src="original.webm">
      	<img src="original.gif">
      </video>

      See [Animated image](Image#Animated image). See also Image sequence

    • Use a video with only 1 frame as image (ex: WebP), could done with H.264/H.265, see [Use video codec](Image#Use video codec)

    • Use code/lib to animate elements instead of video or sprites: https://github.com/bodymovin/bodymovin

    • Don’t use JPEG-XR on the Web (JPEG-XRs are decoded on "the software-side on the CPU" by Internet Explorer and Microsoft Edge, the only browsers that support it)

  • HTML: remove comments, spaces, not required closing tags, attibutes, chars like ", entities (when not required), etc.

  • CSS: remove comments, prefixed properties, useless selectors, regroup using shorthand properties (like background for: background-image, etc.), etc.

  • SVG:

  • JavaScript: optimize by remove dead code (treeshaking), uglify (rename variables for shorter name, inline function, resolve static expressions), etc.

  • Images:

    • image compression options (if applicable, cf. formats)

      • [Image - Compression and optimisation](Image#Compression and optimisation) and [Image - Mixed formats](Image#Mixed formats)

      • [Video - Compression and optimisation](Video#Compression and optimisation)

      • Compression by downscaling technique

      Use other (experimental) formats like BPG or FLIF

      To investigate: store vector using SWF, decompress on the fly (to SVG) with ServiceWorker; decompress biJPEG to PNG or webp with OfflineCanvas

    • use the right dimensions (ex.: thumbnails)

  • 3D models: drops leading zeroes (-0.5 -> -.5) and changes the scale of the model

  • fonts:

  • other formats:

Reduce requests

  • embed images in CSS using data URI, encoded with URI encoding or base64 (could have a negative impact on low end devices)

  • JS, CSS in HTML in tags <script> and <style>

  • CSS/HTML in JavaScript variables

  • HTML prerendered (HTML + JS + data)

  • bundle JS and/or CSS, image spriting https://github.com/clyfish/zerosprites http://yostudios.github.io/Spritemapper/

  • add the right header(s) for server cache strategy (see Cache)

Note: HTTP/2 multiplex and push are not always a good alternative, because compression is more efficient on larger chunk data. Musings on HTTP/2 and Bundling | CSS-Tricks

Reduce latency

Mesure latency:

Use CDN

Use a popular TLD do use DNS resolver cache:

First packet size

Initial TCP window

Minimize ATF (above-the-fold) content size: The first TCP connection isn’t able to fully utilize a connection’s bandwidth on the first roundtrip, which means the number of packets it can send is very limited. In order to render your ATF content it needs to be 148 kb or less — 5 SEO Guidelines for Web Developers

Congestion window: initial cwnd size is 10; 14.6KB = 10 packets of 1460 Bytes.

TODO: where "148 kb" come from?

Fast Open

Keep-Alive

Use Keep-Alive connection (more useful for HTTP/1.X connection than HTTP/2):

<IfModule mod_headers.c>
	Header set Connection keep-alive
</IfModule>

IPv6

Edge side includes

CDN or cache proxy agregade differents resource fragments with cache

Dedicated domains cookies less

Reduce latency server side.

For statics resources (don't require cookies), use a dedicated domain.

With HTTP/2.0 it's no more useful, with header compression.

Multiple domains for static resources

Aka domain sharding, domain sharing

Allow multiple parallels requests.

Ex: static1.example.com and static2.example.com

Not adviced if you use HTTP/2.0 or HTTPS

Serve progressive HTML document

Serve the document using chunk encoding.

Note: browser often have a buffer of 4096 bytes

  1. Send the HTTP headers

  2. the HTML head (title, metas, scripts, styles, etc.)

  3. first part of the HTML body

  4. (*n) send chunks with <p class="progress">Progress: XX%...</p> to show progression

  5. send chunk <p class="progress">Complete</p>

  6. send chunk <style>.progress{display: none}</style>

  7. send the remains HTML document part

Precompress

See Content encoding

For all text formats: CSS, SVG, JavaScript

Could also be used for generic data (other formats that don't already use compression), or if the compression mode in those format can be disabled (ex: uncompressed PNG + Content encoding)

Server precompressed files with nginx

With nginx ngx_http_gzip_static_module and ngx_brotli (there is no ngx_http_brotli_static_module)

Server precompressed files with Apache

Multiplexed / Pipelining

HTTP/1.1 has pipelining, but not well supported (by proxies, etc.) HTTP/2 is multiplexed

TLS

Use dedicated servers

Load balancer, localized CDN, etc.

Cache

To control static resource version, use checksum instead of build number. Which means you only download a new copy when it actually changes (see ETag).

Use forever cache (cache immutable) for static resources.

Cached

max-age or expires headers, Header append Cache-Control "public", Header append Cache-Control "immutable" avoid check of 304s

.htaccess:

 Requires mod_expires to be enabled.
<IfModule mod_expires.c>
	# Enable expirations.
	ExpiresActive On

	# Cache all files for 2 weeks after access (A).
	ExpiresDefault A1209600

	<FilesMatch \.php$>
	  # Do not allow PHP scripts to be cached unless they explicitly send cache
	  # headers themselves. Otherwise all scripts would have to overwrite the
	  # headers set by mod_expires if they want another caching behavior. This may
	  # fail if an error occurs early in the bootstrap process.
	  ExpiresActive Off
	</FilesMatch>
</IfModule>

PHP:

header('Expires: '.gmdate('D, d M Y H:i:s \G\M\T', time() + (60 * 60 * 24 * 12)));// 2 weeks

Varnish script to Handle Vary: User-Agent to create classes to reduce variations.

 recv
if (req.http.host ~ "^example.com$") {
	if (req.http.User-Agent ~ "MSIE\s[1-10]\." || req.http.User-Agent ~ "Edge\/[1-10]\." || req.http.User-Agent ~ "Trident\/.*rv:[1-10]\." || req.http.User-Agent ~ "Firefox\/[1-41]\." || req.http.User-Agent ~ "Safari\/[1-8]\." || req.http.User-Agent ~ "Version\/[1-8]\.")
	{
		# Set header for "old browser"
		set req.http.X-UA-Device = "old";
	}
	elsif (req.http.User-Agent ~ "facebookexternalhit\/" || req.http.User-Agent ~ "Facebot" || req.http.User-Agent ~ "Pinterest\/")
	{
		# Set header for "bot"
		set req.http.X-UA-Device = "bot";
	}
	else
	{
		# Set header for "recent browser"
		set req.http.X-UA-Device = "new";
	}
}

 Fetch
if (req.http.host ~ "^example.com$") {
	set beresp.http.X-UA-Device = req.http.X-UA-Device;

	# Vary by X-UA-Device, so varnish will keep distinct object copies by X-UA-Device value
	if (beresp.http.Vary)
	{
		set beresp.http.Vary = beresp.http.Vary + ",X-UA-Device";
	}
	else
	{
		set beresp.http.Vary = "X-UA-Device";
	}

	# Remove User-Agent from Vary (provide by App)
	if (beresp.http.Vary ~ "User-Agent") {
		set beresp.http.Vary = regsub(beresp.http.Vary, ",? *User-Agent *", "");
		set beresp.http.Vary = regsub(beresp.http.Vary, "^, *", "");
		if (beresp.http.Vary == "") {
			unset beresp.http.Vary;
		}
	}
}

Cacheable chunks

Split resource in cacheable subresources

Not cached

Prevent back button to show cache page (ex.: after logout)

HTTP/1.1 200 OK
Cache-Control: no-cache, no-store, must-revalidate
Expires: 0
HTTP/1.0 200 OK
Pragma: no-cache
Cache-Control: max-age=0

Note: Cache-Control: no-cache is for HTTP/1.1 where Pragma: no-cache is for HTTP/1.0. See http - Difference between Pragma and Cache-control headers? - Stack Overflow

header('Expires: ' . gmdate('D, d M Y H:i:s', time()) . ' GMT');
//header('ETag: "' . sha1(time()) . '"');

Cache partitioning

Reduce processing

Relayout, repaint, reflow

See Relayout, repaint, reflow

Reduce CPU/GPU usage

Ex., code markup:

Other:

Control loading

Load based on the device or network capabilities:

78KB for first data (total, can be compressed) to ensure an optimal experience on LTE, for under 200ms delivery (determined empirically, 75% of handset will load in less than or equal to 200ms): MobilePerf Insights: Why LTE Has Slowed by 50% in the US This Year - Twin Prime

Full raw data vs. data + computations (ex: BMP vs PNG)

Use cache

  • use font-display property for fonts

  • use fallback font

  • download only diff of images (like videos using relative compression + keyframes): render in canvas

  • progressive data (JPEG, Video). It's even better with HTTP2 multiplexing

  • adaptative byterate with DASH https://gist.github.com/ddennedy/16b7d0c15843829b4dc4

"preload" metadata like size, duration, dominant color / thumbnail, LOD, deepth, etc.

The simple summary is

  • preload: when you use on the same page

  • prefetch: for future use (next page)

<img> vs background-image

Buffering

Aka dynamic buffering

Resource hint

See also dns-prefetch, preconnect, prerender

See Server Push

<link rel="preload" as="style" href="style.css">
<link rel="prefetch" href="style.css">

Preload: load this resource (high priority to low priority based on the value of as attribute) Prefetch: load this resource after all other resources (very low priority), will prefetched the resource when the browser is idle for future navigation

Preload will be started immediately and prefetch will be started only after the main page finishes loading.

Preload should be rarely used and only used for small media files (< 5 MB). It's often used as band-aid for an underlying issue. See The cost of preload

Don't use both preload and prefetch in same time: <link rel="preload prefetch" as="style" href="style.css"> because they don't have the same purpose. This create potentially 2 requests.

Link: </styles/style.css>; rel=preload; as=style
<link rel="preload" as="style" href="style.css">
<meta http-equiv="Link" content="</styles/style.css>; rel=preload; as=style">

Using JavaScript, but prefer the link / header version:

<script>
var res = document.createElement("link");
res.rel = "preload";
res.as = "style";
res.href = "styles/other.css";
document.head.appendChild(res);
</script>

Use <link rel="preload" href="styles.css" as="style">, work for fetch, audio, font, image, script, style, track, video and image. See Fetch Standard

Use type parameter to specify the format of the resource. It's usefull to let the browser choose the supported format. But should be carefull defined (a browser support all available choices will preload all). In case you want to preload a font available in both WOFF and WOFF2, reclare only the last one (font/woff2), because browsers support preload also support this more recent format.

Note about mandatory as attribute and valid value:

Reactive prefetch: Google mobile search is getting faster - to be exact, 100-150 milliseconds fa...

// <a href="https://example.com/">Example</a>
// Will start to load the following resource even if the browser didn't start loading the main resource
// It's like Link preload header but before the browser start the connection to the host
myExampleLink.addEventListener("click", event => {
	for(const src of [
		"https://example.com/style.css",
		"https://example.com/script.js",
		"https://example.com/image.jpg",
	]){
		const hint = document.createElement("link");
		hint.rel = "prefetch";
		hint.href = url;
		document.head.appendChild(hint);
	}
});

Preload ASAP

As soon as possible, use UDP Priming (pre request like HEAD) to let the server prepare response

Load list by fragments 1+1+X (better than X and 3+7+X) like streaming

Download priority

  • HTML (highest)

  • CSS (highest)

  • images (low or medium)

  • XHR/Fetch (high)

  • fonts (highest)

  • scripts: low (async, defer or type module), medium (<script src="script.js"></script>) or high (<script src="script.js"></script> before an image)

Note: this is how Chrome prioritize resources

On demand

Aka Lazyload, LOD, defered loading

Use <noscript> tags, or handle it with Service Worker (replace all <img> by a placeholder). See <noscript> and search engines

Live streaming, or start play video when the file is not completely generated:

<video>
	<source src="video.m3u8" type="application/x-mpegURL"><!-- HTTP Live Streaming, required by Safari in that case: if the media size is not known -->
	<source src="video.mpd" type="application/dash+xml"><!-- MPEG-DASH -->
	<source src="video.mp4" type="video/mp4"><!-- Use chunk encoding -->
	<!-- Or use JS to use MSE API for streaming protocols like DASH -->
</video>

Image lazyload

Use loading="lazy" attribute for images (and iframes)

Use a placeholder element, or at least use a SVG in data URI for the src attribute to prevent reflow

  1. placeholder shouldn't be visible if script is not available (inlined CSS hide the placeholder)

  2. placeholder should have role="img" and aria-label set with image alt (will be set by the JS)

  3. should works with complex image content (eg. <figure>)

TODO: support all replaced elements: video, audio and iframe. Use a weakmap or additional property to store the replacement fragment (parse noscript HTML only one time)

<style>
.img,
.img-placeholder{
   display: block;
   /* height style of 69.44% is the equivalent for 16/9 of the width. See CSS#Vertical%20percentages */
   padding-top: 69.44%;
   background: grey;
   position: relative;
}
/* image fallback text (alt attribute) */
.img::before{
   display: flex;
   position: absolute;
   top: 0;
   left: 25%;
   height: 100%;
   width: 50%;
   align-items: center;
   justify-content: center;
}
</style>
<!--
Don't display placeholders if script is disabled
Can be a declared in linked (external) stylesheet but it's not recommended
-->
<style id="noscript-style">.img-placeholder{display: none;}</style>

<script>
document.getElementById("noscript-style").remove();
document.addEventListener("DOMContentLoaded", event => {
   // documents without browsing context don't load resources (images will not be loaded)
   const doc = document.implementation.createHTMLDocument("");
   // Note: longdesc couldn't be direclty replaced by aria-describedby or aria-details
   const filterOutAttrs = "alt class crossorigin ismap longdesc referrerpolicy sizes src srcset usemap".split(" ");
   // Upgrade placeholder with all required attributes (role=img and aria-label) and remove noscript tags
   document.querySelectorAll(".img-placeholder").forEach(placeholder => {
   	const noscript = placeholder.firstElementChild;
   	doc.body.innerHTML = noscript.textContent;
   	const img = doc.querySelector("img");
   	// Copy all attributes from noscript tag to the placeholder
   	Array.from(img.attributes).filter(attr => !filterOutAttrs.includes(attr.name)).forEach(attr => placeholder.setAttribute(attr.name, attr.value));
   	// We don't need XML namespace here (HTML5 don't support it). Else use placeholder.setAttributeNS(attr.namespace, attr.localName, attr.value)
   	placeholder.setAttribute("role", "img");
   	placeholder.setAttribute("aria-label", img.alt);
   	placeholder.dataset.srcdoc = noscript.textContent;// store raw HTML for later use
   	// Remove the noscript tag
   	noscript.remove()
   })
});
</script>

<span class="img-placeholder" style="background: grey;"></span>
<noscript><img class="img" src="cat.jpg" alt="A photography of a cat"></noscript>

<figure>
   <span class="img-placeholder" style="background: brown;"></span>
   <noscript><img class="img" srcset="dog_2x.jpg 2x" src="dog.jpg" alt="A photography of a dog"></noscript>
   <figcaption>Fig1. A dog</figcaption>
</figure>

<span class="media-placeholder" style="background: red;"></span>
<noscript><video class="media" src="fish.mp4">A video of a fish</video></noscript>

After a user action or intersection observation (scroll → visible in the viewport), replace the placeholder by the final HTML:

  1. Read document visibility document.visibilityState === "visible" || document.visibilityState === "prerender"

  2. Listen document visibility change document.addEventListener("visibilitychange", documentVisibilityChange); if hidden, store which placeholder(s) must be replaced then wait the document to be visible

  3. Use the IntersectionObserver API of observe placeholders (currently) in the DOM

  4. Use MutationObserver to handle DOM modifications (to observe later attached placeholders or disconnect detached placeholders). See DOM mutation

  5. Use matchMedia API to match print media (to load all images)

    const mediaQuery = window.matchMedia("print");
    function mediaQueryChange(){
    	if(!mediaQuery.matches){
    		return;
    	}
    	mediaQuery.removeListener("change", mediaQueryChange);
    	/*replace placeholder*/
    }
    mediaQueryChange();
    mediaQuery.addListener("change", mediaQueryChange);
  6. Replace placeholders: placeholder.replaceWith(document.createRange().createContextualFragment(placeholder.dataset.srcdoc));

    const range = document.createRange();
    range.selectNode(placeholder.parentNode);
    placeholder.replaceWith(range.createContextualFragment(placeholder.dataset.srcdoc));// Note: createContextualFragment() is supported by IE11+

If the IntersectionObserver API is not supported:

  • use the polyfill with caution, it's impact performance (use getClientBoundingRect() and getComputedStyle) and it's add weight to load. Additional the existing polyfill are not spec compilant, often don't handle visibility, parent non visible overflow or crop, CSS changes, etc.

  • or fallback to scroll listener, setInterval() with a not too small delay and getClientBoundingRect()

  • or load all lazyloaded images, it's the default behavior before lazyload has been implemented

See also:

Progressive load

Or partial load

For images (works better with progressive images), in Edge Workers (Service Workers for CDN):

  1. receive the client request (for document, images, etc.)

  2. send first 521B/1kB (headers of image / metadata - size) for the browser to do the layout as soon as possible

  3. wait 20ms to let the client to process CSS, JS and other critical resources

  4. send first 15% of the resource (~15% = contains the progressive bytes of the image)

  5. wait for other resources (for the same client/requested document/referrer) to let the browser render first layers, before it recieve the rest of the resources

  6. send the rest

An other solution:

<link rel="stylesheet" href="styles.css" media="print" onload="onload=null;media='all'">
<noscript><link rel="stylesheet" href="styles.css"></noscript>

See also:

<link rel="preload" href="styles.css" as="style" onload="onload=null;rel='stylesheet'">
<noscript><link rel="stylesheet" href="styles.css"></noscript>

But this still blocking DOM parser on few browsers (IE11, Firefox 36). See https://github.com/scottjehl/css-inapplicable-load#the-bad Can be use to load fonts (inlined in CSS). Or use preload font

See <noscript> and search engines

Critical path

Aka blocking dependencies, critical resources

To find blocking points / bottlenecks / critical rendering path

A critical request is one that contains an asset that is essential to the content within the users viewport

Ben Schwarz

Example: the hero image

Optimise for the critical rendering path, get everything at the top of the page in view as fast as possible. Then lazy load the rest.

Composite metric examples (based on what the user care about):

Which metric is best?

Anatomy of a webpage

Aka wireframe of a webpage

Note: always render document server side. All the document content must be delivered. JavaScript (client side) can be used to enhance experience (loading, UI elements, etc.). See Progressive Enhancement

Have things (critical content) before 1000ms. See responsiveness

Anatomy/wireframe of a webpage:

<!--
HTTP header `Link` dns-prefetch, preconnect and preload for critical style, script and content
Critical styles and scripts should be in the [Initial TCP window](#initial-tcp-window)
Inlined scripts below, can be files if pushed with HTTP/2 server push before the HTML document
-->
<!DOCTYPE html>
<html lang="en">
	<head>
		<!--
		Page format
		Before any other resource and content
		-->
		<meta charset="utf-8"><!-- First, must be within the first 1024 bytes. Required or use a BOM or charset parameter in Content-Type header -->
		<meta http-equiv="..." content="...">
		<base ...><!-- if base is used, but not recommended -->

		<!--
		Meta viewport and title
		On top
		-->
		<meta name="viewport" content="width=device-width, initial-scale=1">
		<title>...</title><!-- Limited to 80 chars max -->

		<!--
		Prerequises scripts
		Device detection (mediaqueries, etc.) or page location redirection that not require to display anything
		-->
		<script>/*inlined script*/</script>

		<!--
		Critical script
		Script for critical content: display a loader, handle critical content interaction/layout while the non critical content is loading.
		Before critical styles to not block JS execution
		See [Critical rendering path](#critical-rendering-path)
		-->
		<script>/*inlined script*/</script>
		<script src="..."></script><!-- Only if pushed with HTTP/2.0 server push before the HTML document -->

		<!--
		Critical styles
		Style for critical content: above floating line (main nav, header, hero image, article) or show a loader, while the rest is loading.
		Shouldn't be used to display aside content: comments, commercial components/ads, popular content, related content, sharing widgets, etc.
		Do not use @import, it's a not recommended as Chrome have an issue with it https://bugs.chromium.org/p/chromium/issues/detail?id=1224629
		See [Critical rendering path](#critical-rendering-path)
		-->
		<style>/*inlined style*/</style>
		<link rel="stylesheet" href="..."><!-- Only if pushed with HTTP/2 server push -->
		<style>.icon{width:1em;height:1em}</style><!-- Use to prevent inlined SVG icons be unscaled -->

		<!--
		Resources hint (dns-prefetch, preconnect and preload)
		Can be set as Link header
		-->
		<link rel="preconnect dns-prefetch" href="..."><!-- more than one relationship can be used, here dns-prefetch is used as a fallback. Note: this can create multiple network call -->
		<!-- Preload only if the resources must be start loading before the critical content, like the hero image. See "For non critical content" section -->
		<link rel="preload" href="..." as="style"><!-- for images, fonts, etc. Attributes type="" and media="" can also be used -->
		<!-- Examples: -->
		<link rel="preload" href="font.woff" as="font" type="font/woff" media="min-width: 768px" crossorigin>
		<link rel="preload" href="font.woff2" as="font" type="font/woff2" media="min-width: 768px" crossorigin>

		<!--
		For non critical content
		Non blocking resource (styles and scripts)
		See also [Non-blocking stylesheet](#non-blocking-stylesheet) and [Blocking resources](#blocking-resources)
		-->
		<script src="" async></script>
		<script src="" defer></script>
		<script src="" type="module"></script>
		<link rel="stylesheet" href="/path/to/my.css" media="print" onload="media='all'"><!-- /path/to/my.css have a previous preload link tag -->

		<!--
		Metadata, manifest, document relationship, future navigation hint (prefetch, next, prerender), other SEO related tags, Open Graph, etc.
		Some Open Graph fetcher read only first 32k of the page: [Barry Pollard auf Twitter: "So unless your HEAD is more than 32k (in which case have a word with yourself will you!) you should be fine.… "](https://web.archive.org/web/20210211161623/https://twitter.com/tunetheweb/status/1359898874115145729)
		Others things that should be in the header
		See https://developer.mozilla.org/en-US/docs/Web/HTML/Link_types
		See https://developer.mozilla.org/en-US/docs/Web/HTTP/Link_prefetching_FAQ
		-->
		<meta name="..." content="...">
		<meta property="..." content="...">
		<link rel="alternate" type="application/rss+xml" title="..." href="..."><!-- RSS -->
		<link rel="alternate" type="application/json" title="..." href="...">
		<link rel="canonical" href="...">
		<script type="application/json">...</script><!-- store data for machines, will be fetched by JavaScript after defer or DOMContentLoaded events -->
		<script type="application/ld+json">...</script>
		<link rel="icon" type="image/svg+xml" sizes="any" href="/favicon.svg"><!-- Favicon, prefer SVG and fallback to PNG. See https://realfavicongenerator.net/faq -->
	</head>
	<body>
		<!--
		Critical content
		-->
		<img src="..." alt="">
		<svg><symbol>...</symbol></svg><!-- inlined SVGs icons (symbols) required by critical content. Note: HTML5 spec doesn't allow `svg` element to be in `head` -->

		<!--
		Non critical content
		See [Image lazyload](#image-lazyload)
		-->
		<link rel="stylesheet" href="..."><!-- only if use HTTP/2.0 (allowed in body) https://html.spec.whatwg.org/multipage/links.html#body-ok https://jakearchibald.com/2016/link-in-body/#a-simpler-better-way -->
		<style id="noscript-style">.media-placeholder{display: none;}</style><!-- can be placed anywhere before in the body or the header. ("[can be used] in the body, where flow content is expected.") https://www.w3.org/TR/html52/document-metadata.html#the-style-element
		<script>/*progressive enhancement of media placeholder script*/</script><!-- can be in the footer -->
		<span class="media-placeholder" style="background: grey;"></span><noscript><img class="media" src="cat.jpg" alt="A photography of a cat"></noscript>
		<span class="media-placeholder" style="background: black;"></span><noscript><video class="media" src="dog.mp4">A video of a dog play with a ball</video></noscript>
		<p>...</p>

		<!--
		Non critical scripts and styles
		For tracking, ads, etc..
		See [Blocking resources](#blocking-resources)
		Could be in the section "For non critical content" inside noscript in the header
		-->
		<script src="" defer></script>
		<!-- No inlined scripts here or only if not big and support async API, something like: `window.cmd=window.cmd||[];cmd.push(() => console.log("do something"))` -->
	</body>
</html>

See also Critical rendering path and Blocking resources

Use HTTP2:

<?php
// Is HTTP/2.X
if(substr($_SERVER['SERVER_PROTOCOL'], 0, 7) === 'HTTP/2.'){
	header('Link: </styles/critical.css>; rel=preload; as=style');
	echo '<link href="/styles/critical.css" rel="stylesheet">';
	// script 1, script 2, script 3...
}else{
	echo '<style>';
	include('styles/critical.css');
	echo '</style>';
	// script bundle
}
// then the method used to load async /styles/content.css
?>

Blocking resources

Browser requests the HTML document Begins parsing and constructing DOM Discovers CSS/JS Waits for CSS response Constructs CSSOM Combines CSSOM and DOM intro render tree Font requests are dispatched after the render tree indicates which font variants are needed to render the specified text on the page.

— Slide 16 of To push, or not to push?! by Patrick Hamann

Blocking parser and blocking rendering

A friendly reminder that any <link rel="stylesheet"> in your <head> will block first paint until all of them are done downloading.

Stylesheet are blocking resources.

Synchronous scripts in body block rendering

Defer and async script

Prefer defer over async for regular scripts.

These scripts shouldn't contains window.write() (blocking)

Defer: Use the script as non blocking script. Be executed when the HTML document is done being parsed (aka DOM Interactive or performance.timing.domInteractive) Async: Use the script as non blocking script. Be executed as soon as possible, don't respect execution based on the order in document (out-of-order) At the bottom: Use the script as blocking script. Should be executed after all previous blocking resources being loaded and parsed. (before document load event). Prefer defer or async instead

Non-blocking stylesheet

Aka async stylesheet

<link rel="preload" href="/path/to/my.css" as="style"><!-- optional, set fetch at highest priority -->
<link rel="stylesheet" href="/path/to/my.css" media="print" onload="media='all'">

Depreciated:

<link rel="preload" href="styles.css" as="style" onload="rel='stylesheet';onload=0">
<noscript><link rel="stylesheet" href="styles.css"></noscript>

Reduce media memory usage

For images and videos

By drawing downscaled images when smaller image are required.

  • https://github.com/PixelsCommander/CanvasImage

Compressed textures

Use compressed textures and/or less bytes per channels (require canvas).

See Decompression using GPU

See [Texture format](Texture format)

Major compressed texture formats support (from WebGL Stats 02/2016):

  1. WEBGL_compressed_texture_s3tc 84.2%

  2. WEBGL_compressed_texture_etc1 13.1%

  3. WEBGL_compressed_texture_pvrtc 7.4%

  4. WEBGL_compressed_texture_atc 2.4%

  5. WEBGL_compressed_texture_astc (Mali GPU, ex.: Galaxy S6)

Watch texture max size gl.getParameter(gl.MAX_TEXTURE_SIZE). Prefer use 4096 × 4096

Error: WebGL: compressedTexImage2D: the maximum texture size for level 0 is 4096

WebGL DDS + texture atlas 4096 require bufferData/texImage2D + check an OUT_OF_MEMORY errors

RGBA4444 = 16bit but not related to paletted, use context.texImage2D(context.TEXTURE_2D, 0, context.RGBA, context.RGBA, context.UNSIGNED_SHORT_4_4_4_4, data); where data must be an Uint16Array (4bits for each colors, something like Uint4ClampedArray) with each channel must be clamped to 0xF (not directly compatible with a PNG16 image)

Note: In WebGL 2, PALETTE8_RGBA8_OES can be used with compressedTexImage2D() to upload paletted colors (like from PNG8): https://www.opengl.org/registry/specs/OES/OES_compressed_paletted_texture.txt

Performance reporting

Aka client timing

See JavaScript Performance API (ex: window.performance.getEntriesByType("resource")) and navigator.sendBeacon()

AMP

CSS is limited to 50KB and only inline, custom JS is not allowed (other than via an iframe method), and all images are guaranteed to be lazy loading. [...] Technically correct AMP pages will perform very similar to any non-horrible web page. [...] Part of that answer you can probably guess: the cache is simply very fast. It’s hard to compete with a Google-class CDN. I imagine thousands of servers strategically placed worldwide on the best connections available. [...] on Google Search on mobile [...] Pretty much anything that page needs to render is preloaded, whether you actually open it not. [...]

The AMP page, which we all believe to be super fast and optimized for slow mobiles because it is AMP, isn’t that fast. Its true speed comes from preloading.

AMP: the missing controversy – Ferdy Christant

Cache:

Others links:

Third parties webperf

See also Third parties

Last updated