Added logging, changed some directory structure

This commit is contained in:
2018-01-13 21:33:40 -05:00
parent f079a5f067
commit 8e72ffb917
73656 changed files with 35284 additions and 53718 deletions

View File

@@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2015 Google, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@@ -0,0 +1,664 @@
# [![NPM version][npm-image]][npm-url] [![Build Status][travis-image]][travis-url] [![Dependency Status][daviddm-url]][daviddm-image]
# Service Worker Precache
Service Worker Precache is a module for generating a service worker that
precaches resources. It integrates with your build process. Once configured, it
detects all your static resources (HTML, JavaScript, CSS, images, etc.) and
generates a hash of each file's contents. Information about each file's URL and
versioned hash are stored in the generated service worker file, along with logic
to serve those files cache-first, and automatically keep those files up to date
when changes are detected in subsequent builds.
Serving your local static resources cache-first means that you can get all the
crucial scaffolding for your web app—your App Shell—on the screen without having
to wait for any network responses.
The module can be used in JavaScript-based build scripts,
like those written with [`gulp`](http://gulpjs.com/), and it also provides a
[command-line interface](#command-line-interface). You can use the module
directly, or if you'd prefer, use one of the [wrappers](#wrappers-and-starter-kits)
around `sw-precache` for specific build environments, like
[`webpack`](https://webpack.github.io/).
It can be [used alongside](sw-precache-and-sw-toolbox.md) the [`sw-toolbox`](https://github.com/GoogleChrome/sw-toolbox)
library, which works well when following the App Shell + dynamic content model.
The full documentation is in this README, and the
[getting started guide](GettingStarted.md) provides a quicker jumping off point.
To learn more about the internals of the generated service worker, you can read
[this deep-dive](https://medium.com/@Huxpro/how-does-sw-precache-works-2d99c3d3c725)
by [Huang Xuan](https://twitter.com/Huxpro).
# Table of Contents
<!-- START doctoc generated TOC please keep comment here to allow auto update -->
<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->
- [Install](#install)
- [Usage](#usage)
- [Overview](#overview)
- [Example](#example)
- [Considerations](#considerations)
- [Command-line interface](#command-line-interface)
- [Runtime Caching](#runtime-caching)
- [API](#api)
- [Methods](#methods)
- [generate(options, callback)](#generateoptions-callback)
- [write(filePath, options, callback)](#writefilepath-options-callback)
- [Options Parameter](#options-parameter)
- [cacheId [String]](#cacheid-string)
- [clientsClaim [Boolean]](#clientsclaim-boolean)
- [directoryIndex [String]](#directoryindex-string)
- [dontCacheBustUrlsMatching [Regex]](#dontcachebusturlsmatching-regex)
- [dynamicUrlToDependencies [Object&#x27e8;String,Buffer,Array&#x27e8;String&#x27e9;&#x27e9;]](#dynamicurltodependencies-objectstringbufferarraystring)
- [handleFetch [boolean]](#handlefetch-boolean)
- [ignoreUrlParametersMatching [Array&#x27e8;Regex&#x27e9;]](#ignoreurlparametersmatching-arrayregex)
- [importScripts [Array&#x27e8;String&#x27e9;]](#importscripts-arraystring)
- [logger [function]](#logger-function)
- [maximumFileSizeToCacheInBytes [Number]](#maximumfilesizetocacheinbytes-number)
- [navigateFallback [String]](#navigatefallback-string)
- [navigateFallbackWhitelist [Array&#x27e8;RegExp&#x27e9;]](#navigatefallbackwhitelist-arrayregexp)
- [replacePrefix [String]](#replaceprefix-string)
- [runtimeCaching [Array&#x27e8;Object&#x27e9;]](#runtimecaching-arrayobject)
- [skipWaiting [Boolean]](#skipwaiting-boolean)
- [staticFileGlobs [Array&#x27e8;String&#x27e9;]](#staticfileglobs-arraystring)
- [stripPrefix [String]](#stripprefix-string)
- [stripPrefixMulti [Object]](#stripprefixmulti-object)
- [templateFilePath [String]](#templatefilepath-string)
- [verbose [boolean]](#verbose-boolean)
- [Wrappers and Starter Kits](#wrappers-and-starter-kits)
- [CLIs](#clis)
- [Starter Kits](#starter-kits)
- [Recipes for writing a custom wrapper](#recipes-for-writing-a-custom-wrapper)
- [Acknowledgements](#acknowledgements)
- [Future of Service Worker tooling](#future-of-service-worker-tooling)
- [License](#license)
<!-- END doctoc generated TOC please keep comment here to allow auto update -->
## Install
Local build integration:
```sh
$ npm install --save-dev sw-precache
```
Global command-line interface:
```sh
$ npm install --global sw-precache
```
## Usage
### Overview
1. **Make sure your site is served using HTTPS!**
Service worker functionality is only available on pages that are accessed via HTTPS.
(`http://localhost` will also work, to facilitate testing.) The rationale for this restriction is
outlined in the
["Prefer Secure Origins For Powerful New Features" document](http://www.chromium.org/Home/chromium-security/prefer-secure-origins-for-powerful-new-features).
2. **Incorporate `sw-precache` into your `node`-based build script.** It should
work well with either `gulp` or `Grunt`, or other build scripts that run on
`node`. In fact, we've provided examples of both in the `demo/` directory. Each
build script in `demo` has a function called `writeServiceWorkerFile()` that
shows how to use the API. Both scripts generate fully-functional JavaScript code
that takes care of precaching and fetching all the resources your site needs to
function offline. There is also a [command-line interface](#command-line-interface)
available, for those using alternate build setups.
3. **Register the service worker JavaScript.** The JavaScript that's generated
needs to be registered as the controlling service worker for your pages. This
technically only needs to be done from within a top-level "entry" page for your
site, since the registration includes a [`scope`](https://slightlyoff.github.io/ServiceWorker/spec/service_worker/index.html#service-worker-registration-scope)
which will apply to all pages underneath your top-level page. [`service-worker-registration.js`](/demo/app/js/service-worker-registration.js) is a sample
script that illustrates the best practices for registering the generated service
worker and handling the various [lifecycle](https://slightlyoff.github.io/ServiceWorker/spec/service_worker/index.html#service-worker-state.1) events.
### Example
The project's [sample `gulpfile.js`](/demo/gulpfile.js) illustrates the full use of sw-precache
in context. (Note that the sample gulpfile.js is the one in the `demo` folder,
not the one in the root of the project.) You can run the sample by cloning this
repo, using [`npm install`](https://docs.npmjs.com/) to pull in the
dependencies, changing to the `demo/` directory, running `` `npm bin`/gulp serve-dist ``, and
then visiting http://localhost:3000.
There's also a [sample `Gruntfile.js`](/demo/Gruntfile.js) that shows service worker generation in
Grunt. Though, it doesn't run a server on localhost.
Here's a simpler gulp example for a basic use case. It assumes your site's resources are located under
`app` and that you'd like to cache *all* your JavaScript, HTML, CSS, and image files.
```js
gulp.task('generate-service-worker', function(callback) {
var swPrecache = require('sw-precache');
var rootDir = 'app';
swPrecache.write(`${rootDir}/service-worker.js`, {
staticFileGlobs: [rootDir + '/**/*.{js,html,css,png,jpg,gif,svg,eot,ttf,woff}'],
stripPrefix: rootDir
}, callback);
});
```
This task will create `app/service-worker.js`, which your client pages need to
[register](https://slightlyoff.github.io/ServiceWorker/spec/service_worker/#navigator-service-worker-register) before it can take control of your site's
pages. [`service-worker-registration.js`](/demo/app/js/service-worker-registration.js) is a ready-to-
use script to handle registration.
### Considerations
- Service worker caching should be considered a progressive enhancement. If you follow the model of
conditionally registering a service worker only if it's supported (determined by
`if('serviceWorker' in navigator)`), you'll get offline support on browsers with service workers and
on browsers that don't support service workers, the offline-specific code will never be called.
There's no overhead/breakage for older browsers if you add `sw-precache` to your build.
- **All** resources that are precached will be fetched by a service worker running in a separate
thread as soon as the service worker is installed. You should be judicious in what you list in the
`dynamicUrlToDependencies` and `staticFileGlobs` options, since listing files that are non-essential
(large images that are not shown on every page, for instance) will result in browsers downloading
more data than is strictly necessary.
- Precaching doesn't make sense for all types of resources (see the previous
point). Other caching strategies, like those outlined in the [Offline Cookbook](https://developers.google.com/web/fundamentals/instant-and-offline/offline-cookbook/), can be used in
conjunction with `sw-precache` to provide the best experience for your users. If
you do implement additional caching logic, put the code in a separate JavaScript
file and include it using the `importScripts()` method.
- `sw-precache` uses a [cache-first](http://jakearchibald.com/2014/offline-cookbook/#cache-falling-back-to-network) strategy, which results in a copy of
any cached content being returned without consulting the network. A useful
pattern to adopt with this strategy is to display a toast/alert to your users
when there's new content available, and give them an opportunity to reload the
page to pick up that new content (which the service worker will have added to
the cache, and will be available at the next page load). The sample [`service-worker-registration.js`](/demo/app/js/service-worker-registration.js) file [illustrates](https://github.com/GoogleChrome/sw-precache/blob/7688ee8ccdaddd9171af352384d04d16d712f9d3/demo/app/js/service-worker-registration.js#L51)
the service worker lifecycle event you can listen for to trigger this message.
### Command-line interface
For those who would prefer not to use `sw-precache` as part of a `gulp` or
`Grunt` build, there's a [command-line interface](cli.js) which supports the
[options listed](#options-parameter) in the API, provided via flags or an
external JavaScript configuration file.
**Warning:** When using `sw-precache` "by hand", outside of an automated build process, it's your
responsibility to re-run the command each time there's a change to any local resources! If `sw-precache`
is not run again, the previously cached local resources will be reused indefinitely.
Sensible defaults are assumed for options that are not provided. For example, if you are inside
the top-level directory that contains your site's contents, and you'd like to generate a
`service-worker.js` file that will automatically precache all of the local files, you can simply run
```sh
$ sw-precache
```
Alternatively, if you'd like to only precache `.html` files that live within `dist/`, which is a
subdirectory of the current directory, you could run
```sh
$ sw-precache --root=dist --static-file-globs='dist/**/*.html'
```
**Note:** Be sure to use quotes around parameter values that have special meanings
to your shell (such as the `*` characters in the sample command line above,
for example).
Finally, there's support for passing complex configurations using `--config <file>`.
Any of the options from the file can be overridden via a command-line flag.
We strongly recommend passing it an external JavaScript file defining config via
[`module.exports`](https://nodejs.org/api/modules.html#modules_module_exports).
For example, assume there's a `path/to/sw-precache-config.js` file that contains:
```js
module.exports = {
staticFileGlobs: [
'app/css/**.css',
'app/**.html',
'app/images/**.*',
'app/js/**.js'
],
stripPrefix: 'app/',
runtimeCaching: [{
urlPattern: /this\\.is\\.a\\.regex/,
handler: 'networkFirst'
}]
};
```
That file could be passed to the command-line interface, while also setting the
`verbose` option, via
```sh
$ sw-precache --config=path/to/sw-precache-config.js --verbose
```
This provides the most flexibility, such as providing a regular expression for
the `runtimeCaching.urlPattern` option.
We also support passing in a JSON file for `--config`, though this provides
less flexibility:
```json
{
"staticFileGlobs": [
"app/css/**.css",
"app/**.html",
"app/images/**.*",
"app/js/**.js"
],
"stripPrefix": "app/",
"runtimeCaching": [{
"urlPattern": "/express/style/path/(.*)",
"handler": "networkFirst"
}]
}
```
## Runtime Caching
It's often desireable, even necessary to use precaching and runtime caching together. You may have seen our [`sw-toolbox`](https://github.com/GoogleChrome/sw-toolbox) tool, which handles runtime caching, and wondered how to use them together. Fortunately, `sw-precache` handles this for you.
The `sw-precache` module has the ability to include the `sw-toolbox` code and configuration alongside its own configuration. Using the `runtimeCaching` configuration option in `sw-precache` ([see below](#runtimecaching-arrayobject)) is a shortcut that accomplishes what you could do manually by importing `sw-toolbox` in your service worker and writing your own routing rules.
## API
### Methods
The `sw-precache` module exposes two methods: `generate` and `write`.
#### generate(options, callback)
`generate` takes in [options](#options), generates a service worker
from them and passes the result to a callback function, which must
have the following interface:
`callback(error, serviceWorkerString)`
In the 1.x releases of `sw-precache`, this was the default and only method
exposed by the module.
Since 2.2.0, `generate()` also returns a
[`Promise`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise).
#### write(filePath, options, callback)
`write` takes in [options](#options), generates a service worker from them,
and writes the service worker to a specified file. This method always
invokes `callback(error)`. If no error was found, the `error` parameter will
be `null`
Since 2.2.0, `write()` also returns a [`Promise`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise).
### Options Parameter
Both the `generate()` and `write()` methods take the same options.
#### cacheId [String]
A string used to distinguish the caches created by different web applications that are served off
of the same origin and path. While serving completely different sites from the same URL is not
likely to be an issue in a production environment, it avoids cache-conflicts when testing various
projects all served off of `http://localhost`. You may want to set it to, e.g., the `name`
property from your `package.json`.
_Default_: `''`
#### clientsClaim [Boolean]
Controls whether or not the generated service worker will call
[`clients.claim()`](https://developer.mozilla.org/en-US/docs/Web/API/Clients/claim)
inside the `activate` handler.
Calling `clients.claim()` allows a newly registered service worker to take
control of a page immediately, instead of having to wait until the next page
navigation.
_Default_: `true`
#### directoryIndex [String]
Sets a default filename to return for URL's formatted like directory paths (in
other words, those ending in `'/'`). `sw-precache` will take that translation
into account and serve the contents a relative `directoryIndex` file when
there's no other match for a URL ending in `'/'`. To turn off this behavior,
set `directoryIndex` to `false` or `null`. To override this behavior for one
or more URLs, use the `dynamicUrlToDependencies` option to explicitly set up
mappings between a directory URL and a corresponding file.
_Default_: `'index.html'`
#### dontCacheBustUrlsMatching [Regex]
It's very important that the requests `sw-precache` makes to populate your cache
result in the most up-to-date version of a resource at a given URL. Requests
that are fulfilled with out-of-date responses (like those found in your
browser's HTTP cache) can end up being read from the service worker's cache
indefinitely. Jake Archibald's [blog post](https://jakearchibald.com/2016/caching-best-practices/#a-service-worker-can-extend-the-life-of-these-bugs)
provides more context about this problem.
In the interest of avoiding that scenario, `sw-precache` will, by default,
append a cache-busting parameter to the end of each URL it requests when
populating or updating its cache. Developers who are explicitly doing "the right
thing" when it comes to setting HTTP caching headers on their responses might
want to opt out of this cache-busting. For example, if all of your static
resources already include versioning information in their URLs (via a tool like
[`gulp-rev`](https://github.com/sindresorhus/gulp-rev)), and are served with
long-lived HTTP caching headers, then the extra cache-busting URL parameter
is not needed, and can be safely excluded.
`dontCacheBustUrlsMatching` gives you a way of opting-in to skipping the cache
busting behavior for a subset of your URLs (or all of them, if a catch-all value
like `/./` is used).
If set, then the [pathname](https://developer.mozilla.org/en-US/docs/Web/API/HTMLHyperlinkElementUtils/pathname)
of each URL that's prefetched will be matched against this value.
If there's a match, then the URL will be prefetched as-is, without an additional
cache-busting URL parameter appended.
Note: Prior to `sw-precache` v5.0.0, `dontCacheBustUrlsMatching` matched against
the entire request URL. As of v5.0.0, it only matches against the URL's
[pathname](https://developer.mozilla.org/en-US/docs/Web/API/HTMLHyperlinkElementUtils/pathname).
_Default_: not set
#### dynamicUrlToDependencies [Object&#x27e8;String,Buffer,Array&#x27e8;String&#x27e9;&#x27e9;]
Maps a dynamic URL string to an array of all the files that URL's contents
depend on. E.g., if the contents of `/pages/home` are generated server-side via
the templates `layout.jade` and `home.jade`, then specify `'/pages/home':
['layout.jade', 'home.jade']`. The MD5 hash is used to determine whether
`/pages/home` has changed will depend on the hashes of both `layout.jade` and
`home.jade`.
An alternative value for the mapping is supported as well. You can specify
a string or a Buffer instance rather than an array of file names. If you use this option, then the
hash of the string/Buffer will be used to determine whether the URL used as a key has changed.
For example, `'/pages/dynamic': dynamicStringValue` could be used if the contents of
`/pages/dynamic` changes whenever the string stored in `dynamicStringValue` changes.
_Default_: `{}`
#### handleFetch [boolean]
Determines whether the `fetch` event handler is included in the generated
service worker code. It is useful to set this to `false` in development builds,
to ensure that features like live reload still work. Otherwise, the content
would always be served from the service worker cache.
_Default_: `true`
#### ignoreUrlParametersMatching [Array&#x27e8;Regex&#x27e9;]
`sw-precache` finds matching cache entries by doing a comparison with the full request URL. It's
common for sites to support URL query parameters that don't affect the site's content and should
be effectively ignored for the purposes of cache matching. One example is the
[`utm_`-prefixed](https://support.google.com/analytics/answer/1033867) parameters used for tracking
campaign performance. By default, `sw-precache` will ignore `key=value` when `key` matches _any_ of
the regular expressions provided in this option.
To ignore all parameters, use `[/./]`. To take all parameters into account when matching, use `[]`.
_Default_: `[/^utm_/]`
#### importScripts [Array&#x27e8;String&#x27e9;]
Writes calls to [`importScripts()`](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/basic_usage#Importing_scripts_and_libraries)
to the resulting service worker to import the specified scripts.
_Default_: `[]`
#### logger [function]
Specifies a callback function for logging which resources are being precached and
a precache size. Use `function() {}` if you'd prefer that nothing is logged.
Within a `gulp` script, it's recommended that you use [`gulp-util`](https://github.com/gulpjs/gulp-util) and pass in `gutil.log`.
_Default_: `console.log`
#### maximumFileSizeToCacheInBytes [Number]
Sets the maximum allowed size for a file in the precache list.
_Default_: `2097152` (2 megabytes)
#### navigateFallback [String]
Sets an HTML document to use as a fallback for URLs not found in the `sw-precache` cache. This
fallback URL needs to be cached via `staticFileGlobs` or `dynamicUrlToDependencies` otherwise it
won't work.
```js
// via staticFileGlobs
staticFileGlobs: ['/shell.html']
navigateFallback: '/shell.html'
// via dynamicUrlToDependencies
dynamicUrlToDependencies: {
'/shell': ['/shell.hbs']
},
navigateFallback: '/shell'
```
This comes in handy when used with a web application that performs client-side URL routing
using the [History API](https://developer.mozilla.org/en-US/docs/Web/API/History). It allows any
arbitrary URL that the client generates to map to a fallback cached HTML entry. This fallback entry
ideally should serve as an "application shell" that is able to load the appropriate resources
client-side, based on the request URL.
**Note:** This is **not** intended to be used to route failed navigations to a
generic "offline fallback" page. The `navigateFallback` page is used whether the
browser is online or offline. If you want to implement an "offline fallback",
then using an approach similar to [this example](https://googlechrome.github.io/samples/service-worker/custom-offline-page/)
is more appropriate.
_Default_: `''`
#### navigateFallbackWhitelist [Array&#x27e8;RegExp&#x27e9;]
Works to limit the effect of `navigateFallback`, so that the fallback only
applies to requests for URLs with paths that match at least one
[`RegExp`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/RegExp).
This option is useful if you want to fallback to the cached App Shell for
certain specific subsections of your site, but not have that behavior apply
to all of your site's URLs.
For example, if you would like to have `navigateFallback` only apply to
navigation requests to URLs whose path begins with `/guide/`
(e.g. `https://example.com/guide/1234`), the following configuration could be
used:
```js
navigateFallback: '/shell',
navigateFallbackWhitelist: [/^\/guide\//]
```
If set to `[]` (the default), the whitelist will be effectively bypassed, and
`navigateFallback` will apply to all navigation requests, regardless of URL.
_Default_: `[]`
#### replacePrefix [String]
Replaces a specified string at the beginning of path URL's at runtime. Use this
option when you are serving static files from a different directory at runtime
than you are at build time. For example, if your local files are under
`dist/app/` but your static asset root is at `/public/`, you'd strip 'dist/app/'
and replace it with '/public/'.
_Default_: `''`
#### runtimeCaching [Array&#x27e8;Object&#x27e9;]
Configures runtime caching for dynamic content. If you use this option, the `sw-toolbox`
library configured with the caching strategies you specify will automatically be included in
your generated service worker file.
Each `Object` in the `Array` needs a `urlPattern`, which is either a
[`RegExp`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/RegExp)
or a string, following the conventions of the `sw-toolbox` library's
[routing configuration](https://googlechrome.github.io/sw-toolbox/api.html#main). Also required is
a `handler`, which should be either a string corresponding to one of the
[built-in handlers](https://googlechrome.github.io/sw-toolbox/api.html#handlers) under the `toolbox.` namespace, or a function corresponding to your custom
[request handler](https://googlechrome.github.io/sw-toolbox/api.html#handlers).
Optionally, `method` can be added to specify one of the [supported HTTP methods](https://googlechrome.github.io/sw-toolbox/api.html#expressive-approach) (_default: `'get'`_). There is also
support for `options`, which corresponds to the same options supported by a
[`sw-toolbox` handler](https://googlechrome.github.io/sw-toolbox/api.html#handlers).
For example, the following defines runtime caching behavior for two different URL patterns. It uses a
different handler for each, and specifies a dedicated cache with maximum size for requests
that match `/articles/`:
```js
runtimeCaching: [{
urlPattern: /^https:\/\/example\.com\/api/,
handler: 'networkFirst'
}, {
urlPattern: /\/articles\//,
handler: 'fastest',
options: {
cache: {
maxEntries: 10,
name: 'articles-cache'
}
}
}]
```
The [`sw-precache` + `sw-toolbox` explainer](sw-precache-and-sw-toolbox.md) has
more information about how and why you'd use both libraries together.
_Default_: `[]`
#### skipWaiting [Boolean]
Controls whether or not the generated service worker will call
[`skipWaiting()`](https://developer.mozilla.org/en-US/docs/Web/API/ServiceWorkerGlobalScope/skipWaiting)
inside the `install` handler.
By default, when there's an update to a previously installed
service worker, then the new service worker delays activation and stays in a
`waiting` state until all pages controlled by the old service worker are
unloaded. Calling `skipWaiting()` allows a newly registered service worker to
bypass the `waiting` state.
When `skipWaiting` is `true`, the new service worker's `activate` handler will
be called immediately, and any out of date cache entries from the previous
service worker will be deleted. Please keep this in mind if you rely on older
cached resources to be available throughout the page's lifetime, because, for
example, you [defer the loading of some resources](https://github.com/GoogleChrome/sw-precache/issues/180)
until they're needed at runtime.
_Default_: `true`
#### staticFileGlobs [Array&#x27e8;String&#x27e9;]
An array of one or more string patterns that will be passed in to
[`glob`](https://github.com/isaacs/node-glob).
All files matching these globs will be automatically precached by the generated service worker.
You'll almost always want to specify something for this.
_Default_: `[]`
#### stripPrefix [String]
Removes a specified string from the beginning of path URL's at runtime. Use this
option when there's a discrepancy between a relative path at build time and
the same path at run time. For example, if all your local files are under
`dist/app/` and your web root is also at `dist/app/`, you'd strip that prefix
from the start of each local file's path in order to get the correct relative
URL.
_Default_: `''`
#### stripPrefixMulti [Object]
Maps mutliple strings to be stripped and replaced from the beginning of URL paths at runtime.
Use this option when you have multiple discrepancies between relative paths at build time and
the same path at run time.
If `stripPrefix` and `replacePrefix` are not equal to `''`, they are automatically added to this option.
```js
stripPrefixMulti: {
'www-root/public-precached/': 'public/',
'www-root/public/': 'public/'
}
```
_Default_: `{}`
#### templateFilePath [String]
The path to the ([lo-dash](https://lodash.com/docs#template)) template used to
generate `service-worker.js`. If you need to add additional functionality to the
generated service worker code, it's recommended that you use the
[`importScripts`](#importscripts-arraystring) option to include extra JavaScript rather than
using a different template. But if you do need to change the basic generated
service worker code, please make a copy of the [original template](https://github.com/googlechrome/sw-precache/blob/master/service-worker.tmpl),
modify it locally, and use this option to point to your template file.
_Default_: `service-worker.tmpl` (in the directory that this module lives in)
#### verbose [boolean]
Determines whether there's log output for each individual static/dynamic resource that's precached.
Even if this is set to false, there will be a final log entry indicating the total size of all
precached resources.
_Default_: `false`
## Wrappers and Starter Kits
While it's possible to use the `sw-precache` module's API directly within any
JavaScript environment, several wrappers have been developed by members of the
community tailored to specific build environments. They include:
- [`sw-precache-webpack-plugin`](https://www.npmjs.com/package/sw-precache-webpack-plugin)
- [`sw-precache-brunch`](https://www.npmjs.com/package/sw-precache-brunch)
- [`grunt-sw-precache`](https://www.npmjs.com/package/grunt-sw-precache)
- [`exhibit-builder-sw-precache`](https://www.npmjs.com/package/exhibit-builder-sw-precache)
There are also several starter kits or scaffolding projects that incorporate
`sw-precache` into their build process, giving you a full service worker out of
the box. The include:
### CLIs
- [`polymer-cli`](https://github.com/Polymer/polymer-cli)
- [`create-react-pwa`](https://github.com/jeffposnick/create-react-pwa)
### Starter Kits
- [`react-redux-universal-hot-example`](https://github.com/bertho-zero/react-redux-universal-hot-example)
- [Polymer Starter Kit](https://github.com/polymerelements/polymer-starter-kit)
- [Web Starter Kit](https://github.com/google/web-starter-kit)
### Recipes for writing a custom wrapper
While there are not always ready-to-use wrappers for specific environments, this list contains some recipes to integrate `sw-precache` in your workflow:
- [Gradle wrapper for offline JavaDoc](https://gist.github.com/TimvdLippe/4c39b99e3b0ffbcdd8814a31e2969ed1)
- [Brunch starter for Phoenix Framework](https://gist.github.com/natecox/b19c4e08408a5bf0d4cf4d74f1902260)
## Acknowledgements
Thanks to [Sindre Sorhus](https://github.com/sindresorhus) and
[Addy Osmani](https://github.com/addyosmani) for their advice and code reviews.
[Jake Archibald](https://github.com/jakearchibald) was kind enough to review the service worker logic.
## Future of Service Worker tooling
Both sw-precache and sw-toolbox are **actively maintained** and we plan to continue supporting them. A large number of [production](https://medium.com/dev-channel/progressive-web-app-libraries-in-production-b52cad37d34#.16kxwhu92) Progressive Web Apps are successfully using them today and we are happy to review issues or PRs related to either project.
In parallel, we are working on the next generation of Service Worker tooling over in [Workbox](https://github.com/GoogleChrome/workbox). This new work is more modular and will enable a number of libraries with additional capabilities to be built. We will announce further plans around the roadmap for this work in the future.
## License
Copyright © 2017 Google, Inc.
Licensed under the [Apache License, Version 2.0](LICENSE) (the "License");
you may not use this file except in compliance with the License. You may
obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
[npm-url]: https://npmjs.org/package/sw-precache
[npm-image]: https://badge.fury.io/js/sw-precache.svg
[travis-url]: https://travis-ci.org/GoogleChrome/sw-precache
[travis-image]: https://travis-ci.org/GoogleChrome/sw-precache.svg?branch=master
[daviddm-url]: https://david-dm.org/googlechrome/sw-precache.svg?theme=shields.io
[daviddm-image]: https://david-dm.org/googlechrome/sw-precache

View File

@@ -0,0 +1,148 @@
#!/usr/bin/env node
/**
* Copyright 2015 Google Inc. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
/* eslint-env node */
/* eslint no-nested-ternary: 0 */
'use strict';
var meow = require('meow');
var path = require('path');
var pkg = require('./package.json');
var swPrecache = require('./');
var updateNotifier = require('update-notifier');
updateNotifier({pkg: pkg}).notify();
function setDefaults(cli, configFileFlags) {
var compositeFlags = cli.flags;
compositeFlags.root = compositeFlags.root || configFileFlags.root || './';
if (compositeFlags.root.lastIndexOf('/') !== compositeFlags.root.length - 1) {
compositeFlags.root += '/';
}
compositeFlags.stripPrefix = compositeFlags.stripPrefix ||
configFileFlags.stripPrefix || compositeFlags.root;
compositeFlags.stripPrefixMulti = compositeFlags.stripPrefixMulti ||
configFileFlags.stripPrefixMulti || {};
compositeFlags.swFile = compositeFlags.swFile || configFileFlags.swFile ||
'service-worker.js';
compositeFlags.swFilePath = compositeFlags.swFilePath ||
configFileFlags.swFilePath ||
path.join(compositeFlags.root, compositeFlags.swFile);
compositeFlags.cacheId = compositeFlags.cacheId ||
configFileFlags.cacheId || cli.pkg.name;
compositeFlags.dynamicUrlToDependencies =
compositeFlags.dynamicUrlToDependencies ||
configFileFlags.dynamicUrlToDependencies;
compositeFlags.directoryIndex = compositeFlags.directoryIndex ||
configFileFlags.directoryIndex;
compositeFlags.navigateFallback = compositeFlags.navigateFallback ||
configFileFlags.navigateFallback;
compositeFlags.navigateFallbackWhitelist =
compositeFlags.navigateFallbackWhitelist ||
configFileFlags.navigateFallbackWhitelist;
compositeFlags.staticFileGlobs = compositeFlags.staticFileGlobs ||
configFileFlags.staticFileGlobs;
if (compositeFlags.staticFileGlobs) {
if (typeof compositeFlags.staticFileGlobs === 'string') {
compositeFlags.staticFileGlobs = [compositeFlags.staticFileGlobs];
}
} else {
compositeFlags.staticFileGlobs = [compositeFlags.root + '/**/*.*'];
}
compositeFlags.ignoreUrlParametersMatching =
compositeFlags.ignoreUrlParametersMatching ||
configFileFlags.ignoreUrlParametersMatching;
if (compositeFlags.ignoreUrlParametersMatching &&
typeof compositeFlags.ignoreUrlParametersMatching === 'string') {
compositeFlags.ignoreUrlParametersMatching =
compositeFlags.ignoreUrlParametersMatching.split(',').map(function(s) {
return new RegExp(s);
});
}
compositeFlags.importScripts = compositeFlags.importScripts ||
configFileFlags.importScripts;
if (compositeFlags.importScripts &&
typeof compositeFlags.importScripts === 'string') {
compositeFlags.importScripts = compositeFlags.importScripts.split(',');
}
compositeFlags.runtimeCaching = compositeFlags.runtimeCaching ||
configFileFlags.runtimeCaching;
compositeFlags.templateFilePath = compositeFlags.templateFilePath ||
configFileFlags.templateFilePath;
compositeFlags.maximumFileSizeToCacheInBytes =
compositeFlags.maximumFileSizeToCacheInBytes ||
configFileFlags.maximumFileSizeToCacheInBytes;
// We can't just use ||, since compositeFlags.skipWaiting might be false.
compositeFlags.skipWaiting = ('skipWaiting' in compositeFlags) ?
compositeFlags.skipWaiting :
(('skipWaiting' in configFileFlags) ?
configFileFlags.skipWaiting : undefined);
// We can't just use ||, since compositeFlags.clientsClaim might be false.
compositeFlags.clientsClaim = ('clientsClaim' in compositeFlags) ?
compositeFlags.clientsClaim :
(('clientsClaim' in configFileFlags) ?
configFileFlags.clientsClaim : undefined);
compositeFlags.dontCacheBustUrlsMatching =
compositeFlags.dontCacheBustUrlsMatching ||
configFileFlags.dontCacheBustUrlsMatching;
return compositeFlags;
}
var cli = meow({
help: 'Options from https://github.com/GoogleChrome/sw-precache#options ' +
'are accepted as flags.\nAlternatively, use --config <file>, where ' +
'<file> is the path to a JavaScript file that defines the same ' +
'options via module.exports.\n' +
'When both a config file and command line option is given, the ' +
'command line option takes precedence.'
});
// If the --config option is used, then read the options from an external
// JSON configuration file. Options from the --config file can be overwritten
// by any command line options.
var configFileFlags = cli.flags.config ?
require(path.resolve(cli.flags.config)) : {};
var options = setDefaults(cli, configFileFlags);
swPrecache.write(options.swFilePath, options, function(error) {
if (error) {
console.error(error.stack);
process.exit(1);
}
console.log(options.swFilePath,
'has been generated with the service worker contents.');
});

View File

@@ -0,0 +1,108 @@
/**
* Copyright 2015 Google Inc. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
/* eslint-env node,worker */
'use strict';
var URL = require('dom-urls');
module.exports = {
stripIgnoredUrlParameters: function(originalUrl,
ignoreUrlParametersMatching) {
var url = new URL(originalUrl);
// Remove the hash; see https://github.com/GoogleChrome/sw-precache/issues/290
url.hash = '';
url.search = url.search.slice(1) // Exclude initial '?'
.split('&') // Split into an array of 'key=value' strings
.map(function(kv) {
return kv.split('='); // Split each 'key=value' string into a [key, value] array
})
.filter(function(kv) {
return ignoreUrlParametersMatching.every(function(ignoredRegex) {
return !ignoredRegex.test(kv[0]); // Return true iff the key doesn't match any of the regexes.
});
})
.map(function(kv) {
return kv.join('='); // Join each [key, value] array into a 'key=value' string
})
.join('&'); // Join the array of 'key=value' strings into a string with '&' in between each
return url.toString();
},
addDirectoryIndex: function(originalUrl, index) {
var url = new URL(originalUrl);
if (url.pathname.slice(-1) === '/') {
url.pathname += index;
}
return url.toString();
},
isPathWhitelisted: function(whitelist, absoluteUrlString) {
// If the whitelist is empty, then consider all URLs to be whitelisted.
if (whitelist.length === 0) {
return true;
}
// Otherwise compare each path regex to the path of the URL passed in.
var path = (new URL(absoluteUrlString)).pathname;
return whitelist.some(function(whitelistedPathRegex) {
return path.match(whitelistedPathRegex);
});
},
createCacheKey: function(originalUrl, paramName, paramValue,
dontCacheBustUrlsMatching) {
// Create a new URL object to avoid modifying originalUrl.
var url = new URL(originalUrl);
// If dontCacheBustUrlsMatching is not set, or if we don't have a match,
// then add in the extra cache-busting URL parameter.
if (!dontCacheBustUrlsMatching ||
!(url.pathname.match(dontCacheBustUrlsMatching))) {
url.search += (url.search ? '&' : '') +
encodeURIComponent(paramName) + '=' + encodeURIComponent(paramValue);
}
return url.toString();
},
// When passed a redirected response, this will create a new, "clean" response
// that can be used to respond to a navigation request.
// See https://bugs.chromium.org/p/chromium/issues/detail?id=669363&desc=2#c1
cleanResponse: function(originalResponse) {
// If this is not a redirected response, then we don't have to do anything.
if (!originalResponse.redirected) {
return Promise.resolve(originalResponse);
}
// Firefox 50 and below doesn't support the Response.body stream, so we may
// need to read the entire body to memory as a Blob.
var bodyPromise = 'body' in originalResponse ?
Promise.resolve(originalResponse.body) :
originalResponse.blob();
return bodyPromise.then(function(body) {
// new Response() is happy when passed either a stream or a Blob.
return new Response(body, {
headers: originalResponse.headers,
status: originalResponse.status,
statusText: originalResponse.statusText
});
});
}
};

View File

@@ -0,0 +1,359 @@
/**
* Copyright 2015 Google Inc. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
/* eslint-env node */
'use strict';
var crypto = require('crypto');
var defaults = require('lodash.defaults');
var externalFunctions = require('./functions.js');
var fs = require('fs');
var glob = require('glob');
var mkdirp = require('mkdirp');
var path = require('path');
var prettyBytes = require('pretty-bytes');
var template = require('lodash.template');
var util = require('util');
require('es6-promise').polyfill();
// This should only change if there are breaking changes in the cache format used by the SW.
var VERSION = 'v3';
function absolutePath(relativePath) {
return path.resolve(process.cwd(), relativePath);
}
function getFileAndSizeAndHashForFile(file) {
var stat = fs.statSync(file);
if (stat.isFile()) {
var buffer = fs.readFileSync(file);
return {
file: file,
size: stat.size,
hash: getHash(buffer)
};
}
return null;
}
function getFilesAndSizesAndHashesForGlobPattern(globPattern, excludeFilePath) {
return glob.sync(globPattern.replace(path.sep, '/')).map(function(file) {
// Return null if we want to exclude this file, and it will be excluded in
// the subsequent filter().
return excludeFilePath === absolutePath(file) ?
null :
getFileAndSizeAndHashForFile(file);
}).filter(function(fileAndSizeAndHash) {
return fileAndSizeAndHash !== null;
});
}
function getHash(data) {
var md5 = crypto.createHash('md5');
md5.update(data);
return md5.digest('hex');
}
function escapeRegExp(string) {
return string.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
}
function generateRuntimeCaching(runtimeCaching) {
return runtimeCaching.reduce(function(prev, curr) {
var line;
if (curr.default) {
line = util.format('\ntoolbox.router.default = toolbox.%s;',
curr.default);
} else {
var urlPattern = curr.urlPattern;
if (typeof urlPattern === 'string') {
urlPattern = JSON.stringify(urlPattern);
}
if (!(urlPattern instanceof RegExp ||
typeof urlPattern === 'string')) {
throw new Error(
'runtimeCaching.urlPattern must be a string or RegExp');
}
line = util.format('\ntoolbox.router.%s(%s, %s, %s);',
// Default to setting up a 'get' handler.
curr.method || 'get',
// urlPattern might be a String or a RegExp. sw-toolbox supports both.
urlPattern,
// If curr.handler is a string, then assume it's the name of one
// of the built-in sw-toolbox strategies.
// E.g. 'networkFirst' -> toolbox.networkFirst
// If curr.handler is something else (like a function), then just
// include its body inline.
(typeof curr.handler === 'string' ? 'toolbox.' : '') + curr.handler,
// Default to no options.
stringifyToolboxOptions(curr.options));
}
return prev + line;
}, '');
}
function stringifyToolboxOptions(options) {
options = options || {};
var str = JSON.stringify(options);
if (options.origin instanceof RegExp) {
str = str.replace(/("origin":)\{\}/, '$1' + options.origin);
}
if (options.successResponses instanceof RegExp) {
str = str.replace(/("successResponses":)\{\}/,
'$1' + options.successResponses);
}
return str;
}
function generate(params, callback) {
return new Promise(function(resolve, reject) {
params = defaults(params || {}, {
cacheId: '',
clientsClaim: true,
directoryIndex: 'index.html',
dontCacheBustUrlsMatching: null,
dynamicUrlToDependencies: {},
handleFetch: true,
ignoreUrlParametersMatching: [/^utm_/],
importScripts: [],
logger: console.log,
maximumFileSizeToCacheInBytes: 2 * 1024 * 1024, // 2MB
navigateFallback: '',
navigateFallbackWhitelist: [],
replacePrefix: '',
skipWaiting: true,
staticFileGlobs: [],
stripPrefix: '',
stripPrefixMulti: {},
templateFilePath: path.join(
path.dirname(fs.realpathSync(__filename)), '..', 'service-worker.tmpl'),
verbose: false
});
if (!Array.isArray(params.ignoreUrlParametersMatching)) {
params.ignoreUrlParametersMatching = [params.ignoreUrlParametersMatching];
}
var relativeUrlToHash = {};
var cumulativeSize = 0;
params.stripPrefixMulti[params.stripPrefix] = params.replacePrefix;
params.staticFileGlobs.forEach(function(globPattern) {
var filesAndSizesAndHashes = getFilesAndSizesAndHashesForGlobPattern(
globPattern, params.outputFilePath);
// The files returned from glob are sorted by default, so we don't need to sort here.
filesAndSizesAndHashes.forEach(function(fileAndSizeAndHash) {
if (fileAndSizeAndHash.size <= params.maximumFileSizeToCacheInBytes) {
// Strip the prefix to turn this into a relative URL.
var relativeUrl = fileAndSizeAndHash.file
.replace(
new RegExp('^(' + Object.keys(params.stripPrefixMulti)
.map(escapeRegExp).join('|') + ')'),
function(match) {
return params.stripPrefixMulti[match];
})
.replace(path.sep, '/');
relativeUrlToHash[relativeUrl] = fileAndSizeAndHash.hash;
if (params.verbose) {
params.logger(util.format('Caching static resource "%s" (%s)',
fileAndSizeAndHash.file,
prettyBytes(fileAndSizeAndHash.size)));
}
cumulativeSize += fileAndSizeAndHash.size;
} else {
params.logger(
util.format('Skipping static resource "%s" (%s) - max size is %s',
fileAndSizeAndHash.file, prettyBytes(fileAndSizeAndHash.size),
prettyBytes(params.maximumFileSizeToCacheInBytes)));
}
});
});
Object.keys(params.dynamicUrlToDependencies).forEach(function(dynamicUrl) {
var dependency = params.dynamicUrlToDependencies[dynamicUrl];
var isString = typeof dependency === 'string';
var isBuffer = Buffer.isBuffer(dependency);
if (!Array.isArray(dependency) && !isString && !isBuffer) {
throw Error(util.format(
'The value for the dynamicUrlToDependencies.%s ' +
'option must be an Array, a Buffer, or a String.',
dynamicUrl));
}
if (isString || isBuffer) {
cumulativeSize += dependency.length;
relativeUrlToHash[dynamicUrl] = getHash(dependency);
} else {
var filesAndSizesAndHashes = dependency
.sort()
.map(function(file) {
try {
return getFileAndSizeAndHashForFile(file);
} catch (e) {
// Provide some additional information about the failure if the file is missing.
if (e.code === 'ENOENT') {
params.logger(util.format(
'%s was listed as a dependency for dynamic URL %s, but ' +
'the file does not exist. Either remove the entry as a ' +
'dependency, or correct the path to the file.',
file, dynamicUrl
));
}
// Re-throw the exception unconditionally, since this should be treated as fatal.
throw e;
}
});
var concatenatedHashes = '';
filesAndSizesAndHashes.forEach(function(fileAndSizeAndHash) {
// Let's assume that the response size of a server-generated page is roughly equal to the
// total size of all its components.
cumulativeSize += fileAndSizeAndHash.size;
concatenatedHashes += fileAndSizeAndHash.hash;
});
relativeUrlToHash[dynamicUrl] = getHash(concatenatedHashes);
}
if (params.verbose) {
if (isString) {
params.logger(util.format(
'Caching dynamic URL "%s" with dependency on user-supplied string',
dynamicUrl));
} else if (isBuffer) {
params.logger(util.format(
'Caching dynamic URL "%s" with dependency on user-supplied buffer',
dynamicUrl));
} else {
params.logger(util.format(
'Caching dynamic URL "%s" with dependencies on %j',
dynamicUrl, dependency));
}
}
});
var runtimeCaching;
var swToolboxCode;
if (params.runtimeCaching) {
runtimeCaching = generateRuntimeCaching(params.runtimeCaching);
var pathToSWToolbox = require.resolve('sw-toolbox/sw-toolbox.js');
swToolboxCode = fs.readFileSync(pathToSWToolbox, 'utf8')
.replace('//# sourceMappingURL=sw-toolbox.js.map', '');
}
// It's very important that running this operation multiple times with the same input files
// produces identical output, since we need the generated service-worker.js file to change if
// the input files changes. The service worker update algorithm,
// https://w3c.github.io/ServiceWorker/#update-algorithm, relies on detecting even a single
// byte change in service-worker.js to trigger an update. Because of this, we write out the
// cache options as a series of sorted, nested arrays rather than as objects whose serialized
// key ordering might vary.
var relativeUrls = Object.keys(relativeUrlToHash);
var precacheConfig = relativeUrls.sort().map(function(relativeUrl) {
return [relativeUrl, relativeUrlToHash[relativeUrl]];
});
params.logger(util.format(
'Total precache size is about %s for %d resources.',
prettyBytes(cumulativeSize), relativeUrls.length));
fs.readFile(params.templateFilePath, 'utf8', function(error, data) {
if (error) {
if (callback) {
callback(error);
}
return reject(error);
}
var populatedTemplate = template(data)({
cacheId: params.cacheId,
clientsClaim: params.clientsClaim,
// Ensure that anything false is translated into '', since this will be treated as a string.
directoryIndex: params.directoryIndex || '',
dontCacheBustUrlsMatching: params.dontCacheBustUrlsMatching || false,
externalFunctions: externalFunctions,
handleFetch: params.handleFetch,
ignoreUrlParametersMatching: params.ignoreUrlParametersMatching,
importScripts: params.importScripts ?
params.importScripts.map(JSON.stringify).join(',') : null,
// Ensure that anything false is translated into '', since this will be treated as a string.
navigateFallback: params.navigateFallback || '',
navigateFallbackWhitelist:
JSON.stringify(params.navigateFallbackWhitelist.map(function(regex) {
return regex.source;
})),
precacheConfig: JSON.stringify(precacheConfig),
runtimeCaching: runtimeCaching,
skipWaiting: params.skipWaiting,
swToolboxCode: swToolboxCode,
version: VERSION
});
if (callback) {
callback(null, populatedTemplate);
}
resolve(populatedTemplate);
});
});
}
function write(filePath, params, callback) {
return new Promise(function(resolve, reject) {
function finish(error, value) {
if (error) {
reject(error);
} else {
resolve(value);
}
if (callback) {
callback(error, value);
}
}
mkdirp.sync(path.dirname(filePath));
// Keep track of where we're outputting the file to ensure that we don't
// pick up a previously written version in our new list of files.
// See https://github.com/GoogleChrome/sw-precache/issues/101
params.outputFilePath = absolutePath(filePath);
generate(params).then(function(serviceWorkerFileContents) {
fs.writeFile(filePath, serviceWorkerFileContents, finish);
}, finish);
});
}
module.exports = {
generate: generate,
write: write
};
if (process.env.NODE_ENV === 'swprecache-test') {
module.exports.generateRuntimeCaching = generateRuntimeCaching;
}

View File

@@ -0,0 +1,103 @@
{
"_args": [
[
"sw-precache@5.2.0",
"C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\torrent-project"
]
],
"_from": "sw-precache@5.2.0",
"_id": "sw-precache@5.2.0",
"_inBundle": false,
"_integrity": "sha512-sKctdX+5hUxkqJ/1DM88ubQ+QRvyw7CnxWdk909N2DgvxMqc1gcQFrwL7zpVc87wFmCA/OvRQd0iMC2XdFopYg==",
"_location": "/react-scripts/sw-precache",
"_phantomChildren": {},
"_requested": {
"type": "version",
"registry": true,
"raw": "sw-precache@5.2.0",
"name": "sw-precache",
"escapedName": "sw-precache",
"rawSpec": "5.2.0",
"saveSpec": null,
"fetchSpec": "5.2.0"
},
"_requiredBy": [
"/react-scripts/sw-precache-webpack-plugin"
],
"_resolved": "https://registry.npmjs.org/sw-precache/-/sw-precache-5.2.0.tgz",
"_spec": "5.2.0",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\torrent-project",
"author": {
"name": "Jeff Posnick",
"email": "jeffy@google.com",
"url": "https://jeffy.info"
},
"bin": {
"sw-precache": "cli.js"
},
"bugs": {
"url": "https://github.com/googlechrome/sw-precache/issues"
},
"dependencies": {
"dom-urls": "^1.1.0",
"es6-promise": "^4.0.5",
"glob": "^7.1.1",
"lodash.defaults": "^4.2.0",
"lodash.template": "^4.4.0",
"meow": "^3.7.0",
"mkdirp": "^0.5.1",
"pretty-bytes": "^4.0.2",
"sw-toolbox": "^3.4.0",
"update-notifier": "^1.0.3"
},
"description": "Generates a service worker to cache your local App Shell resources.",
"devDependencies": {
"del": "^2.2.2",
"eslint": "^3.15.0",
"eslint-config-google": "^0.7.1",
"express": "^4.14.1",
"gh-pages": "^0.12.0",
"grunt": "^1.0.1",
"gulp": "^3.9.1",
"gulp-doctoc": "^0.1.4",
"gulp-eslint": "^3.0.1",
"gulp-load-plugins": "^1.5.0",
"gulp-mocha": "^3.0.1",
"gulp-replace": "^0.5.4",
"gulp-util": "^3.0.8",
"jade": "^1.11.0",
"mocha": "^3.2.0",
"node-fetch": "^1.6.3",
"run-sequence": "^1.2.2"
},
"engines": {
"node": ">=4.0.0"
},
"files": [
"cli.js",
"lib",
"service-worker.tmpl"
],
"homepage": "https://github.com/googlechrome/sw-precache",
"keywords": [
"caching",
"offline",
"precaching",
"service-worker",
"serviceworker",
"appshell",
"pwa"
],
"license": "Apache-2.0",
"main": "lib/sw-precache.js",
"name": "sw-precache",
"repository": {
"type": "git",
"url": "git+https://github.com/googlechrome/sw-precache.git"
},
"scripts": {
"doctoc": "doctoc",
"test": "gulp test lint"
},
"version": "5.2.0"
}

View File

@@ -0,0 +1,196 @@
/**
* Copyright 2016 Google Inc. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
// DO NOT EDIT THIS GENERATED OUTPUT DIRECTLY!
// This file should be overwritten as part of your build process.
// If you need to extend the behavior of the generated service worker, the best approach is to write
// additional code and include it using the importScripts option:
// https://github.com/GoogleChrome/sw-precache#importscripts-arraystring
//
// Alternatively, it's possible to make changes to the underlying template file and then use that as the
// new base for generating output, via the templateFilePath option:
// https://github.com/GoogleChrome/sw-precache#templatefilepath-string
//
// If you go that route, make sure that whenever you update your sw-precache dependency, you reconcile any
// changes made to this original template file with your modified copy.
// This generated service worker JavaScript will precache your site's resources.
// The code needs to be saved in a .js file at the top-level of your site, and registered
// from your pages in order to be used. See
// https://github.com/googlechrome/sw-precache/blob/master/demo/app/js/service-worker-registration.js
// for an example of how you can register this script and handle various service worker events.
/* eslint-env worker, serviceworker */
/* eslint-disable indent, no-unused-vars, no-multiple-empty-lines, max-nested-callbacks, space-before-function-paren, quotes, comma-spacing */
'use strict';
var precacheConfig = <%= precacheConfig %>;
var cacheName = 'sw-precache-<%= version %>-<%= cacheId %>-' + (self.registration ? self.registration.scope : '');
<% if (handleFetch) { %>
var ignoreUrlParametersMatching = [<%= ignoreUrlParametersMatching %>];
<% } %>
<% Object.keys(externalFunctions).sort().forEach(function(functionName) {%>
var <%- functionName %> = <%= externalFunctions[functionName] %>;
<% }); %>
var hashParamName = '_sw-precache';
var urlsToCacheKeys = new Map(
precacheConfig.map(function(item) {
var relativeUrl = item[0];
var hash = item[1];
var absoluteUrl = new URL(relativeUrl, self.location);
var cacheKey = createCacheKey(absoluteUrl, hashParamName, hash, <%= dontCacheBustUrlsMatching %>);
return [absoluteUrl.toString(), cacheKey];
})
);
function setOfCachedUrls(cache) {
return cache.keys().then(function(requests) {
return requests.map(function(request) {
return request.url;
});
}).then(function(urls) {
return new Set(urls);
});
}
self.addEventListener('install', function(event) {
event.waitUntil(
caches.open(cacheName).then(function(cache) {
return setOfCachedUrls(cache).then(function(cachedUrls) {
return Promise.all(
Array.from(urlsToCacheKeys.values()).map(function(cacheKey) {
// If we don't have a key matching url in the cache already, add it.
if (!cachedUrls.has(cacheKey)) {
var request = new Request(cacheKey, {credentials: 'same-origin'});
return fetch(request).then(function(response) {
// Bail out of installation unless we get back a 200 OK for
// every request.
if (!response.ok) {
throw new Error('Request for ' + cacheKey + ' returned a ' +
'response with status ' + response.status);
}
return cleanResponse(response).then(function(responseToCache) {
return cache.put(cacheKey, responseToCache);
});
});
}
})
);
});
}).then(function() {
<% if (skipWaiting) { %>
// Force the SW to transition from installing -> active state
return self.skipWaiting();
<% } %>
})
);
});
self.addEventListener('activate', function(event) {
var setOfExpectedUrls = new Set(urlsToCacheKeys.values());
event.waitUntil(
caches.open(cacheName).then(function(cache) {
return cache.keys().then(function(existingRequests) {
return Promise.all(
existingRequests.map(function(existingRequest) {
if (!setOfExpectedUrls.has(existingRequest.url)) {
return cache.delete(existingRequest);
}
})
);
});
}).then(function() {
<% if (clientsClaim) { %>
return self.clients.claim();
<% } %>
})
);
});
<% if (handleFetch) { %>
self.addEventListener('fetch', function(event) {
if (event.request.method === 'GET') {
// Should we call event.respondWith() inside this fetch event handler?
// This needs to be determined synchronously, which will give other fetch
// handlers a chance to handle the request if need be.
var shouldRespond;
// First, remove all the ignored parameters and hash fragment, and see if we
// have that URL in our cache. If so, great! shouldRespond will be true.
var url = stripIgnoredUrlParameters(event.request.url, ignoreUrlParametersMatching);
shouldRespond = urlsToCacheKeys.has(url);
// If shouldRespond is false, check again, this time with 'index.html'
// (or whatever the directoryIndex option is set to) at the end.
var directoryIndex = '<%= directoryIndex %>';
if (!shouldRespond && directoryIndex) {
url = addDirectoryIndex(url, directoryIndex);
shouldRespond = urlsToCacheKeys.has(url);
}
// If shouldRespond is still false, check to see if this is a navigation
// request, and if so, whether the URL matches navigateFallbackWhitelist.
var navigateFallback = '<%= navigateFallback %>';
if (!shouldRespond &&
navigateFallback &&
(event.request.mode === 'navigate') &&
isPathWhitelisted(<%= navigateFallbackWhitelist %>, event.request.url)) {
url = new URL(navigateFallback, self.location).toString();
shouldRespond = urlsToCacheKeys.has(url);
}
// If shouldRespond was set to true at any point, then call
// event.respondWith(), using the appropriate cache key.
if (shouldRespond) {
event.respondWith(
caches.open(cacheName).then(function(cache) {
return cache.match(urlsToCacheKeys.get(url)).then(function(response) {
if (response) {
return response;
}
throw Error('The cached response that was expected is missing.');
});
}).catch(function(e) {
// Fall back to just fetch()ing the request if some unexpected error
// prevented the cached response from being valid.
console.warn('Couldn\'t serve response for "%s" from cache: %O', event.request.url, e);
return fetch(event.request);
})
);
}
}
});
<% if (swToolboxCode) { %>
// *** Start of auto-included sw-toolbox code. ***
<%= swToolboxCode %>
// *** End of auto-included sw-toolbox code. ***
<% } %>
<% if (runtimeCaching) { %>
// Runtime cache configuration, using the sw-toolbox library.
<%= runtimeCaching %>
<% } %>
<% } %>
<% if (importScripts) { %>
importScripts(<%= importScripts %>);
<% } %>