` line to each URL in the sitemap. Each value in priorityMap array corresponds with the depth of the URL being added. For example, the priority value given to a URL equals `priorityMap[depth - 1]`. If a URL's depth is greater than the length of the priorityMap array, the last value in the array will be used. Valid values are between `1.0` and `0.0`.
197 |
198 | Example:
199 |
200 | ```javascript
201 | [1.0, 0.8, 0.6, 0.4, 0.2, 0]
202 | ```
203 |
204 | ### userAgent
205 |
206 | Type: `string`
207 | Default: `Node/SitemapGenerator`
208 |
209 | Change the default crawler user agent.
210 |
211 | ## Events
212 |
213 | The Sitemap Generator emits several events which can be listened to.
214 |
215 | ### `add`
216 |
217 | Triggered when the crawler successfully added a resource to the sitemap. Passes the url as argument.
218 |
219 | ```JavaScript
220 | generator.on('add', (url) => {
221 | // log url
222 | });
223 | ```
224 |
225 | ### `done`
226 |
227 | Triggered when the crawler finished and the sitemap is created.
228 |
229 | ```JavaScript
230 | generator.on('done', () => {
231 | // sitemaps created
232 | });
233 | ```
234 |
235 | ### `error`
236 |
237 | Thrown if there was an error while fetching an URL. Passes an object with the http status code, a message and the url as argument.
238 |
239 | ```JavaScript
240 | generator.on('error', (error) => {
241 | console.log(error);
242 | // => { code: 404, message: 'Not found.', url: 'http://example.com/foo' }
243 | });
244 | ```
245 |
246 | ### `ignore`
247 |
248 | If an URL matches a disallow rule in the `robots.txt` file or meta robots noindex is present this event is triggered. The URL will not be added to the sitemap. Passes the ignored url as argument.
249 |
250 | ```JavaScript
251 | generator.on('ignore', (url) => {
252 | // log ignored url
253 | });
254 | ```
255 |
256 | ## FAQ
257 |
258 |
259 | Does this work with React, Angular, ...
260 | This package don't care what frameworks and technologies you are using under the hood. The only requirement is, that your URL's return valid HTML. Therefore SSR (server side rendering) is required for single page apps as no JavaScript is executed.
261 |
262 |
263 |
264 | Where to put this code
265 | This is basically up to you. You can execute this code manually and upload your sitemap by hand, or you can put this on your server and run this periodically to keep your sitemap up to date.
266 |
267 |
268 |
269 | Should I use this package or the CLI
270 | The CLI should suffice most of the common use cases. It has several options to tweak in case you want it to behave differently. If your use case is more advanced and you need fine control about what the crawler should fetch, you should use this package and the programmatic API.
271 |
272 |
273 | ## License
274 |
275 | [MIT](https://github.com/lgraubner/sitemap-generator/blob/master/LICENSE) © [Lars Graubner](https://larsgraubner.com)
276 |
--------------------------------------------------------------------------------