├── .github ├── CONTRIBUTING.rst └── workflows │ ├── ci.yaml │ └── rebase_checker.yaml ├── .gitignore ├── COPYRIGHT ├── LICENSE ├── README.md ├── _static ├── README.md ├── cc-by_large.svg ├── cc-by_small.svg ├── development │ ├── github_pr_comment_areas.png │ └── jenkins_ci.png └── lsst-logo-dark.svg ├── conf.py ├── gen2tutorialdeprecation.txt ├── getting-started ├── coaddition.rst ├── data-setup.rst ├── dc2-guide.rst ├── dcr-guide.rst ├── display.rst ├── ds9-screenshot.jpg ├── index.rst ├── multiband-analysis.png ├── multiband-analysis.rst ├── photometry.rst ├── single_frame.png ├── singleframe.rst └── uber-cal.rst ├── index.rst ├── install ├── demo.rst ├── docker.rst ├── git-lfs.rst ├── index.rst ├── lsstinstall.rst ├── lsstsw.rst ├── package-development.rst ├── prereqs.rst ├── setup.rst └── top-level-packages.rst ├── known-issues.rst ├── metrics.rst ├── middleware ├── faq.rst └── index.rst ├── releases ├── README.md ├── data-products │ ├── v20_0_0.rst │ └── v21_0_0.rst ├── index.rst ├── tickets │ ├── v14_0.rst │ ├── v15_0.rst │ ├── v16_0.rst │ ├── v17_0.rst │ ├── v18_0_0.rst │ ├── v18_1_0.rst │ ├── v19_0_0.rst │ ├── v20_0_0.rst │ ├── v21_0_0.rst │ ├── v22_0_0.rst │ ├── v23_0_0.rst │ ├── v24_0_0.rst │ ├── v24_1_0.rst │ ├── v25_0_0.rst │ ├── v26_0_0.rst │ ├── v27_0_0.rst │ └── v28_0_0.rst ├── v11_0.rst ├── v12_0.rst ├── v12_0_qserv_dax.rst ├── v13_0.rst ├── v13_0_qserv_dax.rst ├── v13_0_sui.rst ├── v14_0.rst ├── v15_0.rst ├── v16_0.rst ├── v17_0.rst ├── v18_0_0.rst ├── v18_1_0.rst ├── v19_0_0.rst ├── v20_0_0.rst ├── v21_0_0.rst ├── v22_0_0.rst ├── v23_0_0.rst ├── v24_0_0.rst ├── v24_1_0.rst ├── v25_0_0.rst ├── v26_0_0.rst ├── v27_0_0.rst └── v28_0_0.rst ├── requirements.txt ├── tasks.rst └── ups └── pipelines_lsst_io.table /.github/CONTRIBUTING.rst: -------------------------------------------------------------------------------- 1 | ##################################### 2 | Documentation contribution guidelines 3 | ##################################### 4 | 5 | External contributions 6 | ====================== 7 | 8 | This is an open source project and LSST welcomes contributions from the community. 9 | Feel free to create a GitHub issue to report a problem or even submit a pull request. 10 | Issues and pull requests are triaged into LSST's ticketing system, and we'll be able to resolve those issues or merge those pull requests for you. 11 | 12 | Project contributors 13 | ==================== 14 | 15 | If you're a member of the LSST Project, and have push access to this repository, please follow `LSST DM's Development Workflow `__, particularly including the branching conventions. 16 | 17 | Content guidelines 18 | ================== 19 | 20 | Most of this project's content is written in reStructuredText. 21 | Please see the `DM ReStructuredText Style Guide `__ for information on how to format things like headers, lists, tables, images, and code samples. 22 | 23 | Please write one sentence per line (as opposed to hard-wrapping text to a specific line width). 24 | This makes Git diffs and pull requests easier to use. 25 | 26 | Use sentence case for headlines. 27 | 28 | The `Stack section of the DM Developer Guide `__ has more information on how to create content for pipelines.lsst.io. 29 | 30 | Building the documentation 31 | ========================== 32 | 33 | pipelines.lsst.io_ is automatically built and deployed with each ``lsst_distrib`` release. 34 | As a contributor, you don't need to worry about updating the published site after you've merged updates to the documentation. 35 | 36 | If you're writing new content, it's useful to be able to preview your changes. 37 | Depending on whether you're building pipelines.lsst.io_ as a whole, or just a single package, you can follow one of these tutorials to build and test your documentation changes: 38 | 39 | - `Building single-package documentation locally `__. 40 | - `Building the pipelines.lsst.io site locally `__. 41 | - `Building pipelines.lsst.io with Jenkins `__. 42 | 43 | **Tip:** The Jenkins-based method enables you to publish a preview of the pipelines.lsst.io site based on your ticket branch, which is useful for code reviews. 44 | This method only works for branches on the `lsst/pipelines_lsst_io `__ repository itself, though. 45 | 46 | Reference documentation for the build commands: 47 | 48 | - `stack-docs `__: used to build https://pipelines.lsst.io from the `lsst/pipelines_lsst_io `__ repository itself. 49 | - `package-docs `__: used to build documentation for single packages from their ``doc/`` directories. 50 | 51 | *Background:* `Documenteer `__ is the build tool for LSST's Sphinx-based documentation (like this project). 52 | To get a sense of how all this all works, you can read the `Overview of the Stack documentation system `__. 53 | 54 | Getting help with contributions 55 | =============================== 56 | 57 | Whether you have general questions about contributing to pipelines.lsst.io_, or need help with a specific piece of documentation that you're contributing, you can get help a couple different ways: 58 | 59 | - Ask a question in `#dm-docs `__ on Slack. 60 | - Ask a question in the `Data Management category `__ on the LSST Community forum. 61 | 62 | .. _pipelines.lsst.io: https://pipelines.lsst.io 63 | -------------------------------------------------------------------------------- /.github/workflows/ci.yaml: -------------------------------------------------------------------------------- 1 | name: Empty check for branch protection 2 | 3 | on: 4 | - push 5 | - pull_request 6 | 7 | jobs: 8 | 9 | null_check: 10 | runs-on: ubuntu-latest 11 | steps: 12 | - uses: actions/checkout@v2 13 | -------------------------------------------------------------------------------- /.github/workflows/rebase_checker.yaml: -------------------------------------------------------------------------------- 1 | --- 2 | name: Check that 'main' is not merged into the development branch 3 | 4 | on: pull_request 5 | 6 | jobs: 7 | call-workflow: 8 | uses: lsst/rubin_workflows/.github/workflows/rebase_checker.yaml@main 9 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | .pyvenv 2 | _build 3 | modules 4 | packages 5 | py-api 6 | _static/display_firefly 7 | _static/pipe_base 8 | _static/verify 9 | -------------------------------------------------------------------------------- /COPYRIGHT: -------------------------------------------------------------------------------- 1 | Copyright 2015-2018 Association of Universities for Research in Astronomy 2 | Copyright 2017-2018 University of Washington 3 | Copyright 2017 California Institute of Technology 4 | Copyright 2016 The Board of Trustees of the Leland Stanford Junior University, through SLAC National Accelerator Laboratory 5 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # LSST Science Pipelines Documentation 2 | 3 | **https://pipelines.lsst.io** 4 | 5 | This repository, combined with content from the `doc/` directories of individual LSST Science Pipelines packages, is what you see at [pipelines.lsst.io](https://pipelines.lsst.io). 6 | 7 | [pipelines.lsst.io](https://pipelines.lsst.io) is automatically built by LSST's Jenkins CI for each `lsst_distrib` release (major, weekly, and daily releases): 8 | 9 | - The main site, https://pipelines.lsst.io, tracks the latest major release. 10 | - The latest weekly release is published at https://pipelines.lsst.io/v/weekly/ 11 | - The latest daily release is published at https://pipelines.lsst.io/v/daily/ 12 | 13 | You can find links for all editions of this documentation by visiting https://pipelines.lsst.io/v. 14 | 15 | ## Contributing to pipelines.lsst.io 16 | 17 | To learn how to contribute to the [pipelines.lsst.io](https://pipelines.lsst.io) documentation by reporting issues or contributing pull requests, see the [CONTRIBUTING](./.github/CONTRIBUTING.rst) file. 18 | 19 | ## Related projects and resources 20 | 21 | - [LSST Community forum](https://community.lsst.org) 22 | - [DM Developer Guide](https://developer.lsst.io) 23 | 24 | ## Licensing 25 | 26 | The documentation in this repository is licensed under the [Creative Commons Attribution 4.0 International License](http://creativecommons.org/licenses/by/4.0/). 27 | See the included [LICENSE](./LICENSE) file. 28 | 29 | See the included [COPYRIGHT](./COPYRIGHT) file for copyrights. 30 | -------------------------------------------------------------------------------- /_static/README.md: -------------------------------------------------------------------------------- 1 | This `_static` directory is used for assets such as CSS and images. 2 | -------------------------------------------------------------------------------- /_static/cc-by_large.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 19 | 21 | 43 | 45 | 46 | 48 | image/svg+xml 49 | 51 | 52 | 53 | 54 | 58 | 64 | 69 | 70 | 73 | 74 | 77 | 81 | 82 | 86 | 87 | 88 | 89 | 92 | 93 | 102 | 103 | 106 | 109 | 110 | 111 | 112 | 113 | 114 | 116 | 126 | 127 | 129 | 132 | 133 | 142 | 143 | 144 | 145 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | -------------------------------------------------------------------------------- /_static/cc-by_small.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 19 | 21 | 24 | 31 | 32 | 33 | 56 | 58 | 59 | 61 | image/svg+xml 62 | 64 | 65 | 66 | 67 | 71 | 74 | 77 | 84 | 91 | 96 | 100 | 109 | 113 | 114 | 115 | 119 | 120 | 121 | 122 | -------------------------------------------------------------------------------- /_static/development/github_pr_comment_areas.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lsst/pipelines_lsst_io/46dbbb5eaa65cbb16bb3b6a7150fd396ab60282c/_static/development/github_pr_comment_areas.png -------------------------------------------------------------------------------- /_static/development/jenkins_ci.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lsst/pipelines_lsst_io/46dbbb5eaa65cbb16bb3b6a7150fd396ab60282c/_static/development/jenkins_ci.png -------------------------------------------------------------------------------- /_static/lsst-logo-dark.svg: -------------------------------------------------------------------------------- 1 | lsst-logo-dark -------------------------------------------------------------------------------- /conf.py: -------------------------------------------------------------------------------- 1 | """Sphinx configurations for pipeline_lsst_io. 2 | 3 | These configurations are centrally defined in Documenteer 4 | (https://github.com/lsst-sqre/documenteer). Documentation: 5 | https://documenteer.lsst.io/pipelines/configuration.html 6 | """ 7 | 8 | from documenteer.conf.pipelines import * 9 | 10 | project = "LSST Science Pipelines" 11 | html_theme_options["logotext"] = project 12 | html_title = project 13 | html_short_title = project 14 | 15 | # Patch EUPS tag substitutions 16 | rst_epilog = """ 17 | 18 | .. |eups-tag| replace:: v28_0_2 19 | .. |eups-tag-mono| replace:: ``v28_0_2`` 20 | .. |eups-tag-bold| replace:: **v28_0_2** 21 | """ 22 | 23 | # Patch EUPS and Git tag context for Jinja templating 24 | jinja_contexts = { 25 | "default": { 26 | "release_eups_tag": "v28_0_2", 27 | "release_git_ref": "28.0.2", 28 | "version": "v28_0_2", 29 | "release": "v28_0_2", 30 | "scipipe_conda_ref": "28.0.2", 31 | "pipelines_demo_ref": "28.0.2", 32 | } 33 | } 34 | 35 | jira_uri_template = "https://ls.st/{ticket}" 36 | 37 | # needed for pipe_base 38 | intersphinx_mapping['networkx'] = ('https://networkx.org/documentation/stable/', None) 39 | -------------------------------------------------------------------------------- /gen2tutorialdeprecation.txt: -------------------------------------------------------------------------------- 1 | .. warning:: 2 | 3 | These tutorials are based on the deprecated Generation 2 command-line task and Butler (`lsst.daf.persistence.Butler`). 4 | New tutorials for Generation 3 pipeline tasks and `lsst.daf.butler.Butler` are coming soon. 5 | -------------------------------------------------------------------------------- /getting-started/coaddition.rst: -------------------------------------------------------------------------------- 1 | .. 2 | Brief: 3 | This tutorial is geared towards beginners to the Science Pipelines software. 4 | Our goal is to guide the reader through a small data processing project to show what it feels like to use the Science Pipelines. 5 | We want this tutorial to be kinetic; instead of getting bogged down in explanations and side-notes, we'll link to other documentation. 6 | Don't assume the user has any prior experience with the Pipelines; do assume a working knowledge of astronomy and the command line. 7 | 8 | .. _getting-started-tutorial-coaddition: 9 | 10 | ################################################ 11 | Getting started tutorial part 5: coadding images 12 | ################################################ 13 | 14 | In this part of the :ref:`tutorial series ` you will combine the individual exposures produced by the ``singleFrame`` pipeline (from :doc:`part 2 `) into deeper coadds (mosaic images). 15 | The dataset that defines how images are reprojected for coaddition is called a **skymap**. 16 | The example repository has a skymap that we can use for this purpose. 17 | We will "warp" (reproject) images into that skymap. 18 | Then, you will coadd the warped images together into deep images. 19 | 20 | Set up 21 | ====== 22 | 23 | Pick up your shell session where you left off in :doc:`part 4 `. 24 | For convenience, start in the top directory of the example git repository. 25 | 26 | .. code-block:: bash 27 | 28 | cd $RC2_SUBSET_DIR 29 | 30 | The ``lsst_distrib`` package also needs to be set up in your shell environment. 31 | See :doc:`/install/setup` for details on doing this. 32 | 33 | About skymaps 34 | ============== 35 | 36 | Before you get started, let's talk about **skymaps.** 37 | 38 | A skymap is a tiling of the celestial sphere, and is used as coordinate system for the final coadded image. 39 | A skymap is composed of one or more **tracts**. 40 | Those tracts contain smaller regions called **patches**. 41 | Both tracts and patches overlap their neighbors. 42 | 43 | Each tract has a different world coordinate system (WCS), but the WCSs of the patches within a given tract are just linearly-offset versions of the same WCS. 44 | 45 | There are two general categories of skymaps: 46 | 47 | 1. Whole sky. 48 | 2. A skymap based on the bounding box of a set of input exposures. 49 | 50 | Though the HSC dataset you are working with is small, the full dataset is large so we include the skymap computed from the full set and it is of the first type. 51 | Using a skymap that describes the full sky has the benefit that you can compare directly with data products from larger data processing runs since the tracts and patches will align exactly. 52 | 53 | Warping images onto the skymap 54 | =============================== 55 | 56 | Before assembling the coadded image, you need to *warp* the exposures created by the ``singleFrame`` pipeline onto the pixel grids of patches described in the skymap. 57 | You can use the ``makeDirectWarp`` pipeline for this. 58 | 59 | The data to process can be specified with a dataset query passed to the ``-d`` argument, short for ``--data-query``. 60 | The example command below only creates coadds for a subset of the data. 61 | The remaining sections only use a few patches that have coverage from all the input visits, so downsampling in the coadd generation phase can save time. 62 | All the data can be coadded by simply leaving off the ``-d`` switch and the arguments to it. 63 | Remember that when specifying the output ``tract`` and ``patch`` information, you must also specify a valid skymap. 64 | In this case, the skymap, called ``hsc_ring_v1`` from a larger HSC run is used. 65 | The filtering we will do here is: 66 | 67 | .. code-block:: bash 68 | 69 | -d "tract = 9813 AND skymap = 'hsc_rings_v1' AND patch in (38, 39, 40, 41)" 70 | 71 | The above dataset query has been tailored to work with the `example dataset`_ described in the first tutorial. 72 | 73 | The ``patch in (38, 39, 40, 41)`` clause selects patches 38, 39, 40 and 41 in the skymap. 74 | All five bands for each of the patches will be processed. 75 | You can retrieve the skymap and interrogate it using the various member functions. 76 | 77 | .. code-block:: python 78 | 79 | from lsst.daf.butler import Butler 80 | butler = Butler('SMALL_HSC') 81 | skymap = butler.get('skyMap', skymap='hsc_rings_v1', collections='HSC/RC2/defaults') 82 | tractInfo = skymap.generateTract(9813) 83 | patch = tractInfo[41] 84 | patch.getIndex() 85 | --> (5, 4) 86 | 87 | The data queries will typically use the integer id for the patches, but you can use code like that above to find out the x and y indices into the tract's rectilinear grid of patches. 88 | In this case, the patch with id 41 is located at the position (5, 4) in tract 9813. 89 | 90 | To warp the images, you will use the ``pipetask run`` command again. 91 | This time you will specify the ``makeDirectWarp`` subset and an appropriate output collection. 92 | This example uses ``coadds`` as the output collection. 93 | 94 | .. code-block:: bash 95 | 96 | pipetask run --register-dataset-types \ 97 | -b $RC2_SUBSET_DIR/SMALL_HSC/butler.yaml \ 98 | -i u/$USER/source_calibration,u/$USER/gbdes,u/$USER/fgcm \ 99 | -o u/$USER/warps \ 100 | -p $DRP_PIPE_DIR/pipelines/HSC/DRP-RC2_subset.yaml#makeDirectWarp \ 101 | -d "skymap = 'hsc_rings_v1' AND tract = 9813 AND patch in (38, 39, 40, 41)" 102 | 103 | Note that warping requires the outputs of both ``gbdes`` and ``FGCM``, so both of those collections need to be specified as inputs. 104 | Again, this will warp all calibrated exposures. 105 | If you wish to pare down the data to be processed, you can specify a data query like the one earlier in this section using the ``-d`` argument. 106 | 107 | .. tip:: 108 | 109 | As with the ``singleFrame`` pipeline, warping only needs the data from an input visit and the skymap. 110 | Each warp can be done independently of every other warp. 111 | That means it is a good candidate for running in parallel. 112 | If you have access to more than one core for processing, specifying the `-j` argument will speed up this step. 113 | 114 | 115 | Coadding warped images 116 | ====================== 117 | 118 | Now you will select warped images to include in the coadds using the ``selectDeepCoaddVisits`` task, then assemble the warped images into coadditions for each patch with the ``assembleCoadd`` pipeline. 119 | Note that the two pipelines (``selectDeepCoaddVisits`` and ``assembleCoadd``) can be specified in a single call to ``pipetask`` by providing ``#selectDeepCoaddVisits,assembleCoadd`` in the call to ``pipetask``. 120 | As before, we will run without a data query to process a subset of the data, but a selection can be made with the ``-d`` argument just as with warping. 121 | In this case the ``-d`` argument could be omitted since the coaddition process will only find the warped images from the previous command and will thus only produce coadds for those patches. 122 | 123 | Run: 124 | 125 | .. code-block:: bash 126 | 127 | pipetask run --register-dataset-types \ 128 | -b $RC2_SUBSET_DIR/SMALL_HSC/butler.yaml \ 129 | -i u/$USER/warps \ 130 | -o u/$USER/coadds \ 131 | -p $DRP_PIPE_DIR/pipelines/HSC/DRP-RC2_subset.yaml#selectDeepCoaddVisits,assembleCoadd \ 132 | -d "skymap = 'hsc_rings_v1' AND tract = 9813 AND patch in (38, 39, 40, 41)" 133 | 134 | .. tip:: 135 | 136 | While coaddition can be done in parallel, each process is more memory intensive than warping because multiple visits from multiple detectors may be put in memory at once. 137 | Still, if you have access to a machine with a fair amount of memory, the ``-j`` option may still speed up this step. 138 | 139 | Wrap up 140 | ======= 141 | 142 | In this tutorial, you've warped exposures into a pre-existing skymap, and then coadded the exposures to make deep mosaics. 143 | Here are some key takeaways: 144 | 145 | - Skymaps define the WCS of coadditions. 146 | - Skymaps are composed of tracts, each of which is composed of smaller patches. 147 | - The ``makeDirectWarp`` pipeline warps exposures into the WCSs of the skymap. 148 | - The ``assembleCoadd`` pipeline coadds warped exposures into deep mosaics. 149 | 150 | Continue this tutorial in :doc:`part 6, where you'll measure sources ` in the coadds. 151 | 152 | .. _example dataset: https://github.com/lsst/rc2_subset 153 | -------------------------------------------------------------------------------- /getting-started/data-setup.rst: -------------------------------------------------------------------------------- 1 | .. 2 | Brief: 3 | This tutorial is geared towards new users of the LSST Science Pipelines software. 4 | Our goal is to guide the reader through a small data processing project to show what it feels like to use the Science Pipelines. 5 | We want this tutorial to be kinetic; instead of getting bogged down in explanations and side-notes, we'll link to other documentation. 6 | Don't assume the user has any prior experience with the Pipelines; do assume a working knowledge of astronomy and the command line. 7 | 8 | .. _getting-started-tutorial-data-setup: 9 | 10 | ###################################################################### 11 | Getting started tutorial part 1: setting up the Butler data repository 12 | ###################################################################### 13 | 14 | This hands-on tutorial is intended for anyone getting started with using the LSST Science Pipelines for data processing. 15 | You'll get a feel for setting up a Pipelines environment, working with data repositories, running processing from the command-line, and working with the Pipelines' Python APIs. 16 | Along the way we'll point you to additional documentation. 17 | 18 | The LSST Science Pipelines can process data from several telescopes using LSST's algorithms. 19 | In this :ref:`tutorial series ` you will calibrate and reduce Hyper Suprime-Cam (HSC) exposures into coadditions and catalogs of objects. 20 | 21 | In this first part of the :ref:`tutorial series ` you'll set up the LSST Science Pipelines software, and obtain the data needed for the remainder of the tutorial. 22 | Along the way, you'll be introduced to the Butler, which is the Pipelines' interface for managing, reading, and writing datasets. 23 | 24 | Install the LSST Science Pipelines 25 | ================================== 26 | 27 | If you haven't already, you'll need to install the LSST Science Pipelines. 28 | We recommend that you install the pre-built binary packages by following the instructions at :doc:`/install/lsstinstall`. 29 | This tutorial was developed using the ``v28_0_0`` tag (which is based on ``w_2024_42``) of the ``lsst_distrib`` EUPS package. 30 | We expect the tutorials to also work with newer versions of the science pipelines, however continuing development will eventually outpace the directions contained here. 31 | 32 | When working with the LSST Science Pipelines, you need to remember to activate the installation and *set up* the package stack in each new shell session. 33 | Follow the instructions :doc:`/install/setup` to do this. 34 | 35 | To make sure the environment is set up properly, you can run: 36 | 37 | .. code-block:: bash 38 | 39 | eups list lsst_distrib 40 | 41 | The line printed out should contain the word ``setup``. 42 | If not, review the :doc:`set up instructions `. 43 | It may simply be that you're working in a brand new shell. 44 | 45 | Downloading the sample HSC data 46 | =============================== 47 | 48 | Sample data for this tutorial comes from the `rc2_subset`_ package. 49 | `rc2_subset`_ contains a small set of Hyper Suprime-Cam (HSC) exposures. 50 | The Science Pipelines provides native integrations for many observatories, including HSC, CFHT/MegaCam, and of course LSST. 51 | 52 | `rc2_subset`_ is a Git LFS-backed package, so make sure you've :doc:`installed and configured Git LFS for LSST <../install/git-lfs>`. 53 | 54 | .. important:: 55 | 56 | Even if you've used Git LFS before, you do need to :doc:`configure it to work with LSST's servers <../install/git-lfs>`. 57 | 58 | First, clone `rc2_subset`_ using Git: 59 | 60 | .. jinja:: default 61 | 62 | .. code-block:: bash 63 | 64 | git clone -b {{ release_eups_tag }} https://github.com/lsst/rc2_subset 65 | 66 | Then :command:`setup` the package to add it to the EUPS stack: 67 | 68 | .. code-block:: bash 69 | 70 | setup -j -r rc2_subset 71 | 72 | .. tip:: 73 | 74 | The ``-r rc2_subset`` argument is the package's directory path (either absolute or relative). 75 | In this case 76 | 77 | The ``-j`` argument means that we're **just** setting up ``rc2_subset`` without affecting other packages. 78 | 79 | Now run: 80 | 81 | .. code-block:: bash 82 | 83 | echo $RC2_SUBSET_DIR 84 | 85 | The ``$RC2_SUBSET_DIR`` environment variable should be the `rc2_subset`_ directory's path. 86 | 87 | Creating a Butler object for HSC data 88 | ========================================= 89 | 90 | In the LSST Science Pipelines you don't directly manage data files. 91 | Instead, you access data through an instance of the **Butler** class. 92 | This gives you flexibility to work with data from different observatories without significantly changing your workflow. 93 | 94 | The Butler manages data in **repositories.** 95 | Butler repositories can be remote (the data are on a server, across a network) or local (the data are on a local filesystem). 96 | In this tutorial you'll create and use a local Butler repository, which is a simple directory. 97 | 98 | The `rc2_subset`_ git repository has a Butler repository contained within it. 99 | To construct a Butler that can manage data in that repository, from a python prompt say: 100 | 101 | .. code-block:: python 102 | 103 | from lsst.daf.butler import Butler 104 | import os 105 | repo_path = os.path.join(os.environ['RC2_SUBSET_DIR'], 'SMALL_HSC') 106 | butler = Butler(repo_path) 107 | 108 | Now you can explore the repository using the registry attribute of the Butler you created. E.g.: 109 | 110 | .. code-block:: python 111 | 112 | registry = butler.registry 113 | for col in registry.queryCollections(): 114 | print(col) 115 | for ref in registry.queryDatasets('raw', collections='HSC/raw/all', instrument='HSC'): 116 | print(ref.dataId) 117 | 118 | Read more about querying datasets :ref:`here `. 119 | 120 | Notes on terminology 121 | ==================== 122 | 123 | First, a coherent set of pixels can have lots of names. 124 | In this set of tutorials, you will run into three. 125 | The term "exposure" refers to a single image. 126 | The camera produces exposures that can be ingested into a data butler. 127 | Once ingested, exposures can be grouped together into "visits" via the ``define-visits`` subcommand to the ``butler`` command line tool. 128 | Visits can be made up of more than one exposure as in the baseline plan for each visit to be made up of two "snaps" for the LSST. 129 | You will also see mention of ``Exposure``. 130 | This is the name of the python object, or instance thereof, that is used to manipulate pixel data within the Science Pipelines. 131 | The python object will always be presented capitalized and in monospace. 132 | 133 | Second, different projects call the instances of astrophysical bodies different names. 134 | In this project, "sources" are specific measurements of an astrophysical "object". 135 | The term "object" refers to the astrophysical entity itself. 136 | In other words, there is a unique record for each distinct object seen by the LSST, but multiple source measurements for each time the LSST revisits a particular part of the sky. 137 | 138 | Third, you will see mention of "pipelines". 139 | Formally a ``Pipeline`` is made up of one or more ``PipelineTask`` objects. 140 | These can be further grouped into other pipelines. 141 | You will see reference to "subsets" of a pipeline. 142 | This just means a named set of ``PipelineTasks`` that makes up a part of a larger pipeline, but that can be run independently. 143 | 144 | Fourth, the Butler has a concept of "dataset type". 145 | As discussed in `Organizing and identifying datasets `_, a ``DatasetType`` roughly corresponds to the role its datasets play in a processing pipeline, and a particular pipeline will typically accept particular dataset types as inputs and produce particular dataset types as outputs. 146 | In the context of this getting started tutorial, the most important mappings you will encounter between ``DatasetTypes`` and their corresponding concepts are as follows: a dataset type of ``raw`` corresponds to raw (uncalibrated) detector images, a dataset type of ``calexp`` corresponds to calibrated detector images, and a dataset type of ``deepCoadd`` corresponds to coadded sky images. 147 | 148 | Notes on processing 149 | =================== 150 | 151 | The intention of this set of introductory recipes is to give you a realistic sense of how data are processed using the LSST Science Pipelines. 152 | That includes taking raw images all the way through to coaddition and forced photometry. 153 | Though the starting repository is small, a significant amount of processing needs to be done to produce all the datasets needed for downstream processing. 154 | This means that some steps can be quite time consuming and you should be prepared to wait or perhaps run things overnight if you intend to follow these examples line by line. 155 | 156 | The most time consuming steps are: 157 | 158 | - Single frame processing: 11 hours 159 | - Warping the images in preparation for coaddition: 90 minutes 160 | - Coaddition: 70 minutes 161 | - Coadd detection, deblending and measurement: 90 minutes 162 | - Forced photometry: 75 minutes 163 | 164 | These timings are all for a single serial thread. 165 | Some steps can be sped up significantly if you have access to more than one core. 166 | For example, to speed up the single frame processing, you can try adding the ``-j4`` argument. 167 | This will attempt to run the processing on 4 cores simultaneously. 168 | 169 | Wrap up 170 | ======= 171 | 172 | In this tutorial, you've set up a Butler repository with the data you'll process in later steps. 173 | Here are some key takeaways: 174 | 175 | - The Butler is the interface between data and LSST Science Pipelines processing tasks. 176 | - Butler repositories can be hosted on different backends, both remote and local. In this case you created a local Butler repository on your computer's filesystem. 177 | - Butler repositories contain raw data, calibrations, and reference catalogs. As you'll see in future tutorials, the Butler repository also contains the outputs of processing tasks. 178 | - If you are interested in creating a butler repository with your own data, the `Community Forum`_ is the right place to search for and ask questions. 179 | 180 | In :doc:`part 2 of this tutorial series ` you will process the HSC data in this newly-created Butler repository into calibrated exposures. 181 | 182 | .. _rc2_subset: https://github.com/lsst/rc2_subset 183 | .. _Community Forum: https://community.lsst.org 184 | -------------------------------------------------------------------------------- /getting-started/dcr-guide.rst: -------------------------------------------------------------------------------- 1 | ###################################### 2 | Generating DCR Coadds with LSST ComCam 3 | ###################################### 4 | 5 | This article walks through the building of differential chromatic refraction (DCR) coadds by running the 6 | ``dcrAssembleCoaddTask`` on an example set of ComCam image data. Prior to generating DCR coadds, image 7 | templates must already be imported. 8 | 9 | .. note:: 10 | 11 | This guide assumes the user has access to a shared Butler repository containing data from LSST ComCam via 12 | the `US Data Facility (USDF) `__. This guide further assumes 13 | the user has a recently-built version of ``lsst.distrib`` from the `LSST Science Pipelines 14 | `__ (circa ``w_2025_09`` or later). 15 | 16 | Importing AP Templates 17 | ======================= 18 | For detailed instructions on importing AP Templates, see the `importing good seeing templates 19 | `__ guide for details. 20 | 21 | 22 | Assembling DCR Coadds 23 | ===================== 24 | To generate DCR coadds, use the ``dcrAssembleCoaddTask`` found in the ``drp_tasks`` package. This requires 25 | image templates. The DCR subfilters must be registered prior to creating DCR coadds by following the 26 | `register-dcr-subfilters guide `__. The ``dcrNumSubfilters`` in the config must match the number 27 | of subfilters specified when running ``register-dcr-subfilters``. 28 | 29 | 30 | Creating Your DCR Assemble Coadd Pipeline 31 | ----------------------------------------- 32 | The first step in generating DCR coadds is to make a pipeline that runs the ``dcrAssembleCoaddTask`` from the 33 | LSST ``drp_tasks`` package. A sample ``dcrAssembleCoadd`` pipeline is shown below. 34 | 35 | .. code-block:: yaml 36 | 37 | description: A DCR assemble coadd pipeline for LSSTComCam. 38 | instrument: lsst.obs.lsst.LsstComCam 39 | 40 | parameters: 41 | coaddName: dcr 42 | tasks: 43 | DcrAssembleCoaddTask: 44 | class: lsst.drp.tasks.dcr_assemble_coadd.DcrAssembleCoaddTask 45 | config: 46 | effectiveWavelength: 478.5 # in nm 47 | bandwidth: 147.0 # in nm 48 | 49 | Save the pipeline file as ``dcrAssembleCoadd.yaml``. To visualize the pipeline, consider using 50 | ``pipetask build``. For example, 51 | 52 | .. code-block:: bash 53 | 54 | pipetask build -p path/to/your/dcrAssembleCoadd.yaml --pipeline-dot dcrCoadd.dot 55 | dot dcrCoadd.dot -Tpng > dcrCoadd.png 56 | 57 | 58 | Assembling DCR Coadds with ``pipetask run`` 59 | ------------------------------------------- 60 | To generate DCR coadds using ``pipetask run``, make an appropriate output collection name 61 | (``u/USERNAME/OUTPUT-COLLECTION`` in the example below), and then use the following code. 62 | 63 | .. code-block:: bash 64 | 65 | pipetask run -j 4 -b /repo/main -d "skymap='lsst_cells_v1' AND visit in (2024112800132,2024112900230) 66 | AND detector in (0,1) AND band='g'" -i path/to/your/templates,LSSTComCam/defaults 67 | -o u/USERNAME/OUTPUT-COLLECTION -p path/to/your/dcrAssembleCoadd.yaml 68 | 69 | To tell the process to run in the background and write output to a logfile, consider prepending pipetask 70 | run with ``nohup`` and postpend the command with ``> OUTFILENAME &``. Once complete, the DCR coadds from 71 | your templates should have generated. 72 | 73 | 74 | Assembling DCR Coadds with BPS 75 | ------------------------------ 76 | To assemble DCR coadds using the `Batch Processing System (BPS) `__, create a BPS submission file 77 | that runs the ``dcrAssembleCoadd`` pipeline file created above (this is the ``pipelineYaml`` input). 78 | Below is a sample BPS submit file. 79 | 80 | .. code-block:: yaml 81 | 82 | pipelineYaml: '/path/to/your/dcrAssembleCoaddTask.yaml' 83 | 84 | project: ComCam-DCRCoadds 85 | campaign: ComCam-dcr-assembleCoadd 86 | 87 | payload: 88 | payloadName: ComCam-dcr-assembleCoadd 89 | butlerConfig: /sdf/group/rubin/repo/main/butler.yaml 90 | inCollection: path/to/your/templates,LSSTComCam/defaults 91 | dataQuery: "instrument='LSSTComCam' and visit in (2024112800132,2024112900230) and detector in (0,1) and skymap='lsst_cells_v1' and band='g'" 92 | 93 | extraQgraphOptions: "--dataset-query-constraint off" 94 | 95 | provisionResources: true 96 | provisioning: 97 | provisioningMaxWallTime: 1-00:00:00 98 | 99 | Save the file as ``bps_submit_dcrAssembleCoadd.yaml``. 100 | -------------------------------------------------------------------------------- /getting-started/ds9-screenshot.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lsst/pipelines_lsst_io/46dbbb5eaa65cbb16bb3b6a7150fd396ab60282c/getting-started/ds9-screenshot.jpg -------------------------------------------------------------------------------- /getting-started/index.rst: -------------------------------------------------------------------------------- 1 | .. _getting-started: 2 | 3 | ############################################### 4 | Getting started with the LSST Science Pipelines 5 | ############################################### 6 | 7 | Before you start working with the LSST Science Pipelines, these pages will help you get up and running. 8 | 9 | .. _getting-started-tutorial: 10 | 11 | Getting started tutorials 12 | ========================= 13 | 14 | In this tutorial series we'll process a set of Hyper Suprime-Cam images into deep coadditions and source catalogs. 15 | These tutorials will give you a feeling for data processing and analysis with the Science Pipelines. 16 | 17 | .. toctree:: 18 | :maxdepth: 1 19 | 20 | Part 1: setting up the Butler data repository. 21 | Part 2: calibrating single frames. 22 | Part 3: displaying exposures and source catalogs. 23 | Part 4: full focal plane calibration. 24 | Part 5: coadding images. 25 | Part 6: measuring sources. 26 | Part 7: analyzing measurement catalogs in multiple bands. 27 | 28 | Data processing guides 29 | ====================== 30 | 31 | .. toctree:: 32 | :maxdepth: 1 33 | 34 | Processing DESC DC2 data with the Alert Production pipeline 35 | -------------------------------------------------------------------------------- /getting-started/multiband-analysis.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lsst/pipelines_lsst_io/46dbbb5eaa65cbb16bb3b6a7150fd396ab60282c/getting-started/multiband-analysis.png -------------------------------------------------------------------------------- /getting-started/photometry.rst: -------------------------------------------------------------------------------- 1 | .. 2 | Brief: 3 | This tutorial is geared towards beginners to the Science Pipelines software. 4 | Our goal is to guide the reader through a small data processing project to show what it feels like to use the Science Pipelines. 5 | We want this tutorial to be kinetic; instead of getting bogged down in explanations and side-notes, we'll link to other documentation. 6 | Don't assume the user has any prior experience with the Pipelines; do assume a working knowledge of astronomy and the command line. 7 | 8 | .. _getting-started-tutorial-measuring-sources: 9 | 10 | ################################################## 11 | Getting started tutorial part 6: measuring sources 12 | ################################################## 13 | 14 | In this step of the :ref:`tutorial series ` you'll measure the coadditions you assembled in :doc:`part 5 ` to build catalogs of stars and galaxies. 15 | This is the measurement strategy: 16 | 17 | 1. :ref:`Detect sources in individual coadd patches `. 18 | 2. :ref:`Merge those multi-band source detections into a single detection catalog `. 19 | 3. :ref:`Deblend and measure sources in the individual coadds using the unified detection catalog `. 20 | 4. :ref:`Merge the multi-band catalogs of source measurements to identify the best positional measurements for each source `. 21 | 5. :ref:`Re-measure the coadds in each band using fixed positions (forced photometry) `. 22 | 23 | Set up 24 | ====== 25 | 26 | Pick up your shell session where you left off in :doc:`part 5 `. 27 | For convenience, start in the top directory of the example git repository. 28 | 29 | .. code-block:: bash 30 | 31 | cd $RC2_SUBSET_DIR 32 | 33 | The ``lsst_distrib`` package also needs to be set up in your shell environment. 34 | See :doc:`/install/setup` for details on doing this. 35 | 36 | 37 | Run detection pipeline task 38 | =========================== 39 | 40 | Processing will be done in two blocks with two different pipelines. 41 | The first will do steps 1 through 4 in the introduction. 42 | The end result will be calibrated coadd object measurements and calibrated coadd exposures. 43 | 44 | The following code executes steps 1 through 4 of the steps outlined in the introduction to this tutorial. 45 | The code is executed here, but more information on each of the steps is included in sections following this code block. 46 | The fifth and final step is executed in :ref:`this section `. 47 | 48 | .. code-block:: bash 49 | 50 | pipetask run --register-dataset-types \ 51 | -b $RC2_SUBSET_DIR/SMALL_HSC/butler.yaml \ 52 | -i u/$USER/coadds \ 53 | -o u/$USER/coadd_meas \ 54 | -p $DRP_PIPE_DIR/pipelines/HSC/DRP-RC2_subset.yaml#coadd_measurement \ 55 | -d "skymap = 'hsc_rings_v1' AND tract = 9813 AND patch in (38, 39, 40, 41)" \ 56 | -c detection:detection.thresholdValue=250.0 57 | 58 | .. note:: 59 | 60 | The ``-c`` argument is used to set a configuration parameter. 61 | In this case, the detection threshold is set to 250.0. 62 | This is a high threshold, but it is used here to reduce the number of detections in the tutorial data. 63 | In a real analysis, you would want to set this to a lower value to detect fainter objects. 64 | 65 | Notice that since this task operates on coadds, we can select the coadds using the ``tract``, and ``patch`` data ID keys. 66 | In past sections, the examples left off the ``-d`` argument in order to process all available data. 67 | This example, however, is selecting just four of the patches for this step. 68 | Some algorithms are sensitive to how images are arranged on the sky. 69 | For example, some algorithms expect multiple images to overlap, or multi-band coverage. 70 | Those four patches have coverage from all 40 visits in the tutorial repository which means there doesn't need to be as much fine tuning to configurations, and we can process these patches just as the large scale HSC processing is done. 71 | As with previous examples, the outputs will go in a collection placed under a namespace defined by your username. 72 | 73 | .. note:: 74 | 75 | The processing in this part can be quite expensive and take a long time. 76 | You can use the `-j` argument to allow the processing to take more cores, if you have access to more than one. 77 | 78 | .. _getting-started-tutorial-detect-coadds: 79 | 80 | Detecting sources in coadded images 81 | ----------------------------------- 82 | 83 | To start, detect sources in the coadded images to take advantage of their depth and high signal-to-noise ratio. 84 | The ``detection`` subset is responsible for producing calibrated measurements from the input coadds. 85 | Detection is done on each band and patch separately. 86 | 87 | The resulting datasets are the ``deepCoadd_det`` detections and the ``deepCoadd_calexp`` calibrated coadd exposures. 88 | 89 | .. _getting-started-tutorial-merge-coadd-detections: 90 | 91 | Merging multi-band detection catalogs 92 | ------------------------------------- 93 | 94 | Merging the detections from the multiple bands used to produce the coadds allows later steps to use multi-band information in their processing: e.g. deblending. 95 | The ``mergeDetections`` subset created a ``deepCoadd_mergeDet`` dataset, which is a consistent table of sources across all filters. 96 | 97 | .. _getting-started-tutorial-measure-coadds: 98 | 99 | Deblending and measuring source catalogs on coadds 100 | -------------------------------------------------- 101 | 102 | Seeded by the ``deepCoadd_mergeDet``, the deblender works on each detection to find the flux in each component. 103 | Because it has information from multiple bands, the deblender can use color information to help it work out how to separate the flux into different components. 104 | See the `SCARLET paper `_ for further reading. 105 | The ``deblend`` subset produces the ``deepCoadd_deblendedFlux`` data product. 106 | 107 | The ``measure`` subset is responsible for measuring object properties on all of the deblended children produced by the deblender. 108 | This produces the ``deepCoadd_meas`` catalog data product with flux and shape measurement information for each object. 109 | You'll see how to access these tables later. 110 | 111 | .. _getting-started-tutorial-merge-coadds: 112 | 113 | Merging multi-band source catalogs from coadds 114 | ---------------------------------------------- 115 | 116 | After measurement the single band deblended and measured objects in single bands can again be merged into a single catalog. 117 | 118 | Merging the single band detection catalogs into a single multi-band catalog allows for more complete and consistent multi-band photometry by measuring the same source in multiple bands at a fixed position (the forced photometry method) rather than fitting the source's location individually for each band. 119 | 120 | For forced photometry you want to use the best position measurements for each source, which could be from different filters depending on the source. 121 | We call the filter that best measures a source the **reference filter**. 122 | The ``mergeMeasurements`` created a ``deepCoadd_ref`` dataset. 123 | This is the seed catalog for computing forced photometry. 124 | 125 | .. _getting-started-tutorial-forced-coadds: 126 | 127 | Running forced photometry on coadds 128 | =================================== 129 | 130 | Now you have accurate positions for all detected sources in the coadds. 131 | Re-measure the coadds using these fixed source positions (the forced photometry method) to create the best possible photometry of sources in your coadds: 132 | 133 | .. code-block:: bash 134 | 135 | pipetask run --register-dataset-types \ 136 | -b $RC2_SUBSET_DIR/SMALL_HSC/butler.yaml \ 137 | -i u/$USER/coadd_meas \ 138 | -o u/$USER/objects \ 139 | -p $DRP_PIPE_DIR/pipelines/HSC/DRP-RC2_subset.yaml#forced_objects \ 140 | -d "skymap = 'hsc_rings_v1' AND tract = 9813 AND patch in (38, 39, 40, 41)" 141 | 142 | As above, this selects just the patches that have full coverage. 143 | 144 | The ``forced_objects`` subset of pipelines does several things: 145 | 146 | 1. Forced photometry on the coadds resulting in the ``deepCoadd_forced_src`` dataset 147 | 2. Forced photometry on the input single frame calibrated exposures, the ``forced_src`` dataset 148 | 3. Finally, it combines all object level forced measurements into a single tract scale catalog resulting in the ``objectTable_tract`` dataset 149 | 150 | Wrap up 151 | ======= 152 | 153 | In this tutorial, you've created forced photometry catalogs of sources in coadded images. 154 | Here are some key takeaways: 155 | 156 | - *Forced photometry* is a method of measuring sources in several bandpasses using a common source list. 157 | 158 | :doc:`Continue this tutorial series in part 7 ` where you will analyze and plot the source catalogs that you've just measured. 159 | -------------------------------------------------------------------------------- /getting-started/single_frame.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lsst/pipelines_lsst_io/46dbbb5eaa65cbb16bb3b6a7150fd396ab60282c/getting-started/single_frame.png -------------------------------------------------------------------------------- /getting-started/uber-cal.rst: -------------------------------------------------------------------------------- 1 | .. 2 | Brief: 3 | This tutorial is geared towards beginners to the Science Pipelines software. 4 | Our goal is to guide the reader through a small data processing project to show what it feels like to use the Science Pipelines. 5 | We want this tutorial to be kinetic; instead of getting bogged down in explanations and side-notes, we'll link to other documentation. 6 | Don't assume the user has any prior experience with the Pipelines; do assume a working knowledge of astronomy and the command line. 7 | 8 | .. _getting-started-tutorial-uber-cal: 9 | 10 | ################################################################ 11 | Getting started tutorial part 4: Using all the data to calibrate 12 | ################################################################ 13 | 14 | In this part of the :ref:`tutorial series ` you will use all of the results from the ``singleFrame`` pipeline to improve the photometric and astrometric calibrations. 15 | When there are overlapping datasets from multiple visits, it's possible to average out effects from the atmosphere and improve the overall calibration. 16 | This is sometimes referred to as ubercalibration. 17 | 18 | Improving the photometric and astrometric calibrations can lead to better coadds down the line, and correspondingly better measurements of objects. 19 | 20 | For photometric calibration the system is the Forward Global Calibration Method (`FGCM`_). 21 | We use a reference catalog from Pan-STARRS. 22 | 23 | Refined astrometric calibration is provided by an algorithm called ``gbdes``. 24 | For ``gbdes`` we use an astrometric reference catalog derived from the second data release of the `Gaia`_ source catalog. 25 | 26 | It is out of scope to go into the details of the algorithms here, but you will learn how to run them. 27 | Also worth noting is that, in this instance, we will not see much benefit from global calibration since our dataset is small, but for larger datasets it can be a big benefit. 28 | 29 | Set up 30 | ====== 31 | 32 | Pick up your shell session where you left off in :doc:`part 2 `. 33 | For convenience, start in the top directory of the example git repository. 34 | 35 | .. code-block:: bash 36 | 37 | cd $RC2_SUBSET_DIR 38 | 39 | The ``lsst_distrib`` package also needs to be set up in your shell environment. 40 | See :doc:`/install/setup` for details on doing this. 41 | 42 | FGCM 43 | ==== 44 | 45 | As in :doc:`part 2 ` you will be running pipelines configured to produce the results we need for later steps. 46 | 47 | .. code-block:: bash 48 | 49 | pipetask run --register-dataset-types \ 50 | -b $RC2_SUBSET_DIR/SMALL_HSC/butler.yaml \ 51 | -i u/$USER/single_frame \ 52 | -o u/$USER/fgcm \ 53 | -p $DRP_PIPE_DIR/pipelines/HSC/DRP-RC2_subset.yaml#fgcm 54 | 55 | This should look very similar to the command executed in :doc:`part 2 `. 56 | There are three differences: 1) the subset to execute changed from ``singleFrame`` to ``fgcm``, 2) the input is now ``single_frame``, which contains pointers to the inputs to and outputs from ``singleFrame``, and 3) the output collection is now ``fgcm``. 57 | 58 | Note that unlike the ``singleFrame`` pipeline, FGCM must be run on only a single core. 59 | Setting the ``-j`` switch to anything other than ``1`` will result in an error. 60 | 61 | gbdes 62 | ======== 63 | 64 | You can do ``gbdes`` in much the same way as you did FGCM. 65 | Change the subset name and collection name appropriately. 66 | E.g.: 67 | 68 | .. code-block:: bash 69 | 70 | pipetask run --register-dataset-types \ 71 | -b $RC2_SUBSET_DIR/SMALL_HSC/butler.yaml \ 72 | -i u/$USER/single_frame \ 73 | -o u/$USER/gbdes \ 74 | -p $DRP_PIPE_DIR/pipelines/HSC/DRP-RC2_subset.yaml#gbdesAstrometricFit 75 | 76 | Note the input collection is the same as you passed to ``FGCM`` since ``gbdes`` doesn't depend on any of the outputs of ``FGCM``. 77 | 78 | Apply the calibrations 79 | ====================== 80 | 81 | Now you will want to apply the calibrations derived by running ``FGCM`` and 82 | ``gbdes`` to the source catalogs using the following (as always, changing 83 | the subset name and collection name appropriately): 84 | 85 | .. code-block:: bash 86 | 87 | pipetask run --register-dataset-types \ 88 | -b $RC2_SUBSET_DIR/SMALL_HSC/butler.yaml \ 89 | -i u/$USER/single_frame,u/$USER/fgcm,u/$USER/gbdes \ 90 | -o u/$USER/source_calibration \ 91 | -p $DRP_PIPE_DIR/pipelines/HSC/DRP-RC2_subset.yaml#source_calibration 92 | 93 | Wrap up 94 | ======= 95 | 96 | In this tutorial, you've computed the improved photometric and astrometric calibration from multiple visits, and applied the calibration to the source catalogs from those visits. 97 | Here are some key takeaways: 98 | 99 | - ``FGCM`` provides improved photometric calibration. 100 | - Astrometric calibration improvements are provided by running ``gbdes``. 101 | - Calibrations can be applied to the visit-level source catalogs by running the ``calibrate`` subset of tasks. 102 | - Given a pipeline description, e.g. the ``.yaml`` file used here, a subset can be specified, so running multiple steps can be done with very similar command line syntax. 103 | 104 | Continue this tutorial in :doc:`part 5, where you'll warp single frame images and stack them to make coadds `. 105 | 106 | .. _FGCM: https://arxiv.org/pdf/1706.01542.pdf 107 | .. _Gaia: https://www.cosmos.esa.int/web/gaia/dr2 108 | -------------------------------------------------------------------------------- /index.rst: -------------------------------------------------------------------------------- 1 | ########################## 2 | The LSST Science Pipelines 3 | ########################## 4 | 5 | The LSST Science Pipelines are designed to enable optical and near-infrared astronomy in the “big data” era. 6 | While they are being developed to process the data for the `Rubin Observatory Legacy Survey of Space and Time (Rubin’s LSST) `_, our command line and programming interfaces can be extended to address any optical or near-infrared dataset. 7 | 8 | This documentation covers version |eups-tag-bold|. 9 | :ref:`Learn what's new `. 10 | You can also find documentation for `other versions `__. 11 | 12 | .. _part-getting-started: 13 | 14 | Getting started 15 | =============== 16 | 17 | If you're new to the LSST Science Pipelines, these step-by-step data processing tutorials will get you up and running. 18 | 19 | Data processing tutorial series (these were developed using the ``w_2021_33`` version of the science pipelines): 20 | 21 | - Part 1 :doc:`Data repositories ` 22 | - Part 2 :doc:`Single frame processing ` 23 | - Part 3 :doc:`Image and catalog display ` 24 | - Part 4 :doc:`Global calibration ` 25 | - Part 5 :doc:`Image coaddition ` 26 | - Part 6 :doc:`Source measurement ` 27 | - Part 7 :doc:`Multi-band catalog analysis `. 28 | 29 | Guide for processing DESC DC2 data in a shared repository using the Alert Production pipeline (this was developed using the ``w_2021_30`` version of the science pipelines): 30 | 31 | - :doc:`Processing DESC DC2 data with the Alert Production pipeline `. 32 | 33 | Join us on the `LSST Community forum `_ to get help and share ideas. 34 | 35 | .. toctree:: 36 | :hidden: 37 | :caption: Getting Started 38 | 39 | getting-started/index 40 | 41 | .. _part-installation: 42 | 43 | Installation 44 | ============ 45 | 46 | Recommended installation path: 47 | 48 | - :doc:`Installing with lsstinstall ` 49 | - :doc:`install/setup` 50 | - :doc:`install/top-level-packages` 51 | 52 | Alternative distributions and installation methods: 53 | 54 | - :doc:`install/docker` 55 | - :doc:`Installing from source with lsstsw ` 56 | - `CernVM FS `_ (contributed by CC-IN2P3) 57 | 58 | Related topics: 59 | 60 | - :doc:`Configuring Git LFS for data packages ` 61 | - :doc:`install/package-development` 62 | 63 | To install the LSST Simulation software, such as MAF, please follow the `LSST Simulations documentation `_. 64 | 65 | .. This toctree is hidden to let us curate the section above, but still add the install/ pages to the Sphinx toctree 66 | 67 | .. toctree:: 68 | :hidden: 69 | :caption: Installation 70 | 71 | install/index 72 | 73 | .. _part-frameworks: 74 | 75 | Frameworks 76 | ========== 77 | 78 | .. toctree:: 79 | :maxdepth: 2 80 | 81 | middleware/index 82 | 83 | .. _part-modules: 84 | 85 | Python modules 86 | ============== 87 | 88 | .. module-toctree:: 89 | 90 | Additional C++ API reference documentation is currently available at the `doxygen.lsst.codes `__ site. 91 | 92 | .. _part-packages: 93 | 94 | Packages 95 | ======== 96 | 97 | .. package-toctree:: 98 | 99 | .. _part-release-details: 100 | 101 | External documentation 102 | ====================== 103 | 104 | Several packages which are used by the LSST Science Pipelines are documented separately: 105 | 106 | - `astro_metadata_translator `_ - provides generalized infrastructure for handling metadata extraction for astronomical instrumentation 107 | - `SDM Schemas `_ - contains YAML files representing the Rubin Science Data Model (SDM) Schemas 108 | - `Felis `_ - reads and validates the SDM Schemas YAML files into Python objects 109 | 110 | Release details 111 | =============== 112 | 113 | .. toctree:: 114 | :maxdepth: 2 115 | 116 | releases/index 117 | known-issues 118 | metrics 119 | 120 | .. _part-indices: 121 | 122 | Indices 123 | ======= 124 | 125 | .. toctree:: 126 | :maxdepth: 1 127 | :hidden: 128 | 129 | Tasks 130 | 131 | - :doc:`Tasks ` 132 | - :ref:`genindex` 133 | - :ref:`Search ` 134 | 135 | .. _part-citation: 136 | 137 | Citing and acknowledging the LSST Science Pipelines 138 | =================================================== 139 | 140 | If you use the science pipelines in a published work, we request that you cite `Bosch et al. 2018 `_, `2019 `_. 141 | In addition, it is appropriate to include an acknowledgement of the form: 142 | 143 | This paper makes use of LSST Science Pipelines software developed by the `Vera C. Rubin Observatory `_. 144 | We thank the Rubin Observatory for making their code available as free software at `https://pipelines.lsst.io `_. 145 | 146 | For studies that make use of the Data Butler and pipeline execution system, we request that you additionally cite `Jenness et al. 2022 `_. 147 | For studies that make use of the SCARLET source separation framework, we request that you additionally cite `Melchior et al. 2018 `_. 148 | 149 | As the Rubin Observatory is not funded to provide support for the LSST Science Pipelines outside of the project, any support provided by project members will come out of their own free or science time. 150 | A fair way to acknowledge their help and good will may be to offer co-authorship on technical papers describing the derived software (e.g. pipelines). 151 | 152 | .. _part-license: 153 | 154 | License 155 | ======= 156 | 157 | All LSST Science Pipelines code is `free software `_, licensed under the terms of the `GNU General Public Licence, Version 3 `_. 158 | You have the freedom to run, copy, distribute, study, change and improve the software as you see fit within the terms of the GPL v3 license. 159 | Using, modifying or redistributing the LSST Science Pipelines does not make you subject to the LSST Project Publication Policy (`LPM-162 `_). 160 | This documentation is licensed under the `Creative Commons Attribution Share Alike 4.0 International License (CC-BY-SA 4.0) `_. 161 | 162 | .. _part-more-info: 163 | 164 | More info 165 | ========= 166 | 167 | - Join us on the `LSST Community forum, community.lsst.org `_. 168 | - Fork our code on GitHub at https://github.com/lsst. 169 | - Report issues in `Jira `_. 170 | - Some API documentation, particularly for C++, is currently published separately on a `Doxygen site `_. 171 | - Our `Developer Guide `_ describes the procedures and standards followed by the DM team. 172 | - Learn more about Rubin Observatory Data Management by visiting http://lsst.org/about/dm. 173 | - Contribute to our documentation. This guide is on GitHub at `lsst/pipelines_lsst_io `_. 174 | -------------------------------------------------------------------------------- /install/demo.rst: -------------------------------------------------------------------------------- 1 | ###################################################### 2 | Testing the Science Pipelines installation with a demo 3 | ###################################################### 4 | 5 | This demo will allow you to quickly test your LSST Science Pipelines installation, :doc:`regardless of your installation method `. 6 | 7 | 1. Activate the LSST Science Pipelines 8 | ====================================== 9 | 10 | Remember to first load the LSST Science Pipelines into your shell's environment. 11 | The method depends on how the Science Pipelines were installed: 12 | 13 | - :doc:`lsstinstall ` 14 | - :ref:`lsstsw ` 15 | 16 | 2. Download the demo project 17 | ============================ 18 | 19 | Choose a directory to run the demo in. 20 | For example: 21 | 22 | .. code-block:: bash 23 | 24 | mkdir -p demo_data 25 | cd demo_data 26 | 27 | Then download the demo's data (if you aren't running the current stable release, see the note below): 28 | 29 | .. jinja:: default 30 | 31 | .. code-block:: bash 32 | 33 | curl -L https://github.com/lsst/pipelines_check/archive/{{ pipelines_demo_ref }}.tar.gz | tar xvzf - 34 | cd pipelines_check-{{ pipelines_demo_ref }} 35 | 36 | .. caution:: 37 | 38 | The demo's version should match your LSST Science Pipelines installed software. 39 | If you installed from source (with :doc:`lsstsw `) or with a :ref:`newer tag `, you'll likely need to run the latest version of the demo (``main`` branch): 40 | 41 | .. code-block:: bash 42 | 43 | curl -L https://github.com/lsst/pipelines_check/archive/main.tar.gz | tar xvzf - 44 | cd pipelines_check-main 45 | 46 | 3. Run the demo 47 | =============== 48 | 49 | Now setup the processing package and run the demo: 50 | 51 | .. code-block:: bash 52 | 53 | setup -r . 54 | ./bin/run_demo.sh 55 | 56 | 57 | Check that no errors are printed out during the execution. 58 | 59 | The script creates a new Butler data repository in the `DATA_REPO` subdirectory containing the raw and calibration data found in the `input_data` directory. 60 | It then processes the data using the `pipetask` command to execute the `ProcessCcd` pipeline. 61 | The outputs from processing are written to the `demo_collection` collection. 62 | The input data is a single raw image from Hyper Suprime-Cam, detector 10 of visit 903342. 63 | -------------------------------------------------------------------------------- /install/docker.rst: -------------------------------------------------------------------------------- 1 | .. _docker: 2 | 3 | ################### 4 | Running with Docker 5 | ################### 6 | 7 | LSST provides versioned Docker images containing the Science Pipelines software. 8 | With Docker, you can quickly install, download, and run the LSST Science Pipelines on any platform without compiling from source. 9 | Docker is an effective and reliable alternative to the :doc:`lsstinstall ` and :doc:`lsstsw `\ -based methods that install LSST software directly on your system. 10 | 11 | If you have issues using the LSST Docker images, reach out on the `LSST Community support forum `_. 12 | 13 | .. _docker-prereqs: 14 | 15 | Prerequisites 16 | ============= 17 | 18 | To download Docker images and run containers, you need Docker's software. 19 | The `Docker Community Edition `_ is freely available for most platforms, including macOS, Linux, and Windows. 20 | 21 | If you haven't used Docker before, you might want to learn more about Docker, images, and containers. 22 | Docker's `Getting Started `_ documentation is a good resource. 23 | 24 | .. _docker-quick-start: 25 | 26 | Quick start 27 | =========== 28 | 29 | This command downloads a current version of the LSST Science Pipelines Docker image (see also :ref:`docker-tags`), starts a container, and opens a prompt: 30 | 31 | .. jinja:: default 32 | 33 | .. code-block:: bash 34 | 35 | docker run -ti lsstsqre/centos:7-stack-lsst_distrib-{{ release_eups_tag }} 36 | 37 | Then in the container's shell, load the LSST environment and activate the ``lsst_distrib`` top-level package: 38 | 39 | .. code-block:: bash 40 | 41 | source /opt/lsst/software/stack/loadLSST.bash 42 | setup lsst_distrib 43 | 44 | This step is equivalent to the :doc:`set up instructions ` for a :doc:`lsstinstall `\ -based installation. 45 | In fact, the images are internally based on :command:`lsstinstall`. 46 | 47 | When you're done with the container, exit from the container's shell: 48 | 49 | .. code-block:: bash 50 | 51 | exit 52 | 53 | This returns you to the original shell on your host system. 54 | 55 | Next, learn more with these topics: 56 | 57 | - :ref:`docker-mount` 58 | - :ref:`docker-detached` 59 | - :ref:`docker-develop` 60 | - :ref:`docker-tags` 61 | 62 | .. _docker-mount: 63 | 64 | How to mount a host directory into a container 65 | ============================================== 66 | 67 | When you run a Docker container, you're working inside a system that is isolated from your host machine. 68 | The container's filesystem is distinct from your host machine's. 69 | 70 | You can mount a host directory into the container, however. 71 | When you mount a host directory to a container, the data and code that resides on your host filesystem is accessible to the container's filesystem. 72 | This is useful for processing data with the LSST Science Pipelines and even developing packages for the Science Pipelines. 73 | 74 | To mount a local directory, add a ``-v /`` argument to the :command:`docker run` command. 75 | For example: 76 | 77 | .. jinja:: default 78 | 79 | .. code-block:: bash 80 | 81 | docker run -it -v `pwd`:/home/lsst/mnt lsstsqre/centos:7-stack-lsst_distrib-{{ release_eups_tag }} 82 | 83 | The example mounts the current working directory (```pwd```) to the ``/home/lsst/mnt`` directory in the container. 84 | 85 | If you run :command:`ls` from the container's prompt you should see all files in the current working directory of the host filesystem: 86 | 87 | .. code-block:: bash 88 | 89 | ls mnt 90 | 91 | As usual with interactive mode (``docker run -it``), you can ``exit`` from the container's shell to stop the container and return to the host shell: 92 | 93 | .. code-block:: bash 94 | 95 | exit 96 | 97 | .. _docker-detached: 98 | 99 | How to run a container in the background and attach to it 100 | ========================================================= 101 | 102 | The :ref:`docker-quick-start` showed you how to run a container in interactive mode. 103 | In this mode, Docker immediately opens a shell in the new container. 104 | When you ``exit`` from the shell, the container stops. 105 | 106 | An alternative is to run a container in a detached state. 107 | With a detached container, the container won't stop until you specify it. 108 | 109 | To get started, run the container with the ``-d`` flag (**detached**): 110 | 111 | .. jinja:: default 112 | 113 | .. code-block:: bash 114 | 115 | docker run -itd --name lsst lsstsqre/centos:7-stack-lsst_distrib-{{ release_eups_tag }} 116 | 117 | You still use the ``-it`` arguments to put the container in interactive mode, even though Docker doesn't immediately open a container prompt for you. 118 | 119 | The ``--name lsst`` argument gives the new container a name. 120 | You can choose whatever name makes sense for your work. 121 | This example uses the name "``lsst``." 122 | 123 | Next, from a shell on your host system (the same shell as before, or even a new shell) open a shell in the container with the :command:`docker exec` command: 124 | 125 | .. code-block:: bash 126 | 127 | docker exec -it lsst /bin/bash 128 | 129 | Your prompt is now a prompt in the container. 130 | 131 | You can repeat this process, attaching to the container multiple times, to open multiple container shells. 132 | 133 | To close a container shell, type ``exit``. 134 | 135 | Finally, to stop the container entirely, run this command from your host's shell: 136 | 137 | .. code-block:: bash 138 | 139 | docker stop lsst 140 | 141 | And delete the container: 142 | 143 | .. code-block:: bash 144 | 145 | docker rm lsst 146 | 147 | .. _docker-develop: 148 | 149 | How to develop packages inside Docker containers 150 | ================================================ 151 | 152 | You can develop code, including LSST Science Pipelines packages, with the LSST Science Pipelines Docker images. 153 | This section summarizes the containerized development workflow. 154 | Refer to :doc:`package-development` for general information. 155 | 156 | Basic set up 157 | ------------ 158 | 159 | These steps show how to run a container and build a LSST Science Pipelines package in it: 160 | 161 | 1. **From the host shell,** clone packages into the current working directory. 162 | For example: 163 | 164 | .. code-block:: bash 165 | 166 | git clone https://github.com/lsst/pipe_tasks 167 | 168 | Any datasets you're working with should be in the current working directory as well. 169 | 170 | 2. **From the host shell,** start the container with the current working directory mounted: 171 | 172 | .. jinja:: default 173 | 174 | .. code-block:: bash 175 | 176 | docker run -itd -v `pwd`:/home/lsst/mnt --name lsst lsstsqre/centos:7-stack-lsst_distrib-{{ release_eups_tag }} 177 | 178 | This starts the container in a detached mode so you can open and exit multiple container shells. 179 | Follow the steps in :ref:`docker-detached` to open a shell in the container. 180 | 181 | 3. **From the container's shell,** activate the LSST environment and setup the top-level package: 182 | 183 | .. code-block:: bash 184 | 185 | source /opt/lsst/software/stack/loadLSST.bash 186 | setup lsst_distrib 187 | 188 | 4. **From the container's shell,** change into the directory of the package you cloned and set it up. 189 | For example: 190 | 191 | .. code-block:: bash 192 | 193 | cd mnt/pipe_tasks 194 | setup -r . 195 | 196 | .. note:: 197 | 198 | Compared to the :ref:`typical development work `, the :command:`setup` command shown here does not include the ``-t $USER`` argument to tag the development package. 199 | This is because the Docker container doesn't have a ``$USER`` environment variable set by default. 200 | You can still set up and develop the package this way, it just won't be tagged by EUPS. 201 | 202 | 5. **From the container's shell,** build the package. 203 | For example: 204 | 205 | .. code-block:: bash 206 | 207 | scons -Q -j 6 opt=3 208 | 209 | The containerized development workflow 210 | -------------------------------------- 211 | 212 | To develop packages with Docker containers you will use a combination of shells and applications on both the host system and inside the Docker container. 213 | 214 | **On the host system** you will run your own code editors and :command:`git` to develop the package. 215 | This way you don't have to configure an editor of :command:`git` inside the container. 216 | This is why we mount a local directory with the code and data in it. 217 | 218 | **In container shells** you run commands to set up packages (:command:`setup`), compile code (:command:`scons`), test code (:command:`pytest`), and run the Pipelines on data (:command:`processCcd.py`, for example). 219 | Use :command:`docker exec` to open multiple shells in the container (see :ref:`docker-detached`). 220 | 221 | Cleaning up the development container 222 | ------------------------------------- 223 | 224 | You can stop and delete the container at any time: 225 | 226 | .. code-block:: bash 227 | 228 | docker stop 229 | docker rm 230 | 231 | In this example, the container is named ``lsst``. 232 | 233 | Stopping and deleting a container doesn't affect the data in the local directory you mounted into that container. 234 | 235 | .. _docker-tags: 236 | 237 | Finding images for different LSST Science Pipelines releases 238 | ============================================================ 239 | 240 | LSST Science Pipelines Docker images are published as `lsstsqre/centos`_ on Docker Hub. 241 | These images are based on an AlmaLinux_ base image. 242 | 243 | Docker images are versioned with tags, allowing you to run any release of the LSST Science Pipelines software. 244 | The schema of these tags is: 245 | 246 | .. code-block:: text 247 | 248 | -stack-- 249 | 250 | For example: 251 | 252 | .. jinja:: default 253 | 254 | .. code-block:: text 255 | 256 | 7-stack-lsst_distrib-{{ release_eups_tag }} 257 | 258 | This tag corresponds to: 259 | 260 | - CentOS 7 operating system. 261 | - ``lsst_distrib`` :doc:`top-level package `. 262 | - ``{{ release_eups_tag }}`` EUPS tag. See :ref:`lsstinstall-other-tags` for an overview of LSST's EUPS tag schema. 263 | 264 | .. note:: 265 | 266 | Although the container tag suggests it uses CentOS 7, the underlying image is AlmaLinux 9. The implementation of :jira:`RFC-1037` will introduce proper naming conventions for AlmaLinux. 267 | 268 | You can see what tags are available by browsing `lsstsqre/centos on Docker Hub `_. 269 | 270 | .. seealso:: 271 | 272 | See :ref:`lsstinstall-other-tags` for information on the different types of EUPS tags. 273 | 274 | .. _`lsstsqre/centos`: https://hub.docker.com/r/lsstsqre/centos/ 275 | .. _AlmaLinux: https://almalinux.org 276 | -------------------------------------------------------------------------------- /install/git-lfs.rst: -------------------------------------------------------------------------------- 1 | ###################################################### 2 | Configuring Git LFS for downloading LSST data packages 3 | ###################################################### 4 | 5 | LSST uses `Git LFS`_ to efficiently store large files in Git repositories. 6 | Typical Science Pipelines installations, like ``lsst_distrib``, *do not* require Git LFS. 7 | However, some tutorials might require Git LFS to clone a specific Git repository that does use Git LFS. 8 | The `testdata_ci_hsc`_ package is one example. 9 | This page describes how to configure Git LFS to work with LSST's servers. 10 | 11 | .. note:: 12 | 13 | LSST staff and contributors should follow the `instructions in the Developer Guide`_, specifically `Authenticating for push access`_,for configuring Git LFS with authenticated (push) access. 14 | 15 | .. _git-lfs-installation: 16 | 17 | Getting Git LFS 18 | =============== 19 | 20 | Git LFS may have been installed with your LSST Science Pipelines installation. 21 | 22 | To check that it's available, run: 23 | 24 | .. code-block:: bash 25 | 26 | git-lfs version 27 | 28 | If it is not installed, or if you want to use it You can also install Git LFS independently of the LSST Science Pipelines. 29 | 30 | Follow the instructions on the `Git LFS homepage`_ to install Git LFS onto your system, and then the first bullet point in "Getting Started." 31 | 32 | If all you will need to do is to use files that have been stored in Git LFS, and will not need to read them, 33 | 34 | .. code-block:: bash 35 | 36 | git-lfs install 37 | 38 | is all you need to do after installing the git lfs binaries. This must be done on each machine from which you will be accessing repositories with Git LFS-stored artifacts. 39 | 40 | If all you need is read-only access to LFS-stored files, you're done. The ``.gitattributes`` file will already have been created appropriately in each LFS-using repository. 41 | 42 | .. _git-lfs-test: 43 | 44 | Try it out 45 | ========== 46 | 47 | Trying cloning the `testdata_decam`_ Git repository to test your configuration: 48 | 49 | .. code-block:: bash 50 | 51 | git clone https://github.com/lsst/testdata_decam.git 52 | 53 | .. note:: 54 | 55 | Push access 56 | =========== 57 | 58 | LSST contributors need to follow some extra steps to authenticate commands that push to the upstream repository on GitHub. 59 | See `Authenticating for push access`_. 60 | 61 | .. _`Git LFS homepage`: 62 | .. _Git LFS: https://git-lfs.github.com/ 63 | .. _`Developer Guide for details`: 64 | .. _`instructions in the Developer Guide`: https://developer.lsst.io/git/git-lfs.html 65 | .. _`Authenticating for push access`: https://developer.lsst.io/git/git-lfs.html#git-lfs-auth 66 | .. _`testdata_decam`: https://github.com/lsst/testdata_decam 67 | .. _`testdata_ci_hsc`: https://github.com/lsst/testdata_ci_hsc 68 | -------------------------------------------------------------------------------- /install/index.rst: -------------------------------------------------------------------------------- 1 | ##################################### 2 | Installing the LSST Science Pipelines 3 | ##################################### 4 | 5 | .. This page acts as an index.html for the install/ path, but the curated contents are on the homepage. 6 | 7 | .. toctree:: 8 | :maxdepth: 1 9 | 10 | prereqs 11 | lsstinstall 12 | setup 13 | top-level-packages 14 | docker 15 | lsstsw 16 | demo 17 | git-lfs 18 | package-development 19 | -------------------------------------------------------------------------------- /install/lsstsw.rst: -------------------------------------------------------------------------------- 1 | .. _install-lsstsw: 2 | 3 | ####################################### 4 | Installation with lsstsw and lsst-build 5 | ####################################### 6 | 7 | This page guides you through installing the LSST Science Pipelines from source with lsstsw and lsst-build. 8 | These are the same tools LSST Data Management uses to build and test the Science Pipelines. 9 | 10 | Since lsstsw presents the Science Pipelines as a directory of Git repositories cloned from `github.com/lsst `__, this installation method can be very convenient for developing Science Pipelines code, particularly when modifying multiple packages at the same time. 11 | Other methods of installing LSST Science Pipelines software are :doc:`lsstinstall ` and :doc:`Docker `. 12 | 13 | If you have issues using lsstsw, here are two ways to get help: 14 | 15 | - Review the :ref:`known installation issues `. 16 | - Ask a question on the `LSST Community support forum `_. 17 | 18 | .. _lsstsw-prerequisites: 19 | 20 | 1. Prerequisites 21 | ================ 22 | 23 | The LSST Science Pipelines are developed and tested primarily on AlmaLinux 9, but can be compiled and run on macOS, Debian, Ubuntu, and other Linux distributions. 24 | See :ref:`prereq-platforms` for information about LSST's official reference platform and build reports with other platforms, and follow the instructions under :ref:`system-prereqs` to ensure you have installed the prerequisite software for your platform. 25 | 26 | .. _lsstsw-deploy: 27 | 28 | 2. Deploy lsstsw 29 | ================ 30 | 31 | Begin by choosing a working directory, then deploy ``lsstsw`` into it: 32 | 33 | .. code-block:: bash 34 | 35 | git clone https://github.com/lsst/lsstsw.git 36 | cd lsstsw 37 | ./bin/deploy 38 | source bin/envconfig 39 | 40 | If you are running in a :command:`csh` or :command:`tcsh`, change the last line to: 41 | 42 | .. code-block:: bash 43 | 44 | source bin/envconfig.csh 45 | 46 | For more information about the :command:`deploy` command, see :ref:`lsstsw-about-deploy`. 47 | 48 | .. _lsstsw-rebuild: 49 | 50 | If you intend to use a Git LFS repository, like `testdata_ci_hsc`_ or `afwdata`_, you should :doc:`configure Git LFS ` before you continue. 51 | 52 | 3. Build the Science Pipelines packages 53 | ======================================= 54 | 55 | From the :file:`lsstsw` directory, run: 56 | 57 | .. code-block:: bash 58 | 59 | rebuild -t current lsst_distrib 60 | 61 | Once the ``rebuild`` step finishes, note the build number printed on screen. 62 | It is formatted as "``bNNNN``." 63 | The ``-t current`` argument automatically marks the installed packages with the "current" tag, so that eups will set them up when no version is specified. 64 | The equivalent command to do this manually would be: 65 | 66 | .. code-block:: bash 67 | 68 | eups tags --clone bNNNN current 69 | 70 | Finally, set up the packages with EUPS: 71 | 72 | .. code-block:: bash 73 | 74 | setup lsst_distrib 75 | 76 | See :doc:`setup` for more information. 77 | 78 | .. note:: 79 | 80 | You can do more with the :command:`build` command, including building from branches of GitHub repositories. 81 | For more information: 82 | 83 | - :ref:`lsstsw-about-rebuild`. 84 | - :ref:`lsstsw-branches`. 85 | - :ref:`lsstsw-rebuild-ref`. 86 | 87 | .. _lsstsw-testing-your-installation: 88 | 89 | 4. Testing Your installation (optional) 90 | ======================================= 91 | 92 | Once the LSST Science Pipelines are installed, you can verify that it works by :doc:`running a demo project `. 93 | 94 | .. _lsstsw-upgrading: 95 | 96 | 5. Upgrading your installation 97 | ============================== 98 | 99 | You can upgrade an lsstsw installation in-place by following these steps from within your :file:`lsstsw/` directory. 100 | 101 | #. Start a shell that has not sourced `envconfig`; you may have to comment out a line in e.g. your ``.bashrc``. The `bin/deploy` script needs to run without an active environment to be able to install a new one. 102 | #. Run `git pull` to download the latest environment definition. 103 | #. Run `bin/deploy` to install that new conda environment. 104 | #. Start a new shell for the final command, to ensure your shell environment is properly configured for the new lsstsw env, and `source lsstsw/bin/envconfig` if it is not automatically sourced during your shell startup. 105 | #. Run `rebuild -u -t current lsst_distrib` to download the latest repos definition file, rebuild the entire Science Pipelines codebase, and mark the installed packages with the eups "current" tag. 106 | 107 | If you do not intend to use your older builds in the future, you can remove all of the sub-directories in your :file:`stack/VERSION/` (where ``VERSION`` is the old environment version) path before the upgrade, to save space and reduce the number of eups package versions. 108 | 109 | .. _lsstsw-setup: 110 | 111 | Sourcing the Pipelines in a new shell 112 | ===================================== 113 | 114 | In every new shell session you will need to set up the Science Pipelines environment and EUPS package stack. 115 | 116 | Run these two steps: 117 | 118 | 1. Activate the lsstsw software environment by sourcing the :file:`envconfig` script in lsstsw's :file:`bin` directory: 119 | 120 | .. code-block:: bash 121 | 122 | source bin/envconfig 123 | 124 | If you are running in a :command:`csh` or :command:`tcsh`, run this set up script instead: 125 | 126 | .. code-block:: bash 127 | 128 | source bin/envconfig.csh 129 | 130 | 2. Set up a :doc:`top-level package `: 131 | 132 | .. code-block:: bash 133 | 134 | setup lsst_distrib 135 | 136 | Instead of ``lsst_distrib``, you can set up a different top-level package like ``lsst_apps`` or any individual EUPS package you previously installed. 137 | See :doc:`top-level-packages`. 138 | 139 | .. _lsstsw-next: 140 | 141 | Next steps and advanced topics 142 | ============================== 143 | 144 | - :ref:`lsstsw-about-deploy`. 145 | - :ref:`lsstsw-about-rebuild`. 146 | - :ref:`lsstsw-branches`. 147 | - :ref:`lsstsw-deploy-ref`. 148 | - :ref:`lsstsw-rebuild-ref`. 149 | 150 | .. _lsstsw-about-deploy: 151 | 152 | About the lsstsw deploy script 153 | ------------------------------ 154 | 155 | The ``deploy`` script automates several things to prepare an LSST development environment: 156 | 157 | 1. Installs Miniconda_ and a Python 3 environment specific to this lsstsw workspace, including (another) Git and Git LFS. 158 | 2. Installs EUPS_ into :file:`eups/current/`. 159 | 3. Clones `lsst-build`_, the tool that runs the build process. 160 | 4. Clones versiondb_, a robot-managed Git repository of package dependency information. 161 | 5. Creates an empty stack *installation* directory, :file:`stack/`. 162 | 163 | This environment, including the EUPS, Miniconda, Git, and Git LFS software, is only activated when you source the :file:`bin/envconfig` or :file:`bin/envconfig.csh` scripts in a shell. 164 | Otherwise, lsstsw does not affect the software installed on your computer. 165 | 166 | See also: :ref:`lsstsw-deploy-ref`. 167 | 168 | .. _lsstsw-about-rebuild: 169 | 170 | About the lsstsw rebuild command 171 | -------------------------------- 172 | 173 | The :command:`rebuild` command accomplishes the following: 174 | 175 | 1. Clones all Science Pipelines packages from `github.com/lsst `__. 176 | The `repos.yaml`_ file in the https://github.com/lsst/repos repository maps package names to GitHub repositories. 177 | 178 | 2. Runs the Scons-based build process to compile C++, make Pybind11 bindings, and ultimately create the :lmod:`lsst` Python package. 179 | The stack is built and installed into the :file:`stack/` directory inside your :file:`lsstsw/` work directory. 180 | 181 | lsstsw clones repositories using HTTPS (`see repos.yaml `_). 182 | Our guide to `Setting up a Git credential helper `_ will allow you to push new commits up to GitHub without repeatedly entering your GitHub credentials. 183 | 184 | See also: :ref:`lsstsw-rebuild-ref`. 185 | 186 | .. _lsstsw-branches: 187 | 188 | Building from branches 189 | ---------------------- 190 | 191 | lsstsw's :command:`rebuild` command enables you to clone and build development branches. 192 | 193 | To build ``lsst_distrb``, but use the Git branch ``my-feature`` when it's available in a package's repository: 194 | 195 | .. code-block:: bash 196 | 197 | rebuild -r my-feature lsst_distrib 198 | 199 | Multiple ticket branches across multiple products can be built in order of priority: 200 | 201 | .. code-block:: bash 202 | 203 | rebuild -r feature-1 -r feature-2 lsst_distrib 204 | 205 | In this example, a ``feature-1`` branch will be used in any package's Git repository. 206 | A ``feature-2`` branch will be used secondarily in repositories where ``feature-1`` doesn't exist. 207 | Finally, ``lsstsw`` falls back to using the ``main`` branch for repositories that lack both ``feature-1`` and ``feature-2``. 208 | 209 | .. _lsstsw-deploy-ref: 210 | 211 | lsstsw deploy command reference 212 | ------------------------------- 213 | 214 | .. program:: deploy 215 | 216 | .. code-block:: text 217 | 218 | usage: deploy.sh [-2|-3] [-b] [-h] 219 | 220 | .. option:: -b 221 | 222 | Use bleeding-edge conda packages. 223 | 224 | .. option:: -h 225 | 226 | Print the help message. 227 | 228 | .. option:: -r REF 229 | 230 | Use a particular git ref of the conda packages in scipipe_conda_env. 231 | 232 | .. _lsstsw-rebuild-ref: 233 | 234 | lsstsw rebuild command reference 235 | -------------------------------- 236 | 237 | .. program:: rebuild 238 | 239 | .. code-block:: text 240 | 241 | rebuild [-p] [-n] [-u] [-r [-r [...]]] [-t ] [product1 [product2 [...]]] 242 | 243 | .. option:: -p 244 | 245 | Prep only. 246 | 247 | .. option:: -n 248 | 249 | Do not run :command:`git fetch` in already-downloaded repositories. 250 | 251 | .. option:: -u 252 | 253 | Update the :file:`repos.yaml` package index to the ``main`` branch on GitHub of https://github.com/lsst/repos. 254 | 255 | .. option:: -r 256 | 257 | Rebuild using the Git ref. 258 | A Git ref can be a branch name, tag, or commit SHA. 259 | Multiple ``-r`` arguments can be given, in order or priority. 260 | 261 | .. option:: -t 262 | 263 | EUPS tag. 264 | 265 | .. _lsst-build: https://github.com/lsst/lsst_build 266 | .. _versiondb: https://github.com/lsst/versiondb 267 | .. _EUPS: https://github.com/RobertLuptonTheGood/eups 268 | .. _Miniconda: http://conda.pydata.org/miniconda.html 269 | .. _`repos.yaml`: https://github.com/lsst/repos/blob/main/etc/repos.yaml 270 | .. _`testdata_ci_hsc`: https://github.com/lsst/testdata_ci_hsc 271 | .. _`afwdata`: https://github.com/lsst/afwdata 272 | -------------------------------------------------------------------------------- /install/package-development.rst: -------------------------------------------------------------------------------- 1 | ############################################################# 2 | Building a package with the installed Science Pipelines stack 3 | ############################################################# 4 | 5 | You can build packages alongside an LSST Science Pipelines stack that you have installed with :doc:`lsstinstall `, :doc:`lsstsw `, or :doc:`Docker `. 6 | This page describes how to build and set up packages cloned directly from GitHub. 7 | 8 | .. note:: 9 | 10 | If you are developing with the LSST Docker images, refer to :ref:`docker-develop`. 11 | That page describes Docker-specific patterns and complements the general information on this page. 12 | 13 | .. _package-dev-prereq: 14 | 15 | 0. Prerequisites 16 | ================ 17 | 18 | Before developing an individual package, you need an installed Science Pipelines stack. 19 | You can install the Pipelines through any of these methods: :doc:`lsstinstall `, :doc:`lsstsw `, or :doc:`Docker `. 20 | 21 | Then set up the Pipelines software in a shell. 22 | See :doc:`setup` for more information. 23 | 24 | .. _package-dev-clone: 25 | 26 | 1. Clone a package 27 | ================== 28 | 29 | Use Git to clone the package you want to work on. 30 | Most LSST packages are available in the LSST's GitHub organization (https://github.com/lsst). 31 | The `repos.yaml file in the 'repos' repository `_ also maps package names to repository URLs. 32 | 33 | For example: 34 | 35 | .. code-block:: bash 36 | 37 | git clone https://github.com/lsst/pipe_tasks 38 | cd pipe_tasks 39 | 40 | You can also create a new package repository, though this is beyond this document's scope. 41 | 42 | You will work from a Git branch when developing a package. 43 | The `DM Developer Guide `_ describes the branching workflow that LSST staff use. 44 | 45 | .. note:: 46 | 47 | **Docker users:** Clone the package onto your host filesystem rather than directly into the container by mounting a host directory in the container. 48 | See :ref:`docker-develop`. 49 | 50 | .. warning:: 51 | 52 | **lsstsw users:** The :file:`lsstsw/build` directory already includes clones of Git repositories. 53 | These repositories are reset when you run :ref:`rebuild `, though, so you can potentially lose local changes. 54 | It's usually better to clone and work with Git repositories outside of the :file:`lsstsw` directory. 55 | 56 | .. _package-dev-setup: 57 | 58 | 2. Set up the package 59 | ===================== 60 | 61 | From the package's directory, set up the package itself in the EUPS stack: 62 | 63 | .. code-block:: bash 64 | 65 | setup -r . -t $USER 66 | 67 | .. _package-dev-scons: 68 | 69 | 3. Build the package with Scons 70 | =============================== 71 | 72 | .. code-block:: bash 73 | 74 | scons -Q -j 6 opt=3 75 | 76 | These flags configure Scons: 77 | 78 | - ``-Q``: reduce logging to the terminal. 79 | - ``-j 6``: build in parallel (for example, with '6' CPUs). 80 | - ``opt=3``: build with level 3 optimization. 81 | Use ``opt=0`` (or ``opt=g`` with gcc compilers) for debugging. 82 | 83 | This ``scons`` command will run several targets by default, in sequence: 84 | 85 | 1. ``lib``: build the C++ code and Pybind11 interface layer. 86 | 2. ``python``: install the Python code. 87 | 3. ``tests``: run the unit tests. 88 | 4. ``example``: compile the examples. 89 | 5. ``doc``: compile Doxygen-based documentation. 90 | 6. ``shebang``: convert the ``#!/usr/bin/env`` line in scripts for OS X compatibility (see `DMTN-001 `_). 91 | 92 | You can build a subset of these targets by specifying one explicitly. 93 | For example, to compile C++, build the Python package and run tests: 94 | 95 | .. code-block:: bash 96 | 97 | scons -Q -j 6 opt=3 tests 98 | 99 | .. _package-dev-next-steps: 100 | 101 | Next steps 102 | ========== 103 | 104 | By following these steps, you have built a package from source alongside an installed Science Pipelines software stack. 105 | Now when you run the Science Pipelines, your new package will be used instead of the equivalent package provided by the Science Pipelines installation. 106 | Here are some tasks related to maintaining this development software stack: 107 | 108 | - :ref:`package-dev-eups-list`. 109 | - :ref:`package-dev-setup-shell`. 110 | - :ref:`package-dev-unsetup`. 111 | 112 | .. _package-dev-eups-list: 113 | 114 | Reviewing set up packages 115 | ------------------------- 116 | 117 | Packages that are *set up* are part of the active Science Pipelines software stack. 118 | You can see what packages are currently set up by running: 119 | 120 | .. code-block:: bash 121 | 122 | eups list -s 123 | 124 | You can also review what version of a single package is set up by running: 125 | 126 | .. code-block:: bash 127 | 128 | eups list 129 | 130 | .. _package-dev-setup-shell: 131 | 132 | Setting up in a new shell 133 | ------------------------- 134 | 135 | Whenever you open a new shell you need to set up both the LSST software environment and the LSST software stack. 136 | See :doc:`setup` for the basic procedure. 137 | 138 | In addition to setting up the installed Science Pipelines software, you separately need to set up the development package itself. 139 | You can do this following the instruction in step :ref:`package-dev-setup`. 140 | 141 | .. _package-dev-unsetup: 142 | 143 | Un-set up the development package 144 | --------------------------------- 145 | 146 | You can un-set up a development package to revert to the installed LSST Science Pipelines distribution. 147 | 148 | To switch from a development package to the released package: 149 | 150 | .. code-block:: bash 151 | 152 | setup -j -t current 153 | 154 | ``current`` is the default tag normally used for the installed LSST Science Pipelines software stack. 155 | 156 | To un-set up a development package without replacing it: 157 | 158 | .. code-block:: bash 159 | 160 | unsetup -j -t $USER 161 | 162 | This is useful if you are developing a new package that is not part of the installed LSST Science Pipelines software stack. 163 | -------------------------------------------------------------------------------- /install/prereqs.rst: -------------------------------------------------------------------------------- 1 | ############# 2 | Prerequisites 3 | ############# 4 | 5 | This page lists software needed to install and use the LSST Science Pipelines. 6 | 7 | .. _prereq-platforms: 8 | 9 | Platform compatibility 10 | ====================== 11 | 12 | The LSST Data Management reference platform is AlmaLinux 9. 13 | This is the platform we officially develop, test, and operate with. 14 | 15 | Besides the reference platform, the Pipelines can run on a variety of other Linux distributions like Ubuntu, and various versions of macOS. 16 | 17 | .. _system-prereqs: 18 | 19 | System prerequisites 20 | ==================== 21 | 22 | The Science Pipelines are developed and built using a standard `Conda`_ environment. 23 | This provides (almost; see below) all necessary tools and libraries for building and running the Pipelines. 24 | On installation, you will be given the option of automatically installing that environment. 25 | If you decline, your system must have all of the prerequisites listed below available: 26 | 27 | .. jinja:: default 28 | 29 | - `macOS `_. 30 | - `Linux `_. 31 | 32 | In addition to the Conda environment, the following packages must be installed on the host system. 33 | 34 | .. _Conda: https://conda.io 35 | 36 | AlmaLinux 9 37 | ----------- 38 | 39 | On AlmaLinux 9 patch needs to be installed. 40 | 41 | .. code-block:: bash 42 | 43 | sudo dnf install patch 44 | 45 | If you wish to follow the instructions for :doc:`lsstsw` (recommended for some developers, but not necessary for most users), you will also need to install git: 46 | 47 | .. code-block:: bash 48 | 49 | sudo dnf install git 50 | 51 | Debian/Ubuntu 52 | ------------- 53 | 54 | On Debian and Ubuntu systems, curl and patch are required. 55 | These may be installed as follows: 56 | 57 | .. code-block:: bash 58 | 59 | sudo apt-get update && sudo apt-get install curl patch 60 | 61 | If you wish to follow the instructions for :doc:`lsstsw` (recommended for some developers, but not necessary for most users), you will also need to install git: 62 | 63 | .. code-block:: bash 64 | 65 | sudo apt-get update && sudo apt-get install git 66 | 67 | Tip: installing ``xpa-tools`` with this specific statement might be necessary, in some cases, 68 | in order to avoid `xpa`-related errors when displaying images in DS9: 69 | 70 | .. code-block:: bash 71 | 72 | sudo apt install xpa-tools; xpans 73 | 74 | 75 | macOS 76 | ----- 77 | 78 | On macOS systems, please install the Xcode Command Line Tools: 79 | 80 | .. code-block:: bash 81 | 82 | xcode-select --install 83 | 84 | 85 | .. _filesystem-prereqs: 86 | 87 | Filesystem prerequisites 88 | ======================== 89 | 90 | Filesystems used for compiling the Stack and hosting output data repositories must support the ``flock`` system call for file locking. 91 | Local filesystems virtually always have this support. 92 | Network filesystems are sometimes mounted without such support to improve performance; the output of the :command:`mount` command may show the ``nolock`` or ``noflock`` option in those cases. 93 | 94 | .. _optional-deps: 95 | 96 | Optional dependencies 97 | ===================== 98 | 99 | Some pipeline components use `SAOImage DS9 `_, if available, for image display purposes. 100 | -------------------------------------------------------------------------------- /install/setup.rst: -------------------------------------------------------------------------------- 1 | .. _setup: 2 | 3 | ########################################### 4 | Setting up installed LSST Science Pipelines 5 | ########################################### 6 | 7 | Whenever you start a new command-line shell, you need to set up the LSST Science Pipelines software before you can use it. 8 | 9 | .. _setup-howto: 10 | 11 | Setting up 12 | ========== 13 | 14 | Setting the LSST Science Pipelines in a shell is a two-step process: 15 | 16 | 1. Load the LSST environment by sourcing the ``loadLSST`` script in your installation directory: 17 | 18 | .. TODO Use sphinx-tabs here? 19 | 20 | .. code-block:: bash 21 | 22 | source loadLSST.bash # for bash 23 | source loadLSST.csh # for csh 24 | source loadLSST.ksh # for ksh 25 | source loadLSST.zsh # for zsh 26 | 27 | To customize the conda environment used, set the ``LSST_CONDA_ENV_NAME`` environment variable to a conda enviroment name when sourcing the file. 28 | For other conda environments installed by LSST tools, this name will be the ``rubin-env`` metapackage version prefixed with ``lsst-scipipe-``. 29 | 30 | .. note:: 31 | 32 | These instructions are for :doc:`lsstinstall`-based installations. 33 | For ``lsstsw``, follow :ref:`these instructions ` instead. 34 | 35 | To find the ``rubin-env`` conda metapackage version appropriate for a particular science pipelines release, see :ref:`release-history` or the release tag files at `https://eups.lsst.codes/stack/src/tags/`_. 36 | 37 | 38 | 2. Set up a top-level package: 39 | 40 | .. code-block:: bash 41 | 42 | setup 43 | 44 | For example, ``setup lsst_apps`` or ``setup lsst_distrib``. 45 | See :doc:`top-level-packages` for more about LSST's top-level packages. 46 | 47 | .. _setup-list: 48 | 49 | Listing what packages are set up 50 | ================================ 51 | 52 | To see what packages (and their versions) are currently set up: 53 | 54 | .. code-block:: bash 55 | 56 | eups list -s 57 | 58 | To see all packages that are installed, even if not currently set up, run: 59 | 60 | .. code-block:: bash 61 | 62 | eups list 63 | -------------------------------------------------------------------------------- /install/top-level-packages.rst: -------------------------------------------------------------------------------- 1 | ############################# 2 | Top-level packages to install 3 | ############################# 4 | 5 | The LSST Science Pipelines are part of LSST's EUPS_ package stack. 6 | This means that the Science Pipelines software is actually a collection of packages that you install and set up together. 7 | By specifying different **top-level packages** to the :ref:`eups distrib install ` and :doc:`setup ` commands, you can control the size of the software installation or add new capabilities. 8 | 9 | This page describes the common top-level packages that make up the LSST Science Pipelines and related EUPS stacks. 10 | 11 | lsst\_distrib 12 | ============= 13 | 14 | This package provides all of the core Science Pipelines functionality, together with additional measurement algorithms, support for a wider variety of instrumentation, and process execution middleware designed for running pipelines on a cluster. 15 | 16 | Example installation (:ref:`more info `): 17 | 18 | .. code-block:: bash 19 | 20 | eups distrib install lsst_distrib -t 21 | setup lsst_distrib 22 | 23 | .. _EUPS: https://github.com/RobertLuptonTheGood/eups 24 | 25 | lsst\_apps 26 | ========== 27 | 28 | This package provides the only core frameworks and algorithmic code. 29 | It may be convenient when space is at a premium. 30 | 31 | Example installation (:ref:`more info `): 32 | 33 | .. code-block:: bash 34 | 35 | eups distrib install lsst_apps -t 36 | setup lsst_apps 37 | -------------------------------------------------------------------------------- /known-issues.rst: -------------------------------------------------------------------------------- 1 | .. 2 | Keep these known issues updated to the current state of the software. 3 | 4 | Maintain the existing headers in Installation Issues and simply report "None" 5 | if there are no issues at the moment. 6 | 7 | ############ 8 | Known Issues 9 | ############ 10 | 11 | .. _installation-issues: 12 | 13 | Binary installation issues 14 | ========================== 15 | 16 | Cross Platform 17 | -------------- 18 | 19 | - Detailed numerical results may be sensitive to the exact versions of third party numerical libraries. 20 | If you are not using the :ref:`standard Conda environment ` you may encounter issues when running the :doc:`validation demo `. 21 | 22 | .. _src-installation-issues: 23 | 24 | Source installation issues 25 | ========================== 26 | 27 | .. _installation-issues-cross-platform: 28 | 29 | Cross Platform 30 | -------------- 31 | 32 | - It is not recommended to install using a privileged user, as certain tests might fail during the build process. 33 | - Compiling some packages---in particular ``afw``\ ---require large amounts of RAM to compile. 34 | This is compounded as the system will automatically attempt to parallelize the build, and can cause the build to run extremely slowly or fail altogether. 35 | On machines with less than 8 GB of RAM, disable parallelization by setting ``EUPSPKG_NJOBS=1`` in your environment before running ``eups distrib``. 36 | 37 | - Detailed numerical results may be sensitive to the exact versions of third party numerical libraries. 38 | If you are not using the :ref:`standard Conda environment ` you may encounter issues when running the :doc:`validation demo `. 39 | 40 | .. _installation-issues-macos: 41 | 42 | macOS specific 43 | -------------- 44 | 45 | - Some old installations of Xcode on Macs create a :file:`/Developer` directory which can interfere with installation. 46 | 47 | .. _other-issues: 48 | 49 | Other issues 50 | ============ 51 | 52 | Many versions of the DS9 image viewer software incorrectly read mask planes in Science Pipelines image files as all zeros. 53 | -------------------------------------------------------------------------------- /metrics.rst: -------------------------------------------------------------------------------- 1 | ############################### 2 | Characterization Metric Reports 3 | ############################### 4 | 5 | Starting from Summer 2015, administrative ("cycle") releases are accompanied by a measurements report characterizing the current performance. 6 | Metrics included in these reports are expected to increase in number and sophistication at subsequent releases. 7 | 8 | - :ref:`Release 28.0.0 `: `DMTR-451 `_ 9 | - :ref:`Release 27.0.0 `: `DMTR-431 `_ 10 | - :ref:`Release 26.0.0 `: `DMTR-421 `_ 11 | - :ref:`Release 25.0.0 `: `DMTR-392 `_ 12 | - :ref:`Release 24.1.0 `: `DMTR-391 `_ 13 | - :ref:`Release 23.0.0 `: `DMTR-351 `_ 14 | - :ref:`Release 22.0.0 `: `DMTR-311 `_ 15 | - :ref:`Release 21.0.0 `: `DMTR-281 `_ 16 | - :ref:`Release 20.0.0 `: `DMTR-251 `_ 17 | - :ref:`Release 19.0.0 `: `DMTR-191 `_ 18 | - :ref:`Release 18.0.0 `: `DMTR-141 `_ 19 | - :ref:`Release 17.0 `: `DMTR-131 `_ 20 | - :ref:`Release 16.0 `: `DMTR-81 `_ 21 | - :ref:`Release 15.0 `: `DMTR-62 `_ 22 | - :ref:`Release 14.0 `: `DMTR-41 `_ 23 | - :ref:`Release 13.0 `: `DMTR-15 `_ 24 | - :ref:`Release 12.0 `: `DMTR-14 `_ 25 | - :ref:`Release 11.0 `: `DMTR-11 `_ 26 | -------------------------------------------------------------------------------- /middleware/index.rst: -------------------------------------------------------------------------------- 1 | ################################### 2 | Data access and pipeline middleware 3 | ################################### 4 | 5 | .. toctree:: 6 | 7 | faq 8 | -------------------------------------------------------------------------------- /releases/README.md: -------------------------------------------------------------------------------- 1 | # How to document releases 2 | 3 | ## Release notes 4 | 5 | Release notes are published through the `releases/vXX_Y_Z.rst` files; one per each release 6 | In addition, the `releases/tickets/vXX_Y_Z.rst` file lists all the tickets closed in the corresponding release. 7 | 8 | To start release notes for a new release: 9 | 10 | 1. Create a new file in `releases/` based on the template below 11 | 2. Create a new file in `releases/tickets` listing all the tickets closed in this release 12 | 3. Remove the `release-latest` label from the previous latest release 13 | 4. Add an entry for `vXX_Y_Z` to `releases/index.rst` 14 | 15 | ### Template for release notes 16 | 17 | ``` 18 | .. _release-latest: 19 | .. _release-vNN-N: 20 | 21 | ################################################## 22 | LSST Science Pipelines XX.Y.Z Release (YYYY-MM-DD) 23 | ################################################## 24 | 25 | .. toctree:: 26 | :hidden: 27 | 28 | tickets/vXX_Y_Z 29 | 30 | +-------------------------------------------+------------+ 31 | | Source | Identifier | 32 | +===========================================+============+ 33 | | Git tag | XX.Y.Z | 34 | +-------------------------------------------+------------+ 35 | | :doc:`EUPS distrib ` | vXX\_Y\_Z | 36 | +-------------------------------------------+------------+ 37 | 38 | This release is based on the ``w_YYYY_NN`` weekly build. 39 | 40 | These release notes highlight significant changes to the Science Pipelines codebase which are likely to be of wide interest. 41 | For a complete list of changes made, see :doc:`tickets/vXX_Y_Z`. 42 | 43 | - :ref:`release-vXX-Y-Z-functionality` 44 | - :ref:`release-vXX-Y-Z-interface` 45 | - :ref:`release-vXX-Y-Z-pending-deprecations` 46 | - :ref:`release-vXX-Y-Z-deprecations` 47 | 48 | *See also:* 49 | 50 | .. todo:: 51 | 52 | Insert link to the CMR when it is available. 53 | 54 | .. todo:: 55 | 56 | Insert link to Doxygen documentation when the release is frozen. 57 | 58 | - :doc:`Installation instructions ` 59 | - :doc:`Known issues ` 60 | - `Characterization Metric Report (DMTR-NNN) `_ 61 | - `Doxygen Documentation`__ 62 | 63 | __ http://doxygen.lsst.codes/stack/doxygen/xlink_master_XXXX/ 64 | 65 | 66 | .. _release-vXX-Y-Z-functionality: 67 | 68 | Major New Features 69 | ================== 70 | 71 | .. Insert list of :ref:`..` to individual items 72 | 73 | .. Add items here 74 | 75 | .. _release-vXX-Y-Z-interface: 76 | 77 | Significant Interface Changes 78 | ============================= 79 | 80 | .. Insert list of :ref:`..` to individual items 81 | 82 | .. Add items here 83 | 84 | .. _release-vXX-Y-Z-pending-deprecations: 85 | 86 | Pending Deprecations 87 | ==================== 88 | 89 | .. That is, items that we anticipate deprecating in the next release 90 | .. Insert list of :ref:`..` to individual items 91 | 92 | .. Add items here 93 | 94 | .. _release-vXX-Y-Z-deprecations: 95 | 96 | Deprecations 97 | ============ 98 | 99 | .. That is, items that are deprecated in this releas 100 | .. Insert list of :ref:`..` to individual items 101 | 102 | .. Add items here 103 | 104 | 105 | ``` 106 | 107 | Release note items have headers at the `-` symbol level. 108 | 109 | Use the 110 | 111 | ``` 112 | :jirab:`DM-NNNN` 113 | ``` 114 | 115 | role at the of release note items to indicate and link to the corresponding link tickets and RFCs. 116 | If the item is a single paragraph, the Jira link should occur at the end of the paragraph. 117 | If the item has several paragraphs, the Jira link should occur in its own paragraph at the end of the release note item. 118 | -------------------------------------------------------------------------------- /releases/data-products/v20_0_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v20-0-0-data-products: 2 | 3 | ########################################## 4 | Changes to Data Products in Release 20.0.0 5 | ########################################## 6 | 7 | The notes below highlight significant changes to data products generated by release 20.0.0 the Science Pipelines when processing DECam (HiTS; `Förster et al., 2016 `_ ) and/or Hyper Suprime-Cam data using the default configuration. 8 | Results obtained in practice will vary with the pipeline configuration. 9 | 10 | For a more detailed technical discussion of this release, refer to the :ref:`release notes `. 11 | Some items described in those notes may be missing here because they have not yet attained the level of vetting or integration necessary to include them in our standard processing, or simply because they have little or no effect on data products from the perspective of science users consuming them. 12 | 13 | Alert Production Data Products 14 | ============================== 15 | 16 | Serialized Alert Packets 17 | ------------------------ 18 | 19 | ``ap_assocation`` now serializes alert packets in `Apache Avro`_ format to disk. 20 | As of this release, these alerts do not yet provide all contents specified by :lse:`163` (the Data Products Definition Document) — in particular, they do not include cut-out images. 21 | The :ref:`alert_packet ` package contains the alert schemas as well as a number of utility routines for manipulating alert packets. 22 | :jirab:`DM-24324` 23 | 24 | .. _Apache Avro: https://avro.apache.org. 25 | 26 | Fixes to Alard and Lupton Decorrelation 27 | --------------------------------------- 28 | 29 | Some bugs in the Discrete Fourier Transform implementation of the Alard and Lupton decorrelation (:dmtn:`021`) were fixed. 30 | The corresponding improvements to detection thresholding will be most noticeable when processing datasets where the noise in the co-added template is not negligible. 31 | :jirab:`DM-24371` 32 | 33 | Rescale Template Variances during Difference Imaging 34 | ---------------------------------------------------- 35 | 36 | Detection thresholds for difference imaging will be incorrect if the variances of the input images are incorrect. 37 | Pixel covariances introduced to template images during the coaddition and warping process are one source of frequent variance inaccuracy. 38 | A ``doScaleTemplateVariance`` configuration option has been added to :lsst-task:`lsst.pipe.tasks.imageDifference.ImageDifferenceTask`. 39 | If enabled, :lsst-task:`lsst.pipe.tasks.scaleVariance.ScaleVarianceTask` is invoked by :lsst-task:`lsst.ip.diffim.imageDifference.ImageDifferenceTask` to empirically rescale the template variance before PSF matching and subtraction. 40 | :jirab:`DM-20558` 41 | 42 | Data Release Data Products 43 | ========================== 44 | 45 | Standardized Source Tables 46 | -------------------------- 47 | 48 | The ``src`` catalogs produced by single-frame processing are now rewritten with standardized columns and calibrations applied, producing the new ``sourceTable`` (one per visit+detector) and ``sourceTable_visit`` (one per visit) butler datasets. 49 | These should have a more stable and well-documented schema than the ``src`` catalogs they are derived from, and will evolve towards the complete Source table described in :lse:`163`. 50 | By using a columnar data format (`Apache Parquet `_), access to just a few columns is also considerably faster. 51 | 52 | Similar Parquet tables for objects (``objectTable`` and ``objectTable_tract``) have existed since the last major release, based on the lower-level ``deepCoadd_meas``, ``deepCoadd_ref``, and ``deepCoadd_forced_src`` datasets. 53 | 54 | :jirab:`DM-24062` 55 | 56 | Fixes to Defect Masking 57 | ----------------------- 58 | 59 | Large-area sensor defects were not being masked or interpolated properly in the previous release, because the convolution-like correction for the brighter-fatter effect could expand the affected region beyond our original masks. 60 | This has been addressed by growing the mask region. 61 | :jirab:`DM-23083` 62 | 63 | Sky Sources in Single-Frame Measurement 64 | --------------------------------------- 65 | 66 | Single-frame processing catalogs (including ``src`` and the new ``sourceTable`` and ``sourceTable_visit``) now include "sky sources" -- catalog entries that represent random patches of approximately blank sky (defined as not overlapping any detection footprint) that are measured with exactly the same algorithms as real detections. 67 | These can be used for diagnostics on background subtraction, detection thresholds, noise propagation, and probably many aspects of processing we haven't considered, but it's also easy to accidentally include them in analyses where only real detections should be used. 68 | This can be avoided by filtering rows where the ``sky_source`` flag field is ``True``. 69 | 70 | See `community.lsst.org `_ for more information. 71 | 72 | :jirab:`DM-23078` 73 | 74 | Mitigations for Background-Subtraction Problems in Aperture Fluxes 75 | ------------------------------------------------------------------ 76 | 77 | The quality of background subtractions has long been recognized as a problem in our Hyper Suprime-Cam processing, due to a combination of depth, competing science goals, and poorly understood instrumental features. 78 | One impact of this that we had not appreciated until recently was the degree to which this affected our aperture photometry, especially on single-epoch processing (which happens before some of our more sophisticated background subtraction steps). 79 | 80 | This release includes two changes to mitigate this problem: 81 | 82 | - Aperture corrections are now derived from a brighter sample of stars, for which the errors due to bad backgrounds are relatively smaller. :jirab:`DM-23071` 83 | - When using ``fgcmcal`` to perform relative photometric calibration, an analytic correction for the background bias is applied to all input photometry, including the quality of the overall calibration. :jirab:`DM-23036` 84 | -------------------------------------------------------------------------------- /releases/data-products/v21_0_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v21-0-0-data-products: 2 | 3 | ########################################## 4 | Changes to Data Products in Release 21.0.0 5 | ########################################## 6 | 7 | The notes below highlight significant changes to data products generated by release 21.0.0 the Science Pipelines when processing DECam (HiTS; `Förster et al., 2016 `_ ) and/or Hyper Suprime-Cam data using the default configuration. 8 | Results obtained in practice will vary with the pipeline configuration. 9 | 10 | For a more detailed technical discussion of this release, refer to the :ref:`release notes `. 11 | Some items described in those notes may be missing here because they have not yet attained the level of vetting or integration necessary to include them in our standard processing, or simply because they have little or no effect on data products from the perspective of science users consuming them. 12 | 13 | Alert Production Data Products 14 | ============================== 15 | 16 | Data Release Data Products 17 | ========================== 18 | -------------------------------------------------------------------------------- /releases/index.rst: -------------------------------------------------------------------------------- 1 | .. _release-history: 2 | 3 | ############### 4 | Release History 5 | ############### 6 | 7 | .. toctree:: 8 | :maxdepth: 1 9 | 10 | v28_0_0 11 | v27_0_0 12 | v26_0_0 13 | v25_0_0 14 | v24_1_0 15 | v24_0_0 16 | v23_0_0 17 | v22_0_0 18 | v21_0_0 19 | v20_0_0 20 | v19_0_0 21 | v18_1_0 22 | v18_0_0 23 | v17_0 24 | v16_0 25 | v15_0 26 | v14_0 27 | v13_0 28 | v12_0 29 | v11_0 30 | -------------------------------------------------------------------------------- /releases/tickets/v18_1_0.rst: -------------------------------------------------------------------------------- 1 | ################################### 2 | Tickets Addressed in Release 18.1.0 3 | ################################### 4 | 5 | - :jira:`DM-20664`: Backport issue DM-20506 to 18.1.x branch [daf_base] 6 | -------------------------------------------------------------------------------- /releases/tickets/v24_1_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v24-1-0-tickets: 2 | 3 | ################################### 4 | Tickets Addressed in Release 24.1.6 5 | ################################### 6 | 7 | - :jira:`DM-47113`: Backport fix to prevent NaN values in scarlet lite models to v24 [scarlet] 8 | 9 | ################################### 10 | Tickets Addressed in Release 24.1.5 11 | ################################### 12 | 13 | - `DM-43843 `_: Investigate failed blends in HSC PDR4 [meas\_extensions\_scarlet] 14 | 15 | ################################### 16 | Tickets Addressed in Release 24.1.4 17 | ################################### 18 | 19 | - :jira:`DM-41900`: maskStreaks failed on 1 charImage dataId in the w\_2023\_47 RC2 [pipe\_tasks] 20 | 21 | ################################### 22 | Tickets Addressed in Release 24.1.3 23 | ################################### 24 | 25 | - :jira:`DM-36718`: Multi shapelet convolution test is very sensitive [shapelet] 26 | 27 | ################################### 28 | Tickets Addressed in Release 24.1.2 29 | ################################### 30 | 31 | - :jira:`DM-30947`: Reimplement C++ HSM moment measurement algorithms in Python [meas\_extensions\_shapeHSM] 32 | - :jira:`DM-41489`: Reimplement C++ HSM shear measurement algorithms in Python [meas\_extensions\_shapeHSM] 33 | - :jira:`DM-41648`: Write a plugin to output PSF higher order moments in meas catalogs [meas\_extensions\_shapeHSM] 34 | - :jira:`DM-41908`: Speed up HSM plugins in Python [meas\_extensions\_shapeHSM] 35 | - :jira:`DM-41994`: Speed up galsim interface with meas\_extensions\_shapeHSM [meas\_extensions\_shapeHSM] 36 | - :jira:`DM-42170`: Add unit test to verify shapeHSM shear measurements against GalSim's Python layer outputs [meas\_extensions\_shapeHSM] 37 | 38 | ################################### 39 | Tickets Addressed in Release 24.1.1 40 | ################################### 41 | 42 | - :jira:`DM-30535`: Create docs page for ScarletDeblendTask [afw, meas\_extensions\_scarlet] 43 | - :jira:`DM-40186`: Investigate PDR2 detection errors: 983 instances of exited early Insufficient good sky source flux measurements for dynamic threshold calculation [meas\_algorithms] 44 | - :jira:`DM-40451`: Adapt scarlet to conditionally skip bands with partial coverage [afw, meas\_extensions\_scarlet, pipe\_tasks, sdm\_schemas] 45 | - :jira:`DM-40456`: sphgeom is failing a test on macOS Ventura (13) on x86 [sphgeom] 46 | - :jira:`DM-40463`: gaap is failing a test on macOS Ventura [meas\_extensions\_gaap] 47 | - :jira:`DM-40781`: RC2 tract 9813 psf weighted mean map has nans [pipe\_tasks] 48 | - :jira:`DM-40921`: DM-40451 breaks ci\_imsim [meas\_extensions\_scarlet, sdm\_schemas] 49 | - :jira:`DM-40957`: Investigate failed blends in RC2 subset after DM-40451 [meas\_extensions\_scarlet] 50 | - :jira:`DM-41008`: Fix long runtime in forcedPhotCcd with no data footprints [meas\_extensions\_scarlet, pipe\_tasks] 51 | 52 | ################################### 53 | Tickets Addressed in Release 24.1.0 54 | ################################### 55 | 56 | - :jira:`DM-16724`: Dynamic detection failure due to lack of sky-object measurements is too quiet [meas_algorithms] 57 | - :jira:`DM-23781`: Improve Sky Object Placement [meas_algorithms] 58 | - :jira:`DM-34959`: Replace read\_gpickle and write\_gpickle in GenericWorkflow [ctrl_bps] 59 | - :jira:`DM-35207`: Use final PSF models to determine inputs to coaddition [analysis_drp, ap_pipe, drp_pipe, meas_base, obs_lsst, pipe_tasks, pipelines_check] 60 | - :jira:`DM-36998`: Remove large objects from Piff results by default [meas_extensions_piff] 61 | - :jira:`DM-37249`: Make butler registry compatible with transaction-level connection pooling [daf_butler] 62 | - :jira:`DM-37257`: If brightObjectMask is unavailable proceed making Coadd without [pipe_tasks] 63 | - :jira:`DM-37411`: Add visit-level PSF model robustness metrics [afw, meas_deblender, pipe_tasks, sdm_schemas] 64 | - :jira:`DM-37412`: Refactor ComputeExposureSummaryStats to allow fine-grained updates [afw, pipe_tasks] 65 | - :jira:`DM-37559`: DM-35207 broke ap\_verify [ap_pipe] 66 | - :jira:`DM-37786`: updateVisitSummary failure in some HSC-RC2 visits with w\_2023\_03 [drp_pipe, pipe_base, pipe_tasks] 67 | - :jira:`DM-37954`: Backport bps\_usdf.yaml to v24.0.0 [ctrl_bps_panda] 68 | - :jira:`DM-38065`: Make release notes for middleware v25 [ctrl_bps, ctrl_bps_panda] 69 | - :jira:`DM-38307`: Allow output collection to not be specified [ctrl_bps, ctrl_bps_panda] 70 | - :jira:`DM-38808`: Proper motion correction is wrong for negative epoch shift in ReferenceObjectLoader [jointcal, meas_algorithms] 71 | - :jira:`DM-39101`: Implement a maximum aperture radius for Kron aperture calculations [meas_extensions_photometryKron] 72 | - :jira:`DM-39342`: v24 DRP-Prod.yaml: Ensure FGCM configs handle HSC-I2/R2 and add version w/ DIA pipeline [drp_pipe] 73 | - :jira:`DM-39482`: Correct HSC NB1010 colorterm filtername [ap_pipe, fgcmcal, obs_subaru] 74 | -------------------------------------------------------------------------------- /releases/v12_0_qserv_dax.rst: -------------------------------------------------------------------------------- 1 | .. _release-v12-0-qserv-dax: 2 | 3 | ####################################################### 4 | Winter 2016 & X2016 QServ and Data Access Release Notes 5 | ####################################################### 6 | 7 | The 12.0 release of the LSST Science Pipelines includes Qserv release 2016_05. 8 | 9 | - :ref:`release-v12-0-qserv-major-changes` 10 | - :ref:`release-v12-0-qserv-bug-fixes` 11 | - :ref:`release-v12-0-qserv-internal-improvements` 12 | 13 | *See also:* 14 | 15 | - `Qserv 2016_05 documentation `_ 16 | 17 | .. _release-v12-0-qserv-major-changes: 18 | 19 | Major Functionality and Interface Changes 20 | ========================================= 21 | 22 | - :ref:`release-v12-0-qserv-shared-scans` 23 | - :ref:`release-v12-0-qserv-large-results` 24 | - :ref:`release-v12-0-qserv-query-cancellation` 25 | - :ref:`release-v12-0-qserv-query` 26 | - :ref:`release-v12-0-qserv-logging` 27 | - :ref:`release-v12-0-qserv-sqlalchemy` 28 | - :ref:`release-v12-0-qserv-sql-css` 29 | - :ref:`release-v12-0-qserv-multinode-docker` 30 | - :ref:`release-v12-0-qserv-database-delete` 31 | - :ref:`release-v12-0-qserv-mariadb` 32 | - :ref:`release-v12-0-qserv-czar-in-proxy` 33 | - :ref:`release-v12-0-qserv-docs` 34 | 35 | .. _release-v12-0-qserv-shared-scans: 36 | 37 | Shared scan performance improvements 38 | ------------------------------------ 39 | 40 | Qserv's shared scan capability was extensively reworked, including a new scheduler and page-locking memory 41 | support on the workers. Performance was greatly improved. 42 | :jirab:`DM-3755,DM-4677,DM-4697,DM-4807,DM-4943,DM-5313,DM-5514` 43 | 44 | .. _release-v12-0-qserv-large-results: 45 | 46 | Robustness with large (multi-gigbyte) result sets 47 | ------------------------------------------------- 48 | 49 | Qserv previously had an issue where dense and highly distributed queries could cause workers to "fire-hose" 50 | the czar, causing it to lock up or fail due to memory and/or CPU exhaustion. Threading and flow control 51 | changes were made on the czar and workers to address this. A memory management issue in the mysql proxy 52 | LUA code was also addressed. 53 | :jirab:`DM-5908,DM-5909,DM-5910,DM-6149` 54 | 55 | .. _release-v12-0-qserv-query-cancellation: 56 | 57 | Query cancellation 58 | ------------------ 59 | 60 | Query cancellation improvements and rework begun in the W16 cycle were completed. Queries in flight are 61 | now canceled robustly on both czar and workers when a user types -C to the mysql client. 62 | :jirab:`DM-2699,DM-3562,DM-3564,DM-3946,DM-3945` 63 | 64 | .. _release-v12-0-qserv-query: 65 | 66 | Query coverage 67 | -------------- 68 | 69 | Qserv now correctly handles queries with "where objectId between", and "where objectId in". 70 | :jirab:`DM-2873,DM-2887` 71 | 72 | .. _release-v12-0-qserv-logging: 73 | 74 | Logging improvements 75 | -------------------- 76 | 77 | Qserv log messages now include user-friendly thread IDs and unique query IDs. This improves consumability 78 | of logs for both real users and automated tools. 79 | :jirab:`DM-5314,DM-4755,DM-4756` 80 | 81 | .. _release-v12-0-qserv-sqlalchemy: 82 | 83 | SQLAlchemy client support 84 | ------------------------- 85 | The SQlAlchemy client library makes a few probe queries on connect to assess Unicode support by the engine. 86 | Some of these queries were problematic for the czar. This was addressed and SQLAlchemy can now be used as 87 | an alternative client for Qserv. 88 | :jirab:`DM-4648` 89 | 90 | .. _release-v12-0-qserv-sql-css: 91 | 92 | SQL-based CSS implementation 93 | ---------------------------- 94 | 95 | Qserv's central shared-state (CSS) meta-data service implementation, formerly based on Zookeeper, was 96 | replaced with a more robust and transactional SQL-based implementation. Dependencies on Zookeeper were 97 | removed from the build. 98 | :jirab:`DM-4003,DM-4138,DM-3192,DM-3574,DM-2733` 99 | 100 | .. _release-v12-0-qserv-multinode-docker: 101 | 102 | Multi-node integration tests via Docker 103 | --------------------------------------- 104 | 105 | A multi-node integration test suite was added, which may be run on a single host via Docker. The multi-node 106 | integration test is integrated with Travis CI, and now runs automatically on commits to all branches 107 | of the LSST Qserv git repo on github. 108 | :jirab:`DM-5218,DM-3985,DM-4295,DM-3910,DM-3922,DM-4395` 109 | 110 | .. _release-v12-0-qserv-database-delete: 111 | 112 | Distributed table and database deletion 113 | --------------------------------------- 114 | 115 | Distributed table and database deletion were implemented. Watcher process (wmgr) does deletion on workers, 116 | and state is synchronized via CSS. 117 | :jirab:`DM-2522,DM-2622,DM-2624,DM-4206,DM-2625` 118 | 119 | .. _release-v12-0-qserv-mariadb: 120 | 121 | Qserv stack now based on MariaDB 122 | -------------------------------- 123 | 124 | Qserv and all associated services and libraries were ported from MySQL to MariaDB. Dependencies on mysql and 125 | mysqlclient were removed from the build. 126 | :jirab:`DM-224,DM-5319,DM-5125,DM-5122,DM-4705,DM-3949,DM-5026` 127 | 128 | .. _release-v12-0-qserv-czar-in-proxy: 129 | 130 | Czar now in-process with mysqlproxy 131 | ----------------------------------- 132 | The Qserv czar was previously wrapped with SWIG then hosted within a Python process which communicated with 133 | mysqlproxy over an XMLRPC interface implemented in Twisted and LUA. The czar has been reworked so it is 134 | now directly wrapped to LUA and brought into the mysqlproxy process. This allowed elimination of the XMLRPC 135 | wire protocol and associated code, elimination of several external library dependencies, and removal of all 136 | Python involvement from the proxy/czar process. 137 | :jirab:`DM-4348,DM-5307` 138 | 139 | .. _release-v12-0-qserv-docs: 140 | 141 | Documentation updates 142 | --------------------- 143 | 144 | Qserv user and installation documentation 145 | (`Qserv 2016_05 documentation `_) 146 | was updated/corrected. 147 | :jirab:`DM-5754,DM-4105` 148 | 149 | .. _release-v12-0-qserv-bug-fixes: 150 | 151 | Bug Fixes 152 | ========= 153 | 154 | - :ref:`release-v12-0-qserv-service-timeout` 155 | - :ref:`release-v12-0-qserv-testqdisp` 156 | - :ref:`release-v12-0-qserv-match-tables` 157 | 158 | .. _release-v12-0-qserv-service-timeout: 159 | 160 | Service timeout failure fix 161 | --------------------------- 162 | Qserv services would crash in some instances if left running for several days. The cause was tracked down 163 | to a missing null handle check in a mysql wrapper library, which was provoked when server connections would 164 | timeout. 165 | :jirab:`DM-5594` 166 | 167 | .. _release-v12-0-qserv-testqdisp: 168 | 169 | Intermittent testQdisp unit test failure 170 | ---------------------------------------- 171 | 172 | This was tracked down to a problem with the Executive class mocks used by the unit test. These mocks did 173 | not handle threading during cancellation correctly. 174 | :jirab:`DM-4928` 175 | 176 | .. _release-v12-0-qserv-match-tables: 177 | 178 | Data loader didn't work for match tables 179 | ---------------------------------------- 180 | 181 | The qserv-data-loader.py script was not invoking the correct partitioner for match tables, and was not 182 | passing all required CSS parameters down to the CSS update code. 183 | :jirab:`DM-3656` 184 | 185 | .. _release-v12-0-qserv-internal-improvements: 186 | 187 | Build and Code Improvements 188 | =========================== 189 | 190 | - :ref:`release-v12-0-qserv-stream-logs` 191 | - :ref:`release-v12-0-qserv-scons` 192 | - :ref:`release-v12-0-qserv-compilers` 193 | - :ref:`release-v12-0-qserv-style` 194 | - :ref:`release-v12-0-qserv-lib-updates` 195 | - :ref:`release-v12-0-qserv-dead-code` 196 | - :ref:`release-v12-0-qserv-docker` 197 | - :ref:`release-v12-0-qserv-integration-tests` 198 | - :ref:`release-v12-0-qserv-futurize` 199 | - :ref:`release-v12-0-qserv-worker-config` 200 | - :ref:`release-v12-0-qserv-taskmsgfactory2` 201 | - :ref:`release-v12-0-qserv-installation-files` 202 | 203 | .. _release-v12-0-qserv-stream-logs: 204 | 205 | Stream based logging macros 206 | --------------------------- 207 | 208 | Qserv was cut over to using stream based logging macros exclusively, and the boost format style logging 209 | macros (considered harmful) were removed from the LSST log package. A redundant logging wrapping layer 210 | in qserv was also removed. 211 | :jirab:`DM-4616,DM-5204,DM-5202,DM-3037` 212 | 213 | .. _release-v12-0-qserv-scons: 214 | 215 | Build improvements 216 | ------------------ 217 | 218 | Overly verbose build output from scons was greatly reduced. Scons files were reworked to treat shared 219 | libraries consistently, and some latent incorrect shared lib linkages were corrected. Scons files were also 220 | adjusted to avoid unnecessary copying of the source tree into the build tree. 221 | :jirab:`DM-3447,DM-2421,DM-4145,DM-3686,DM-3707` 222 | 223 | .. _release-v12-0-qserv-compilers: 224 | 225 | Compiler support 226 | ---------------- 227 | 228 | Issues were addressed to ensure that qserv builds and passes all unit tests on Linux with gcc 4.8.5 - 5.3.1, 229 | and on MacOSX with XCode 7.3.0. Warnings were addressed wherever possible, and the builds are now largely 230 | warning free except for some warnings produced by third-party library dependencies. Warnings generated by 231 | the Eclipse Neon C++ code analyzer were also addressed wherever possible. 232 | :jirab:`DM-3584,DM-3663,DM-3803,DM-3772,DM-3779,DM-3915,DM-4398,DM-4470,DM-4529,DM-4704,DM-5788,DM-6292` 233 | 234 | .. _release-v12-0-qserv-style: 235 | 236 | C++ style and conformance 237 | ------------------------- 238 | 239 | Various small systematic changes were made across the Qserv code base for style consistency. Anonymous 240 | namespaces were moved to top level of translation units. A single space was added after "if" before 241 | the subsequent paren. toString() functions were removed in favor of streaming operators. Non-standard 242 | uint type was replaced with unsigned int. 243 | :jirab:`DM-4753,DM-3888,DM-2452,DM-3805` 244 | 245 | .. _release-v12-0-qserv-lib-updates: 246 | 247 | Library updates 248 | --------------- 249 | 250 | Qserv was rolled forward to scisql 0.3.5, mysqlproxy 0.8.5, boost 1.60, and the latest changes from 251 | XRootD were incorporated. We also moved from using a forked version of the sphgeom library to following the 252 | tip of the official LSST version. 253 | :jirab:`DM-4938,DM-4786,DM-5394,DM-2178,DM-4092,DM-2334` 254 | 255 | .. _release-v12-0-qserv-dead-code: 256 | 257 | Dead code removal 258 | ----------------- 259 | 260 | Unused worker configuration templates and deprecated czar merging codes were removed. Unused objectId 261 | hinting code was removed from the proxy LUA miniParser. 262 | :jirab:`DM-4440,DM-2320,DM-3952` 263 | 264 | .. _release-v12-0-qserv-docker: 265 | 266 | Docker improvements 267 | ------------------- 268 | 269 | Docker container build and deploy scripts continued to be extended, enhanced, and debugged. Scripts are 270 | currently based on shmux, and have been used for administration of multiple qserv clusters 271 | at both NCSA and IN2P3. 272 | :jirab:`DM-3199,DM-6130,DM-4438,DM-5187,DM-5402,DM-4523,DM-5336` 273 | 274 | .. _release-v12-0-qserv-integration-tests: 275 | 276 | Integration test improvements 277 | ----------------------------- 278 | 279 | Integration tests were added involving blobs and non-box spatial constraints. Additionally, a facility to 280 | reset the empty chunk list in the czar was added, which greatly streamlines the integration tests. 281 | :jirab:`DM-991,DM-2900,DM-4383` 282 | 283 | .. _release-v12-0-qserv-futurize: 284 | 285 | Modernize python code in Qserv admin tools 286 | ------------------------------------------ 287 | 288 | Python admin scripts were run through "futurize -1". One print change was made to runQueries.py. 289 | :jirab:`DM-6324` 290 | 291 | .. _release-v12-0-qserv-worker-config: 292 | 293 | Worker configuration files 294 | -------------------------- 295 | 296 | INI file style configuration support was added for the worker, in support of being able to configure 297 | shared scans without resorting to environment variables. 298 | :jirab:`DM-5209` 299 | 300 | .. _release-v12-0-qserv-taskmsgfactory2: 301 | 302 | Rename TaskMsgFactory2 303 | ---------------------- 304 | 305 | to TaskMsgFactory. I can't believe we track this kind of nonsense. 306 | :jirab:`DM-2060` 307 | 308 | .. _release-v12-0-qserv-installation-files: 309 | 310 | Clean up installation files 311 | --------------------------- 312 | 313 | Directories cfg/ and proxy/ in the qserv install tree were moved under share/ and lib/ for consistency. 314 | :jirab:`DM-1355` 315 | -------------------------------------------------------------------------------- /releases/v13_0_qserv_dax.rst: -------------------------------------------------------------------------------- 1 | .. _release-v13-0-qserv-dax: 2 | 3 | ########################################## 4 | Fall 2016 QServ and Data Access Highlights 5 | ########################################## 6 | 7 | - Query analysis fixes (more robust handling of ORDER BY, fix for missed usages of chunk/subchunk secondary index to limit query dispatch to involved chunks). 8 | 9 | - Shared scan improvements (fixes for "snail scan" long-running outlier query scan, scan table memory locking fixes/improvements). 10 | 11 | - Many containerization improvements (container sizes reduced by removing mariadb unit tests and intermediate compilation products, container-host timezone sync, more robust build and deploy scripts). 12 | 13 | - Database connection management fixes for wmgr service to enable parallelized usage of qserv loader script. 14 | 15 | - XRootD now logs via the LSST log package. 16 | -------------------------------------------------------------------------------- /releases/v13_0_sui.rst: -------------------------------------------------------------------------------- 1 | .. _release-v13-0-sui: 2 | 3 | ########################################### 4 | Fall 2016 Science User Interface Highlights 5 | ########################################### 6 | 7 | - PDAC v1 was deployed at NCSA. 8 | It provides data query and display of SDSS Stripe 82 data processed with the LSST pipeline stack in 2013. 9 | Please see `the PDAC v1 guide `_ for details and an access guide. 10 | 11 | - Finished the Firefly client side code rewrite in JavaScript using React/Redux framework. 12 | The binary release is available from https://github.com/Caltech-IPAC/firefly/releases. 13 | 14 | - Major visualization feature improvements: 15 | 16 | - Firefly JavaScript API and Python API improvements, providing more controls using Firefly visualization components and features. Python API firefly_client is pip and eups installable. 17 | - Phase folding capabilities for time series data, and light curve plots. 18 | - Overlay layer on images: 19 | 20 | - LSST mask overlay 21 | - User can change overlay symbol shape, size, and color. 22 | 23 | - Charts redesign and expansion of capabilities: 24 | 25 | - Multiple charts of different columns from the same data sets. Display XY 2D plot and histogram at the same time 26 | - XY 2D plots with error bars 27 | 28 | - afw.display firefly_display improvement using firefly_client. 29 | - Developed two Jupyter widgets for Firefly. 30 | 31 | -------------------------------------------------------------------------------- /releases/v15_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v15-0: 2 | 3 | ######################### 4 | Release 15.0 (2018-04-06) 5 | ######################### 6 | 7 | .. toctree:: 8 | :hidden: 9 | 10 | tickets/v15_0 11 | 12 | .. warning:: 13 | 14 | This will be the last major release that supports Python 2. 15 | From release v16.0 on, only Python 3 will be explicitly supported. 16 | Minor (bugfix) releases of v15 will continue to support Python 2. 17 | 18 | +-------------------------------------------+------------+ 19 | | Source | Identifier | 20 | +===========================================+============+ 21 | | Git tag | 15.0 | 22 | +-------------------------------------------+------------+ 23 | | :doc:`EUPS distrib ` | v15\_0 | 24 | +-------------------------------------------+------------+ 25 | 26 | This release is based on the ``w_2018_10`` weekly build. 27 | 28 | These release notes highlight significant changes to the Science Pipelines codebase which are likely to be of wide interest. 29 | For a complete list of changes made, see :doc:`tickets/v15_0`. 30 | 31 | If you have questions or comments about this release, visit our `Community Forum `_ for advice. 32 | 33 | - :ref:`release-v15-0-major-changes` 34 | 35 | *See also:* 36 | 37 | - :doc:`Installation instructions <../install/index>` 38 | - :doc:`Known issues ` 39 | - `Characterization Metric Report (DMTR-62) `_ 40 | 41 | .. _release-v15-0-major-changes: 42 | 43 | Major Functionality and Interface Changes 44 | ========================================= 45 | 46 | - :ref:`release-v15-0-diffim` 47 | - :ref:`release-v15-0-mem` 48 | - :ref:`release-v15-0-artifacts` 49 | - :ref:`release-v15-0-skysub` 50 | - :ref:`release-v15-0-wcs` 51 | - :ref:`release-v15-0-firefly` 52 | - :ref:`release-v15-0-compression` 53 | - :ref:`release-v15-0-cmdline` 54 | - :ref:`release-v15-0-prereqs` 55 | 56 | .. _release-v15-0-diffim: 57 | 58 | Image differencing algorithm improvements 59 | ----------------------------------------- 60 | 61 | Algorithms appropriate for differencing in a variety of contexts have been implemented as tasks in the LSST stack. These various algorithms and configurations have been compared. The results are written up in `DMTN-061 `_. This represents a significant step forward in being able to determine the baseline image differencing algorithm. Relevant code changes are in the ``ip_diffim`` `package `_. 62 | 63 | .. _release-v15-0-mem: 64 | 65 | Performance improvements in coaddition 66 | -------------------------------------- 67 | 68 | Now all coaddition algorithms have significantly reduced memory footprints. 69 | 70 | .. _release-v15-0-artifacts: 71 | 72 | Significantly improved artifact rejection in coaddition 73 | ------------------------------------------------------- 74 | 75 | Coaddition algorithms that do artifact clipping can now handle artifacts that overlap from epoch to epoch. ``SafeClipAssembleCoaddTask`` and ``CompareWarpAssembleCoaddTask`` are the two examples. 76 | 77 | .. _release-v15-0-skysub: 78 | 79 | Full focal plane sky subtraction 80 | -------------------------------- 81 | 82 | There are now tasks to create and apply models of the sky that extend over the entire field of view. View notes in `the LSST Community form post `_. 83 | 84 | .. _release-v15-0-wcs: 85 | 86 | Replace all ``Wcs`` classes with the AST backed ``SkyWcs`` 87 | ---------------------------------------------------------- 88 | 89 | The last release introduced a transform system backed by the `AST `_ package. Since that release the stack has been converted to using that system in all contexts where a world coordinate system is required. 90 | 91 | .. _release-v15-0-firefly: 92 | 93 | Plotting frontend for Firefly 94 | ----------------------------- 95 | 96 | This release includes, for the first time, the package that allows the LSST plotting abstraction layer to plot directly in the Science Portal plotting tool, Firefly. View ``display_firefly`` on `GitHub `_. 97 | 98 | .. _release-v15-0-compression: 99 | 100 | Lossless compression on by default 101 | ---------------------------------- 102 | 103 | Lossless compression is turned on by default when persisting any image-like data product. Read the :jira:`RFC-378` and view the notes in `this LSST Community forum post `_ and links therein. 104 | 105 | .. _release-v15-0-cmdline: 106 | 107 | Changes to command-line tasks 108 | ----------------------------- 109 | 110 | Command-line tasks now handle clobbering of versions, data, and configs in a more intuitive way. E.g. output repositories are now expected to differ from the input repository. This eliminates the need to explicitly turn on clobbering when making multiple runs to different outputs (reruns) when using the same inputs. Additional details are in an `LSST Community forum post `_. 111 | 112 | .. _release-v15-0-prereqs: 113 | 114 | Updated pre-requisites 115 | ---------------------- 116 | 117 | Pre-requisites for installing the science pipelines have been updated. Of note are that ``numpy 1.13``, ``astropy 2.0`` and ``matplotlib 2.0`` are all required. The baseline version of ``Python`` is now ``Python 3.6``. See the announcement in `this post on the LSST Community forum `_. 118 | -------------------------------------------------------------------------------- /releases/v16_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v16-0: 2 | 3 | ######################### 4 | Release 16.0 (2018-06-28) 5 | ######################### 6 | 7 | .. toctree:: 8 | :hidden: 9 | 10 | tickets/v16_0 11 | 12 | .. warning:: 13 | 14 | :ref:`release-v16-0-py3`. 15 | 16 | +-------------------------------------------+------------+ 17 | | Source | Identifier | 18 | +===========================================+============+ 19 | | Git tag | 16.0 | 20 | +-------------------------------------------+------------+ 21 | | :doc:`EUPS distrib ` | v16\_0 | 22 | +-------------------------------------------+------------+ 23 | 24 | This release is based on the ``w_2018_22`` weekly build. 25 | 26 | These release notes highlight significant changes to the Science Pipelines codebase which are likely to be of wide interest. 27 | For a complete list of changes made, see :doc:`tickets/v16_0`. 28 | 29 | If you have questions or comments about this release, visit our `Community Forum `_ for advice. 30 | 31 | - :ref:`release-v16-0-major-changes` 32 | 33 | *See also:* 34 | 35 | - :doc:`Installation instructions <../install/index>` 36 | - :doc:`Known issues ` 37 | - `Characterization Metric Report (DMTR-81) `_ 38 | 39 | .. _release-v16-0-major-changes: 40 | 41 | Major Functionality and Interface Changes 42 | ========================================= 43 | 44 | - :ref:`release-v16-0-new-geom` 45 | - :ref:`release-v16-0-decam-ingest` 46 | - :ref:`release-v16-0-py3` 47 | - :ref:`release-v16-0-starselector` 48 | - :ref:`release-v16-0-selectimages` 49 | 50 | .. _release-v16-0-new-geom: 51 | 52 | Reworked “geom” package, replacing much of “afw.geom” 53 | ----------------------------------------------------- 54 | 55 | The ``Angle``, ``Point``, ``Extent``, ``SpherePoint``, ``Box``, ``LinearTransform`` and ``AffineTransform`` primitivies have been moved from afw.geom to geom. 56 | Aliases for compatibility purposes remain within afw for this release, but new code should use the geom package. 57 | For further details, refer to `this community.lsst.org post`__. 58 | 59 | __ https://community.lsst.org/t/new-geom-package-replaces-much-of-lsst-afw-geom/2932 60 | 61 | .. _release-v16-0-decam-ingest: 62 | 63 | DECam ingest now defaults to raw image 64 | -------------------------------------- 65 | 66 | ``ingestImagesDecam.py`` (encapsulating ``DecamIngestTask``) now ingests raw images by default instead of “instcals”. 67 | This change means that ``DecamIngestTask`` behaves more like ``IngestTask`` and other ingestion tasks. 68 | For further details, refer to `this community.lsst.org post`__. 69 | 70 | __ https://community.lsst.org/t/ingestimagesdecam-py-default-changed/2915 71 | 72 | .. _release-v16-0-py3: 73 | 74 | Python 2 is no longer supported 75 | ------------------------------- 76 | 77 | This release is no longer tested on or expected to work with Python 2. 78 | For further details, refer to `this community.lsst.org post`__. 79 | 80 | __ https://community.lsst.org/t/python-2-no-longer-supported/2845 81 | 82 | .. _release-v16-0-starselector: 83 | 84 | Improvements to the ``StarSelector`` interface 85 | ---------------------------------------------- 86 | 87 | The call signatures for source and star selectors have been modernized and unified as described on :jira:`RFC-198`. 88 | Some legacy code has not yet adopted the new interface. 89 | 90 | .. _release-v16-0-selectimages: 91 | 92 | Automatically select the :math:`N` images with the best seeing when building templates 93 | -------------------------------------------------------------------------------------- 94 | 95 | The ``MaxPsfWcsSelectImagesTask`` can now automatically determine the thresholds needed to select images with an appropriate PSF when building templates, rather than having them specified by the user. 96 | For details, refer to :jira:`DM-11953`. 97 | -------------------------------------------------------------------------------- /releases/v18_0_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v18-0-0: 2 | 3 | ########################### 4 | Release 18.0.0 (2019-07-09) 5 | ########################### 6 | 7 | .. toctree:: 8 | :hidden: 9 | 10 | tickets/v18_0_0 11 | 12 | +-------------------------------------------+------------+ 13 | | Source | Identifier | 14 | +===========================================+============+ 15 | | Git tag | 18.0.0 | 16 | +-------------------------------------------+------------+ 17 | | :doc:`EUPS distrib ` | v18\_0\_0 | 18 | +-------------------------------------------+------------+ 19 | 20 | This release is based on the ``w_2019_23`` weekly build. 21 | 22 | These release notes highlight significant changes to the Science Pipelines codebase which are likely to be of wide interest. 23 | For a complete list of changes made, see :doc:`tickets/v18_0_0`. 24 | 25 | As of this release, we have adopted the “`Semantic Versioning `_” system, version 2.0.0. 26 | In this system, releases are described by a three part number representing major (18 in this case), minor (0), and patch (0) version. 27 | The major version is incremented on incompatible API changes; the minor version when new functionality is added; and the patch version when bug fixes are made. 28 | For further details, refer to :dmtn:`106`. 29 | 30 | If you have questions or comments about this release, visit our `Community Forum `_ for advice. 31 | 32 | - :ref:`release-v18-0-0-functionality` 33 | - :ref:`release-v18-0-0-interface` 34 | - :ref:`release-v18-0-0-pending-deprecations` 35 | - :ref:`release-v18-0-0-deprecations` 36 | 37 | *See also:* 38 | 39 | - :doc:`Installation instructions <../install/index>` 40 | - :doc:`Known issues ` 41 | - `Characterization Metric Report (DMTR-141) `_ 42 | - `Doxygen Documentation`__ 43 | 44 | __ http://doxygen.lsst.codes/stack/doxygen/xlink_master_2019_06_08_08.07.58/> 45 | 46 | .. _release-v18-0-0-functionality: 47 | 48 | Major Functionality Changes 49 | =========================== 50 | 51 | - :ref:`release-v18-0-0-ngmix` 52 | - :ref:`release-v18-0-0-fakes` 53 | - :ref:`release-v18-0-0-fgcm` 54 | - :ref:`release-v18-0-0-timeseries` 55 | 56 | .. _release-v18-0-0-ngmix: 57 | 58 | Improved integration with the ngmix package 59 | ------------------------------------------- 60 | 61 | meas_extensions_ngmix, which provides an interface between the LSST codebase and Erin Sheldon's `ngmix`__ package, has been substantially rewritten. 62 | This new package exposes more ngmix functionality, and may act as the basis for future development of the LSST measurement framework. 63 | For more information, refer to :jira:`DM-16268`. 64 | 65 | __ https://github.com/esheldon/ngmix 66 | 67 | .. _release-v18-0-0-fakes: 68 | 69 | New tasks for inserting “fake” sources 70 | -------------------------------------- 71 | 72 | New command-line tasks have been added for inserting simulated sources into single epoch (:command:`processFakes.py`) and coadded (:command:`insertFakes.py`) data. 73 | For more information, refer to `this community.lsst.org post`__. 74 | 75 | __ https://community.lsst.org/t/new-tasks-for-fake-source-insertion/3722 76 | 77 | .. _release-v18-0-0-fgcm: 78 | 79 | FCGM can now use reference stars 80 | -------------------------------- 81 | 82 | By default, fgcmcal — LSST's implementation of the Forward Global Calibration Method (`Burke et al., 2018`__) — does not use an external reference catalog, but solves internally to produce a relative calibration. 83 | It has now been upgraded to optionally load a set of reference stars, and use them to produce an absolute calibration. 84 | For more information, refer to :jira:`DM-16702`. 85 | 86 | __ http://adsabs.harvard.edu/abs/2018AJ....155...41B 87 | 88 | .. _release-v18-0-0-timeseries: 89 | 90 | DIAObjects now include basic lightcurve characterization 91 | -------------------------------------------------------- 92 | 93 | DIAObjects generated by the Alert Production system now include a basic set of features which extracted from the object's light-curve (mean fluxes, best-fit slope, etc). 94 | While it is hoped that these features are intrinsically useful, they also serve to exercise the machinery which will be used to calculate the full set of features which is currently being developed in conjunction with the relevant science collaborations. 95 | For more details about this functionality, refer to :jira:`DM-18318`; for information about the ultimate set of features which will be computed, refer to :jira:`DM-11962` and :lse:`163`. 96 | 97 | .. _release-v18-0-0-interface: 98 | 99 | Significant Interface Changes 100 | ============================= 101 | 102 | - :ref:`release-v18-0-0-propertyset` 103 | - :ref:`release-v18-0-0-ap_pipe` 104 | - :ref:`release-v18-0-0-selectors` 105 | - :ref:`release-v18-0-0-calib` 106 | 107 | .. _release-v18-0-0-propertyset: 108 | 109 | Python interface to PropertySet and PropertyList changed 110 | -------------------------------------------------------- 111 | 112 | ``__getitem__``, ``update``, and ``get`` methods have been added to the Python interfaces of `~lsst.daf.base.PropertySet` and `~lsst.daf.base.PropertyList`. 113 | This means that they can be used in the same way as native Python dictionaries. 114 | It is also possible to store the value `None` in a `~lsst.daf.base.PropertySet`. 115 | For more information, refer to `this community.lsst.org post`__ and :jira:`RFC-596`. 116 | 117 | __ https://community.lsst.org/t/changes-to-propertyset-and-propertylist-python-interface/3715 118 | 119 | .. _release-v18-0-0-ap_pipe: 120 | 121 | Alert Production Pipeline command-line interface changed 122 | -------------------------------------------------------- 123 | 124 | The :command:`ap_pipe.py` command will no longer try to create a “prompt products” database when it is executed. 125 | This change makes the pipeline better able to run in different environments, including large-scale testing and operations. 126 | An appropriate database should now be created in advance, either by using the :command:`make_ppdb.py` command, or by configuring the pipeline to use an externally-provided database. 127 | For more information, refer to `this community.lsst.org post`__ and :jira:`RFC-587`. 128 | 129 | __ https://community.lsst.org/t/ap-pipeline-ap-pipe-py-command-line-interface-change/3646 130 | 131 | .. _release-v18-0-0-selectors: 132 | 133 | New configurations for AstrometryTask source and reference selectors 134 | -------------------------------------------------------------------- 135 | 136 | The configuration options for single-frame astrometry (as implemented in :lsst-task:`lsst.meas.astrom.AstrometryTask`) have changed. 137 | This fixes various bugs where (a) selections of the reference catalog were performed only in some modes of operation; and (b) multiple conflicting selections of the source catalog could be performed in some modes of operation. 138 | All obs package defaults have been updated to reflect the new changes; you need only worry about these changes if you have overridden the obs package defaults. 139 | For more information, refer to `this community.lsst.org post`__ and :jira:`RFC-589`. 140 | 141 | __ https://community.lsst.org/t/new-configurations-for-astrometrytask-source-and-reference-selectors/3661 142 | 143 | .. _release-v18-0-0-calib: 144 | 145 | `lsst.afw.image.Calib` removed 146 | ------------------------------ 147 | 148 | `~lsst.afw.image.Calib`, which provided only a photometric zeropoint per CCD, has been replaced with `~lsst.afw.image.PhotoCalib`, which provides a spatially-varying photometric model. 149 | Some `~lsst.afw.image.Calib` interfaces are supported by `~lsst.afw.image.PhotoCalib`, but full API compatibility is not possible; using the old-style interfaces is deprecated, and they will be removed following this release. 150 | `~lsst.afw.image.PhotoCalib` is able to read files persisted with `~lsst.afw.image.Calib` objects, so backwards compatibility of on-disk data is maintained. 151 | For more information, refer to :jira:`RFC-289` and :jira:`RFC-573`. 152 | 153 | .. _release-v18-0-0-pending-deprecations: 154 | 155 | Pending Deprecations 156 | ==================== 157 | 158 | These packages/functions will be deprecated in the next major release. 159 | 160 | - :ref:`release-v18-0-0-deprecate-gen2` 161 | - :ref:`release-v18-0-0-deprecate-lsstsim` 162 | - :ref:`release-v18-0-0-deprecate-afwGeom` 163 | 164 | .. _release-v18-0-0-deprecate-gen2: 165 | 166 | Upcoming removal of “Generation 2” Middleware 167 | --------------------------------------------- 168 | 169 | The “Generation 3” middleware :ref:`included in the previous release ` is ultimately intended to supplant the current (“Generation 2”) Data Butler and command-line task functionality. 170 | We expect to deliver a final major release supporting the Generation 2 functionality in late calendar year 2019 (likely version 19.0.0, but that remains to be confirmed). 171 | Following that release, the “Generation 2” middleware will be removed from the codebase. 172 | This will include: 173 | 174 | - The daf_persistence package, to be replaced by daf_butler; 175 | - `lsst.pipe.base.CmdLineTask`, to be replaced by `lsst.pipe.base.PipelineTask`; 176 | - The pipe_drivers and ctrl_pool packages, for which replacements are still in development. 177 | 178 | .. _release-v18-0-0-deprecate-lsstSim: 179 | 180 | Upcoming removal of the obs_lsstSim package 181 | ------------------------------------------- 182 | 183 | The obs_lsst package, :ref:`included in the previous release `, obviates the need for the obs_lsstSim package. 184 | All LSST code is expected to transition to the new system later in summer 2019. 185 | Some work will be required to update old data repositories to the new system. 186 | After that, a final release will be made containing obs_lsstSim in late 2019, after which the package will be retired. 187 | 188 | .. _release-v18-0-0-deprecate-afwGeom: 189 | 190 | Upcoming removal of `lsst.afw.geom` classes that have been relocated to `lsst.geom` 191 | ----------------------------------------------------------------------------------- 192 | 193 | As announced in v16.0 (:ref:`release-v16-0-new-geom`) some primitives have been moved from `afw.geom` to `geom`. 194 | We currently provide aliases for compatibility purposes, but new code should use the `geom` package. 195 | These aliases will be removed after the version 19.0.0 will be released. 196 | 197 | .. _release-v18-0-0-deprecations: 198 | 199 | Deprecations 200 | ============ 201 | 202 | These packages/functions are deprecated and will not be available in the next major release. 203 | 204 | - :ref:`release-v18-0-0-deprecate-calib` 205 | - :ref:`release-v18-0-0-deprecate-ap-silent` 206 | - :ref:`release-v18-0-0-deprecate-isr` 207 | - :ref:`release-v18-0-0-deprecate-meas_algorithms` 208 | 209 | .. _release-v18-0-0-deprecate-calib: 210 | 211 | Upcoming removal of `lsst.afw.image.Calib` compatibility API 212 | ------------------------------------------------------------ 213 | 214 | This release includes (partial) backwards compatibility with the now removed (:ref:`see above `) `~lsst.afw.image.Calib` API. 215 | This will be removed before the next release. 216 | 217 | .. _release-v18-0-0-deprecate-ap-silent: 218 | 219 | Upcoming removal of the ``--silent`` argument to :command:`ap_verify.py` 220 | ------------------------------------------------------------------------ 221 | 222 | The ``--silent`` argument used to disable upload of metrics from :command:`ap_verify.py` to `SQuaSH`__. 223 | The capability to upload metrics has been removed from :command:`ap_verify.py` (see :jira:`DM-16536`), but ``--silent`` has been retained as a no-op for compatibility reasons. 224 | It will be removed before the next release. 225 | 226 | .. _release-v18-0-0-deprecate-isr: 227 | 228 | Upcoming removal of `ip_isr` functions from `isrFunctions.py` 229 | ------------------------------------------------------------- 230 | 231 | These functions are replaced by functionality in `lsst.meas.algorithms.Defects`: 232 | 233 | - ``defectListFromFootprintList`` replaced by ``Defects.fromFootPrintList()`` 234 | - ``transposeDefectList`` replaced by ``Defects.transpose()`` 235 | - ``maskPixelsFromDefectList`` replaced by ``Defects.maskPixels()`` 236 | - ``getDefectListFromMask`` replaced by ``Defects.fromMask()`` 237 | 238 | .. _release-v18-0-0-deprecate-meas_algorithms: 239 | 240 | Upcoming removal of `meas_algorithms` functions from `defects.py` 241 | ----------------------------------------------------------------- 242 | 243 | - ``policyToBadRegionList``, policy defect files no longer supported. 244 | 245 | __ https://squash.lsst.codes 246 | 247 | -------------------------------------------------------------------------------- /releases/v18_1_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v18-1-0: 2 | 3 | ########################### 4 | Release 18.1.0 (2019-08-08) 5 | ########################### 6 | 7 | .. toctree:: 8 | :hidden: 9 | 10 | tickets/v18_1_0 11 | 12 | +-------------------------------------------+------------+ 13 | | Source | Identifier | 14 | +===========================================+============+ 15 | | Git tag | 18.1.0 | 16 | +-------------------------------------------+------------+ 17 | | :doc:`EUPS distrib ` | v18\_1\_0 | 18 | +-------------------------------------------+------------+ 19 | 20 | This is a minor feature release, based on the :ref:`release-v18-0-0`, and requested by :jira:`RFC-618`. 21 | 22 | It contains two changes relative to the :ref:`release-v18-0-0`: 23 | 24 | - `lsst.daf.base.PropertySet` now supports storing unsigned 64-bit integers. 25 | For more information, refer to :jira:`DM-20506` and :jira:`DM-20664`. 26 | - The version number of the eigen package has changed from 3.3.4.lsst1-1-g221959b to 3.3.7. 27 | There are no changes to the contents of this package; the version included with release 18.0.0 was incorrectly labelled. 28 | 29 | There are no changes to interfaces or deprecations in this release. 30 | 31 | If you have questions or comments about the LSST Science Pipelines, visit our community forum for advice: https://community.lsst.org/. 32 | -------------------------------------------------------------------------------- /releases/v22_0_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v22-0-0: 2 | 3 | ########################### 4 | Release 22.0.0 (2021-07-09) 5 | ########################### 6 | 7 | .. toctree:: 8 | :hidden: 9 | 10 | tickets/v22_0_0 11 | 12 | +-------------------------------------------+------------+ 13 | | Source | Identifier | 14 | +===========================================+============+ 15 | | Git tag | 22.0.0 | 16 | +-------------------------------------------+------------+ 17 | | :doc:`EUPS distrib ` | v22\_0\_0 | 18 | +-------------------------------------------+------------+ 19 | | rubin-env version | 0.4.3 | 20 | +-------------------------------------------+------------+ 21 | 22 | This release is based on the ``w_2021_14`` weekly build. 23 | The bug fix on :jirab:`DM-29907` was backported. 24 | 25 | The notes below highlight significant technical changes to the Science Pipelines codebase in this release. 26 | For a complete list of changes made, see :doc:`tickets/v22_0_0`. 27 | 28 | The `Characterization Metric Report (DMTR-311) `_ describes the scientific performance of this release in terms of scientific performance metrics. 29 | 30 | If you have questions or comments about this release, visit our `community forum `_ for advice. 31 | 32 | - :ref:`release-v22-0-0-functionality` 33 | - :ref:`release-v22-0-0-interface` 34 | - :ref:`release-v22-0-0-pending-deprecations` 35 | - :ref:`release-v22-0-0-deprecations` 36 | 37 | *See also:* 38 | 39 | - :doc:`Installation instructions ` 40 | - :doc:`Known issues ` 41 | - `Doxygen Documentation`__ 42 | 43 | __ http://doxygen.lsst.codes/stack/doxygen/xlink_master_2021_04_01_08.31.19/ 44 | 45 | 46 | .. _release-v22-0-0-functionality: 47 | 48 | Major New Features 49 | ================== 50 | 51 | - :ref:`release-v22-0-0-gen3` 52 | - :ref:`release-v22-0-0-meas_extensions_piff` 53 | - :ref:`release-v22-0-0-scarlet` 54 | - :ref:`release-v22-0-0-dustmaps` 55 | - :ref:`release-v22-0-0-faro` 56 | 57 | .. _release-v22-0-0-gen3: 58 | 59 | Generation 3 Middleware 60 | ----------------------- 61 | Major improvements have been made to the Generation 3 middleware. Some highlights include: 62 | 63 | * Usability improvements to `~lsst.pipe.base.PipelineTask` including URI support. 64 | * Additional :doc:`butler command-line commands `, including 65 | 66 | * ``butler prune-datasets``, 67 | * ``butler query-dimension-records``, 68 | * and ``butler associate``. 69 | 70 | * Allow `lsst.daf.butler.Butler.get` to support dimension record values such as exposure observing day or detector name in the dataID. 71 | * :doc:`pipetask run ` can now execute a subset of a graph. This allows a single graph file to be created with an entire workflow and then only part of it to be executed, necessary for large-scale workflow execution. 72 | 73 | At the time of this release, :ref:`Gen 2 middleware is now deprecated `. 74 | 75 | 76 | .. _release-v22-0-0-meas_extensions_piff: 77 | 78 | Initial integration of Piff 79 | --------------------------- 80 | `Piff `_ has been added to rubin-env as a third-party package. 81 | A new package, `meas_extension_piff`_, (included in lsst_distrib) integrates Piff with the pipelines. 82 | Tasks can be configured to use Piff PSF models instead of the current default PSFex. 83 | :jirab:`RFC-755` 84 | 85 | .. _meas_extension_piff: https://github.com/lsst/meas_extensions_piff 86 | 87 | .. _release-v22-0-0-scarlet: 88 | 89 | Coadd measurement now uses the Scarlet deblender by default 90 | ----------------------------------------------------------- 91 | `Scarlet ` is now the default deblender used for measurements on coadds. 92 | The configuration for single-frame measurement has not changed. 93 | Scarlet produces additional flags for filtering duplicates, as it handles isolated sources differently: ``deblend_parentNPeaks``, ``deblend_parentNChild``. 94 | The flag ``detect_isPrimary`` is still populated and is the recommended column. 95 | See the `community.lsst.org post `_ for more information. 96 | :jirab:`RFC-745` 97 | 98 | .. _release-v22-0-0-dustmaps: 99 | 100 | Dust Maps 101 | --------- 102 | The `dustmaps_cachedata `_ package has been added to the pipelines, and the `dustmaps `_ module is a third-party dependency. 103 | :jirab:`RFC-752` 104 | 105 | .. _release-v22-0-0-faro: 106 | 107 | Faro package for scientific performance metrics 108 | ----------------------------------------------- 109 | A package for measuring scientific performance metrics, :ref:`faro `, was added. 110 | It implements the Gen3 versions of the tasks in :ref:`validate_drp `. See :ref:`release-v22-0-0-deprecate-validate-drp`. 111 | This package was used to generate the Characterization Report for this release. 112 | :jirab:`RFC-753, DM-28351` 113 | 114 | 115 | 116 | .. _release-v22-0-0-interface: 117 | 118 | Significant Interface Changes 119 | ============================= 120 | 121 | - :ref:`release-v22-0-0-filterLabel` 122 | - :ref:`release-v22-0-0-remove-obs_ctio0m9` 123 | - :ref:`release-v22-0-0-remove-metric-in-commonMetrics.py` 124 | 125 | 126 | .. _release-v22-0-0-filterLabel: 127 | 128 | Replace afw.image.Filter with FilterLabel 129 | ----------------------------------------- 130 | Filter information is now stored as a `FilterLabel `_, replacing the existing ``lsst.afw.image.Filter``. 131 | This class replaces ``Filter``'s system of names, canonical names, and aliases with just two names: a band (e.g., “g” or “r”) and a physical filter (e.g., “g DECam SDSS c0001 4720.0 1520.0” or “HSC-R2”). 132 | Note that not all ``FilterLabel`` objects have both a band and a physical filter, especially during the transition period, so please program defensively. 133 | See `community.lsst.org `__ for details. 134 | :jirab:`RFC-730` 135 | 136 | .. _release-v22-0-0-remove-obs_ctio0m9: 137 | 138 | Removal of the obs_ctio0m9 package 139 | ---------------------------------- 140 | 141 | The ``obs_ctio0m9`` camera package has been removed. 142 | :jirab:`RFC-729, DM-26867, DM-26868` 143 | 144 | 145 | .. _release-v22-0-0-remove-metric-in-commonMetrics.py: 146 | 147 | Removal of metric configurations in commonMetric.py 148 | --------------------------------------------------- 149 | 150 | The :code:`metric` field has been replaced by :code:`connections.package` and 151 | :code:`connections.metric`. 152 | 153 | 154 | .. _release-v22-0-0-pending-deprecations: 155 | 156 | Pending Deprecations 157 | ==================== 158 | 159 | These Tasks will be deprecated in the next major release. 160 | 161 | - :ref:`release-v22-0-0-deprecate-decamRawIngestTask` 162 | 163 | .. _release-v22-0-0-deprecate-decamRawIngestTask: 164 | 165 | Deprecated DecamRawIngestTask and MegaPrimeRawIngestTask 166 | ----------------------------------------------------------------- 167 | MegaPrime and DECam no longer require a specialist Gen3 ingest task. 168 | Please use the default `~lsst.obs.base.RawIngestTask`. 169 | Both `~lsst.obs.decam.DecamRawIngestTask` and `~lsst.obs.cfht.MegaPrimeRawIngestTask` will be removed after v23. 170 | 171 | .. _release-v22-0-0-deprecations: 172 | 173 | Deprecations 174 | ============ 175 | 176 | These packages/functions are deprecated and will not be available in the next major release. 177 | 178 | - :ref:`release-v22-0-0-deprecate-gen2` 179 | - :ref:`release-v22-0-0-deprecate-Filter` 180 | - :ref:`release-v22-0-0-deprecate-validate-drp` 181 | - :ref:`release-v22-0-0-deprecate-configurations-in-fgcmFitCycle.py` 182 | - :ref:`release-v22-0-0-deprecate-configurations-in-psfexStarSelector.py` 183 | 184 | .. _release-v22-0-0-deprecate-gen2: 185 | 186 | Deprecated Generation 2 Middleware 187 | ---------------------------------- 188 | Generation 2 middleware (Gen2) is no longer being developed and should not be used for new code. 189 | Gen2 infrastructure code (within e.g., :ref:`pipe_tasks `, :ref:`pipe_base `, :ref:`obs_base `, `daf_persistence `_, and obs packages) will no longer be maintained after January 1 2022, and may be removed at any point afterward. 190 | The CI package ``ci_hsc_gen2`` will continue to be run, and the tasks it checks will be maintained until we remove the Gen2 infrastructure code in 2022. 191 | Following this release, we will begin to drop gen2 pipelines from our verification packages (e.g., ap_verify), where a functional and validated gen3 pipeline exists. 192 | 193 | .. _release-v22-0-0-deprecate-Filter: 194 | 195 | Removal of ``lsst.afw.Filter`` 196 | ------------------------------ 197 | The `lsst.afw.Filter` class has been replaced with :ref:`filterLabel `, marked as deprecated, and will be removed after this release. Method names that contain ``Filter``, such as ``getFilter``, have been replaced e.g., with ``.getFilterLabel``. 198 | :jirab:`RFC-730` 199 | 200 | .. _release-v22-0-0-deprecate-validate-drp: 201 | 202 | Removal of validate_drp 203 | ----------------------- 204 | The algorithms implemented in :ref:`validate_drp ` were ported as-is to run in :ref:`faro `. 205 | All future development of scientific performance metrics will be carried out in faro. 206 | :ref:`validate_drp ` will be removed after this release. 207 | 208 | 209 | .. _release-v22-0-0-deprecate-doApplyUberCal-in-forcedPhotCcd.py: 210 | 211 | Deprecated doApplyUberCal in forcedPhotCcd.py 212 | --------------------------------------------- 213 | 214 | The field :code:`doApplyUberCal` is deprecated. 215 | Use :code:`doApplyExternalPhotoCalib` and :code:`doApplyExternalSkyWcs` instead. 216 | It will be removed before the 23.0.0 release. 217 | :jirab:`DM-23352` 218 | 219 | 220 | .. _release-v22-0-0-deprecate-configurations-in-isrTask.py: 221 | 222 | Deprecated configurations in isrTask.py 223 | --------------------------------------- 224 | 225 | The following configurations will be removed before the 23.0.0 release: 226 | 227 | * :code:`overscanFitType` 228 | * :code:`overscanOrder` 229 | * :code:`overscanNumSigmaClip` 230 | * :code:`overscanIsInt` 231 | 232 | Please configure overscan via the :code:`OverscanCorrectionConfig` interface. 233 | :jirab:`DM-23396` 234 | 235 | .. _release-v22-0-0-deprecate-configurations-in-fgcmFitCycle.py: 236 | 237 | Deprecated configurations in fgcmFitCycle.py 238 | -------------------------------------------- 239 | 240 | The following configurations are no longer used and will be removed before the 23.0.0 release: 241 | 242 | * :code:`fitFlag` 243 | * :code:`requiredFlag` 244 | * :code:`superStarSubCcd` 245 | * :code:`ccdGraySubCcd` 246 | * :code:`expGrayPhotometricCut` 247 | * :code:`expGrayHighCut` 248 | * :code:`expVarGrayPhotometricCut` 249 | * :code:`aperCorrInputSlopes` 250 | * :code:`sedFudgeFactors`, use :code:`sedSlopeMap` instead 251 | * :code:`sigFgcmMaxEGray`, use :code:`sigFgcmMaxEGrayDict` instead 252 | * :code:`approxThroughput`, use :code:`approxThroughputDict` instead 253 | * :code:`colorSplitIndices`, use :code:`colorSplitBands` instead 254 | * :code:`useRepeatabilityForExpGrayCuts`, use :code:`useRepeatabilityForExpGrayCutsDict` instead 255 | 256 | :jirab:`DM-23699` 257 | 258 | .. _release-v22-0-0-deprecate-configurations-in-psfexStarSelector.py: 259 | 260 | Deprecated configurations in psfexStarSelector.py 261 | ------------------------------------------------- 262 | 263 | The following configurations are no longer used and will be removed before the 23.0.0 release: 264 | 265 | * :code:`maxbad` 266 | * :code:`maxbadflag` 267 | -------------------------------------------------------------------------------- /releases/v24_1_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v24-1-0: 2 | 3 | ########################### 4 | Release 24.1.6 (2024-12-02) 5 | ########################### 6 | 7 | .. toctree:: 8 | :hidden: 9 | 10 | tickets/v24_1_0 11 | 12 | +-------------------------------------------+------------+ 13 | | Source | Identifier | 14 | +===========================================+============+ 15 | | Git tag | 24.1.6 | 16 | +-------------------------------------------+------------+ 17 | | :doc:`EUPS distrib ` | v24\_1\_6 | 18 | +-------------------------------------------+------------+ 19 | | rubin-env version | v24\_1\_6 | 20 | +-------------------------------------------+------------+ 21 | 22 | This is a minor feature release, based on the :ref:`release-v24-0-0`, 23 | with 38 additional tickets back ported. 24 | 25 | For a complete list of changes made, see :doc:`tickets/v24_1_0`. 26 | 27 | The `Characterization Metric Report (DMTR-391) `_ describes the scientific performance of this release in terms of scientific performance metrics. No characterization report was provided for :ref:`release-v24-0-0` 28 | 29 | There are no changes to interfaces or deprecations in this release. 30 | 31 | If you have questions or comments about the LSST Science Pipelines, visit our community forum for advice: https://community.lsst.org/. 32 | -------------------------------------------------------------------------------- /releases/v27_0_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-v27-0-0: 2 | 3 | ########################### 4 | Release 27.0.0 (2024-06-25) 5 | ########################### 6 | 7 | .. toctree:: 8 | :hidden: 9 | 10 | tickets/v27_0_0 11 | 12 | +-------------------------------------------+------------+ 13 | | Source | Identifier | 14 | +===========================================+============+ 15 | | Git tag | 27.0.0 | 16 | +-------------------------------------------+------------+ 17 | | :doc:`EUPS distrib ` | v27\_0\_0 | 18 | +-------------------------------------------+------------+ 19 | | rubin-env version | 8.0.0 | 20 | +-------------------------------------------+------------+ 21 | 22 | This release is based on the ``w_2024_16`` weekly build with 3 tickets backported. 23 | 24 | The notes below highlight significant technical changes to the Science Pipelines codebase in this release. 25 | For a complete list of changes made, see :doc:`tickets/v27_0_0`. 26 | 27 | The `Characterization Metric Report (DMTR-431) `_ describes the scientific performance of this release in terms of scientific performance metrics. 28 | 29 | If you have questions or comments about this release, visit our `Community forum `_ for advice. 30 | 31 | - :ref:`release-v27-0-0-functionality` 32 | - :ref:`release-v27-0-0-interface` 33 | - :ref:`release-v27-0-0-deprecations` 34 | 35 | *See also:* 36 | 37 | - :doc:`Installation instructions ` 38 | - :doc:`Known issues ` 39 | - `Doxygen Documentation`__ 40 | 41 | __ http://doxygen.lsst.codes/stack/doxygen/xlink_v27.0.0_2024_06_26_00.56.30 42 | 43 | .. _release-v27-0-0-functionality: 44 | 45 | Major new features 46 | ================== 47 | 48 | - :ref:`release-v27-0-0-cell-coadd` 49 | - :ref:`release-v27-0-0-source-injection` 50 | - :ref:`release-v27-0-0-obs-fiber-spectrograph` 51 | - :ref:`release-v27-0-0-obs-generic-camera` 52 | 53 | .. _release-v27-0-0-cell-coadd: 54 | 55 | Cell Coadds Package 56 | ------------------- 57 | 58 | The `cell_coadds `_ package includes data structures for defining coadds of astronomical images in small (few arcsecond) cells, in which only input images that fully contain a cell are included. 59 | This helps mitigate problems with PSF discontinuities that are present in traditional coadds. 60 | 61 | Pipeline tasks that use these data structures to actually build cell-based coadds are in development as of this release and maybe be usable, but they are not stable and are not included in the main DRP pipelines. 62 | 63 | .. _release-v27-0-0-obs-fiber-spectrograph: 64 | 65 | Ingestion of Fiber Spectrograph Data 66 | ------------------------------------ 67 | 68 | The `obs_fiberspectrograph` package adds support for data from fiber spectrographs that Rubin Observatory uses to monitor calibration light sources. 69 | 70 | .. _release-v27-0-0-obs-generic-camera: 71 | 72 | Support for Generic Cameras 73 | ---------------------------- 74 | 75 | The `obs_rubinGenericCamera` package adds support for data from the generic cameras used at Rubin Observatory, including the star trackers, the AuxTel pointing camera, and the all-sky camera. 76 | 77 | .. _release-v27-0-0-source-injection: 78 | 79 | Source Injection 80 | ---------------- 81 | 82 | The source `~lsst.source.injection` package (`GitHub `_) contains tools designed to assist in the injection of synthetic sources into scientific imaging at various points in the Rubin pipelines. 83 | Source generation and subsequent source injection is powered by the GalSim software package. 84 | 85 | .. _release-v27-0-0-interface: 86 | 87 | Significant interface changes 88 | ============================= 89 | 90 | .. _release-v27-0-0-new-packages: 91 | 92 | New packages added 93 | ------------------ 94 | 95 | * `cell_coadds `_ (see :ref:`above `) 96 | * `obs_fiberspectrograph `_ (see :ref:`above `) 97 | * `obs_rubinGenericCamera `_, (see :ref:`above `) 98 | * `source_injection `_ (see :ref:`above `) 99 | * `scarlet_lite `_, an optimized subset of the original Scarlet package used and maintained by Rubin DM. 100 | 101 | .. _release-v27-0-0-packages-removed: 102 | 103 | Packages removed 104 | ---------------- 105 | 106 | - `scarlet_extensions `_, obsolete scarlet utilities. 107 | 108 | Task and Data Product changes 109 | ----------------------------- 110 | 111 | * The Alert Production pipelines now use the new `~lsst.pipe.tasks.calibrateImage.CalibrationImageTask` for PSF modeling, background, subtraction, and initiall calibration immediately after ISR, instead of the two-task sequence of `~lsst.pipe.tasks.characterizeImage.CharacterizeImageTask` and `~lsst.pipe.tasks.calibrate.CalibrateTask`. 112 | The new task is leaner and simpler, and it can handle pairs of "snaps", but its output data products are slightly different. 113 | It will be included in the Data Release Production pipelines in a future release. 114 | 115 | .. _release-v27-0-0-deprecations: 116 | 117 | Deprecations 118 | ============ 119 | 120 | These tasks or methods are now deprecated and will be removed in the next major release. 121 | See :jira:`RFC-945` for a clarification on the accelerated deprecation policy. 122 | 123 | - :ref:`release-v27-0-0-deprecated-pipe-tasks` 124 | - :ref:`release-v27-0-0-deprecated-meas-base` 125 | - :ref:`release-v27-0-0-deprecated-configurations` 126 | 127 | .. _release-v27-0-0-deprecated-pipe-tasks: 128 | 129 | Deprecations in lsst.pipe.tasks 130 | ------------------------------- 131 | 132 | The python classes `~lsst.pipe.tasks.setPrimaryFlags.SetPrimaryFlagsConfig` and `~lsst.pipe.tasks.setPrimaryFlags.SetPrimaryFlagsConfig` have been moved to `lsst.meas.algorithms`. 133 | 134 | The coaddition tasks `~lsst.pipe.tasks.assembleCoadd.AssembleCoaddTask` and `~lsst.pipe.tasks.assembleCoadd.CompareWarpAssembleCoaddTask` have moved to `lsst.drp.tasks`. 135 | 136 | .. _release-v27-0-0-deprecated-meas-base: 137 | 138 | Deprecations in lsst.meas.base 139 | ------------------------------ 140 | 141 | The plugin `NaiveCentroid` is deprecated. 142 | 143 | .. _release-v27-0-0-deprecated-configurations: 144 | 145 | Deprecated task configurations 146 | ------------------------------ 147 | 148 | In `~lsst.fgcmcal.FgcmFitCycleConfig` the configuration field ``nCore`` will be removed. Please specify the number of cores with ``pipetask run --cores-per-quantum`` instead. 149 | 150 | In `~lsst.analysis.tools.tasks.CatalogMatchConfig` ``targetDecColumn`` has been been replaced by ``targetRaColumn`` and ``decColumn`` has been replaced by ``targetDecColumn``. 151 | -------------------------------------------------------------------------------- /releases/v28_0_0.rst: -------------------------------------------------------------------------------- 1 | .. _release-latest: 2 | .. _release-v28-0-0: 3 | 4 | ########################### 5 | Release 28.0.2 (2025-03-25) 6 | ########################### 7 | 8 | .. toctree:: 9 | :hidden: 10 | 11 | tickets/v28_0_0 12 | 13 | +--------------------------------------------+------------+ 14 | | Source | Identifier | 15 | +============================================+============+ 16 | | Git tag | 28.0.2 | 17 | +--------------------------------------------+------------+ 18 | | :doc:`EUPS distrib ` | v28\_0\_2 | 19 | +--------------------------------------------+------------+ 20 | | rubin-env version | 9.0.0 | 21 | +--------------------------------------------+------------+ 22 | 23 | This release is based on the ``w_2024_42`` weekly build with 5 tickets backported. 24 | 25 | The notes below highlight significant technical changes to the Science Pipelines codebase in this release. 26 | For a complete list of changes made, see :doc:`tickets/v28_0_0`. 27 | 28 | The `Characterization Metric Report (DMTR-451) `_ describes the scientific performance of this release in terms of scientific performance metrics. 29 | 30 | If you have questions or comments about this release, visit our `Community forum `_ for advice. 31 | 32 | - :ref:`release-v28-0-0-functionality` 33 | - :ref:`release-v28-0-0-interface` 34 | - :ref:`release-v28-0-0-deprecations` 35 | 36 | *See also:* 37 | 38 | - :doc:`Installation instructions ` 39 | - :doc:`Known issues ` 40 | - `Doxygen Documentation`__ 41 | 42 | __ http://doxygen.lsst.codes/stack/doxygen/xlink_v28.0.0_2025_01_24_05.17.48 43 | 44 | .. _release-v28-0-0-functionality: 45 | 46 | Major new features 47 | ================== 48 | 49 | - :ref:`release-v28-0-0-lsstinstall` 50 | - :ref:`release-v28-0-0-platform` 51 | - :ref:`release-v28-0-0-middleware` 52 | - :ref:`release-v28-0-0-multiprofit` 53 | - :ref:`release-v28-0-0-analysis-ap` 54 | 55 | .. _release-v28-0-0-lsstinstall: 56 | 57 | Supported installation method 58 | ----------------------------- 59 | 60 | Installation with ``newinstall.sh`` is no longer supported. Please use :doc:`lsstinstall `. 61 | 62 | .. _release-v28-0-0-platform: 63 | 64 | Platform compatibility 65 | ---------------------- 66 | 67 | AlmaLinux 9 replaces CentOS 7 as official development, test and operation platform. 68 | 69 | .. _release-v28-0-0-middleware: 70 | 71 | Notable middleware improvements 72 | ------------------------------- 73 | 74 | * Added new ``Butler`` interfaces to simplify queries and collection management. See ``Butler.collections``, ``Butler.query_datasets``, ``Butler.query_data_ids``, and ``Butler.query_dimension_records``. The new ``query_*`` interface uses a brand new query system that supports region and RA/Dec queries and they all support sorting and limit. It is recommended to migrate away from ``Butler.registry.query*`` interfaces. 75 | * There have been many improvements to the ``pipetask report`` command. 76 | 77 | .. _release-v28-0-0-multiprofit: 78 | 79 | Multiprofit source modelling 80 | ----------------------------- 81 | 82 | :doc:`lsst.multiprofit ` adds a python interface to the MultiProFit astronomical source modelling code and can be used through :doc:`lsst.meas.extensions.multiprofit `, which implements separate tasks for PSF and source model fitting. 83 | 84 | .. _release-v28-0-0-analysis-ap: 85 | 86 | New package analysis_ap 87 | ----------------------- 88 | 89 | Adds scripts and pipelines to compute quality assurance metrics and the ability to make plots for studying the alerts pipeline output products with focus on human-driven analysis of DiaSources and DiaObjects written to the PPDB/APDB; for automatic metric and plot generation on butler outputs during processing runs, see `~lsst.analysis.tools.tasks.AssocDiaSrcDetectorVisitAnalysisTask` and related Tasks and Pipelines in :doc:`lsst.analysis.tools `. 90 | 91 | .. _release-v28-0-0-interface: 92 | 93 | Significant interface changes 94 | ============================= 95 | 96 | .. _release-v28-0-0-new-packages: 97 | 98 | New packages added 99 | ------------------ 100 | 101 | * `analysis_ap `_ (see :ref:`above `). 102 | * `modelfit_parameters `_: A library for defining parameters used in fitting parametric models. 103 | * `gauss2d `_ and `gauss2d_fit `_: Packages for defining and evaluating 2D Gaussian mixtures and images thereof. 104 | * `multiprofit `_ and `meas_extensions_multiprofit `_ (see :ref:`above `). 105 | * `rucio_register `_: Command and API to add Butler specific information to Rucio metadata. 106 | 107 | .. _release-v28-0-0-packages-removed: 108 | 109 | Packages removed 110 | ---------------- 111 | 112 | * `proxmin `_ and `scarlet `_ are obsolete with the adoption of scarlet\_lite. 113 | 114 | .. _release-v28-0-0-deprecations: 115 | 116 | Deprecations 117 | ============ 118 | 119 | - :ref:`release-v28-0-0-deprecated-verify` 120 | - :ref:`release-v28-0-0-deprecated-ip-isr` 121 | - :ref:`release-v28-0-0-deprecated-obs-chft` 122 | - :ref:`release-v28-0-0-deprecated-meas-extensions-shapeHSM` 123 | - :ref:`release-v28-0-0-deprecated-ep-pipe` 124 | - :ref:`release-v28-0-0-deprecated-pipe-tasks` 125 | - :ref:`release-v28-0-0-deprecated-ctrl-mpexec` 126 | - :ref:`release-v28-0-0-deprecated-analysis-tools` 127 | - :ref:`release-v28-0-0-deprecated-cp-pipe` 128 | - :ref:`release-v28-0-0-deprecated-lsst-skymap` 129 | - :ref:`release-v28-0-0-deprecated-ip-diffim` 130 | - :ref:`release-v28-0-0-deprecated-analysis-ap` 131 | - :ref:`release-v28-0-0-deprecated-daf-butler` 132 | - :ref:`release-v28-0-0-deprecated-configurations` 133 | 134 | These tasks or methods are now deprecated and will be removed in the next major release. 135 | See :jira:`RFC-945` for a clarification on the accelerated deprecation policy. 136 | 137 | .. _release-v28-0-0-deprecated-verify: 138 | 139 | Deprecations in lsst.verify 140 | --------------------------- 141 | 142 | `~lsst.verify.tasks.ConfigApdbLoader` is deprecated. Please use `~lsst.verify.tasks.ApdbMetricConfig.apdb_config_url`. 143 | 144 | .. _release-v28-0-0-deprecated-ip-isr: 145 | 146 | Deprecations in lsst.ip.isr 147 | --------------------------- 148 | 149 | ``isrTask*.makeBinnedImages`` are no longer used. Please subtask ``BinExposureTask`` instead. 150 | 151 | .. _release-v28-0-0-deprecated-obs-chft: 152 | 153 | Deprecations in lsst.obs.chft 154 | ----------------------------- 155 | 156 | ``CfhtIsrTask.makeBinnedImages`` is no longer used. Please subtask ``BinExposureTask`` instead. 157 | 158 | .. _release-v28-0-0-deprecated-meas-extensions-shapeHSM: 159 | 160 | Deprecations in lsst.meas.extensions.shapeHSM 161 | --------------------------------------------- 162 | 163 | The ``shearType`` setters are deprecated. 164 | 165 | .. _release-v28-0-0-deprecated-ep-pipe: 166 | 167 | Deprecations in lsst.ap.pipe 168 | ---------------------------- 169 | 170 | ``createApFakes.CreateRandomApFakesTask`` is replaced by :doc:`source_injection ` tasks. 171 | 172 | .. _release-v28-0-0-deprecated-pipe-tasks: 173 | 174 | Deprecations in lsst.pipe.tasks 175 | ------------------------------- 176 | 177 | The following tasks are replaced by :doc:`source_injection ` tasks: 178 | 179 | * `~lsst.pipe.tasks.matchFakes.MatchFakesTask` 180 | * `~lsst.pipe.tasks.insertFakes.InsertFakesTask` 181 | * `~lsst.pipe.tasks.processCcdWithFakes.ProcessCcdWithFakesTask` 182 | * `~lsst.pipe.tasks.processCcdWithFakes.ProcessCcdWithVariableFakesTask` 183 | 184 | The following unused classes in ``diff_matched_tract_catalog`` will be removed: 185 | 186 | * `~lsst.pipe.tasks.diff_matched_tract_catalog.MeasurementType` 187 | * `~lsst.pipe.tasks.diff_matched_tract_catalog.Statistic` 188 | * `~lsst.pipe.tasks.diff_matched_tract_catalog.Median` 189 | * `~lsst.pipe.tasks.diff_matched_tract_catalog.SigmaIQR` 190 | * `~lsst.pipe.tasks.diff_matched_tract_catalog.SigmaMAD` 191 | * `~lsst.pipe.tasks.diff_matched_tract_catalog.Percentile` 192 | * `~lsst.pipe.tasks.diff_matched_tract_catalog.SourceType` 193 | * `~lsst.pipe.tasks.diff_matched_tract_catalog.MatchType` 194 | 195 | In `~lsst.pipe.tasks.diff_matched_tract_catalog.DiffMatchedTractCatalogConfig` the attribute ``compute_stats`` will be removed. 196 | 197 | .. _release-v28-0-0-deprecated-ctrl-mpexec: 198 | 199 | Deprecations in lsst.ctrl.mpexec 200 | -------------------------------- 201 | 202 | `~lsst.ctrl.mpexec.pipeline2dot` and `~lsst.ctrl.mpexec.graph2dot` should now be imported from ``lsst.pipe.base.dot_tools`` and will be removed. 203 | 204 | .. _release-v28-0-0-deprecated-analysis-tools: 205 | 206 | Deprecations in lsst.analysis.tools 207 | ----------------------------------- 208 | 209 | `~lsst.analysis.tools.actions.vector.TreecorrConfig` is no longer a part of :doc:`analysis_tools ` (see :jira:`DM-45899`). 210 | 211 | .. _release-v28-0-0-deprecated-cp-pipe: 212 | 213 | Deprecations in lsst.cp.pipe 214 | ---------------------------- 215 | 216 | `~lsst.cp.pipe.OverscanModel`. `~lsst.cp.pipe.SimpleModel`, `~lsst.cp.pipe.SimulatedModel`, `~lsst.cp.pipe.SegmentSimulator` and `~lsst.cp.pipe.SegmentSimulator` moved to ``lsst.ip.isr.deferredCharge``, and will be removed. 217 | 218 | .. _release-v28-0-0-deprecated-lsst-skymap: 219 | 220 | Deprecations in lsst.skymap 221 | --------------------------- 222 | 223 | The functions ``angToCoord``, ``coordToAnge`` and classes ``HealpixTractInfo``, ``HealpixSkyMapConfig``, ``HealpixSkyMap`` will be removed. 224 | ``getInnerSkyPolygon/inner_sky_polygon`` are deprecated in favor of `~lsst.skymap.TractInfo.inner_sky_region`. 225 | 226 | .. _release-v28-0-0-deprecated-ip-diffim: 227 | 228 | Deprecations in lsst.ip.diffim 229 | ------------------------------ 230 | 231 | The plugin ``NativeDipoleCentroid`` is deprecated. 232 | 233 | .. _release-v28-0-0-deprecated-daf-butler: 234 | 235 | Deprecations in daf.butler 236 | -------------------------- 237 | 238 | ``Butler.collections`` should no longer be used to get the list of default collections. 239 | Use ``Butler.collections.default`` instead. 240 | 241 | .. _release-v28-0-0-deprecated-analysis-ap: 242 | 243 | Deprecations in lsst.analysis.ap 244 | -------------------------------- 245 | 246 | ``legacyPlotUtils`` will be removed. 247 | 248 | .. _release-v28-0-0-deprecated-configurations: 249 | 250 | Deprecated task configurations 251 | ------------------------------ 252 | 253 | These configurations are deprecated: 254 | 255 | * `~lsst.ip.diffim.AlardLuptonPreconvolveSubtractConfig`: ``badSourceFlags`` in favor of seeting the equivalent field for the sourceSelector subtask instead. 256 | * `~lsst.ip.isr.IsrStatisticsTaskConfig`: ``doApplyGainsForCtiStatistics``. 257 | * `~lsst.ip.isr.DeferredChargeConfig`: ``useGains``. 258 | * `~lsst.pipe.tasks.characterizeImage.CharacterizeImageConfig`: subtasks ``doComputeSummaryStats`` moved to `~lsst.pipe.tasks.calibrate.CalibrateTask` and ``doMaskStreaks`` moved to `~lsst.ip.diffim.DetectAndMeasureTask`. 259 | * `~lsst.pipe.tasks.diff_matched_tract_catalog.DiffMatchedTractCatalogConfig`: ``column_ref_extended``, ``column_ref_extended_inverted``, ``column_ref_extended_inverted``, ``column_target_extended`` and ``compute_stats``. 260 | * `~lsst.cp.pipe.CpCtiSolveConfig`: ``useGains``. 261 | * `~lsst.cp.pipe.LinearitySolveConfig`: ``ignorePtcMask``. 262 | * `~lsst.cp.pipe.PhotonTransferCurveExtractConfig`: ``minMeanSignal`` and ``maxMeanSignal``. 263 | * `~lsst.ap.association.TransformDiaSourceCatalogConfig`: ``doPackFlags``. 264 | * `~lsst.ap.association.DiaPipelineConfig`: ``apdb`` has been replaced by ``apdb_config_url``. 265 | * `~lsst.ap.association.LoadDiaCatalogsConfig`: ``pixelMargin`` has been replaced by ``angleMargin``, ``doLoadForcedSources``. 266 | * `~lsst.ap.association.FilterDiaSourceCatalogConfig`: ``doWriteTrailedSources``. Trailed sources will not be written out during production. 267 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | documenteer[pipelines]==0.8.2 2 | -------------------------------------------------------------------------------- /tasks.rst: -------------------------------------------------------------------------------- 1 | .. _task-index: 2 | 3 | ########## 4 | Task index 5 | ########## 6 | 7 | .. _pipelinetask-index: 8 | 9 | Pipeline tasks 10 | ============== 11 | 12 | .. lsst-pipelinetasks:: 13 | :root: lsst 14 | 15 | .. _commandlinetask-index: 16 | 17 | Command-line tasks 18 | ================== 19 | 20 | .. lsst-cmdlinetasks:: 21 | :root: lsst 22 | 23 | .. _regular-task-index: 24 | 25 | Tasks 26 | ===== 27 | 28 | .. lsst-tasks:: 29 | :root: lsst 30 | 31 | .. _configurables-index: 32 | 33 | Configurables 34 | ============= 35 | 36 | .. lsst-configurables:: 37 | :root: lsst 38 | 39 | .. _configs-index: 40 | 41 | Configs 42 | ======= 43 | 44 | .. lsst-configs:: 45 | :root: lsst 46 | -------------------------------------------------------------------------------- /ups/pipelines_lsst_io.table: -------------------------------------------------------------------------------- 1 | # Specify direct dependencies to all packages included in the documentation. 2 | # Sort alphabetically for maintainability. 3 | setupRequired(afw) 4 | setupRequired(alert_packet) 5 | setupRequired(analysis_ap) 6 | setupRequired(analysis_tools) 7 | setupRequired(ap_association) 8 | setupRequired(ap_pipe) 9 | setupRequired(ap_verify) 10 | setupRequired(base) 11 | setupRequired(cbp) 12 | setupRequired(cell_coadds) 13 | setupRequired(coadd_utils) 14 | setupRequired(cp_pipe) 15 | setupRequired(ctrl_bps) 16 | setupRequired(ctrl_bps_htcondor) 17 | setupRequired(ctrl_bps_panda) 18 | setupRequired(ctrl_bps_parsl) 19 | setupRequired(ctrl_mpexec) 20 | setupRequired(daf_butler) 21 | setupRequired(daf_relation) 22 | setupRequired(dax_apdb) 23 | setupRequired(display_ds9) 24 | setupRequired(display_firefly) 25 | setupRequired(drp_pipe) 26 | setupRequired(drp_tasks) 27 | setupRequired(faro) 28 | setupRequired(fgcmcal) 29 | setupRequired(gauss2d) 30 | setupRequired(gauss2d_fit) 31 | setupRequired(geom) 32 | setupRequired(ip_diffim) 33 | setupRequired(ip_isr) 34 | setupRequired(jointcal) 35 | setupRequired(log) 36 | setupRequired(meas_algorithms) 37 | setupRequired(meas_astrom) 38 | setupRequired(meas_base) 39 | setupRequired(meas_deblender) 40 | setupRequired(meas_extensions_multiprofit) 41 | setupRequired(meas_extensions_gaap) 42 | setupRequired(meas_extensions_photometryKron) 43 | setupRequired(meas_extensions_piff) 44 | setupRequired(meas_extensions_psfex) 45 | setupRequired(meas_extensions_scarlet) 46 | setupRequired(meas_extensions_shapeHSM) 47 | setupRequired(meas_extensions_simpleShape) 48 | setupRequired(meas_extensions_trailedSources) 49 | setupRequired(meas_modelfit) 50 | setupRequired(meas_transiNet) 51 | setupRequired(modelfit_parameters) 52 | setupRequired(multiprofit) 53 | setupRequired(obs_base) 54 | setupRequired(obs_cfht) 55 | setupRequired(obs_decam) 56 | setupRequired(obs_lsst) 57 | setupRequired(pex_config) 58 | setupRequired(pex_exceptions) 59 | setupRequired(pipe_base) 60 | setupRequired(pipe_tasks) 61 | setupRequired(resources) 62 | setupRequired(sconsUtils) 63 | setupRequired(shapelet) 64 | setupRequired(skymap) 65 | setupRequired(source_injection) 66 | setupRequired(utils) 67 | setupRequired(verify) 68 | setupRequired(verify_metrics) 69 | --------------------------------------------------------------------------------