├── .gitignore ├── README.md ├── acs_cte_forward_model ├── acs_cte_forward_model_example.ipynb └── requirements.txt ├── acs_findsat_mrt └── acs_findsat_mrt_example.ipynb ├── acs_logo.png ├── acs_pixel_area_maps ├── acs_pixel_area_maps.ipynb ├── p_module │ ├── __init__.py │ ├── notebook_tools.py │ ├── plot.py │ └── plots.py └── requirements.txt ├── acs_polarization_tools └── acs_polarization_tools.ipynb ├── acs_reduction ├── acs_reduction.ipynb ├── p_module │ ├── __init__.py │ └── plot.py └── requirements.txt ├── acs_saturation_trails ├── acs_saturation_trails.ipynb ├── p_module │ ├── __init__.py │ └── plot.py └── requirements.txt ├── acs_sbc_dark_analysis ├── acs_sbc_dark_analysis.ipynb └── requirements.txt └── acs_subarrays ├── acs_subarrays.ipynb └── requirements.txt /.gitignore: -------------------------------------------------------------------------------- 1 | .*~ 2 | *~ 3 | *.pyc 4 | __pycache__/ 5 | .ipynb_checkpoints/ 6 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | [![Developed For: ACS](https://img.shields.io/badge/developed%20for-ACS-orange.svg?style=flat)](http://www.stsci.edu/hst/acs) [![Powered By: STScI](https://img.shields.io/badge/powered%20by-STScI-blue.svg?colorA=707170&colorB=3e8ddd&style=flat)](http://www.stsci.edu/) 2 | 3 | ![ACS logo](acs_logo.png) 4 | 5 | # acs-notebook 6 | 7 | This repository contains Jupyter notebooks to demonstrate how to calibrate and analyze data from the *Hubble Space Telescope* (*HST*) Advanced Camera for Surveys (ACS). Users are advised to visit the [ACS website](http://www.stsci.edu/hst/acs), [Instrument Handbook](http://www.stsci.edu/hst/acs/documents/handbooks/current/cover.html), and [Data Handbook](http://www.stsci.edu/hst/acs/documents/handbooks/currentDHB/acs_cover.html) for more information about the current status of ACS, instrument specifications, and data analysis. 8 | 9 | Users who need help transitioning from IRAF/PyRAF to Python should see the [stak-notebooks](https://github.com/spacetelescope/stak-notebooks) repository. 10 | 11 | If you have questions about HST data analysis, calibration software, instrument capabilities, and/or the methods discussed in this repository, please visit the [HST Help Desk](http://hsthelp.stsci.edu). Through the help desk portal, you can explore the HST Knowledge Base and request additional help from experts. 12 | 13 | # ALL NOTEBOOKS MIGRATED 14 | All notebooks in this repository have been migrated to 15 | 16 | [The ACS directory in hst_notebooks](https://github.com/spacetelescope/hst_notebooks/tree/main/notebooks/ACS) 17 | 18 | Please see that repository for the most up to date information. 19 | -------------------------------------------------------------------------------- /acs_cte_forward_model/requirements.txt: -------------------------------------------------------------------------------- 1 | # The required packages for the ACS CTE Forward Model notebook 2 | 3 | numpy 4 | collections 5 | matplotlib 6 | astropy 7 | photutils 1.1.0 8 | astroquery 9 | acstools 10 | -------------------------------------------------------------------------------- /acs_findsat_mrt/acs_findsat_mrt_example.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "b232525b", 6 | "metadata": {}, 7 | "source": [ 8 | "
\n", 9 | "REQUIREMENT: Before proceeding, install or update your\n", 10 | "stenv distribution. stenv is the replacement for AstroConda, which is unsupported as of February 2023.\n", 11 | "
" 12 | ] 13 | }, 14 | { 15 | "cell_type": "markdown", 16 | "id": "a3e6a89b", 17 | "metadata": {}, 18 | "source": [ 19 | "\n", 20 | "# Satellite trail detection in ACS/WFC data using acstools.findsat_mrt\n", 21 | "\n", 22 | "This notebook provides examples of how to find and create masks for satellite trails in ACS/WFC imaging data using acstools.findsat_mrt, which is based on the method described in ACS ISR 2022-08. Many of the tools presented here should be applicable to any imaging data.\n", 23 | "\n", 24 | "### Table of Contents:\n", 25 | "\n", 26 | "[Introduction](#intro_ID)
\n", 27 | "[Imports, Setup, and Data](#imports)
\n", 28 | "\n", 29 | "[Example 1: Step-by-step guide to find trails in an FLC image](#example1)
\n", 30 | "[Example 2: Quick run on an FLC image](#example2)
\n", 31 | "[Example 3: Find trails in an FLC image using the WFC wrapper](#example3)
\n", 32 | "[Example 4: Step-by-step guide to find trails in a DRC image](#example4)
\n", 33 | "[Example 5: Find trails in a DRC image using the WFC wrapper](#example5)
\n", 34 | "[Example 6: Create a new kernel for detection](#example6)
\n", 35 | "\n", 36 | "### About this Notebook\n", 37 | "**Author:** David V. Stark, ACS Instrument Team, Space Telescope Science Institute
\n", 38 | "**First Published On:** 5/13/2023
\n", 39 | "**Updated On:** 5/15/2023" 40 | ] 41 | }, 42 | { 43 | "cell_type": "markdown", 44 | "id": "e99e4ec8", 45 | "metadata": {}, 46 | "source": [ 47 | "\n", 48 | "## Introduction\n", 49 | "\n", 50 | "Despite being in orbit, HST imaging data still suffers from contamination by artificial satellites that can compromise science data unless they are identified and masked. This notebook presents examples of how to identify satellite trails in ACS/WFC data. The routine is also effective at identifying other linear features duch as diffraction spikes and glint (see Section 4.5 of the ACS DHB for further discussion on these artifacts). \n", 51 | "\n", 52 | "A full description of the algorithm is provided in ACS ISR 2022-08. To briefly summarize, the Median Radon Transform (MRT) is calculated for an input image and used to identify linear signals in the data. The MRT is similar to the standard Radon Transform except that it calculates the median, rather than the sum, of data along all possible paths through an image. This modification makes the algorithm more robust against false signals from localized sources (e.g., stars, galaxies) but still very sensitive to persistent linear features, even well-below the background noise level. \n", 53 | "\n", 54 | "Additional post-processing is done to filter out spurious detections, primarily eliminating them based on trail S/N, width, and persistence across the image. These parameters, especially the maximum allowed trail width, are tuned for ACS/WFC data binned 2x2 and may be different for images from other instruments. Once the final set of trails is identified and characterized, a mask can be created. The routine provides numerous ways of visualizing the results, as will be demonstrated below.\n", 55 | "\n", 56 | "The following examples illustrate how to use `acstools.findsat_mrt` to identify satellite trails and then create masks for them. Examples 1 and 4 go through the analysis step by step, including how to preprocess data and run individual routines inside `findsat_mrt`. Examples 2, 3, and 5 demonstrate how to automate many of these steps. Our demonstrations stop at the creation of the masks. We leave it to the user to decide the best way to apply the masks to their own analysis." 57 | ] 58 | }, 59 | { 60 | "cell_type": "markdown", 61 | "id": "47e71a5f", 62 | "metadata": {}, 63 | "source": [ 64 | "\n", 65 | "## Imports, setup, and data\n", 66 | "\n", 67 | "It is recommended that you use the latest stenv python environment when using this notebook. In particular, you must use acstools v3.6.0 or greater in order to run this notebook. You can check you version with\n", 68 | "\n", 69 | "`conda list acstools`\n", 70 | "\n", 71 | "and update if necessary with\n", 72 | "\n", 73 | "`conda update acstools`\n", 74 | "\n", 75 | "Set your working directory and import the needed packages with the following" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": null, 81 | "id": "2ecb79b9", 82 | "metadata": {}, 83 | "outputs": [], 84 | "source": [ 85 | "#check your own working directory\n", 86 | "import os\n", 87 | "print('Current working directory is {}'.format(os.getcwd()))\n", 88 | "\n", 89 | "#update as needed with\n", 90 | "#os.chdir('/Users/dstark/acs_work/satellite_trails/findsat_mrt/')" 91 | ] 92 | }, 93 | { 94 | "cell_type": "code", 95 | "execution_count": null, 96 | "id": "11679561", 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "#import modules and setup\n", 101 | "import matplotlib.pyplot as plt\n", 102 | "import numpy as np\n", 103 | "\n", 104 | "from astropy.io import fits\n", 105 | "from astropy.nddata import bitmask, block_reduce, block_replicate\n", 106 | "from acstools.findsat_mrt import TrailFinder, WfcWrapper\n", 107 | "\n", 108 | "#These are optional configurations\n", 109 | "%matplotlib inline\n", 110 | "plt.rcParams[\"figure.figsize\"] = (8, 6)\n", 111 | "plt.rcParams['font.serif'] = \"Georgia\"\n", 112 | "plt.rcParams['font.family'] = \"serif\"" 113 | ] 114 | }, 115 | { 116 | "cell_type": "markdown", 117 | "id": "2cf59b69", 118 | "metadata": {}, 119 | "source": [ 120 | "Download the example data needed. Examples 1-3 use jc8m32j5q_flc.fits, while examples 4-5 use hst_13498_32_acs_wfc_f606w_jc8m32j5_drc.fits. " 121 | ] 122 | }, 123 | { 124 | "cell_type": "markdown", 125 | "id": "8f36a30f", 126 | "metadata": {}, 127 | "source": [ 128 | "\n", 129 | "\n", 130 | "## Example 1: Finding trails in an FLC image\n", 131 | "\n", 132 | "FLC images are individual exposures processed by the CALACS pipeline. The data contain two chips, but we only analyze one here. \n", 133 | "\n", 134 | "We start by reading in an image and doing some pre-processing to remove bad pixels, subtract a median background, and make the image a bit smaller (to speed up the calculation of the MRT)." 135 | ] 136 | }, 137 | { 138 | "cell_type": "code", 139 | "execution_count": null, 140 | "id": "d67ec590", 141 | "metadata": {}, 142 | "outputs": [], 143 | "source": [ 144 | "# Read in the image files and header information\n", 145 | "image_file = 'jc8m32j5q_flc.fits'\n", 146 | "ext = 4 # ACS image data are in extensions 1 or 4, we'll just use 4 for now (chip 1)\n", 147 | "with fits.open(image_file) as h:\n", 148 | " image = h[ext].data # image data\n", 149 | " dq = h[ext+2].data # data quality bitmasks\n", 150 | "\n", 151 | " header = h[0].header # primary header\n", 152 | " image_header = h[1].header # image header" 153 | ] 154 | }, 155 | { 156 | "cell_type": "markdown", 157 | "id": "bf3e40f2", 158 | "metadata": {}, 159 | "source": [ 160 | "Below, we make a mask for bad pixels. We're ignoring cosmic rays here because routines to make them often partially (but not fully) mask trails. By default, any masked pixels are set to `NaN`." 161 | ] 162 | }, 163 | { 164 | "cell_type": "code", 165 | "execution_count": null, 166 | "id": "93a4ada8", 167 | "metadata": {}, 168 | "outputs": [], 169 | "source": [ 170 | "mask = bitmask.bitfield_to_boolean_mask(dq, ignore_flags=[4096, 8192, 16384])\n", 171 | "image[mask == True] = np.nan" 172 | ] 173 | }, 174 | { 175 | "cell_type": "markdown", 176 | "id": "a5d6905b", 177 | "metadata": {}, 178 | "source": [ 179 | "Below we subtract Subtract the background from the image. Here we just do a simple median." 180 | ] 181 | }, 182 | { 183 | "cell_type": "code", 184 | "execution_count": null, 185 | "id": "d90c5e49", 186 | "metadata": {}, 187 | "outputs": [], 188 | "source": [ 189 | "image = image - np.nanmedian(image)" 190 | ] 191 | }, 192 | { 193 | "cell_type": "markdown", 194 | "id": "d27b0c53", 195 | "metadata": {}, 196 | "source": [ 197 | "The MRT is computationally demanding and WFC images are big. To help things a bit, let's rebin the images." 198 | ] 199 | }, 200 | { 201 | "cell_type": "code", 202 | "execution_count": null, 203 | "id": "32e5a69c", 204 | "metadata": {}, 205 | "outputs": [], 206 | "source": [ 207 | "binsize = 2 # adjust this as needed\n", 208 | "image_rebin = block_reduce(image, binsize, func=np.nansum)" 209 | ] 210 | }, 211 | { 212 | "cell_type": "markdown", 213 | "id": "e09d8359", 214 | "metadata": {}, 215 | "source": [ 216 | "We now set up `TrailFinder`. Many of the parameters in the call below are optional (and set to their current values by default) but we show them to illustrate the setup. Of note is that I'm explicitly defining the image header keys to save. These can be useful later when analyzing trail population properties. The keywords being saved here were chosen to ensure we know the original exposure ippsoot and which chip was analyzed. Additional keywords are saved that store information about the orientation of the telescope when the image was taken. In principle, the user can save any header keywords they like. We have also set `plot=False` in this example, so we can demonstrate how to manually create plots. Setting `plot=True` will automatically generate plots after specific processes are finished. Be aware that not all possible keyword parameters are defined below. See the documentation for complete information." 217 | ] 218 | }, 219 | { 220 | "cell_type": "code", 221 | "execution_count": null, 222 | "id": "4d4632ab", 223 | "metadata": {}, 224 | "outputs": [], 225 | "source": [ 226 | "# Now we can set up TrailFinder\n", 227 | "s = TrailFinder(image=image_rebin,\n", 228 | " header=header,\n", 229 | " image_header=image_header,\n", 230 | " save_image_header_keys=['ROOTNAME', 'CCDCHIP', 'CRPIX1', 'CRPIX2', 'CRVAL1', 'CRVAL2',\n", 231 | " 'ORIENTAT','RA_APER', 'DEC_APER', 'PA_APER'],\n", 232 | " processes=8,\n", 233 | " plot=False,\n", 234 | " threshold=5,\n", 235 | " max_width=75, \n", 236 | " check_persistence=True,\n", 237 | " min_persistence=0.5,\n", 238 | " output_root='example1')" 239 | ] 240 | }, 241 | { 242 | "cell_type": "markdown", 243 | "id": "e6d6f406", 244 | "metadata": {}, 245 | "source": [ 246 | "Before we actually run anything, let's plot the image we are analyzing. You should see two satellite trails in this example." 247 | ] 248 | }, 249 | { 250 | "cell_type": "code", 251 | "execution_count": null, 252 | "id": "8397469f", 253 | "metadata": {}, 254 | "outputs": [], 255 | "source": [ 256 | "s.plot_image()" 257 | ] 258 | }, 259 | { 260 | "cell_type": "markdown", 261 | "id": "63291b30", 262 | "metadata": {}, 263 | "source": [ 264 | "If you're having trouble seeing the trails, you can adjust the scale keyword (the min and max values to show given as multiples of the image standard deviation)" 265 | ] 266 | }, 267 | { 268 | "cell_type": "code", 269 | "execution_count": null, 270 | "id": "a9779382", 271 | "metadata": {}, 272 | "outputs": [], 273 | "source": [ 274 | "s.plot_image(scale=[-1, 1])" 275 | ] 276 | }, 277 | { 278 | "cell_type": "markdown", 279 | "id": "2ad4c287", 280 | "metadata": {}, 281 | "source": [ 282 | "Next we run the Median Radon Transform. This step can take some time depending on the image size and number of processes being used. This tutorial assumes you can run 8 processes at the same time, but adjust as needed. If you're not sure how many processes you can run, you can see how many CPU cores are available and adjust based on that." 283 | ] 284 | }, 285 | { 286 | "cell_type": "code", 287 | "execution_count": null, 288 | "id": "a3a00b7a", 289 | "metadata": {}, 290 | "outputs": [], 291 | "source": [ 292 | "import os\n", 293 | "os.cpu_count()" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": null, 299 | "id": "78cbd13a", 300 | "metadata": {}, 301 | "outputs": [], 302 | "source": [ 303 | "s.processes = 8 # adjust this if necessary\n", 304 | "s.run_mrt()" 305 | ] 306 | }, 307 | { 308 | "cell_type": "markdown", 309 | "id": "14db26e7", 310 | "metadata": {}, 311 | "source": [ 312 | "Now we will plot the MRT. You may be able to spot the signals from the satellite trails as two somewhat wide point-like sources." 313 | ] 314 | }, 315 | { 316 | "cell_type": "code", 317 | "execution_count": null, 318 | "id": "29c74a89", 319 | "metadata": {}, 320 | "outputs": [], 321 | "source": [ 322 | "s.plot_mrt()" 323 | ] 324 | }, 325 | { 326 | "cell_type": "markdown", 327 | "id": "5b6b93fd", 328 | "metadata": {}, 329 | "source": [ 330 | "Note that the x axis in in pixels, not degrees or radians. The `theta` array ranges from 0 to 180 with a spacing of 0.5 degrees, hence 360 pixels.\n", 331 | "\n", 332 | "We next run the source finder on the MRT. You can create your own detection kernels, or use the defaults provided (see [Example 6](#example6) for how to create detection kernels). Depending on the settings, this can pick up a lot more than the actual trails we're interested in. There are additional steps we'll take later to filter these false detections out. The ones we found and their location on the MRT are shown below.\n", 333 | "\n", 334 | "The `threshold` in this case refers to the signal-to-noise ratio of a feature found in the MRT. The default is 5." 335 | ] 336 | }, 337 | { 338 | "cell_type": "code", 339 | "execution_count": null, 340 | "id": "0d5ef5dd", 341 | "metadata": { 342 | "scrolled": true 343 | }, 344 | "outputs": [], 345 | "source": [ 346 | "s.threshold = 5 # detection threshold\n", 347 | "s.find_mrt_sources() # finds the sources\n", 348 | "s.plot_mrt(show_sources=True) # overplots the sources on top of the MRT" 349 | ] 350 | }, 351 | { 352 | "cell_type": "markdown", 353 | "id": "2530aa32", 354 | "metadata": {}, 355 | "source": [ 356 | "We filter the sources further based on a reassessment of their S/N, width, and persistence. The default parameters (namely width) have been chosen for ACS data binned by 2 pixels in each direction. It's possible different defaults will be better for different imaging data." 357 | ] 358 | }, 359 | { 360 | "cell_type": "code", 361 | "execution_count": null, 362 | "id": "82692609", 363 | "metadata": { 364 | "scrolled": true 365 | }, 366 | "outputs": [], 367 | "source": [ 368 | "# Parameters that affect how the filtering works\n", 369 | "s.threshold = 5\n", 370 | "s.max_width = 75\n", 371 | "s.check_persistence = True\n", 372 | "s.min_persistence = 0.5\n", 373 | "\n", 374 | "# now filter\n", 375 | "s.filter_sources()\n", 376 | "\n", 377 | "# note: some extra columns have been added to the source list\n", 378 | "s.source_list" 379 | ] 380 | }, 381 | { 382 | "cell_type": "markdown", 383 | "id": "bdcc282a", 384 | "metadata": {}, 385 | "source": [ 386 | "Several columns have been added to the source list that characterize the observed streak. Also, the `status` array has values of 0, 1, and 2 now (it just had 0 before). Those with `status=2` are sources that passed all filtering stages (checks for SNR and width, then persistence). Those with `status=1` are sources that passed the first filtering stage (checks for SNR and width), but not the second (persistence check). And `status=0` are sources that did not pass the filtering steps.\n", 387 | "\n", 388 | "The `plot_mrt` command will overplot the different statuses" 389 | ] 390 | }, 391 | { 392 | "cell_type": "code", 393 | "execution_count": null, 394 | "id": "abb07c2c", 395 | "metadata": {}, 396 | "outputs": [], 397 | "source": [ 398 | "s.plot_mrt(show_sources=True)" 399 | ] 400 | }, 401 | { 402 | "cell_type": "markdown", 403 | "id": "01ddbb1b", 404 | "metadata": {}, 405 | "source": [ 406 | "Now we can make the mask itself. By default it only uses sources in the MRT with `status=2`. We make two types of masks, one a simple boolean mask, and one a segementation mask where pixels corresponding to each streak are assigned the ID number. We create these below." 407 | ] 408 | }, 409 | { 410 | "cell_type": "code", 411 | "execution_count": null, 412 | "id": "35d1c54b", 413 | "metadata": { 414 | "scrolled": false 415 | }, 416 | "outputs": [], 417 | "source": [ 418 | "# make the mask\n", 419 | "s.mask_include_status = [2]\n", 420 | "s.make_mask()" 421 | ] 422 | }, 423 | { 424 | "cell_type": "code", 425 | "execution_count": null, 426 | "id": "852171d1", 427 | "metadata": {}, 428 | "outputs": [], 429 | "source": [ 430 | "# plot the mask and segmentation map\n", 431 | "s.plot_mask()\n", 432 | "s.plot_segment()" 433 | ] 434 | }, 435 | { 436 | "cell_type": "markdown", 437 | "id": "4d9739d3", 438 | "metadata": {}, 439 | "source": [ 440 | "We can also overlay the mask on top of the image to make sure it makes sense." 441 | ] 442 | }, 443 | { 444 | "cell_type": "code", 445 | "execution_count": null, 446 | "id": "db757931", 447 | "metadata": {}, 448 | "outputs": [], 449 | "source": [ 450 | "s.plot_image(overlay_mask=True)" 451 | ] 452 | }, 453 | { 454 | "cell_type": "markdown", 455 | "id": "03e97d6e", 456 | "metadata": {}, 457 | "source": [ 458 | "We can save the results now. You have the optional of saving the catalog, mask, MRT, and a diagnostic image that shows the results. In this example we'll just save everything." 459 | ] 460 | }, 461 | { 462 | "cell_type": "code", 463 | "execution_count": null, 464 | "id": "28e56996", 465 | "metadata": {}, 466 | "outputs": [], 467 | "source": [ 468 | "# define what to save\n", 469 | "s.save_mask = True\n", 470 | "s.save_mrt = True\n", 471 | "s.save_catalog = True\n", 472 | "s.save_diagnostic = True\n", 473 | "\n", 474 | "s.save_output()" 475 | ] 476 | }, 477 | { 478 | "cell_type": "markdown", 479 | "id": "142718eb", 480 | "metadata": {}, 481 | "source": [ 482 | "Keep in mind that the mask we have created is applicable to the rebinned image. To convert it into a mask that can be applied to the original unbinned image, we need to resample it using the `block_replicate` function. The rescaled mask is plotted below. Note the difference in image size, but the mask pattern remains the same." 483 | ] 484 | }, 485 | { 486 | "cell_type": "code", 487 | "execution_count": null, 488 | "id": "d601e25f", 489 | "metadata": { 490 | "scrolled": true 491 | }, 492 | "outputs": [], 493 | "source": [ 494 | "full_mask = block_replicate(s.mask, binsize, conserve_sum=False)\n", 495 | "fig,ax=plt.subplots()\n", 496 | "ax.imshow(full_mask,origin='lower')" 497 | ] 498 | }, 499 | { 500 | "cell_type": "markdown", 501 | "id": "94bd6fb7", 502 | "metadata": {}, 503 | "source": [ 504 | "## \n", 505 | "## Example 2: Quick run of TrailFinder on an flc image\n", 506 | "\n", 507 | "Example 1 thoroughly demonstrated the steps to read in an FLC file, pre-process it, and identify trails. This example demonstrates how one can run many of the steps simultaneously once a file is read in an all parameters set. \n", 508 | "\n", 509 | "First, we read in and preprocess the data file exactly as before." 510 | ] 511 | }, 512 | { 513 | "cell_type": "code", 514 | "execution_count": null, 515 | "id": "f904d971", 516 | "metadata": {}, 517 | "outputs": [], 518 | "source": [ 519 | "# Read in the image files and header information\n", 520 | "image_file = 'jc8m32j5q_flc.fits'\n", 521 | "ext = 4 # ACS image data are in extensions 1 or 4, we'll just use 1 for now\n", 522 | "with fits.open(image_file) as h:\n", 523 | " image = h[ext].data # image data\n", 524 | " dq = h[ext+2].data # data quality bitmasks\n", 525 | " \n", 526 | " header = h[0].header # primary header\n", 527 | " image_header = h[1].header # image header\n", 528 | "\n", 529 | "# make a mask for bad pixels.\n", 530 | "mask = bitmask.bitfield_to_boolean_mask(dq, ignore_flags=[4096, 8192, 16384])\n", 531 | "image[mask == True] = np.nan\n", 532 | "\n", 533 | "# Subtract the background from the image.\n", 534 | "image = image - np.nanmedian(image)\n", 535 | "\n", 536 | "# Rebin the image to speed up calculation\n", 537 | "image_rebin = block_reduce(image, 2, func=np.nansum)" 538 | ] 539 | }, 540 | { 541 | "cell_type": "markdown", 542 | "id": "bd74bebf", 543 | "metadata": {}, 544 | "source": [ 545 | "And initialize trail finder as before" 546 | ] 547 | }, 548 | { 549 | "cell_type": "code", 550 | "execution_count": null, 551 | "id": "64ac5854", 552 | "metadata": {}, 553 | "outputs": [], 554 | "source": [ 555 | "s2 = TrailFinder(image=image_rebin,\n", 556 | " header=header,\n", 557 | " image_header=image_header,\n", 558 | " save_image_header_keys=['ROOTNAME', 'CCDCHIP', 'CRPIX1', 'CRPIX2', 'CRVAL1', 'CRVAL2',\n", 559 | " 'ORIENTAT','RA_APER', 'DEC_APER', 'PA_APER'],\n", 560 | " processes=8,\n", 561 | " plot=False,\n", 562 | " threshold=5,\n", 563 | " max_width=75, \n", 564 | " check_persistence=True,\n", 565 | " min_persistence=0.5,\n", 566 | " output_root='example2')" 567 | ] 568 | }, 569 | { 570 | "cell_type": "markdown", 571 | "id": "2bcc16d8", 572 | "metadata": {}, 573 | "source": [ 574 | "If you're feeling ok about the setup, run all the subsequent steps together with the `run_all` command (this calculates the MRT, finds MRT sources, filters the sources, and saves the output)" 575 | ] 576 | }, 577 | { 578 | "cell_type": "code", 579 | "execution_count": null, 580 | "id": "aa725eb6", 581 | "metadata": {}, 582 | "outputs": [], 583 | "source": [ 584 | "s2.run_all()" 585 | ] 586 | }, 587 | { 588 | "cell_type": "markdown", 589 | "id": "023f39c6", 590 | "metadata": {}, 591 | "source": [ 592 | "If we plot the mask, it should look identical to the one in the previous example." 593 | ] 594 | }, 595 | { 596 | "cell_type": "code", 597 | "execution_count": null, 598 | "id": "672c813b", 599 | "metadata": {}, 600 | "outputs": [], 601 | "source": [ 602 | "s2.plot_mask()" 603 | ] 604 | }, 605 | { 606 | "cell_type": "markdown", 607 | "id": "3418fe79", 608 | "metadata": {}, 609 | "source": [ 610 | "## \n", 611 | "## Example 3: find trails in an FLC image using the WFC wrapper\n", 612 | "\n", 613 | "The approaches shown in examples 1 and 2 can be useful for imaging data from any telescope, not just ACS/WFC data. However, for ACS/WFC data, we provide a convenience wrapper that performs even more of the steps all together, including reading the image and pre-processing it.\n", 614 | "\n", 615 | "The `WfcWrapper` class has the same properties as the TrailFinder class, but with a few additional keywords. It also contains the additional routines that read the image, rebin, mask, and subtract the background. By default, these will be run automatically when WfcWrapper is initialized, although this can be turned off. In most cases, you probably will only need to adjust the `binsize` keyword. The specific value of `binsize` is up to the user. Larger values speed up the MRT calculation, but keep in mind that the parameters to filter out spurious trails (e.g., `max_width`) are tuned to WFC data binned 2x2. A user may want to start with a larger value for `binsize` and reduce it once they get a sense for the computation time." 616 | ] 617 | }, 618 | { 619 | "cell_type": "code", 620 | "execution_count": null, 621 | "id": "a88ab476", 622 | "metadata": { 623 | "scrolled": false 624 | }, 625 | "outputs": [], 626 | "source": [ 627 | "w = WfcWrapper('jc8m32j5q_flc.fits', binsize=2, extension=4, processes=8, output_root='example3')" 628 | ] 629 | }, 630 | { 631 | "cell_type": "markdown", 632 | "id": "6246e090", 633 | "metadata": {}, 634 | "source": [ 635 | "We can plot the image to see that it looks like the one from the last example after preprocessing." 636 | ] 637 | }, 638 | { 639 | "cell_type": "code", 640 | "execution_count": null, 641 | "id": "3c8e4041", 642 | "metadata": {}, 643 | "outputs": [], 644 | "source": [ 645 | "w.plot_image()" 646 | ] 647 | }, 648 | { 649 | "cell_type": "markdown", 650 | "id": "e76ab415", 651 | "metadata": {}, 652 | "source": [ 653 | "From here, everything is the same as the last example:" 654 | ] 655 | }, 656 | { 657 | "cell_type": "code", 658 | "execution_count": null, 659 | "id": "59a21001", 660 | "metadata": {}, 661 | "outputs": [], 662 | "source": [ 663 | "w.run_mrt()\n", 664 | "w.find_mrt_sources()\n", 665 | "w.filter_sources()" 666 | ] 667 | }, 668 | { 669 | "cell_type": "markdown", 670 | "id": "49573b7a", 671 | "metadata": {}, 672 | "source": [ 673 | "Below is the resulting MRT and sources" 674 | ] 675 | }, 676 | { 677 | "cell_type": "code", 678 | "execution_count": null, 679 | "id": "dfe00c81", 680 | "metadata": { 681 | "scrolled": true 682 | }, 683 | "outputs": [], 684 | "source": [ 685 | "w.plot_mrt(show_sources=True)" 686 | ] 687 | }, 688 | { 689 | "cell_type": "markdown", 690 | "id": "f32b19ac", 691 | "metadata": {}, 692 | "source": [ 693 | "Lastly, we generate the mask" 694 | ] 695 | }, 696 | { 697 | "cell_type": "code", 698 | "execution_count": null, 699 | "id": "9ff107d4", 700 | "metadata": {}, 701 | "outputs": [], 702 | "source": [ 703 | "w.make_mask()\n", 704 | "w.plot_mask()" 705 | ] 706 | }, 707 | { 708 | "cell_type": "markdown", 709 | "id": "0f752317", 710 | "metadata": {}, 711 | "source": [ 712 | "If you're really feeling very confident, you can run everything in a single line by setting `execute=True`." 713 | ] 714 | }, 715 | { 716 | "cell_type": "code", 717 | "execution_count": null, 718 | "id": "c6c9d53d", 719 | "metadata": { 720 | "scrolled": true 721 | }, 722 | "outputs": [], 723 | "source": [ 724 | "w = WfcWrapper('jc8m32j5q_flc.fits', binsize=2, extension=4, output_root='example3', processes=8,\n", 725 | " execute=True)" 726 | ] 727 | }, 728 | { 729 | "cell_type": "markdown", 730 | "id": "6e2d8290", 731 | "metadata": {}, 732 | "source": [ 733 | "We'll plot the image and mask together to check that everything looks ok" 734 | ] 735 | }, 736 | { 737 | "cell_type": "code", 738 | "execution_count": null, 739 | "id": "cd8046a6", 740 | "metadata": {}, 741 | "outputs": [], 742 | "source": [ 743 | "w.plot_image(overlay_mask=True)" 744 | ] 745 | }, 746 | { 747 | "cell_type": "markdown", 748 | "id": "69675f8c", 749 | "metadata": {}, 750 | "source": [ 751 | "\n", 752 | "## Example 4: Finding trails in a DRC image\n", 753 | "\n", 754 | "Applying `TrailFinder` to a DRC image (that shows both chips together) can boost sensitivity by increasing the number of pixels over which we search for trails. The DRC files also remove the distortion in the original FLC files (though this does not appear to create signficant curvature to most trails). \n", 755 | "\n", 756 | "Here, we demonstrate the steps that go into preparing a DRC image to be analyzed. The subsequent example will illustrate how to do all of this in a single line.\n", 757 | "\n", 758 | "There are no DQ arrays for the DRC files, so we ignore the pre-processing steps that incorporated those." 759 | ] 760 | }, 761 | { 762 | "cell_type": "code", 763 | "execution_count": null, 764 | "id": "2611ffaf", 765 | "metadata": {}, 766 | "outputs": [], 767 | "source": [ 768 | "# Read in the image files and header information\n", 769 | "image_file = 'hst_13498_32_acs_wfc_f606w_jc8m32j5_drc.fits'\n", 770 | "ext = 1\n", 771 | "with fits.open(image_file) as h:\n", 772 | " image = h[ext].data # image data\n", 773 | " wht = h[ext+1].data\n", 774 | " image = image*wht # wht is effective exposure time, so this turns it into counts\n", 775 | " \n", 776 | " header = h[0].header # primary header\n", 777 | " image_header = h[1].header # image header" 778 | ] 779 | }, 780 | { 781 | "cell_type": "code", 782 | "execution_count": null, 783 | "id": "e0aa5079", 784 | "metadata": {}, 785 | "outputs": [], 786 | "source": [ 787 | "# Flag anything with wht == 0 as bad\n", 788 | "image[wht == 0] = np.nan\n", 789 | "\n", 790 | "# Subtract the background from the image. \n", 791 | "median = np.nanmedian(image)\n", 792 | "image = image - np.nanmedian(image)" 793 | ] 794 | }, 795 | { 796 | "cell_type": "code", 797 | "execution_count": null, 798 | "id": "01bb287c", 799 | "metadata": {}, 800 | "outputs": [], 801 | "source": [ 802 | "# Let's rebin the images\n", 803 | "binsize = 2\n", 804 | "image_rebin = block_reduce(image, binsize, func=np.nansum)" 805 | ] 806 | }, 807 | { 808 | "cell_type": "markdown", 809 | "id": "6f010359", 810 | "metadata": {}, 811 | "source": [ 812 | "Setting up `TrailFinder` is essentially the same as earlier examples at this point. We'll use the default settings. In fact, about all the steps from here on out are the same." 813 | ] 814 | }, 815 | { 816 | "cell_type": "code", 817 | "execution_count": null, 818 | "id": "ba69935b", 819 | "metadata": {}, 820 | "outputs": [], 821 | "source": [ 822 | "s4 = TrailFinder(image=image_rebin, processes=8, output_root='example4')" 823 | ] 824 | }, 825 | { 826 | "cell_type": "markdown", 827 | "id": "66ef431e", 828 | "metadata": {}, 829 | "source": [ 830 | "We can do a quick plot of our image to make sure things look ok" 831 | ] 832 | }, 833 | { 834 | "cell_type": "code", 835 | "execution_count": null, 836 | "id": "0bce4e3d", 837 | "metadata": { 838 | "scrolled": false 839 | }, 840 | "outputs": [], 841 | "source": [ 842 | "s4.plot_image()" 843 | ] 844 | }, 845 | { 846 | "cell_type": "markdown", 847 | "id": "c1cb96f9", 848 | "metadata": {}, 849 | "source": [ 850 | "Now run the MRT calculation and plot the results" 851 | ] 852 | }, 853 | { 854 | "cell_type": "code", 855 | "execution_count": null, 856 | "id": "82cc852c", 857 | "metadata": { 858 | "scrolled": false 859 | }, 860 | "outputs": [], 861 | "source": [ 862 | "s4.run_mrt()\n", 863 | "s4.plot_mrt(scale=[-1, 5]) # adjusted scale manually due to varying background in image" 864 | ] 865 | }, 866 | { 867 | "cell_type": "markdown", 868 | "id": "26dd9c60", 869 | "metadata": {}, 870 | "source": [ 871 | "This example has a clear gradient in the background due to the cluster. This causes some large scale variation in the RT, but you can see the \"point source\" signals from the satellite trails around `x,y = (90,700)` and `x,y = (300,700)`. This is a case where we may have wanted to explore some different background subtraction methods, but we'll proceed with the simpler approach here. Now we'll try to pull the sources out." 872 | ] 873 | }, 874 | { 875 | "cell_type": "code", 876 | "execution_count": null, 877 | "id": "07693400", 878 | "metadata": { 879 | "scrolled": true 880 | }, 881 | "outputs": [], 882 | "source": [ 883 | "s4.find_mrt_sources()" 884 | ] 885 | }, 886 | { 887 | "cell_type": "markdown", 888 | "id": "b0b37524", 889 | "metadata": {}, 890 | "source": [ 891 | "And below we plot the MRT with the sources overlaid" 892 | ] 893 | }, 894 | { 895 | "cell_type": "code", 896 | "execution_count": null, 897 | "id": "5b213dca", 898 | "metadata": { 899 | "scrolled": false 900 | }, 901 | "outputs": [], 902 | "source": [ 903 | "s4.plot_mrt(show_sources=True)" 904 | ] 905 | }, 906 | { 907 | "cell_type": "markdown", 908 | "id": "0efb4332", 909 | "metadata": {}, 910 | "source": [ 911 | "It's clearly shredding those large-scale features quite a bit, but we'll try to filter these out." 912 | ] 913 | }, 914 | { 915 | "cell_type": "code", 916 | "execution_count": null, 917 | "id": "e774d81b", 918 | "metadata": {}, 919 | "outputs": [], 920 | "source": [ 921 | "s4.filter_sources()" 922 | ] 923 | }, 924 | { 925 | "cell_type": "markdown", 926 | "id": "7bb302d0", 927 | "metadata": {}, 928 | "source": [ 929 | "Let's re-plot the MRT with sources to see what made it through" 930 | ] 931 | }, 932 | { 933 | "cell_type": "code", 934 | "execution_count": null, 935 | "id": "039addef", 936 | "metadata": {}, 937 | "outputs": [], 938 | "source": [ 939 | "s4.plot_mrt(show_sources=True)" 940 | ] 941 | }, 942 | { 943 | "cell_type": "markdown", 944 | "id": "65779354", 945 | "metadata": {}, 946 | "source": [ 947 | "That seems to have worked! Let's make the map to confirm" 948 | ] 949 | }, 950 | { 951 | "cell_type": "code", 952 | "execution_count": null, 953 | "id": "5314fd66", 954 | "metadata": {}, 955 | "outputs": [], 956 | "source": [ 957 | "s4.make_mask()\n", 958 | "s4.plot_mask()\n", 959 | "s4.plot_segment()" 960 | ] 961 | }, 962 | { 963 | "cell_type": "markdown", 964 | "id": "53ecf16d", 965 | "metadata": {}, 966 | "source": [ 967 | "Let's make a version plotting the mask on top of the original image" 968 | ] 969 | }, 970 | { 971 | "cell_type": "code", 972 | "execution_count": null, 973 | "id": "681d29a2", 974 | "metadata": {}, 975 | "outputs": [], 976 | "source": [ 977 | "s4.plot_image(overlay_mask=True)" 978 | ] 979 | }, 980 | { 981 | "cell_type": "markdown", 982 | "id": "acc92ac5", 983 | "metadata": {}, 984 | "source": [ 985 | "\n", 986 | "## Example 5: Finding trails in a DRC image using the WFC Wrapper\n", 987 | "\n", 988 | "All of the setup from the last example can be streamlined using the `WfcWrapper` class." 989 | ] 990 | }, 991 | { 992 | "cell_type": "code", 993 | "execution_count": null, 994 | "id": "8a1f9771", 995 | "metadata": { 996 | "scrolled": false 997 | }, 998 | "outputs": [], 999 | "source": [ 1000 | "from acstools.findsat_mrt import WfcWrapper\n", 1001 | "w2 = WfcWrapper('hst_13498_32_acs_wfc_f606w_jc8m32j5_drc.fits', binsize=2, extension=1, processes=8,\n", 1002 | " output_root='example5')" 1003 | ] 1004 | }, 1005 | { 1006 | "cell_type": "markdown", 1007 | "id": "ecbc3cf1", 1008 | "metadata": {}, 1009 | "source": [ 1010 | "Run full pipeline now" 1011 | ] 1012 | }, 1013 | { 1014 | "cell_type": "code", 1015 | "execution_count": null, 1016 | "id": "36bf9557", 1017 | "metadata": { 1018 | "scrolled": true 1019 | }, 1020 | "outputs": [], 1021 | "source": [ 1022 | "w2.run_all()" 1023 | ] 1024 | }, 1025 | { 1026 | "cell_type": "markdown", 1027 | "id": "177c03a2", 1028 | "metadata": {}, 1029 | "source": [ 1030 | "Let's plot the final mask to ensure it looks the same as the earlier examples." 1031 | ] 1032 | }, 1033 | { 1034 | "cell_type": "code", 1035 | "execution_count": null, 1036 | "id": "7dc7f533", 1037 | "metadata": { 1038 | "scrolled": true 1039 | }, 1040 | "outputs": [], 1041 | "source": [ 1042 | "w2.plot_mask()" 1043 | ] 1044 | }, 1045 | { 1046 | "cell_type": "markdown", 1047 | "id": "d02fe703", 1048 | "metadata": {}, 1049 | "source": [ 1050 | "And there you go!" 1051 | ] 1052 | }, 1053 | { 1054 | "cell_type": "markdown", 1055 | "id": "7cabd7a2", 1056 | "metadata": {}, 1057 | "source": [ 1058 | "\n", 1059 | "## Example 6: Create a new kernel for trail detection\n", 1060 | "\n", 1061 | "We include a function called create_mrt_line_kernel that can be used to generate kernels for detecting trails of s specified size in the MRT. Note that kernels with widths of 1, 3, 7, and 15 pixels (convolved with a simple Gaussian HST/ACS psf model) are included already, but perhaps you want to generate a kernel with a new width, or convolved with a different PSF." 1062 | ] 1063 | }, 1064 | { 1065 | "cell_type": "code", 1066 | "execution_count": null, 1067 | "id": "0b4782ac", 1068 | "metadata": {}, 1069 | "outputs": [], 1070 | "source": [ 1071 | "from acstools.utils_findsat_mrt import create_mrt_line_kernel" 1072 | ] 1073 | }, 1074 | { 1075 | "cell_type": "markdown", 1076 | "id": "1f5f4a32", 1077 | "metadata": {}, 1078 | "source": [ 1079 | "Let's generate a kernel for a trail that has an inherent width of 5 pixels and is convolved with a Gaussian PSF with `sigma=3`. " 1080 | ] 1081 | }, 1082 | { 1083 | "cell_type": "code", 1084 | "execution_count": null, 1085 | "id": "e890d06d", 1086 | "metadata": { 1087 | "scrolled": false 1088 | }, 1089 | "outputs": [], 1090 | "source": [ 1091 | "out = create_mrt_line_kernel(5, 3, processes=8, plot=True)" 1092 | ] 1093 | }, 1094 | { 1095 | "cell_type": "markdown", 1096 | "id": "d9acfcce", 1097 | "metadata": {}, 1098 | "source": [ 1099 | "The first plot show the model streak. The second plot shows the resulting MRT. The kernsl is created by taking a cutout around the signal in the MRT. The third double-plot shows 1D slices of the signal in the MRT, with orange lines showing the location of the maximum values. These serve as first guesses of the center, after which the center is redetermined using a Guassian fit and the cutout extracted with the kernel perfectly centered. The 4th plot above shows the final kernel.\n", 1100 | "\n", 1101 | "The kernel can be saved by defining the `outfile` keyword in `create_mrt_line_kernel`. By adding this file path into the `kernels` keyword in `TrailFinder` or `WfcWrapper`, it will be used for source detection when running `find_mrt_sources`." 1102 | ] 1103 | } 1104 | ], 1105 | "metadata": { 1106 | "kernelspec": { 1107 | "display_name": "Python 3 (ipykernel)", 1108 | "language": "python", 1109 | "name": "python3" 1110 | }, 1111 | "language_info": { 1112 | "codemirror_mode": { 1113 | "name": "ipython", 1114 | "version": 3 1115 | }, 1116 | "file_extension": ".py", 1117 | "mimetype": "text/x-python", 1118 | "name": "python", 1119 | "nbconvert_exporter": "python", 1120 | "pygments_lexer": "ipython3", 1121 | "version": "3.9.13" 1122 | } 1123 | }, 1124 | "nbformat": 4, 1125 | "nbformat_minor": 5 1126 | } 1127 | -------------------------------------------------------------------------------- /acs_logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/spacetelescope/acs-notebook/49962e53efa23575340acc154446294354848c27/acs_logo.png -------------------------------------------------------------------------------- /acs_pixel_area_maps/acs_pixel_area_maps.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "
\n", 8 | "REQUIREMENT:\n", 9 | " Before proceeding, install or update your\n", 10 | " \n", 11 | " stenv\n", 12 | " \n", 13 | " distribution. stenv is the replacement for AstroConda, which is unsupported as of February 2023.\n", 14 | "
" 15 | ] 16 | }, 17 | { 18 | "cell_type": "markdown", 19 | "metadata": {}, 20 | "source": [ 21 | "# Obtaining Pixel Area Maps for ACS Data\n", 22 | "\n", 23 | "## Introduction\n", 24 | "\n", 25 | "***\n", 26 | "\n", 27 | "The optical design of ACS introduces larger geometric distortion than in other *HST* instruments. In the event a user wishes to perform photometry on data that have not been processed by AstroDrizzle, a correction must be applied to account for the different sizes of the pixels on the sky across the field of view. \n", 28 | "\n", 29 | "A pixel area map (PAM), which is an image where each pixel value describes that pixel's area on the sky relative to the native plate scale, is used for this correction. The distortion corrections applied by both AstroDrizzle and the PAMs include observation-specific corrections (e.g., velocity distortion). For the best results, users are advised to create a PAM for each observation individually.\n", 30 | "\n", 31 | "To transform a distorted FLT/FLC image so that it is suitable for photometry, users must multiply the image by the PAM and divide the result by the exposure time so that the image has units of electrons/second. After this transformation, the information on the [ACS Zeropoints](http://www.stsci.edu/hst/instrumentation/acs/data-analysis/zeropoints) page can be used to convert flux measurements into physical units.\n", 32 | "\n", 33 | "### This tutorial will show you how to...\n", 34 | "\n", 35 | "#### 1. [Construct the Pixel Area Map](#_obtain)\n", 36 | "\n", 37 | "* Use `AstroDrizzle` with ASN files to combine images.\n", 38 | "\n", 39 | "#### 2. [Apply the Pixel Area Map](#_drkdrizzle)\n", 40 | "\n", 41 | "* Identify which dark images to use for your data.\n", 42 | "* Use `AstroDrizzle` to combine dark images." 43 | ] 44 | }, 45 | { 46 | "cell_type": "markdown", 47 | "metadata": {}, 48 | "source": [ 49 | "## Imports\n", 50 | "***\n", 51 | "\n", 52 | "Here we list the Python packages used in this notebook. Links to the documentation for each module is provided for convenience.\n", 53 | "\n", 54 | "| Package Name | module | docs | used for |\n", 55 | "|------------------|:-----------------|:-------------:|:------------|\n", 56 | "| `os` | `system` | link|command line input|\n", 57 | "| `os` | `environ` | link| setting environments |\n", 58 | "|`shutil` | `rmtree` | link| remove directory tree |\n", 59 | "|`matplotlib` |`pyplot` | link| plotting functions\n", 60 | "|`astroquery.mast` |`Observations` | link| download data from MAST |\n", 61 | "|`astropy.io` | `fits` | link| access and update fits files |\n", 62 | "|`stsci.skypac` |`pamutils`| link| construct pixel area map |" 63 | ] 64 | }, 65 | { 66 | "cell_type": "code", 67 | "execution_count": null, 68 | "metadata": {}, 69 | "outputs": [], 70 | "source": [ 71 | "import os\n", 72 | "import shutil\n", 73 | "\n", 74 | "import matplotlib.pyplot as plt\n", 75 | "\n", 76 | "from astroquery.mast import Observations\n", 77 | "from astropy.io import fits\n", 78 | "from stsci.skypac import pamutils" 79 | ] 80 | }, 81 | { 82 | "cell_type": "markdown", 83 | "metadata": {}, 84 | "source": [ 85 | "## Download the Data\n", 86 | "***\n", 87 | "\n", 88 | "Here we download all of the data required for this notebook. This is an important step! Some of the image processing steps require all relevant files to be in the working directory. We recommend working with a brand new directory for every new set of data.\n", 89 | "\n", 90 | "#### [GO Proposal 9438](https://stdatu.stsci.edu/proposal_search.php?mission=hst&id=9438): \"The Origin of the Intergalactic Globular Cluster Population in Abell 1185\"\n", 91 | "\n", 92 | "For this example, we will only retreive data associated with the Observation ID **J6ME13QHQ**. Using the python package `astroquery`, we can retreive files from the [MAST](http://archive.stsci.edu) archive.\n", 93 | "\n", 94 | "
\n", 95 | "MAY CHANGE: The argument \"mrp_only\" stands for \"minimum recommended products only\". It currently needs to be set to False, although in the future, False is intended to be set as the default and can be left out.\n", 96 | "
" 97 | ] 98 | }, 99 | { 100 | "cell_type": "code", 101 | "execution_count": null, 102 | "metadata": {}, 103 | "outputs": [], 104 | "source": [ 105 | "obs_table = Observations.query_criteria(proposal_id=9438, obs_id ='J6ME13QHQ')\n", 106 | "\n", 107 | "dl_table = Observations.download_products(obs_table['obsid'], \n", 108 | " mrp_only=False, \n", 109 | " productSubGroupDescription=['FLT'])" 110 | ] 111 | }, 112 | { 113 | "cell_type": "markdown", 114 | "metadata": {}, 115 | "source": [ 116 | "We'll use the packages `os` and `shutil` to put all of these files in our working directory for convenience and do a little housekeeping." 117 | ] 118 | }, 119 | { 120 | "cell_type": "code", 121 | "execution_count": null, 122 | "metadata": {}, 123 | "outputs": [], 124 | "source": [ 125 | "for row in dl_table:\n", 126 | " oldfname = row['Local Path']\n", 127 | " newfname = os.path.basename(oldfname)\n", 128 | " os.rename(oldfname, newfname)\n", 129 | " \n", 130 | "shutil.rmtree('mastDownload')" 131 | ] 132 | }, 133 | { 134 | "cell_type": "markdown", 135 | "metadata": {}, 136 | "source": [ 137 | "## File Information\n", 138 | "***\n", 139 | "The structure of the fits files from ACS may be different depending on what kind of observation was made. For more information, refer to Section 2.2 of the [ACS Data Handbook](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/other-documents/_documents/acs_dhb.pdf).\n", 140 | "\n", 141 | "#### FLT Files (WFC-Specific)\n", 142 | "\n", 143 | "| Ext | Name | Type | Contains |\n", 144 | "|:----------:|------------------|--------------|:---------------------------------------------------------|\n", 145 | "|0 | PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 146 | "|1 | SCI (Image) | (ImageHDU) | WFC2 raw image data. |\n", 147 | "|2 | ERR (Error) | (ImageHDU) | WFC2 error array. |\n", 148 | "|3 | DQ (Data Quality)| (ImageHDU) | WFC2 data quality array. |\n", 149 | "|4 | SCI (Image) | (ImageHDU) | WFC1 raw image data. |\n", 150 | "|5 | ERR (Error) | (ImageHDU) | WFC1 error array. |\n", 151 | "|6 | DQ (Data Quality)| (ImageHDU) | WFC1 data quality array. |\n", 152 | "|7-10 | D2IMARR | (ImageHDU) | Filter-independent CCD pixel-grid distortion corrections.|\n", 153 | "|11-14| WCSDVARR | (ImageHDU) | Filter-dependent non-polynomial distortion corrections. |\n", 154 | "|15 | WCSCORR | (ImageHDU) | History of changes to the WCS solution. |" 155 | ] 156 | }, 157 | { 158 | "cell_type": "markdown", 159 | "metadata": {}, 160 | "source": [ 161 | "Here, we set our files to variable names below for convenience." 162 | ] 163 | }, 164 | { 165 | "cell_type": "code", 166 | "execution_count": null, 167 | "metadata": {}, 168 | "outputs": [], 169 | "source": [ 170 | "flt_file = 'j6me13qhq_flt.fits'" 171 | ] 172 | }, 173 | { 174 | "cell_type": "markdown", 175 | "metadata": {}, 176 | "source": [ 177 | "You can always use `.info()` on an HDUlist for an overview of the structure" 178 | ] 179 | }, 180 | { 181 | "cell_type": "code", 182 | "execution_count": null, 183 | "metadata": {}, 184 | "outputs": [], 185 | "source": [ 186 | "with fits.open(flt_file) as hdulist:\n", 187 | " hdulist.info()" 188 | ] 189 | }, 190 | { 191 | "cell_type": "markdown", 192 | "metadata": {}, 193 | "source": [ 194 | "## Construct Pixel Area Map\n", 195 | "***\n", 196 | "\n", 197 | "Function input for `pamutils.pam_from_file` is (input_file, extension, output_file). To create a PAM for the first science extension (WFC2) of this observation, we need to specify HDU Extension 1." 198 | ] 199 | }, 200 | { 201 | "cell_type": "code", 202 | "execution_count": null, 203 | "metadata": {}, 204 | "outputs": [], 205 | "source": [ 206 | "pamutils.pam_from_file(flt_file, ext=1, output_pam='j6me13qhq_wfc2_pam.fits')" 207 | ] 208 | }, 209 | { 210 | "cell_type": "markdown", 211 | "metadata": {}, 212 | "source": [ 213 | "For full-frame WFC observations, a second PAM for the WFC1 science array (extension 4) can be created as follows:" 214 | ] 215 | }, 216 | { 217 | "cell_type": "code", 218 | "execution_count": null, 219 | "metadata": {}, 220 | "outputs": [], 221 | "source": [ 222 | "pamutils.pam_from_file(flt_file, ext=4, output_pam='j6me13qhq_wfc1_pam.fits')" 223 | ] 224 | }, 225 | { 226 | "cell_type": "markdown", 227 | "metadata": {}, 228 | "source": [ 229 | "## Apply the Pixel Area Map\n", 230 | "***\n", 231 | "\n", 232 | "Because the pixel area map is an array of flux corrections based on 2d image distortion, you can multiply it by your data element-by-element to produce the corrected image. Here, we present the results." 233 | ] 234 | }, 235 | { 236 | "cell_type": "code", 237 | "execution_count": null, 238 | "metadata": {}, 239 | "outputs": [], 240 | "source": [ 241 | "from p_module import plot" 242 | ] 243 | }, 244 | { 245 | "cell_type": "code", 246 | "execution_count": null, 247 | "metadata": {}, 248 | "outputs": [], 249 | "source": [ 250 | "plot.triple_pam_plot(flt_file, 'j6me13qhq_wfc2_pam.fits', 'WFC2 PAM Correction Results')" 251 | ] 252 | }, 253 | { 254 | "cell_type": "code", 255 | "execution_count": null, 256 | "metadata": {}, 257 | "outputs": [], 258 | "source": [ 259 | "plot.triple_pam_plot(flt_file, 'j6me13qhq_wfc1_pam.fits', 'WFC1 PAM Correction Results')" 260 | ] 261 | }, 262 | { 263 | "cell_type": "markdown", 264 | "metadata": {}, 265 | "source": [ 266 | "### For more help:\n", 267 | "\n", 268 | "More details may be found on the [ACS website](http://www.stsci.edu/hst/instrumentation/acs) and in the [ACS Instrument](https://hst-docs.stsci.edu/display/ACSIHB) and [Data Handbooks](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/other-documents/_documents/acs_dhb.pdf). Geometric Distortion information can be found in [Section 5.6.4](https://hst-docs.stsci.edu/display/ACSIHB/5.6+ACS+Point+Spread+Functions) and [Section 10.4](https://hst-docs.stsci.edu/display/ACSIHB/10.4+Geometrical+Distortion+in+the+ACS) of the ACS Instrument Handbook.\n", 269 | "\n", 270 | "Please visit the [HST Help Desk](http://hsthelp.stsci.edu). Through the help desk portal, you can explore the *HST* Knowledge Base and request additional help from experts." 271 | ] 272 | } 273 | ], 274 | "metadata": { 275 | "kernelspec": { 276 | "display_name": "Python 3 (ipykernel)", 277 | "language": "python", 278 | "name": "python3" 279 | }, 280 | "language_info": { 281 | "codemirror_mode": { 282 | "name": "ipython", 283 | "version": 3 284 | }, 285 | "file_extension": ".py", 286 | "mimetype": "text/x-python", 287 | "name": "python", 288 | "nbconvert_exporter": "python", 289 | "pygments_lexer": "ipython3", 290 | "version": "3.9.13" 291 | } 292 | }, 293 | "nbformat": 4, 294 | "nbformat_minor": 2 295 | } 296 | -------------------------------------------------------------------------------- /acs_pixel_area_maps/p_module/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/spacetelescope/acs-notebook/49962e53efa23575340acc154446294354848c27/acs_pixel_area_maps/p_module/__init__.py -------------------------------------------------------------------------------- /acs_pixel_area_maps/p_module/notebook_tools.py: -------------------------------------------------------------------------------- 1 | import ipywidgets 2 | 3 | 4 | def side2(server, img1, img2, width=500): 5 | 6 | v1 = server.get_viewer('v1') 7 | v2 = server.get_viewer('v2') 8 | 9 | v1.load(img1) 10 | v2.load(img2) 11 | 12 | box = ipywidgets.HBox() 13 | h1 = ipywidgets.HTML() 14 | h2 = ipywidgets.HTML() 15 | 16 | h1.value = v1.embed(height=650, width=width)._repr_html_() 17 | h2.value = v2.embed(height=650, width=width)._repr_html_() 18 | 19 | box.children = [h1, h2] 20 | 21 | return box 22 | 23 | 24 | def side3(server, img1, img2, img3, width=300, xcen=None, ycen=None, zoom_x=5, zoom_y=5, 25 | offset=25, cutlevel=[0, 200]): 26 | 27 | v1 = server.get_viewer('v1') 28 | v2 = server.get_viewer('v2') 29 | v3 = server.get_viewer('v3') 30 | 31 | v1.load(img1) 32 | v2.load(img2) 33 | v3.load(img3) 34 | 35 | objs = [v1, v2, v3] 36 | 37 | for i, obj in enumerate(objs): 38 | if xcen is not None: 39 | if ycen is not None: 40 | obj.set_pan(xcen[i]+offset, ycen[i]) 41 | 42 | obj.scale_to(zoom_x, zoom_y) 43 | 44 | obj.cut_levels(cutlevel[0], cutlevel[1]) 45 | 46 | canvas = obj.add_canvas() 47 | Circle = canvas.get_draw_class('circle') 48 | canvas.add(Circle(xcen[i], ycen[i], radius=20, color='cyan')) 49 | 50 | box = ipywidgets.HBox() 51 | h1 = ipywidgets.HTML() 52 | h2 = ipywidgets.HTML() 53 | h3 = ipywidgets.HTML() 54 | 55 | h1.value = v1.embed(height=650, width=width)._repr_html_() 56 | h2.value = v2.embed(height=650, width=width)._repr_html_() 57 | h3.value = v3.embed(height=650, width=width)._repr_html_() 58 | 59 | box.children = [h1, h2, h3] 60 | 61 | return box -------------------------------------------------------------------------------- /acs_pixel_area_maps/p_module/plot.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | from astropy.io import fits 3 | from astropy.visualization import (ZScaleInterval, LinearStretch, 4 | ImageNormalize) 5 | 6 | def ds9_imitate(ax,image): 7 | norm = ImageNormalize(image, 8 | interval=ZScaleInterval(), 9 | stretch=LinearStretch()) 10 | 11 | ax.imshow(image, cmap='bone', norm=norm) 12 | return 13 | 14 | def triple_pam_plot(flt_file, pam_file, figtitle): 15 | fl_img = fits.getdata(flt_file, ext=1) 16 | pam_img = fits.getdata(pam_file) 17 | 18 | fig = plt.figure(figsize=(20,4)) 19 | fig.suptitle(figtitle,fontsize=20) 20 | 21 | ax = fig.add_subplot(1, 3, 1) 22 | ds9_imitate(ax, fl_img) 23 | ax.set_title('Raw') 24 | 25 | ax2 = fig.add_subplot(1, 3, 2, yticks=[]) 26 | ds9_imitate(ax2, pam_img) 27 | ax2.set_title('Pixel Area Map') 28 | 29 | pamd_img = fl_img*pam_img 30 | 31 | ax3 = fig.add_subplot(1, 3, 3, yticks=[]) 32 | ds9_imitate(ax3, pamd_img) 33 | ax3.set_title('Raw x Pixel Area Map') 34 | 35 | plt.subplots_adjust(wspace=0.05) 36 | return 37 | -------------------------------------------------------------------------------- /acs_pixel_area_maps/p_module/plots.py: -------------------------------------------------------------------------------- 1 | from astropy.io import fits 2 | 3 | import plotly 4 | import plotly.graph_objs as go 5 | import numpy as np 6 | 7 | 8 | def drizzle_rms_plot(pixfrac, rms_med): 9 | 10 | data = go.Scatter(x=pixfrac, y=rms_med, name='Data', mode='markers', marker=dict(size=10)) 11 | 12 | layout = go.Layout(yaxis=dict(title='Drizzle pixfrac', showline=True, ticks='inside', mirror=True), 13 | xaxis=dict(title='RMS/Median', showline=True, ticks='inside', mirror=True), 14 | shapes=[dict(type='line', x0=0, x1=1.1, y0=0.2, y1=0.2, line=dict(color='rgb(255, 40, 0)'))], 15 | title='Drizzle pixfrac Optimization', 16 | autosize=False, 17 | height=600, width=900) 18 | 19 | fig = go.Figure(data=[data], layout=layout) 20 | 21 | return fig 22 | 23 | 24 | def star_2d(img, y, x, size=50, low=10, high=150, spacing=20): 25 | 26 | with fits.open(img) as hdu: 27 | img_data = hdu[0].data[y-size: y+size, x-size: x+size] 28 | 29 | data = go.Contour(z=img_data, 30 | y=np.arange(y-size, y+size+1), 31 | x=np.arange(x-size, x+size+1), 32 | autocontour=False, 33 | colorscale='Jet', 34 | contours=dict(start=low, end=high, size=spacing)) 35 | 36 | layout = go.Layout(autosize=False, 37 | height=600, width=600) 38 | 39 | fig = go.Figure(data=[data], layout=layout) 40 | 41 | return fig -------------------------------------------------------------------------------- /acs_pixel_area_maps/requirements.txt: -------------------------------------------------------------------------------- 1 | # The required packages for the ACS Pixel Area Maps notebook 2 | 3 | matplotlib 4 | astroquery 5 | astropy 6 | stsci 7 | -------------------------------------------------------------------------------- /acs_polarization_tools/acs_polarization_tools.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "
\n", 8 | "REQUIREMENT:\n", 9 | " Before proceeding, install or update your\n", 10 | " \n", 11 | " stenv\n", 12 | " \n", 13 | " distribution. stenv is the replacement for AstroConda, which is unsupported as of February 2023.\n", 14 | "
" 15 | ] 16 | }, 17 | { 18 | "cell_type": "markdown", 19 | "metadata": {}, 20 | "source": [ 21 | "# Using ACS Polarization Tools\n", 22 | "\n", 23 | "## Introduction\n", 24 | "\n", 25 | "***\n", 26 | "\n", 27 | "In December 2020, the ACS Team made available a Python module called `polarization_tools` in the `acstools` package to facilitate data analysis of ACS polarization data. This package contains both tables of calibration terms and class methods for polarization calculations.\n", 28 | "\n", 29 | "Note that this example does not use imaging data (i.e., FITS files). Where necessary, we will indicate values that come from ACS FITS files.\n", 30 | "\n", 31 | "\n", 32 | "### This tutorial will show you how to...\n", 33 | "\n", 34 | "#### [Retrieve Polarization Calibration Coefficients](#_coeffs) \n", 35 | "\n", 36 | "* Get the latest calibration coefficients that are used in `acstools`\n", 37 | "\n", 38 | "#### [Calculate Polarization Properties](#_properties)\n", 39 | "\n", 40 | "* Use the `acstools.polarization_tools` module to compute:\n", 41 | " * Stokes parameters\n", 42 | " * Fractional polarization\n", 43 | " * Electric field vector position angle" 44 | ] 45 | }, 46 | { 47 | "cell_type": "markdown", 48 | "metadata": {}, 49 | "source": [ 50 | "## Imports\n", 51 | "\n", 52 | "***\n", 53 | "\n", 54 | "Here we list the Python packages used in this notebook. Links to the documentation for each module is provided for convenience.\n", 55 | "\n", 56 | "| Package Name | module | docs | used for |\n", 57 | "|------------------|:-----------------|:-------------:|:------------|\n", 58 | "|`acstools` |`polarization_tools`| link| polarization information and methods " 59 | ] 60 | }, 61 | { 62 | "cell_type": "code", 63 | "execution_count": null, 64 | "metadata": {}, 65 | "outputs": [], 66 | "source": [ 67 | "from acstools import polarization_tools" 68 | ] 69 | }, 70 | { 71 | "cell_type": "markdown", 72 | "metadata": {}, 73 | "source": [ 74 | "### Retrieve Polarization Calibration Coefficients\n", 75 | "\n", 76 | "***\n", 77 | "\n", 78 | "The `polarization_tools` module contains all of the necessary calibration terms from the tables in section 5.3 of the ACS Data Handbook (see tables 5.6 and 5.7). These coefficients are stored in YAML files in `acstools` and may be retrieved as `astropy.Table` tables:" 79 | ] 80 | }, 81 | { 82 | "cell_type": "code", 83 | "execution_count": null, 84 | "metadata": {}, 85 | "outputs": [], 86 | "source": [ 87 | "polarizer_tables = polarization_tools.PolarizerTables.from_package_data()\n", 88 | "\n", 89 | "# print the ACS/WFC polarizer efficiencies\n", 90 | "polarizer_tables.wfc_efficiency" 91 | ] 92 | }, 93 | { 94 | "cell_type": "code", 95 | "execution_count": null, 96 | "metadata": {}, 97 | "outputs": [], 98 | "source": [ 99 | "# print the average ACS/WFC cross-polarization leak corrections\n", 100 | "polarizer_tables.wfc_transmission" 101 | ] 102 | }, 103 | { 104 | "cell_type": "markdown", 105 | "metadata": {}, 106 | "source": [ 107 | "In addition, the tables contain metadata that describe their derivation:" 108 | ] 109 | }, 110 | { 111 | "cell_type": "code", 112 | "execution_count": null, 113 | "metadata": {}, 114 | "outputs": [], 115 | "source": [ 116 | "# print the ACS/WFC cross-polarization leak correction metadata\n", 117 | "polarizer_tables.wfc_transmission.meta" 118 | ] 119 | }, 120 | { 121 | "cell_type": "markdown", 122 | "metadata": {}, 123 | "source": [ 124 | "### Calculate Polarization Properties\n", 125 | "\n", 126 | "***\n", 127 | "\n", 128 | "The `acstools.polarization_tools.Polarizer` class contains methods for calculating the polarization properties of a source given ACS photometry as input. The inputs may be either in electrons or electrons/second, as long as all three (for each POL0, POL60, and POL120) are consistently in the same units. In addition, the class can accept either a single float value for each photometric measurement or a `numpy` array. This has the advantage that the class can construct Stokes images from the ACS photometry.\n", 129 | "\n", 130 | "For this example, we will use photometry measurements of the polarizated calibration star Vela 1-81 (an OB supergiant). The observations were taken using the ACS/WFC detector and by crossing the polarization filters with the F606W filter. The average PA_V3 value from the primary headers of the FITS files is 294.41272 degrees." 131 | ] 132 | }, 133 | { 134 | "cell_type": "code", 135 | "execution_count": null, 136 | "metadata": {}, 137 | "outputs": [], 138 | "source": [ 139 | "# Measured photometry from ACS/WFC data\n", 140 | "pol0 = 60603.96\n", 141 | "pol60 = 62382.70\n", 142 | "pol120 = 67898.63\n", 143 | "\n", 144 | "# PA_V3 angle\n", 145 | "pa_v3 = 294.41272\n", 146 | "\n", 147 | "pol_data = polarization_tools.Polarization(pol0, pol60, pol120, 'F606W', 'WFC', pa_v3)\n", 148 | "pol_data.calc_stokes()\n", 149 | "pol_data.calc_polarization()\n", 150 | "print(f'Stokes I = {pol_data.stokes_i}, Stokes Q = {pol_data.stokes_q}, Stokes U = {pol_data.stokes_u}')\n", 151 | "print(f'Fractional polarization = {pol_data.polarization:0.2%}, Position Angle = {pol_data.angle}')" 152 | ] 153 | }, 154 | { 155 | "cell_type": "markdown", 156 | "metadata": {}, 157 | "source": [ 158 | "Notice that we did not provide any of the calibration coefficients from the Data Handbook or from the [Retrieve Polarization Calibration Coefficients](#_coeffs) section. This is because the `Polarization` class retrieves the values using the spectral filter and detector information. These calibration values may be overwritten manually by setting the appropriate attributes:" 159 | ] 160 | }, 161 | { 162 | "cell_type": "code", 163 | "execution_count": null, 164 | "metadata": {}, 165 | "outputs": [], 166 | "source": [ 167 | "# print the efficiency coefficients cross-polarization leak term used\n", 168 | "print(f'c0 = {pol_data.c0}, c60 = {pol_data.c60}, c120 = {pol_data.c120}, cross-leak = {pol_data.transmission_correction}')" 169 | ] 170 | }, 171 | { 172 | "cell_type": "markdown", 173 | "metadata": {}, 174 | "source": [ 175 | "### For more help:\n", 176 | "\n", 177 | "More details may be found on the [ACS website](http://www.stsci.edu/hst/instrumentation/acs) and in the [ACS Instrument](https://hst-docs.stsci.edu/display/ACSIHB) and [Data Handbooks](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/other-documents/_documents/acs_dhb.pdf).\n", 178 | "\n", 179 | "Please visit the [HST Help Desk](http://hsthelp.stsci.edu). Through the help desk portal, you can explore the *HST* Knowledge Base and request additional help from experts." 180 | ] 181 | } 182 | ], 183 | "metadata": { 184 | "kernelspec": { 185 | "display_name": "Python 3 (ipykernel)", 186 | "language": "python", 187 | "name": "python3" 188 | }, 189 | "language_info": { 190 | "codemirror_mode": { 191 | "name": "ipython", 192 | "version": 3 193 | }, 194 | "file_extension": ".py", 195 | "mimetype": "text/x-python", 196 | "name": "python", 197 | "nbconvert_exporter": "python", 198 | "pygments_lexer": "ipython3", 199 | "version": "3.9.13" 200 | } 201 | }, 202 | "nbformat": 4, 203 | "nbformat_minor": 4 204 | } 205 | -------------------------------------------------------------------------------- /acs_reduction/acs_reduction.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "
\n", 8 | "REQUIREMENT:\n", 9 | " Before proceeding, install or update your\n", 10 | " \n", 11 | " stenv\n", 12 | " \n", 13 | " distribution. stenv is the replacement for AstroConda, which is unsupported as of February 2023.\n", 14 | "
" 15 | ] 16 | }, 17 | { 18 | "cell_type": "markdown", 19 | "metadata": {}, 20 | "source": [ 21 | "# ACS/WFC Image Reduction\n", 22 | "\n", 23 | "## Introduction\n", 24 | "\n", 25 | "***\n", 26 | "\n", 27 | "This notebook covers the steps necessary to calibrate Advanced Camera for Surveys (ACS) Wide Field Channel (WFC) observations to produce a distortion-corrected image ready for photometry.\n", 28 | "\n", 29 | "For most observations, reprocessing the raw files with the calibration pipeline is no longer required as the [MAST](http://archive.stsci.edu) archive is now static and any changes to the pipeline or reference files automatically triggers a reprocessing of the data. However, users may wish to reprocess their data with custom reference files.\n", 30 | "\n", 31 | "This notebook is intended for users with a (very!) basic understanding of python and photometry.\n", 32 | "\n", 33 | "You will need approximately **13 GB of space** available for this exercise.\n", 34 | "\n", 35 | "### This tutorial will show you how to...\n", 36 | "\n", 37 | "#### 1. [Calibrate Raw Files](#_calibrate) \n", 38 | "\n", 39 | "* Query the Calibration Reference Data System ([CRDS](https://hst-crds.stsci.edu/)) for the current best reference files applicable to a given observation\n", 40 | "* Update the `*_raw.fits` primary headers with new calibration information\n", 41 | "* Retrieve calibration files from CRDS and set up the reference file directory\n", 42 | "* Process files with `calacs`\n", 43 | "\n", 44 | "#### 2. [Update the WCS](#_wcs) \n", 45 | "\n", 46 | "* Update the FLT/FLC file WCS header keywords" 47 | ] 48 | }, 49 | { 50 | "cell_type": "markdown", 51 | "metadata": {}, 52 | "source": [ 53 | "## Imports\n", 54 | "***\n", 55 | "\n", 56 | "Here we list the Python packages used in this notebook. Links to the documentation for each module is provided for convenience.\n", 57 | "\n", 58 | "| Package Name | module | docs | used for |\n", 59 | "|------------------|:-----------------|:-------------:|:------------|\n", 60 | "| `os` | `system` | link|command line input|\n", 61 | "| `os` | `environ` | link| setting environments |\n", 62 | "|`shutil` | `rmtree` | link| remove directory tree |\n", 63 | "|`glob` | `glob` | link| search for files based on Unix shell rules |\n", 64 | "|`astroquery.mast` |`Observations` | link| download data from MAST |\n", 65 | "|`astropy.io` | `fits` | link| access and update fits files |\n", 66 | "|`astropy.table` | `Table` | link| constructing and editing in a tabular format |\n", 67 | "|`stwcs` |`updatewcs` | link| update wcs solution |" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": null, 73 | "metadata": {}, 74 | "outputs": [], 75 | "source": [ 76 | "import os\n", 77 | "import shutil\n", 78 | "import glob\n", 79 | "\n", 80 | "from astroquery.mast import Observations\n", 81 | "\n", 82 | "from astropy.io import fits\n", 83 | "from astropy.table import Table\n", 84 | "\n", 85 | "from stwcs import updatewcs" 86 | ] 87 | }, 88 | { 89 | "cell_type": "markdown", 90 | "metadata": {}, 91 | "source": [ 92 | "## Download the Data \n", 93 | "***\n", 94 | "\n", 95 | "Here we download all of the data required for this notebook. This is an important step! Some of the image processing steps require all relevant files to be in the working directory. We recommend working with a brand new directory for every new set of data.\n", 96 | "\n", 97 | "#### [GO Proposal 10775](https://stdatu.stsci.edu/proposal_search.php?mission=hst&id=10775): \"An ACS Survey of Galactic Globular Clusters\"\n", 98 | "\n", 99 | "For this example, we will only retreive data associated with the Observation ID **J9L960010**. Using the python package `astroquery`, we can access the [MAST](http://archive.stsci.edu) archive. \n", 100 | "\n", 101 | "We will need to grab the raw files, the telemetry files, and the association file for this observation set.\n", 102 | "\n", 103 | "
\n", 104 | "MAY CHANGE: The argument \"mrp_only\" stands for \"minimum recommended products only\". It currently needs to be set to False, although in the future, False is intended to be set as the default and can be left out.\n", 105 | "
" 106 | ] 107 | }, 108 | { 109 | "cell_type": "code", 110 | "execution_count": null, 111 | "metadata": {}, 112 | "outputs": [], 113 | "source": [ 114 | "obs_table = Observations.query_criteria(proposal_id=10775, obs_id='J9L960010')\n", 115 | "\n", 116 | "\n", 117 | "dl_table = Observations.download_products(obs_table['obsid'],\n", 118 | " productSubGroupDescription=['RAW', 'ASN', 'SPT'],\n", 119 | " mrp_only=False)" 120 | ] 121 | }, 122 | { 123 | "cell_type": "markdown", 124 | "metadata": {}, 125 | "source": [ 126 | "We'll use the packages `os` and `shutil` to put all of these files in our working directory and do a little housekeeping." 127 | ] 128 | }, 129 | { 130 | "cell_type": "code", 131 | "execution_count": null, 132 | "metadata": {}, 133 | "outputs": [], 134 | "source": [ 135 | "for row in dl_table:\n", 136 | " oldfname = row['Local Path']\n", 137 | " newfname = os.path.basename(oldfname)\n", 138 | " os.rename(oldfname, newfname)\n", 139 | " \n", 140 | "# Delete the mastDownload directory and all subdirectories it contains.\n", 141 | "shutil.rmtree('mastDownload')" 142 | ] 143 | }, 144 | { 145 | "cell_type": "markdown", 146 | "metadata": {}, 147 | "source": [ 148 | "Here we set our filenames to variable names for convenience using `glob.glob`." 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "metadata": {}, 155 | "outputs": [], 156 | "source": [ 157 | "asn_file = 'j9l960010_asn.fits'\n", 158 | "raw_files = glob.glob('*_raw.fits')" 159 | ] 160 | }, 161 | { 162 | "cell_type": "markdown", 163 | "metadata": {}, 164 | "source": [ 165 | "## File Information\n", 166 | "***\n", 167 | "The structure of the fits files from ACS may be different depending on what kind of observation was made. For more information, refer to Section 2.2 of the [ACS Data Handbook](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/other-documents/_documents/acs_dhb.pdf).\n", 168 | "\n", 169 | "### Association Files\n", 170 | "\n", 171 | "| Ext | Name | Type | Contains |\n", 172 | "|--------|------------------|--------------|:-------------------------------------------------------|\n", 173 | "|0| PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 174 | "|1| ASN (Association)| (BinTableHDU)| Table of files associated with this group. |" 175 | ] 176 | }, 177 | { 178 | "cell_type": "markdown", 179 | "metadata": {}, 180 | "source": [ 181 | "### Raw Files (WFC-Specific)\n", 182 | "\n", 183 | "| Ext | Name | Type | Contains |\n", 184 | "|--------|------------------|--------------|:-------------------------------------------------------|\n", 185 | "|0| PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 186 | "|1| SCI (Image) | (ImageHDU) | WFC2 raw image data. |\n", 187 | "|2| ERR (Error) | (ImageHDU) | WFC2 error array. |\n", 188 | "|3| DQ (Data Quality)| (ImageHDU) | WFC2 data quality array. |\n", 189 | "|4| SCI (Image) | (ImageHDU) | WFC1 raw image data. |\n", 190 | "|5| ERR (Error) | (ImageHDU) | WFC1 error array. |\n", 191 | "|6| DQ (Data Quality)| (ImageHDU) | WFC1 data quality array. |" 192 | ] 193 | }, 194 | { 195 | "cell_type": "markdown", 196 | "metadata": {}, 197 | "source": [ 198 | "You can always use `.info()` on an HDUlist for an overview of the structure" 199 | ] 200 | }, 201 | { 202 | "cell_type": "code", 203 | "execution_count": null, 204 | "metadata": {}, 205 | "outputs": [], 206 | "source": [ 207 | "with fits.open(asn_file) as hdulist:\n", 208 | " hdulist.info()\n", 209 | " \n", 210 | "with fits.open(raw_files[0]) as hdulist:\n", 211 | " hdulist.info()" 212 | ] 213 | }, 214 | { 215 | "cell_type": "markdown", 216 | "metadata": {}, 217 | "source": [ 218 | "## Calibrate Raw Files \n", 219 | "\n", 220 | "***\n", 221 | "\n", 222 | "Now that we have the `*_raw.fits` files, we can process them with the ACS calibration pipeline `calacs`. \n", 223 | "\n", 224 | "#### Updating Headers for CRDS\n", 225 | "\n", 226 | "By default, the association file will trigger the creation of a drizzled product. In order to avoid this, we will filter the association file to only include table entries with `MEMTYPE` equal to 'EXP-DTH'. This will remove the 'PROD-DTH' entry that prompts AstroDrizzle." 227 | ] 228 | }, 229 | { 230 | "cell_type": "code", 231 | "execution_count": null, 232 | "metadata": {}, 233 | "outputs": [], 234 | "source": [ 235 | "with fits.open(asn_file, mode='update') as asn_hdu:\n", 236 | " asn_tab = asn_hdu[1].data\n", 237 | " asn_tab = asn_tab[asn_tab['MEMTYPE'] == 'EXP-DTH']" 238 | ] 239 | }, 240 | { 241 | "cell_type": "markdown", 242 | "metadata": {}, 243 | "source": [ 244 | "Due to the computationally intense processing required to CTE correct full-frame ACS/WFC images, we have disabled the CTE correction here by default, however it can be turned on by changing the following variable to True:" 245 | ] 246 | }, 247 | { 248 | "cell_type": "code", 249 | "execution_count": null, 250 | "metadata": {}, 251 | "outputs": [], 252 | "source": [ 253 | "cte_correct = False" 254 | ] 255 | }, 256 | { 257 | "cell_type": "markdown", 258 | "metadata": {}, 259 | "source": [ 260 | "Calibration steps can be enabled or disabled by setting the switch keywords in the primary header to 'PERFORM' or 'OMIT', respectively. Switch keywords all end with the string `CORR` (e.g., `BLEVCORR` and `DARKCORR`). In this case, we want to update `PCTECORR`." 261 | ] 262 | }, 263 | { 264 | "cell_type": "code", 265 | "execution_count": null, 266 | "metadata": {}, 267 | "outputs": [], 268 | "source": [ 269 | "for file in raw_files:\n", 270 | " \n", 271 | " if cte_correct: \n", 272 | " value = 'PERFORM'\n", 273 | " else: \n", 274 | " value = 'OMIT'\n", 275 | " \n", 276 | " fits.setval(file, 'PCTECORR', value=value)" 277 | ] 278 | }, 279 | { 280 | "cell_type": "markdown", 281 | "metadata": {}, 282 | "source": [ 283 | "#### Querying CRDS for Reference Files\n", 284 | "\n", 285 | "Before running `calacs`, we need to set some environment variables for several subsequent calibration tasks.\n", 286 | "\n", 287 | "We will point to a subdirectory called `crds_cache/` using the JREF environment variable. The `JREF` variable is used for ACS reference files. Other instruments use other variables, e.g., `IREF` for WFC3." 288 | ] 289 | }, 290 | { 291 | "cell_type": "code", 292 | "execution_count": null, 293 | "metadata": {}, 294 | "outputs": [], 295 | "source": [ 296 | "os.environ['CRDS_SERVER_URL'] = 'https://hst-crds.stsci.edu'\n", 297 | "os.environ['CRDS_SERVER'] = 'https://hst-crds.stsci.edu'\n", 298 | "os.environ['CRDS_PATH'] = './crds_cache'\n", 299 | "os.environ['jref'] = './crds_cache/references/hst/acs/'" 300 | ] 301 | }, 302 | { 303 | "cell_type": "markdown", 304 | "metadata": {}, 305 | "source": [ 306 | "The code block below will query CRDS for the best reference files currently available for these datasets, update the header keywords to point \"to these new files. We will use the Python package `os` to run terminal commands. In the terminal, the line would be:\n", 307 | "\n", 308 | " crds bestrefs --files [filename] --sync-references=1 --update-bestrefs\n", 309 | " \n", 310 | "...where 'filename' is the name of your fits file." 311 | ] 312 | }, 313 | { 314 | "cell_type": "code", 315 | "execution_count": null, 316 | "metadata": {}, 317 | "outputs": [], 318 | "source": [ 319 | "for file in raw_files:\n", 320 | " command_line_input = 'crds bestrefs --files {:} --sync-references=1 --update-bestrefs'.format(file)\n", 321 | " os.system(command_line_input)" 322 | ] 323 | }, 324 | { 325 | "cell_type": "markdown", 326 | "metadata": {}, 327 | "source": [ 328 | "#### Running calacs\n", 329 | "\n", 330 | "Finally, we can run `calacs` on the association file. It will produce `*_flt.fits`. The FLT files have had the default CCD calibration steps (bias subtraction, dark subtraction, flat field normalization) performed.\n", 331 | "\n", 332 | "
If the CTE correction is enabled...\n", 333 | "\n", 334 | " * ...this next step will take a long time to complete. The CTE correction is computationally expensive and will use all of the cores on a machine by default. On an 8 core machine, CTE correcting a full-frame ACS/WFC image can take approximately 15 minutes per RAW file. \n", 335 | " \n", 336 | "\n", 337 | " * ...`*_flc.fits` will also be produced. The FLC files are CTE-corrected but otherwise identical to the FLT files.\n", 338 | "
" 339 | ] 340 | }, 341 | { 342 | "cell_type": "code", 343 | "execution_count": null, 344 | "metadata": {}, 345 | "outputs": [], 346 | "source": [ 347 | "os.system('calacs.e j9l960010_asn.fits');" 348 | ] 349 | }, 350 | { 351 | "cell_type": "markdown", 352 | "metadata": {}, 353 | "source": [ 354 | "Selecting an image to plot, depending on whether or not you enabled CTE correction earlier." 355 | ] 356 | }, 357 | { 358 | "cell_type": "code", 359 | "execution_count": null, 360 | "metadata": {}, 361 | "outputs": [], 362 | "source": [ 363 | "if cte_correct:\n", 364 | " fl_fits = 'j9l960a7q_flc.fits'\n", 365 | "else:\n", 366 | " fl_fits = 'j9l960a7q_flt.fits'" 367 | ] 368 | }, 369 | { 370 | "cell_type": "markdown", 371 | "metadata": {}, 372 | "source": [ 373 | "#### Plotting results\n", 374 | "\n", 375 | "As a check of our calibrated products, we will plot a subsection of one of the input images." 376 | ] 377 | }, 378 | { 379 | "cell_type": "code", 380 | "execution_count": null, 381 | "metadata": {}, 382 | "outputs": [], 383 | "source": [ 384 | "from p_module import plot" 385 | ] 386 | }, 387 | { 388 | "cell_type": "code", 389 | "execution_count": null, 390 | "metadata": {}, 391 | "outputs": [], 392 | "source": [ 393 | "raw_image = fits.getdata('j9l960a7q_raw.fits')\n", 394 | "cal_image = fits.getdata(fl_fits)\n", 395 | "\n", 396 | "plot.calib_compare_plot(raw_image, cal_image)" 397 | ] 398 | }, 399 | { 400 | "cell_type": "markdown", 401 | "metadata": {}, 402 | "source": [ 403 | "Comparing the FLT calibrated image to the RAW uncalibrated one, we can see that image artifacts have been removed. Most noticeably, hot columns in the bias have been subtracted." 404 | ] 405 | }, 406 | { 407 | "cell_type": "code", 408 | "execution_count": null, 409 | "metadata": {}, 410 | "outputs": [], 411 | "source": [ 412 | "if cte_correct:\n", 413 | " img_files = 'j9l9*a[9-f]q_flc.fits'\n", 414 | "else:\n", 415 | " img_files = 'j9l9*a[9-f]q_flt.fits'\n", 416 | "\n", 417 | "updatewcs.updatewcs(img_files, use_db=False)" 418 | ] 419 | }, 420 | { 421 | "cell_type": "markdown", 422 | "metadata": {}, 423 | "source": [ 424 | "## Conclusion\n", 425 | "\n", 426 | "***\n", 427 | "\n", 428 | "The FLT and FLC images are not yet suitable for photometry. Before performing any analysis on the images, we still need to remove detector artifacts, cosmic rays, and geometric distortion. [AstroDrizzle](http://www.stsci.edu/scientific-community/software/drizzlepac.html) can do all of these steps and produce a single mosaic image that incorporates all of the individual exposures.\n", 429 | "\n", 430 | "Users who do not use `astrodrizzle` to correct data for distortion will need to apply a pixel area map to their data to correct for the distorted pixel area projected onto the sky before performing photometry. For those who would like to learn how to create a pixel area map, a Jupyter Notebook on the subject can be found [here](https://github.com/spacetelescope/acs-notebook/blob/master/notebooks/pixel_area_maps.ipynb)." 431 | ] 432 | }, 433 | { 434 | "cell_type": "markdown", 435 | "metadata": {}, 436 | "source": [ 437 | "### For more help:\n", 438 | "\n", 439 | "More details may be found on the [ACS website](http://www.stsci.edu/hst/instrumentation/acs) and in the [ACS Instrument](https://hst-docs.stsci.edu/display/ACSIHB) and [Data Handbooks](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/other-documents/_documents/acs_dhb.pdf).\n", 440 | "\n", 441 | "Please visit the [HST Help Desk](http://hsthelp.stsci.edu). Through the help desk portal, you can explore the *HST* Knowledge Base and request additional help from experts." 442 | ] 443 | } 444 | ], 445 | "metadata": { 446 | "kernelspec": { 447 | "display_name": "Python 3 (ipykernel)", 448 | "language": "python", 449 | "name": "python3" 450 | }, 451 | "language_info": { 452 | "codemirror_mode": { 453 | "name": "ipython", 454 | "version": 3 455 | }, 456 | "file_extension": ".py", 457 | "mimetype": "text/x-python", 458 | "name": "python", 459 | "nbconvert_exporter": "python", 460 | "pygments_lexer": "ipython3", 461 | "version": "3.9.13" 462 | } 463 | }, 464 | "nbformat": 4, 465 | "nbformat_minor": 2 466 | } 467 | -------------------------------------------------------------------------------- /acs_reduction/p_module/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/spacetelescope/acs-notebook/49962e53efa23575340acc154446294354848c27/acs_reduction/p_module/__init__.py -------------------------------------------------------------------------------- /acs_reduction/p_module/plot.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | from astropy.io import fits 3 | from astropy.visualization import (ZScaleInterval, LinearStretch, 4 | ImageNormalize) 5 | 6 | def ds9_imitate(ax, image): 7 | norm = ImageNormalize(image, 8 | interval=ZScaleInterval(), 9 | stretch=LinearStretch()) 10 | 11 | ax.imshow(image, cmap='bone', norm=norm) 12 | return 13 | 14 | def triple_pam_plot(flt_file, pam_file, figtitle): 15 | fl_img = fits.getdata(flt_file, ext=1) 16 | pam_img = fits.getdata(pam_file) 17 | 18 | fig = plt.figure(figsize=(20,4)) 19 | fig.suptitle(figtitle,fontsize=20) 20 | 21 | ax = fig.add_subplot(1, 3, 1) 22 | ds9_imitate(ax, fl_img) 23 | ax.set_title('Raw') 24 | 25 | ax2 = fig.add_subplot(1, 3, 2, yticks=[]) 26 | ds9_imitate(ax2, pam_img) 27 | ax2.set_title('Pixel Area Map') 28 | 29 | pamd_img = fl_img*pam_img 30 | 31 | ax3 = fig.add_subplot(1, 3, 3, yticks=[]) 32 | ds9_imitate(ax3, pamd_img) 33 | ax3.set_title('Raw x Pixel Area Map') 34 | 35 | plt.subplots_adjust(wspace=0.05) 36 | return 37 | 38 | def calib_compare_plot(raw_image, cal_image): 39 | fig = plt.figure(figsize=(14,14)) 40 | 41 | ax = fig.add_subplot(2,1,1) 42 | ds9_imitate(ax, raw_image) 43 | ax.set_title('Raw') 44 | 45 | ax2 = fig.add_subplot(2,1,2) 46 | ds9_imitate(ax2, cal_image) 47 | ax2.set_title('Flat-Fielded') 48 | 49 | return 50 | -------------------------------------------------------------------------------- /acs_reduction/requirements.txt: -------------------------------------------------------------------------------- 1 | # The required packages for the ACS Reduction notebook 2 | 3 | astroquery 4 | astropy 5 | stwcs 6 | -------------------------------------------------------------------------------- /acs_saturation_trails/acs_saturation_trails.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "
\n", 8 | "REQUIREMENT:\n", 9 | " Before proceeding, install or update your\n", 10 | " \n", 11 | " stenv\n", 12 | " \n", 13 | " distribution. stenv is the replacement for AstroConda, which is unsupported as of February 2023.\n", 14 | "
" 15 | ] 16 | }, 17 | { 18 | "cell_type": "markdown", 19 | "metadata": {}, 20 | "source": [ 21 | "# ACS Linearity with Saturated Stars\n", 22 | "\n", 23 | "## Introduction\n", 24 | "\n", 25 | "***\n", 26 | "\n", 27 | "The ACS/WFC CCD becomes saturated around 80000 counts. When this occurs, excess charge from the source spills out lengthwise along the columns of the CCD. This can lead to issues with photometry when using very bright stars, since a significant portion of the star's flux may fall outside of a reasonable extraction radius.\n", 28 | "\n", 29 | "However, accurate relative photometry can be obtained as long as a large enough aperture is selected to contain the spilled flux ([ACS ISR 2004-01](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/instrument-science-reports-isrs/_documents/isr0401.pdf)). While one could simply use a larger circular aperture, that may introduce error when working with a crowded field (where bright stars are often located).\n", 30 | "\n", 31 | "Here we present a method to identify and perform photometry on saturated sources by defining a custom aperture that is a combination of a standard 0.5\" arcsecond circular aperture and the pixels affected by saturation trails. This method has been tested on ACS/WFC observations of 47 Tuc in the F660W band. The plot below shows the results of using this alternative method to recover flux.\n", 32 | "\n", 33 | "![title](photometry_plot.png)\n", 34 | "\n", 35 | "### This tutorial will show you how to...\n", 36 | "\n", 37 | "#### 1. [Prepare Images](#_prep) \n", 38 | "\n", 39 | "* Apply Pixel Area Map\n", 40 | "* Separate by long and short exposure\n", 41 | "* Make sure you have images of the same field\n", 42 | "\n", 43 | "#### 2. [Identify Saturated Stars](#_identify)\n", 44 | "\n", 45 | "* Identify the saturated pixels using the data quality (DQ) array\n", 46 | "* Determine whether or not the saturation trails extend significantly away from the target\n", 47 | "\n", 48 | "#### 3. [Bleed the Saturation Mask](#_bleed)\n", 49 | "\n", 50 | "* Construct a convolution kernel\n", 51 | "* Bleed the saturation mask with the convolution kernel\n", 52 | "\n", 53 | "#### 4. [Define a Custom Aperture](#_define)\n", 54 | "\n", 55 | "* Isolate central clump from your saturation mask\n", 56 | "* Obtain circular aperture as a boolean mask\n", 57 | "* Combine circular aperture with saturation mask\n", 58 | "\n", 59 | "#### 5. [Photometry with a Custom Aperture](#_phot)\n", 60 | "\n", 61 | "* Extract counts with the custom aperture\n", 62 | "* Estimate background to be subtracted\n", 63 | "\n", 64 | "#### 5. [Additional Results](#_results)\n", 65 | "\n", 66 | "* A worked example with several stars" 67 | ] 68 | }, 69 | { 70 | "cell_type": "markdown", 71 | "metadata": {}, 72 | "source": [ 73 | "## Imports\n", 74 | "\n", 75 | "***\n", 76 | "\n", 77 | "Here we list the Python packages used in this notebook. Links to the documentation for each module is provided for convenience.\n", 78 | "\n", 79 | "| Package Name | module | docs | used for |\n", 80 | "|------------------|:-----------------|:-------------:|:------------|\n", 81 | "| `os` | `system` | link|command line input|\n", 82 | "|`shutil` | `rmtree` | link| remove directory tree |\n", 83 | "|`numpy` | `_s` | link| construct array slice object |\n", 84 | "|`matplotlib` |`pyplot` | link| plotting |\n", 85 | "|`astroquery.mast` |`Observations` | link| download data from MAST |\n", 86 | "|`astropy.io` | `fits` | link| access and update fits files |\n", 87 | "|`astropy.table` | `Table` | link| constructing and editing in a tabular format |\n", 88 | "|`astropy.stats` |`sigma_clip`| link| sigma clipping image for background estimation |\n", 89 | "|`scipy.signal` |`convolve2d`| link| convolve saturation mask with kernel |\n", 90 | "|`stsci.skypac` |`pamutils`| link|obtain pixel area maps (PAM) |\n", 91 | "|`photutils` |`CircularAperture`| link| aperture object for photometry |\n", 92 | "|`matplotlib.patches`|`Circle`| link| draw circle on a plot |" 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": null, 98 | "metadata": {}, 99 | "outputs": [], 100 | "source": [ 101 | "import os\n", 102 | "import shutil\n", 103 | "\n", 104 | "import numpy as np\n", 105 | "import matplotlib.pyplot as plt\n", 106 | "\n", 107 | "from astroquery.mast import Observations\n", 108 | "\n", 109 | "from astropy.io import fits\n", 110 | "from astropy.table import Table\n", 111 | "from astropy.stats import sigma_clip\n", 112 | "\n", 113 | "from scipy.signal import convolve2d\n", 114 | "from stsci.skypac import pamutils\n", 115 | "\n", 116 | "from photutils import CircularAperture\n", 117 | "from matplotlib.patches import Circle\n", 118 | "from p_module import plot" 119 | ] 120 | }, 121 | { 122 | "cell_type": "markdown", 123 | "metadata": {}, 124 | "source": [ 125 | "Here we set environment variables for later use with the Calibration Reference Data System (CRDS)." 126 | ] 127 | }, 128 | { 129 | "cell_type": "code", 130 | "execution_count": null, 131 | "metadata": {}, 132 | "outputs": [], 133 | "source": [ 134 | "os.environ['CRDS_SERVER_URL'] = 'https://hst-crds.stsci.edu'\n", 135 | "os.environ['CRDS_SERVER'] = 'https://hst-crds.stsci.edu'\n", 136 | "os.environ['CRDS_PATH'] = './crds_cache'\n", 137 | "os.environ['jref'] = './crds_cache/references/hst/acs/'" 138 | ] 139 | }, 140 | { 141 | "cell_type": "markdown", 142 | "metadata": {}, 143 | "source": [ 144 | "## Download the Data\n", 145 | "\n", 146 | "***\n", 147 | "\n", 148 | "Here we download all of the data required for this notebook. This is an important step! Some of the image processing steps require all relevant files to be in the working directory. We recommend working with a brand new directory for every new set of data.\n", 149 | "\n", 150 | "#### [GO Proposal 14949](https://stdatu.stsci.edu/proposal_search.php?mission=hst&id=14949): \"ACS External CTE Monitor\"\n", 151 | "\n", 152 | "Using the python package `astroquery`, we can download files from the [MAST](http://archive.stsci.edu) archive.\n", 153 | "\n", 154 | "
\n", 155 | "MAY CHANGE: The argument \"mrp_only\" stands for \"minimum recommended products only\". It currently needs to be set to False, although in the future, False is intended to be set as the default and can be left out.\n", 156 | "
" 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": null, 162 | "metadata": {}, 163 | "outputs": [], 164 | "source": [ 165 | "obs_table = Observations.query_criteria(proposal_id=14949, filters='F606W')\n", 166 | "\n", 167 | "dl_table = Observations.download_products(obs_table['obsid'],\n", 168 | " productSubGroupDescription=['FLC'],\n", 169 | " mrp_only=False)" 170 | ] 171 | }, 172 | { 173 | "cell_type": "markdown", 174 | "metadata": {}, 175 | "source": [ 176 | "We'll use the package `os` to put all of these files in our working directory for convenience." 177 | ] 178 | }, 179 | { 180 | "cell_type": "code", 181 | "execution_count": null, 182 | "metadata": {}, 183 | "outputs": [], 184 | "source": [ 185 | "for row in dl_table:\n", 186 | " oldfname = row['Local Path']\n", 187 | " newfname = os.path.basename(oldfname)\n", 188 | " os.rename(oldfname, newfname)" 189 | ] 190 | }, 191 | { 192 | "cell_type": "markdown", 193 | "metadata": {}, 194 | "source": [ 195 | "Now that all of our files are in the current working directory, we delete the leftover MAST file structure." 196 | ] 197 | }, 198 | { 199 | "cell_type": "code", 200 | "execution_count": null, 201 | "metadata": {}, 202 | "outputs": [], 203 | "source": [ 204 | "shutil.rmtree('mastDownload')" 205 | ] 206 | }, 207 | { 208 | "cell_type": "markdown", 209 | "metadata": {}, 210 | "source": [ 211 | "### File Information \n", 212 | "The structure of the fits files from ACS may be different depending on what kind of observation was made. \n", 213 | "For more information, refer to Section 2.2of the [ACS Data Handbook](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/other-documents/_documents/acs_dhb.pdf).\n", 214 | "\n", 215 | "#### Raw Files\n", 216 | "\n", 217 | "A standard raw image file from a subarray has the same structure as you'd expect from full frame observation from ACS/WCS.\n", 218 | "\n", 219 | "| Ext | Name | Type | Contains |\n", 220 | "|--------|------------------|--------------|:-------------------------------------------------------|\n", 221 | "|0| PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 222 | "|1| SCI (Image) | (ImageHDU) | Raw image data. |\n", 223 | "|2| ERR (Error) | (ImageHDU) | Error array. |\n", 224 | "|3| DQ (Data Quality)| (ImageHDU) | Data quality array. |\n", 225 | "\n", 226 | "#### SPT Files\n", 227 | "\n", 228 | "SPT files contain telemetry and engineering data from the telescope.\n", 229 | "\n", 230 | "| Ext | Name | Type | Contains |\n", 231 | "|--------|------------------|--------------|:-------------------------------------------------------|\n", 232 | "|0| PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 233 | "|1| UDL (Image) | (ImageHDU) | Raw image data. |" 234 | ] 235 | }, 236 | { 237 | "cell_type": "markdown", 238 | "metadata": {}, 239 | "source": [ 240 | "You can always use `.info()` on an HDUlist for an overview of the structure." 241 | ] 242 | }, 243 | { 244 | "cell_type": "code", 245 | "execution_count": null, 246 | "metadata": {}, 247 | "outputs": [], 248 | "source": [ 249 | "with fits.open('jdg302ctq_flc.fits') as hdulist:\n", 250 | " hdulist.info()" 251 | ] 252 | }, 253 | { 254 | "cell_type": "markdown", 255 | "metadata": {}, 256 | "source": [ 257 | "## 1. Prepare Images \n", 258 | "***\n", 259 | "\n", 260 | "For this notebook, we will need two well-aligned images of the same field on the sky. One image should have a short exposure time (eg. 40 seconds) and the other should have a long exposure time (eg. 400 seconds). Here we assume you already know which images those are, and set those observation files to appropriate variable names." 261 | ] 262 | }, 263 | { 264 | "cell_type": "code", 265 | "execution_count": null, 266 | "metadata": {}, 267 | "outputs": [], 268 | "source": [ 269 | "fname_short = 'jdg302ctq_flc.fits'\n", 270 | "fname_long = 'jdg301c4q_flc.fits'" 271 | ] 272 | }, 273 | { 274 | "cell_type": "markdown", 275 | "metadata": {}, 276 | "source": [ 277 | "Before we use our images for photometry, we will need to apply a pixel area map (PAM) correction. This step corrects the difference in flux accross the CCD due to distortion. A dedicated notebook on PAM corrections can be found in the ACS notebook collection.\n", 278 | "\n", 279 | "First, we will work with the short exposure image." 280 | ] 281 | }, 282 | { 283 | "cell_type": "code", 284 | "execution_count": null, 285 | "metadata": {}, 286 | "outputs": [], 287 | "source": [ 288 | "fitsfile = fname_short" 289 | ] 290 | }, 291 | { 292 | "cell_type": "markdown", 293 | "metadata": {}, 294 | "source": [ 295 | "Now we can extract the image from the fits file using the python package `fits`. Here, I use the name \"raw_short\" to indicate that this image has not had the PAM correction applied, and is the short exposure image." 296 | ] 297 | }, 298 | { 299 | "cell_type": "code", 300 | "execution_count": null, 301 | "metadata": {}, 302 | "outputs": [], 303 | "source": [ 304 | "raw_short = fits.getdata(fitsfile)" 305 | ] 306 | }, 307 | { 308 | "cell_type": "markdown", 309 | "metadata": {}, 310 | "source": [ 311 | "Now we need to obtain the PAM for this image using the python package `pamutils`. To contruct the new filename for the PAM, we will use the python package `os` to grab the basename of our fits file, and append '_pam.fits' at the end." 312 | ] 313 | }, 314 | { 315 | "cell_type": "code", 316 | "execution_count": null, 317 | "metadata": {}, 318 | "outputs": [], 319 | "source": [ 320 | "pname = os.path.basename(fitsfile).split('.')[0] + '_pam.fits'\n", 321 | "print(pname)" 322 | ] 323 | }, 324 | { 325 | "cell_type": "markdown", 326 | "metadata": {}, 327 | "source": [ 328 | "Now we can run `pam_from_file` on our fits file to create our PAM." 329 | ] 330 | }, 331 | { 332 | "cell_type": "code", 333 | "execution_count": null, 334 | "metadata": {}, 335 | "outputs": [], 336 | "source": [ 337 | "pamutils.pam_from_file(fitsfile, ext=1, output_pam=pname)" 338 | ] 339 | }, 340 | { 341 | "cell_type": "markdown", 342 | "metadata": {}, 343 | "source": [ 344 | "Once our PAM has been written to file, we can extract it with `fits` for later use." 345 | ] 346 | }, 347 | { 348 | "cell_type": "code", 349 | "execution_count": null, 350 | "metadata": {}, 351 | "outputs": [], 352 | "source": [ 353 | "pam_short = fits.getdata(pname)" 354 | ] 355 | }, 356 | { 357 | "cell_type": "markdown", 358 | "metadata": {}, 359 | "source": [ 360 | "Finally, we can apply the PAM corrections to our \"raw\" image." 361 | ] 362 | }, 363 | { 364 | "cell_type": "code", 365 | "execution_count": null, 366 | "metadata": {}, 367 | "outputs": [], 368 | "source": [ 369 | "img_short = raw_short * pam_short" 370 | ] 371 | }, 372 | { 373 | "cell_type": "markdown", 374 | "metadata": {}, 375 | "source": [ 376 | "There is one more array we'll need to extract from our fits file. The data quality (DQ) array labels saturated pixels with the flag number 256. As seen from our [file information](#_fileinfo), the DQ array can be found in extension 3 of the HDU list." 377 | ] 378 | }, 379 | { 380 | "cell_type": "code", 381 | "execution_count": null, 382 | "metadata": {}, 383 | "outputs": [], 384 | "source": [ 385 | "dq_short = fits.getdata(fitsfile, ext=3)==256" 386 | ] 387 | }, 388 | { 389 | "cell_type": "markdown", 390 | "metadata": {}, 391 | "source": [ 392 | "Here I repeat all of the previous steps with the long exposure image, changing variable names where necessary." 393 | ] 394 | }, 395 | { 396 | "cell_type": "code", 397 | "execution_count": null, 398 | "metadata": {}, 399 | "outputs": [], 400 | "source": [ 401 | "fitsfile = fname_long\n", 402 | "\n", 403 | "dq_long = fits.getdata(fitsfile, ext=3)==256\n", 404 | "raw_long = fits.getdata(fitsfile)\n", 405 | "\n", 406 | "pname = os.path.basename(fitsfile).split('.')[0] + '_pam.fits'\n", 407 | "pamutils.pam_from_file(fitsfile, ext=1, output_pam=pname)\n", 408 | "\n", 409 | "pam_long = fits.getdata(pname)\n", 410 | "\n", 411 | "img_long = raw_long * pam_long" 412 | ] 413 | }, 414 | { 415 | "cell_type": "markdown", 416 | "metadata": {}, 417 | "source": [ 418 | "## 2. Identify Saturated Stars \n", 419 | "***\n", 420 | "\n", 421 | "Before we begin our modified aperture photometry routine, we should determine whether or not our sources are saturated. We can identify saturated stars by whether or not their saturation trails extend past a typical extraction radius.\n", 422 | "\n", 423 | "Here we have the local coordinates of a bright star in our field." 424 | ] 425 | }, 426 | { 427 | "cell_type": "code", 428 | "execution_count": null, 429 | "metadata": {}, 430 | "outputs": [], 431 | "source": [ 432 | "local_coord = {'x':1711, 'y':225}" 433 | ] 434 | }, 435 | { 436 | "cell_type": "markdown", 437 | "metadata": {}, 438 | "source": [ 439 | "We will make cutouts around our source with a radius of 100 pixels. This size cutout is typically big enough to contain saturation trails from the brightest stars. We will also assume that our extraction aperture has a radius of 0.5 arcseconds. Knowing that the ACS pixel scale is ~20 pixels/arcsecond, we can calculate our aperture radius in pixels." 440 | ] 441 | }, 442 | { 443 | "cell_type": "code", 444 | "execution_count": null, 445 | "metadata": { 446 | "scrolled": false 447 | }, 448 | "outputs": [], 449 | "source": [ 450 | "pix_per_arcsec = 20\n", 451 | "cutout_radius = 100\n", 452 | "aperture_radius = 0.5 * pix_per_arcsec" 453 | ] 454 | }, 455 | { 456 | "cell_type": "markdown", 457 | "metadata": {}, 458 | "source": [ 459 | "We can make a slice object with numpy to help make cutouts around our source. It will be convenient for us to define a function to construct a cutter object with `numpy`." 460 | ] 461 | }, 462 | { 463 | "cell_type": "code", 464 | "execution_count": null, 465 | "metadata": {}, 466 | "outputs": [], 467 | "source": [ 468 | "def make_cutter(x, y, cutout_radius=100):\n", 469 | " \n", 470 | " # Makes a 2D array slice object centered around x, y\n", 471 | " \n", 472 | " starty, endy = (y - cutout_radius), (y + cutout_radius)\n", 473 | " startx, endx = (x - cutout_radius), (x + cutout_radius)\n", 474 | " \n", 475 | " return np.s_[starty:endy, startx:endx]" 476 | ] 477 | }, 478 | { 479 | "cell_type": "markdown", 480 | "metadata": {}, 481 | "source": [ 482 | "Now we can take a cutout of our image around the source." 483 | ] 484 | }, 485 | { 486 | "cell_type": "code", 487 | "execution_count": null, 488 | "metadata": {}, 489 | "outputs": [], 490 | "source": [ 491 | "cutter = make_cutter(local_coord['x'], local_coord['y'])" 492 | ] 493 | }, 494 | { 495 | "cell_type": "markdown", 496 | "metadata": {}, 497 | "source": [ 498 | "Before we try out our cutter, let's take a look at our full frame image." 499 | ] 500 | }, 501 | { 502 | "cell_type": "code", 503 | "execution_count": null, 504 | "metadata": {}, 505 | "outputs": [], 506 | "source": [ 507 | "plot.ds9_imitate(plt, img_short)" 508 | ] 509 | }, 510 | { 511 | "cell_type": "markdown", 512 | "metadata": {}, 513 | "source": [ 514 | "Now by indexing our image with our cutter, we can grab just the cutout we need!" 515 | ] 516 | }, 517 | { 518 | "cell_type": "code", 519 | "execution_count": null, 520 | "metadata": {}, 521 | "outputs": [], 522 | "source": [ 523 | "plot.ds9_imitate(plt, img_short[cutter])" 524 | ] 525 | }, 526 | { 527 | "cell_type": "markdown", 528 | "metadata": {}, 529 | "source": [ 530 | "We can visually confirm that this source is affected by saturation trails in the short exposure. What about the long exposure image? Since our images are aligned, we can use the same coordinates (and the same cutter!) as before." 531 | ] 532 | }, 533 | { 534 | "cell_type": "code", 535 | "execution_count": null, 536 | "metadata": {}, 537 | "outputs": [], 538 | "source": [ 539 | "plot.ds9_imitate(plt, img_long[cutter])" 540 | ] 541 | }, 542 | { 543 | "cell_type": "markdown", 544 | "metadata": {}, 545 | "source": [ 546 | "We can also apply the same cutter to our DQ saturated pixel array!" 547 | ] 548 | }, 549 | { 550 | "cell_type": "code", 551 | "execution_count": null, 552 | "metadata": {}, 553 | "outputs": [], 554 | "source": [ 555 | "plt.imshow(dq_short[cutter], cmap='bone')" 556 | ] 557 | }, 558 | { 559 | "cell_type": "markdown", 560 | "metadata": {}, 561 | "source": [ 562 | "As we expect, we do not see very much saturation in our short exposure image. What about our long exposure image?" 563 | ] 564 | }, 565 | { 566 | "cell_type": "code", 567 | "execution_count": null, 568 | "metadata": {}, 569 | "outputs": [], 570 | "source": [ 571 | "plt.imshow(dq_long[cutter], cmap='bone')" 572 | ] 573 | }, 574 | { 575 | "cell_type": "markdown", 576 | "metadata": {}, 577 | "source": [ 578 | "Now we see a large clump of saturated pixels spilling along the y-axis!" 579 | ] 580 | }, 581 | { 582 | "cell_type": "markdown", 583 | "metadata": {}, 584 | "source": [ 585 | "For both of these images, we want to see whether or not the saturated pixels fall outside the range of our typical 0.5\" extraction radius." 586 | ] 587 | }, 588 | { 589 | "cell_type": "code", 590 | "execution_count": null, 591 | "metadata": {}, 592 | "outputs": [], 593 | "source": [ 594 | "fig = plt.figure(figsize=[5,5])\n", 595 | "ax = fig.add_subplot(111)\n", 596 | "\n", 597 | "circ_patch = Circle((cutout_radius, cutout_radius),\n", 598 | " radius=aperture_radius,\n", 599 | " color='C1',\n", 600 | " linewidth=2,\n", 601 | " fill=False)\n", 602 | "\n", 603 | "ax.imshow(dq_short[cutter], cmap='bone')\n", 604 | "ax.add_patch(circ_patch)" 605 | ] 606 | }, 607 | { 608 | "cell_type": "code", 609 | "execution_count": null, 610 | "metadata": {}, 611 | "outputs": [], 612 | "source": [ 613 | "fig = plt.figure(figsize=[5,5])\n", 614 | "ax = fig.add_subplot(111)\n", 615 | "\n", 616 | "circ_patch = Circle((cutout_radius, cutout_radius),\n", 617 | " radius=aperture_radius,\n", 618 | " color='C1',\n", 619 | " linewidth=2,\n", 620 | " fill=False)\n", 621 | "ax.imshow(dq_long[cutter], cmap='bone', origin='lower')\n", 622 | "ax.add_patch(circ_patch)" 623 | ] 624 | }, 625 | { 626 | "cell_type": "markdown", 627 | "metadata": {}, 628 | "source": [ 629 | "Since the saturated pixels extend past our extraction radius, we need to use a different method to improve photometry" 630 | ] 631 | }, 632 | { 633 | "cell_type": "markdown", 634 | "metadata": {}, 635 | "source": [ 636 | "## 3. Bleed the Saturation Mask \n", 637 | "\n", 638 | "First we need to define a kernel to bleed our saturation mask. We can do this by hand. Since pixels affected by saturation will spill charge along columns, all we need is to convolve our image with a column kernel." 639 | ] 640 | }, 641 | { 642 | "cell_type": "code", 643 | "execution_count": null, 644 | "metadata": {}, 645 | "outputs": [], 646 | "source": [ 647 | "bleed_kernel = np.array([[0,1,0],\n", 648 | " [0,1,0],\n", 649 | " [0,1,0],\n", 650 | " [0,1,0],\n", 651 | " [0,1,0]])\n", 652 | "\n", 653 | "plt.imshow(bleed_kernel, origin='lower');" 654 | ] 655 | }, 656 | { 657 | "cell_type": "markdown", 658 | "metadata": {}, 659 | "source": [ 660 | "We will use the `scipy` function `convolve2d()` to convolve our cutout with our kernel. Here, `mode='same'` ensures that the returned array is the same shape as the input array." 661 | ] 662 | }, 663 | { 664 | "cell_type": "code", 665 | "execution_count": null, 666 | "metadata": {}, 667 | "outputs": [], 668 | "source": [ 669 | "conv_sat = convolve2d(dq_long[cutter], bleed_kernel, mode='same')" 670 | ] 671 | }, 672 | { 673 | "cell_type": "markdown", 674 | "metadata": {}, 675 | "source": [ 676 | "After convolution, we need to convert to a boolean array." 677 | ] 678 | }, 679 | { 680 | "cell_type": "code", 681 | "execution_count": null, 682 | "metadata": {}, 683 | "outputs": [], 684 | "source": [ 685 | "sat_aperture = np.array([x > 0 for x in conv_sat]).astype(bool)" 686 | ] 687 | }, 688 | { 689 | "cell_type": "markdown", 690 | "metadata": {}, 691 | "source": [ 692 | "Finally, let's take a look at our mask to make sure it \"bled out\" properly." 693 | ] 694 | }, 695 | { 696 | "cell_type": "code", 697 | "execution_count": null, 698 | "metadata": {}, 699 | "outputs": [], 700 | "source": [ 701 | "fig = plt.figure(figsize=[5,5])\n", 702 | "ax = fig.add_subplot(111)\n", 703 | "\n", 704 | "ax.imshow(sat_aperture, cmap='bone', origin='lower');" 705 | ] 706 | }, 707 | { 708 | "cell_type": "markdown", 709 | "metadata": {}, 710 | "source": [ 711 | "## 4. Define a Custom Aperture \n", 712 | "\n", 713 | "Now we want to create a new aperture which includes the pixels with the spilled charge. If we want to use the saturation mask we just created, we need to isolate only the clump associated with our star.\n", 714 | "\n", 715 | "Here, we give you a function which will return a mask with only the central clump." 716 | ] 717 | }, 718 | { 719 | "cell_type": "code", 720 | "execution_count": null, 721 | "metadata": {}, 722 | "outputs": [], 723 | "source": [ 724 | "# Isolate associated clump from saturation mask\n", 725 | "\n", 726 | "def find_central_clump(boolean_mask):\n", 727 | " \n", 728 | " from scipy import ndimage\n", 729 | "\n", 730 | " central_index = tuple((np.array(np.shape(boolean_mask))/2).astype(int))\n", 731 | "\n", 732 | " label, num_label = ndimage.label(boolean_mask == True)\n", 733 | " size = np.bincount(label.ravel())\n", 734 | " \n", 735 | " clump_labels = range(size[1:].shape[0])\n", 736 | " \n", 737 | " is_central_clump = False\n", 738 | " \n", 739 | " for cl in clump_labels:\n", 740 | " \n", 741 | " clump_mask = label == (cl + 1)\n", 742 | " idxs = [tuple(i) for i in np.argwhere(clump_mask)]\n", 743 | " is_central_clump = central_index in idxs\n", 744 | "\n", 745 | " if is_central_clump:\n", 746 | " \n", 747 | " return clump_mask\n", 748 | " \n", 749 | " else: continue\n", 750 | " \n", 751 | " if not is_central_clump:\n", 752 | " \n", 753 | " return 0" 754 | ] 755 | }, 756 | { 757 | "cell_type": "markdown", 758 | "metadata": {}, 759 | "source": [ 760 | "We can apply this function to our mask to isolate the central clump." 761 | ] 762 | }, 763 | { 764 | "cell_type": "code", 765 | "execution_count": null, 766 | "metadata": {}, 767 | "outputs": [], 768 | "source": [ 769 | "central_clump = find_central_clump(sat_aperture)" 770 | ] 771 | }, 772 | { 773 | "cell_type": "markdown", 774 | "metadata": {}, 775 | "source": [ 776 | "Now we can plot the resulting array to see the clump of interest isolated at the center." 777 | ] 778 | }, 779 | { 780 | "cell_type": "code", 781 | "execution_count": null, 782 | "metadata": {}, 783 | "outputs": [], 784 | "source": [ 785 | "plt.imshow(central_clump, origin='lower');" 786 | ] 787 | }, 788 | { 789 | "cell_type": "markdown", 790 | "metadata": {}, 791 | "source": [ 792 | "We can use the package `photutils` to define a circular aperture. To combine it with our mask, we need the circular aperture in mask form. Luckily, this is a built-in feature of aperture objects!" 793 | ] 794 | }, 795 | { 796 | "cell_type": "code", 797 | "execution_count": null, 798 | "metadata": {}, 799 | "outputs": [], 800 | "source": [ 801 | "aperture = CircularAperture((cutout_radius, cutout_radius), aperture_radius)\n", 802 | "aperture_mask = np.array(aperture.to_mask())\n", 803 | "\n", 804 | "plt.imshow(aperture_mask, origin='lower')" 805 | ] 806 | }, 807 | { 808 | "cell_type": "markdown", 809 | "metadata": {}, 810 | "source": [ 811 | "To match the size of our cutout, we can create a new array with our circular aperture at the center" 812 | ] 813 | }, 814 | { 815 | "cell_type": "code", 816 | "execution_count": null, 817 | "metadata": {}, 818 | "outputs": [], 819 | "source": [ 820 | "circular_mask = np.zeros(np.shape(sat_aperture))\n", 821 | "\n", 822 | "aperture_dim = np.shape(aperture_mask)\n", 823 | "cutout_dim = np.shape(circular_mask)\n", 824 | "\n", 825 | "insert_start = int((cutout_dim[0] - aperture_dim[0]) / 2)\n", 826 | "insert_end = int(insert_start + aperture_dim[0])\n", 827 | "\n", 828 | "circular_mask[insert_start:insert_end, insert_start:insert_end] = aperture_mask\n", 829 | " \n", 830 | "circular_mask = circular_mask.astype(bool)\n", 831 | "\n", 832 | "plt.imshow(circular_mask, origin='lower')" 833 | ] 834 | }, 835 | { 836 | "cell_type": "markdown", 837 | "metadata": {}, 838 | "source": [ 839 | "We can use the `numpy` function `logical_or()` to combine both of our masks to form one boolean array." 840 | ] 841 | }, 842 | { 843 | "cell_type": "code", 844 | "execution_count": null, 845 | "metadata": {}, 846 | "outputs": [], 847 | "source": [ 848 | "combined_aperture = np.logical_or(central_clump, circular_mask)\n", 849 | "\n", 850 | "plt.imshow(combined_aperture, origin='lower')" 851 | ] 852 | }, 853 | { 854 | "cell_type": "markdown", 855 | "metadata": {}, 856 | "source": [ 857 | "## 5. Photometry with a Custom Aperture \n", 858 | "\n", 859 | "Now that we have our custom aperture, let's use that aperture to perform photometry for one source on boht our short and long expsure images.\n", 860 | "\n", 861 | "We'll start with the short exposure image. As before, we will use our cutter to make a cutout around the source." 862 | ] 863 | }, 864 | { 865 | "cell_type": "code", 866 | "execution_count": null, 867 | "metadata": {}, 868 | "outputs": [], 869 | "source": [ 870 | "img_cutout = img_short[cutter]" 871 | ] 872 | }, 873 | { 874 | "cell_type": "markdown", 875 | "metadata": {}, 876 | "source": [ 877 | "To obtain the flux in the aperture, all we need to do is to apply the mask to the cutout, and then sum the values." 878 | ] 879 | }, 880 | { 881 | "cell_type": "code", 882 | "execution_count": null, 883 | "metadata": {}, 884 | "outputs": [], 885 | "source": [ 886 | "flux_sum = np.sum(img_cutout[combined_aperture])" 887 | ] 888 | }, 889 | { 890 | "cell_type": "markdown", 891 | "metadata": {}, 892 | "source": [ 893 | "To get the local background for each source, we will sigma-clip the image and calculate the median background value. " 894 | ] 895 | }, 896 | { 897 | "cell_type": "code", 898 | "execution_count": null, 899 | "metadata": {}, 900 | "outputs": [], 901 | "source": [ 902 | "bkg_data = sigma_clip(img_cutout, sigma=2, maxiters=10)" 903 | ] 904 | }, 905 | { 906 | "cell_type": "markdown", 907 | "metadata": {}, 908 | "source": [ 909 | "We will then estimate the background in our new aperture by multiplying the median by the area covered by the aperture." 910 | ] 911 | }, 912 | { 913 | "cell_type": "code", 914 | "execution_count": null, 915 | "metadata": {}, 916 | "outputs": [], 917 | "source": [ 918 | "new_aperture_area = np.sum(combined_aperture)\n", 919 | "bkg_sum = np.median(bkg_data) * new_aperture_area" 920 | ] 921 | }, 922 | { 923 | "cell_type": "markdown", 924 | "metadata": {}, 925 | "source": [ 926 | "Subtract the estimated background from our flux sum, and you're finished!" 927 | ] 928 | }, 929 | { 930 | "cell_type": "code", 931 | "execution_count": null, 932 | "metadata": {}, 933 | "outputs": [], 934 | "source": [ 935 | "final_sum_short = flux_sum - bkg_sum\n", 936 | "\n", 937 | "print(final_sum_short)" 938 | ] 939 | }, 940 | { 941 | "cell_type": "markdown", 942 | "metadata": {}, 943 | "source": [ 944 | "Below, I repeat the photometry steps for this source on the long exposure image." 945 | ] 946 | }, 947 | { 948 | "cell_type": "code", 949 | "execution_count": null, 950 | "metadata": {}, 951 | "outputs": [], 952 | "source": [ 953 | "img_cutout = img_long[cutter]\n", 954 | "flux_sum = np.sum(img_cutout[combined_aperture])\n", 955 | "bkg_data = sigma_clip(img_cutout, sigma=2, maxiters=10)\n", 956 | "bkg_sum = np.median(bkg_data) * new_aperture_area\n", 957 | "\n", 958 | "final_sum_long = flux_sum - bkg_sum\n", 959 | "\n", 960 | "print(final_sum_long)" 961 | ] 962 | }, 963 | { 964 | "cell_type": "markdown", 965 | "metadata": {}, 966 | "source": [ 967 | "If we have recovered the lost flux with our new aperture, our star in the 400 second exposure should have ~10 times the flux as our star in the 40 second exposure." 968 | ] 969 | }, 970 | { 971 | "cell_type": "code", 972 | "execution_count": null, 973 | "metadata": {}, 974 | "outputs": [], 975 | "source": [ 976 | "final_sum_long/final_sum_short" 977 | ] 978 | }, 979 | { 980 | "cell_type": "markdown", 981 | "metadata": {}, 982 | "source": [ 983 | "## 6. Additional Results\n", 984 | "\n", 985 | "Here we perform all of the photometry steps on a list of three stars. This section of the notebook is intended as a worked example for multiple stars, and therefore will not guide you through each step.\n", 986 | "\n", 987 | "Since we are dealing with photometry of more than one star, it will be convenient to define a table to store information for each star. We will create a column each for x-position, y-position, and the final flux sum for each of the images. We set the table length at 'n' rows for each star, and fill it with zeros to start." 988 | ] 989 | }, 990 | { 991 | "cell_type": "code", 992 | "execution_count": null, 993 | "metadata": {}, 994 | "outputs": [], 995 | "source": [ 996 | "local_coords = [(1711, 225), (1205, 238), (3159, 312)]\n", 997 | "n = len(local_coords)\n", 998 | "\n", 999 | "dtype = [('x', 'i4'), \n", 1000 | " ('y', 'i4'), \n", 1001 | " ('flux_short', 'f8'), \n", 1002 | " ('flux_long', 'f8'), \n", 1003 | " ('flux_ratio', 'f8')]\n", 1004 | "\n", 1005 | "source_table = Table(data=np.zeros(n, dtype=dtype))\n", 1006 | "\n", 1007 | "source_table['x'] = [c[0] for c in local_coords]\n", 1008 | "source_table['y'] = [c[1] for c in local_coords]\n", 1009 | "\n", 1010 | "print(source_table)" 1011 | ] 1012 | }, 1013 | { 1014 | "cell_type": "markdown", 1015 | "metadata": {}, 1016 | "source": [ 1017 | "Below I have condensed the steps of this notebook into functions." 1018 | ] 1019 | }, 1020 | { 1021 | "cell_type": "code", 1022 | "execution_count": null, 1023 | "metadata": {}, 1024 | "outputs": [], 1025 | "source": [ 1026 | "def prepare_images(fname):\n", 1027 | " \n", 1028 | " pname = os.path.basename(fname).split('.')[0] + '_pam.fits'\n", 1029 | " pamutils.pam_from_file(fname, ext=1, output_pam=pname)\n", 1030 | "\n", 1031 | " raw_array = fits.getdata(fname)\n", 1032 | " pam_array = fits.getdata(pname)\n", 1033 | " img_array = raw_array * pam_array\n", 1034 | "\n", 1035 | " sat_array = fits.getdata(fname, ext=3)==256\n", 1036 | " \n", 1037 | " return img_array, sat_array\n", 1038 | "\n", 1039 | "def bleed_saturation_mask(sat_array):\n", 1040 | " \n", 1041 | " bleed_kernel = np.array([[0,1,0],\n", 1042 | " [0,1,0],\n", 1043 | " [0,1,0],\n", 1044 | " [0,1,0],\n", 1045 | " [0,1,0]])\n", 1046 | " \n", 1047 | " convolved = convolve2d(sat_array, bleed_kernel, mode='same')\n", 1048 | " bled_mask = np.array([x > 0 for x in convolved]).astype(bool)\n", 1049 | " \n", 1050 | " return bled_mask\n", 1051 | "\n", 1052 | "def photometry_on_cutout(img_cutout, custom_aperture):\n", 1053 | " \n", 1054 | " flux_sum = np.sum(img_cutout[custom_aperture])\n", 1055 | " bkg_data = sigma_clip(img_cutout, sigma=3, maxiters=10)\n", 1056 | " \n", 1057 | " aperture_area = np.sum(custom_aperture)\n", 1058 | " bkg_flux = np.median(bkg_data) * aperture_area\n", 1059 | " \n", 1060 | " return flux_sum-bkg_flux" 1061 | ] 1062 | }, 1063 | { 1064 | "cell_type": "markdown", 1065 | "metadata": {}, 1066 | "source": [ 1067 | "The following cell performs photometry on the three stars." 1068 | ] 1069 | }, 1070 | { 1071 | "cell_type": "code", 1072 | "execution_count": null, 1073 | "metadata": {}, 1074 | "outputs": [], 1075 | "source": [ 1076 | "for row in source_table:\n", 1077 | "\n", 1078 | " img_arr_s, _ = prepare_images(fname_short)\n", 1079 | " img_arr_l, sat_arr = prepare_images(fname_long)\n", 1080 | " sat_mask = bleed_saturation_mask(sat_arr)\n", 1081 | "\n", 1082 | " cutter = make_cutter(row['x'], row['y'])\n", 1083 | "\n", 1084 | " sat_aperture = find_central_clump(sat_mask[cutter])\n", 1085 | "\n", 1086 | " custom_aperture = np.logical_or(sat_aperture, circular_mask)\n", 1087 | "\n", 1088 | " row['flux_short'] = photometry_on_cutout(img_arr_s[cutter], custom_aperture)\n", 1089 | " row['flux_long'] = photometry_on_cutout(img_arr_l[cutter], custom_aperture)\n", 1090 | " \n", 1091 | "source_table['flux_ratio'] = source_table['flux_long']/ source_table['flux_short']" 1092 | ] 1093 | }, 1094 | { 1095 | "cell_type": "markdown", 1096 | "metadata": {}, 1097 | "source": [ 1098 | "Let's take a look at our table..." 1099 | ] 1100 | }, 1101 | { 1102 | "cell_type": "code", 1103 | "execution_count": null, 1104 | "metadata": {}, 1105 | "outputs": [], 1106 | "source": [ 1107 | "source_table" 1108 | ] 1109 | }, 1110 | { 1111 | "cell_type": "markdown", 1112 | "metadata": {}, 1113 | "source": [ 1114 | "While this method is an improvement for some saturated stars, it still has limitations. We can make a quick plot to show that the percentage of recovered flux decreases for brighter stars." 1115 | ] 1116 | }, 1117 | { 1118 | "cell_type": "code", 1119 | "execution_count": null, 1120 | "metadata": {}, 1121 | "outputs": [], 1122 | "source": [ 1123 | "fig = plt.figure(figsize=(8,5))\n", 1124 | "ax = fig.add_subplot(111)\n", 1125 | "\n", 1126 | "ax.plot(source_table['flux_short']/np.max(source_table['flux_short']),\n", 1127 | " source_table['flux_ratio']*10, 'o')\n", 1128 | "\n", 1129 | "ax.text(.7, 101, 'Perfect Recovery', color='C1', fontsize=12)\n", 1130 | "ax.set_ylim([60, 104])\n", 1131 | "ax.set_xlabel('Relative Flux (40s exposure)', fontsize=12)\n", 1132 | "ax.set_ylabel('% recovered flux (400s exposure)', fontsize=12)\n", 1133 | "ax.axhline(y=100, linestyle='--', color='C1')\n", 1134 | "ax.grid(True, linestyle=':')" 1135 | ] 1136 | }, 1137 | { 1138 | "cell_type": "markdown", 1139 | "metadata": {}, 1140 | "source": [ 1141 | "### References\n", 1142 | "\n", 1143 | "***\n", 1144 | "\n", 1145 | "http://iopscience.iop.org/article/10.1086/444553\n", 1146 | "\n", 1147 | "http://documents.stsci.edu/hst/acs/documents/handbooks/DataHandbookv3/acs_Ch57.html\n", 1148 | "\n" 1149 | ] 1150 | } 1151 | ], 1152 | "metadata": { 1153 | "kernelspec": { 1154 | "display_name": "Python 3 (ipykernel)", 1155 | "language": "python", 1156 | "name": "python3" 1157 | }, 1158 | "language_info": { 1159 | "codemirror_mode": { 1160 | "name": "ipython", 1161 | "version": 3 1162 | }, 1163 | "file_extension": ".py", 1164 | "mimetype": "text/x-python", 1165 | "name": "python", 1166 | "nbconvert_exporter": "python", 1167 | "pygments_lexer": "ipython3", 1168 | "version": "3.9.13" 1169 | } 1170 | }, 1171 | "nbformat": 4, 1172 | "nbformat_minor": 2 1173 | } 1174 | -------------------------------------------------------------------------------- /acs_saturation_trails/p_module/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/spacetelescope/acs-notebook/49962e53efa23575340acc154446294354848c27/acs_saturation_trails/p_module/__init__.py -------------------------------------------------------------------------------- /acs_saturation_trails/p_module/plot.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | from astropy.io import fits 3 | from astropy.visualization import (ZScaleInterval, LinearStretch, 4 | ImageNormalize) 5 | 6 | def ds9_imitate(ax, image, extent=None): 7 | norm = ImageNormalize(image, 8 | interval=ZScaleInterval(), 9 | stretch=LinearStretch()) 10 | 11 | ax.imshow(image, cmap='bone', norm=norm, origin='lower', extent=extent) 12 | return 13 | 14 | def triple_pam_plot(flt_file, pam_file, figtitle): 15 | fl_img = fits.getdata(flt_file, ext=1) 16 | pam_img = fits.getdata(pam_file) 17 | 18 | fig = plt.figure(figsize=(20,4)) 19 | fig.suptitle(figtitle,fontsize=20) 20 | 21 | ax = fig.add_subplot(1, 3, 1) 22 | ds9_imitate(ax, fl_img) 23 | ax.set_title('Raw') 24 | 25 | ax2 = fig.add_subplot(1, 3, 2, yticks=[]) 26 | ds9_imitate(ax2, pam_img) 27 | ax2.set_title('Pixel Area Map') 28 | 29 | pamd_img = fl_img*pam_img 30 | 31 | ax3 = fig.add_subplot(1, 3, 3, yticks=[]) 32 | ds9_imitate(ax3, pamd_img) 33 | ax3.set_title('Raw x Pixel Area Map') 34 | 35 | plt.subplots_adjust(wspace=0.05) 36 | return 37 | 38 | def calib_compare_plot(raw_image, cal_image): 39 | fig = plt.figure(figsize=(14,14)) 40 | 41 | ax = fig.add_subplot(2,1,1) 42 | ds9_imitate(ax, raw_image) 43 | ax.set_title('Raw') 44 | 45 | ax2 = fig.add_subplot(2,1,2) 46 | ds9_imitate(ax2, cal_image) 47 | ax2.set_title('Flat-Fielded') 48 | 49 | return 50 | -------------------------------------------------------------------------------- /acs_saturation_trails/requirements.txt: -------------------------------------------------------------------------------- 1 | # The required packages for the ACS Saturation Trails notebook 2 | 3 | numpy 4 | matplotlib 5 | astroquery 6 | astropy 7 | scipy 8 | stsci 9 | photutils 10 | -------------------------------------------------------------------------------- /acs_sbc_dark_analysis/acs_sbc_dark_analysis.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "
\n", 8 | "REQUIREMENT:\n", 9 | " Before proceeding, install or update your\n", 10 | " \n", 11 | " stenv\n", 12 | " \n", 13 | " distribution. stenv is the replacement for AstroConda, which is unsupported as of February 2023.\n", 14 | "
" 15 | ] 16 | }, 17 | { 18 | "cell_type": "markdown", 19 | "metadata": { 20 | "nbpresent": { 21 | "id": "0d33cc02-bd22-422e-974b-893db4bc8a2e" 22 | } 23 | }, 24 | "source": [ 25 | "# SBC Dark Analysis\n", 26 | "\n", 27 | "\n", 28 | "## Introduction\n", 29 | "***\n", 30 | "\n", 31 | "This notebook has been prepared as a demo on how to perform aperture photometry in SBC images that contain an elevated dark rate. This problem arises when the detector temperature goes above ~25 ºC. \n", 32 | "\n", 33 | "More information on the dark rate can be found in [ISR ACS 2017-04](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/instrument-science-reports-isrs/_documents/isr1704.pdf) (Avila 2017).\n", 34 | "\n", 35 | "### This tutorial will show you how to...\n", 36 | "\n", 37 | "#### 1. [Identify Images with Significant Dark Current](#_identify) \n", 38 | "\n", 39 | "* Open files and extract information\n", 40 | "* Organize information in a table\n", 41 | "* Sort table by temperature\n", 42 | "\n", 43 | "#### 2. [Combine Science Images](#_scidrizzle)\n", 44 | "\n", 45 | "* Use `AstroDrizzle` with ASN files to combine images.\n", 46 | "\n", 47 | "#### 3. [Combined Dark Images](#_drkdrizzle)\n", 48 | "\n", 49 | "* Identify which dark images to use for your data.\n", 50 | "* Use `AstroDrizzle` to combine dark images.\n", 51 | "\n", 52 | "#### 4. [Perform Photometry](#photometry)\n", 53 | "\n", 54 | "* Subtract dark current from science images using aperture photometry" 55 | ] 56 | }, 57 | { 58 | "cell_type": "markdown", 59 | "metadata": {}, 60 | "source": [ 61 | "## Imports\n", 62 | "***\n", 63 | "\n", 64 | "Here we list the Python packages used in this notebook. Links to the documentation for each module is provided for convenience.\n", 65 | "\n", 66 | "| Package Name | module | docs | used for |\n", 67 | "|------------------|:-----------------|:-------------:|:------------|\n", 68 | "| `os` | `system` | link|command line input|\n", 69 | "| `os` | `environ` | link| setting environments |\n", 70 | "|`shutil` | `rmtree` | link| remove directory tree |\n", 71 | "|`glob` | `glob` | link| search for files based on Unix shell rules |\n", 72 | "|`matplotlib` |`pyplot` | link| plotting |\n", 73 | "|`matplotlib` |`colors.LogNorm` | link| data normalization used for contrast plotting |\n", 74 | "|`matplotlib` |`patches.Rectangle` | link| add rectangle patch to plot |\n", 75 | "|`numpy` | `_s` | link| construct array slice object |\n", 76 | "|`astroquery.mast` |`Observations` | link| download data from MAST |\n", 77 | "|`drizzlepac` |`astrodrizzle` | link| drizzle combine images |\n", 78 | "|`astropy.io` | `fits` | link| access and update fits files |\n", 79 | "|`astropy.table` | `Table` | link| constructing and editing in a tabular format |\n", 80 | "|`astropy.wcs` | `WCS` | link| extract WCS information from header |\n", 81 | "|`photutils` |`EllipticalAperture`| link| construct aperture object for plotting\n", 82 | "|`photutils` |`aperture_photometry`| link| extract counts from aperture" 83 | ] 84 | }, 85 | { 86 | "cell_type": "code", 87 | "execution_count": null, 88 | "metadata": { 89 | "nbpresent": { 90 | "id": "6c8b2858-62fe-40bd-a114-8257661e0b0b" 91 | } 92 | }, 93 | "outputs": [], 94 | "source": [ 95 | "import os\n", 96 | "import shutil\n", 97 | "import glob\n", 98 | "\n", 99 | "import matplotlib.pyplot as plt\n", 100 | "import numpy as np\n", 101 | "\n", 102 | "from astroquery.mast import Observations\n", 103 | "from drizzlepac.astrodrizzle import AstroDrizzle as adriz\n", 104 | "\n", 105 | "from astropy.io import fits\n", 106 | "from astropy.table import Table\n", 107 | "from astropy import wcs\n", 108 | "\n", 109 | "from matplotlib.colors import LogNorm\n", 110 | "from matplotlib.patches import Rectangle\n", 111 | "\n", 112 | "from photutils import EllipticalAperture\n", 113 | "from photutils import aperture_photometry" 114 | ] 115 | }, 116 | { 117 | "cell_type": "markdown", 118 | "metadata": { 119 | "nbpresent": { 120 | "id": "71a61091-6c4e-4567-880e-8d9908e4dfa4" 121 | } 122 | }, 123 | "source": [ 124 | "## Download the Data\n", 125 | "***\n", 126 | "\n", 127 | "Here we download all of the data required for this notebook. This is an important step! Some of the image processing steps require all relevant files to be in the working directory. We recommend working with a brand new directory for every new set of data.\n", 128 | "\n", 129 | "Using the python package `astroquery`, we can retreive files from the [MAST](http://archive.stsci.edu) archive.\n", 130 | "\n", 131 | "#### [GO Proposal 13655](https://stdatu.stsci.edu/proposal_search.php?mission=hst&id=13655): \"How Lyman alpha bites/beats the dust\"\n", 132 | "\n", 133 | "First, we will grab the FLT and ASN files from program 13655. For this example, we only want to retreive the files from visit 11 of this program. We will specify program ID 'JCMC' along with observation set ID '11'.\n", 134 | "\n", 135 | "
\n", 136 | "MAY CHANGE: The argument \"mrp_only\" stands for \"minimum recommended products only\". It currently needs to be set to False, although in the future, False is intended to be set as the default and can be left out.\n", 137 | "
" 138 | ] 139 | }, 140 | { 141 | "cell_type": "code", 142 | "execution_count": null, 143 | "metadata": {}, 144 | "outputs": [], 145 | "source": [ 146 | "science_list = Observations.query_criteria(proposal_id='13655', obs_id='JCMC11*')\n", 147 | "\n", 148 | "sci_dl_table = Observations.download_products(science_list['obsid'], \n", 149 | " productSubGroupDescription=['ASN', 'FLT'],\n", 150 | " mrp_only=False)" 151 | ] 152 | }, 153 | { 154 | "cell_type": "markdown", 155 | "metadata": {}, 156 | "source": [ 157 | "#### [GO Proposal 13961](https://stdatu.stsci.edu/proposal_search.php?mission=hst&id=13961): \"SBC Dark Current Measurement\"\n", 158 | "\n", 159 | "Now we need a set of dark calibration images. You can use any calibration set as long as the dark rate in the image matches that of your science image (discussed later in this notebook). For convenience, here we download the RAW dark frames from one calibration program: GO Proposal 13961." 160 | ] 161 | }, 162 | { 163 | "cell_type": "code", 164 | "execution_count": null, 165 | "metadata": {}, 166 | "outputs": [], 167 | "source": [ 168 | "darks_list = Observations.query_criteria(proposal_id='13961', obstype='cal')\n", 169 | "\n", 170 | "drk_dl_table = Observations.download_products(darks_list['obsid'],\n", 171 | " productSubGroupDescription=['RAW'],\n", 172 | " mrp_only=False)" 173 | ] 174 | }, 175 | { 176 | "cell_type": "markdown", 177 | "metadata": {}, 178 | "source": [ 179 | "We'll use the packages `os` and `shutil` to put all of these files in our working directory for convenience and do a little housekeeping. Now let's place those images in the same directory as this notebook..." 180 | ] 181 | }, 182 | { 183 | "cell_type": "code", 184 | "execution_count": null, 185 | "metadata": {}, 186 | "outputs": [], 187 | "source": [ 188 | "for dl_table in [sci_dl_table, drk_dl_table]:\n", 189 | " \n", 190 | " for row in dl_table:\n", 191 | " oldfname = row['Local Path']\n", 192 | " newfname = os.path.basename(oldfname)\n", 193 | " os.rename(oldfname, newfname)\n", 194 | "\n", 195 | "shutil.rmtree('mastDownload/')" 196 | ] 197 | }, 198 | { 199 | "cell_type": "markdown", 200 | "metadata": {}, 201 | "source": [ 202 | "Below we define our filenames with variables for convenience using `glob.glob`." 203 | ] 204 | }, 205 | { 206 | "cell_type": "code", 207 | "execution_count": null, 208 | "metadata": {}, 209 | "outputs": [], 210 | "source": [ 211 | "asn_list = glob.glob('*_asn.fits')\n", 212 | "flt_list = glob.glob('*_flt.fits')\n", 213 | "drk_list = glob.glob('*_raw.fits')" 214 | ] 215 | }, 216 | { 217 | "cell_type": "markdown", 218 | "metadata": {}, 219 | "source": [ 220 | "## File Information\n", 221 | "\n", 222 | "***\n", 223 | "\n", 224 | "The structure of the fits files from ACS may be different depending on what kind of observation was made. For more information, refer to Section 2.2 of the [ACS Data Handbook](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/other-documents/_documents/acs_dhb.pdf).\n", 225 | "\n", 226 | "### Association Files\n", 227 | "\n", 228 | "Association files only contain one extension which lists associated files and their types.\n", 229 | "\n", 230 | "| Ext | Name | Type | Contains |\n", 231 | "|--------|------------------|--------------|:-------------------------------------------------------|\n", 232 | "|0| PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 233 | "|1| ASN (Association)| (BinTableHDU)| Table of files associated with this group. |" 234 | ] 235 | }, 236 | { 237 | "cell_type": "markdown", 238 | "metadata": {}, 239 | "source": [ 240 | "### Raw Files\n", 241 | "\n", 242 | "A standard raw image file from the SBC has the same structure as you'd expect from full frame observation from ACS.\n", 243 | "\n", 244 | "| Ext | Name | Type | Contains |\n", 245 | "|--------|------------------|--------------|:-------------------------------------------------------|\n", 246 | "|0| PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 247 | "|1| SCI (Image) | (ImageHDU) | Raw image data. |\n", 248 | "|2| ERR (Error) | (ImageHDU) | Error array. |\n", 249 | "|3| DQ (Data Quality)| (ImageHDU) | Data quality array. |" 250 | ] 251 | }, 252 | { 253 | "cell_type": "markdown", 254 | "metadata": {}, 255 | "source": [ 256 | "### FLT Files\n", 257 | "\n", 258 | "SBC flat-fielded files have the same structure as the raw files, with additional HDUs for WCS corrections.\n", 259 | "\n", 260 | "| Ext | Name | Type | Contains |\n", 261 | "|----------|------------------|--------------|:-------------------------------------------------------|\n", 262 | "|0 | PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 263 | "|1 | SCI (Image) | (ImageHDU) | Raw image data. |\n", 264 | "|2 | ERR (Error) | (ImageHDU) | Error array. |\n", 265 | "|3 | DQ (Data Quality)| (ImageHDU) | Data quality array. |\n", 266 | "|4-5| WCSDVARR (WCS) | (ImageHDU) | Filter-dependent non-polynomial distortion corrections.|\n", 267 | "|6 | WCSCORR (WCS) | (BinTableHDU)| History of changes to the WCS solution. |" 268 | ] 269 | }, 270 | { 271 | "cell_type": "markdown", 272 | "metadata": {}, 273 | "source": [ 274 | "You can always use `.info()` on an HDUlist for an overview of the structure." 275 | ] 276 | }, 277 | { 278 | "cell_type": "code", 279 | "execution_count": null, 280 | "metadata": {}, 281 | "outputs": [], 282 | "source": [ 283 | "with fits.open(drk_list[0]) as hdulist:\n", 284 | " hdulist.info()" 285 | ] 286 | }, 287 | { 288 | "cell_type": "markdown", 289 | "metadata": {}, 290 | "source": [ 291 | "## Identify Affected Observations\n", 292 | "\n", 293 | "***\n", 294 | "\n", 295 | "Let's take a look at some information from our science images. We want to find observations with an average temperature greater than 25$^o$C. We can organize the information in a `Table` object from `astropy.table` for convenience. Here, we define a table with column names and respective data types.\n", 296 | "\n", 297 | "Note: The FITS header keywords `mdecodt1` and `mdecodt2` refer to the temperature at the beginning and end of the exposure respectively." 298 | ] 299 | }, 300 | { 301 | "cell_type": "code", 302 | "execution_count": null, 303 | "metadata": {}, 304 | "outputs": [], 305 | "source": [ 306 | "flt_table = Table(names=('file', 'start', 'filter1', 'mdecodt1', 'mdecodt2', 'avgtemp'),\n", 307 | " dtype=('S64', 'S19', 'S6', 'f8', 'f8', 'f8'))" 308 | ] 309 | }, 310 | { 311 | "cell_type": "markdown", 312 | "metadata": {}, 313 | "source": [ 314 | "Now we need to obtain information from the headers. The temperatures are stored in the science extensions, while observation information is found in the primary header.\n", 315 | "\n", 316 | "
Pro-Tip: Adding rows to a table is a slow way to construct a table. For larger sets of data, consider\n", 317 | "\n", 318 | " constructing a table of known size.\n", 319 | "
" 320 | ] 321 | }, 322 | { 323 | "cell_type": "code", 324 | "execution_count": null, 325 | "metadata": {}, 326 | "outputs": [], 327 | "source": [ 328 | "for file in flt_list:\n", 329 | " filt = fits.getval(file,'FILTER1', ext=0)\n", 330 | " date = fits.getval(file,'DATE-OBS', ext=0)\n", 331 | " time = fits.getval(file,'TIME-OBS', ext=0)\n", 332 | " \n", 333 | " t1 = fits.getval(file,'MDECODT1', ext=1)\n", 334 | " t2 = fits.getval(file,'MDECODT2', ext=1)\n", 335 | "\n", 336 | " starttime = date + 'T' + time\n", 337 | " avgtemp = (t1+t2) / 2\n", 338 | " \n", 339 | " flt_table.add_row((file, starttime, filt, t1, t2, avgtemp))\n", 340 | "\n", 341 | "print(flt_table)" 342 | ] 343 | }, 344 | { 345 | "cell_type": "markdown", 346 | "metadata": {}, 347 | "source": [ 348 | "We can sort the table by column value for analysis. Since we are interested in temperature, we'll sort this table by the column 'avgtemp'" 349 | ] 350 | }, 351 | { 352 | "cell_type": "code", 353 | "execution_count": null, 354 | "metadata": {}, 355 | "outputs": [], 356 | "source": [ 357 | "flt_table.sort('avgtemp')\n", 358 | "print(flt_table)" 359 | ] 360 | }, 361 | { 362 | "cell_type": "markdown", 363 | "metadata": {}, 364 | "source": [ 365 | "Sorting the table by average temperature gives us a sense of how temperature of the SBC behaves over time. Only the last image was affected by a temperature of over 25$^o$C, and therefore the only image to be affected by elevated dark current. \n", 366 | "\n", 367 | "The table shows us that this image was taken with the filter F165LP- which is also same filter that the first image was taken with. This is not a coincidence! Take a moment to consider the time-based symmetry of the images, and what that means for the dark current of the combined images." 368 | ] 369 | }, 370 | { 371 | "cell_type": "markdown", 372 | "metadata": {}, 373 | "source": [ 374 | "## Combine science images\n", 375 | "\n", 376 | "***\n", 377 | "\n", 378 | "Let's make drizzled products for each filter. We do this by using the ASN files for each filter. The ASN files will tell AstroDrizzle which flat images to combine for a given filter. Steps 1-6 of the drizzling procedure have been turned off since their purpose is to identify and mask cosmic rays, which do not affect SBC images.\n", 379 | "\n", 380 | "The drizzle keyword parameters below are the appropriate ones for SBC data. For \"final_scale\" we use the pixel scale of SBC, 0.025 arcseconds." 381 | ] 382 | }, 383 | { 384 | "cell_type": "code", 385 | "execution_count": null, 386 | "metadata": {}, 387 | "outputs": [], 388 | "source": [ 389 | "driz_kwargs = {'runfile':'',\n", 390 | " 'context':False,\n", 391 | " 'build':False,\n", 392 | " 'preserve':False,\n", 393 | " 'clean':True,\n", 394 | " 'static':False,\n", 395 | " 'skysub':False,\n", 396 | " 'driz_separate':False,\n", 397 | " 'median':False,\n", 398 | " 'blot':False,\n", 399 | " 'driz_cr':False,\n", 400 | " 'driz_combine':True,\n", 401 | " 'final_wcs':True,\n", 402 | " 'final_scale':0.025}" 403 | ] 404 | }, 405 | { 406 | "cell_type": "markdown", 407 | "metadata": {}, 408 | "source": [ 409 | "Now we'll run AstroDrizzle on our list of association files." 410 | ] 411 | }, 412 | { 413 | "cell_type": "code", 414 | "execution_count": null, 415 | "metadata": {}, 416 | "outputs": [], 417 | "source": [ 418 | "for file in asn_list:\n", 419 | " output_name = fits.getheader(file)['asn_id']\n", 420 | " adriz(file, output=output_name, **driz_kwargs)" 421 | ] 422 | }, 423 | { 424 | "cell_type": "markdown", 425 | "metadata": {}, 426 | "source": [ 427 | "## Create dark images\n", 428 | "\n", 429 | "***\n", 430 | "\n", 431 | "We want to use dark frames to make a drizzled product that will be used to approximate the dark rate to be subtracted from the science product. The dark rate above 25C varies. We need to find the dark frame that contains a *dark rate* similar to your affected image. \n", 432 | "\n", 433 | "Below we open the two F165LP science frames, one being a high temperature image and the other being a lower temperature image with negligible dark current. \n", 434 | "\n", 435 | "To measure the dark rate, we will take the sum of the pixels within a 200x200 box. The box will be placed off-center of our SBC image where dark rate is low and consistent, at (y, x) = (750, 680). We will measure this sum for our science images as well as all of the dark frames.\n", 436 | "\n", 437 | "With the array handling package `numpy`, we can define a 2D array slice to use for later." 438 | ] 439 | }, 440 | { 441 | "cell_type": "code", 442 | "execution_count": null, 443 | "metadata": {}, 444 | "outputs": [], 445 | "source": [ 446 | "cutter = np.s_[650:850,580:780]" 447 | ] 448 | }, 449 | { 450 | "cell_type": "markdown", 451 | "metadata": {}, 452 | "source": [ 453 | "Now we can print out the sum of the pixels in each image cut out." 454 | ] 455 | }, 456 | { 457 | "cell_type": "code", 458 | "execution_count": null, 459 | "metadata": {}, 460 | "outputs": [], 461 | "source": [ 462 | "sci_list = ['jcmc11ctq_flt.fits',\n", 463 | " 'jcmc11e6q_flt.fits']\n", 464 | "\n", 465 | "print('Box Sums for Science Data:\\n')\n", 466 | "\n", 467 | "for file in sci_list:\n", 468 | " image = fits.getdata(file)\n", 469 | " img_slice = image[cutter]\n", 470 | " neatname = os.path.basename(file)\n", 471 | " print('{} --> {}'.format(neatname,np.sum(img_slice)))\n", 472 | "\n", 473 | "print('\\n----------------\\n')\n", 474 | "print('Box Sums for Dark Frames:\\n')\n", 475 | "\n", 476 | "for file in drk_list:\n", 477 | " image = fits.getdata(file)\n", 478 | " img_slice = image[cutter]\n", 479 | " neatname = os.path.basename(file)\n", 480 | " print('{} --> {}'.format(neatname, np.sum(img_slice)))" 481 | ] 482 | }, 483 | { 484 | "cell_type": "markdown", 485 | "metadata": {}, 486 | "source": [ 487 | "It looks like the two dark frames that come closest to the science frames are **jcrx01iqq** and **jcrx01iyq**. We will use those to make a combined master dark frame. Note that for programs with more exposures, you will need to do this for each input image in the combined mosaic.\n", 488 | "\n", 489 | "For better visualization, let's take a look at one of our science images and matching dark frame. We will also highlight the dark rate extraction box in each plot." 490 | ] 491 | }, 492 | { 493 | "cell_type": "code", 494 | "execution_count": null, 495 | "metadata": {}, 496 | "outputs": [], 497 | "source": [ 498 | "plt_kwargs = {'norm':LogNorm(),\n", 499 | " 'interpolation':'nearest',\n", 500 | " 'cmap':'plasma',\n", 501 | " 'origin':'lower'}\n", 502 | "\n", 503 | "fig, ax = plt.subplots(1, 2, figsize=(16,12))\n", 504 | "\n", 505 | "science = fits.getdata('jcmc11ctq_flt.fits')\n", 506 | "dark = fits.getdata('jcrx01iqq_raw.fits')\n", 507 | "\n", 508 | "ax[0].set_title('Science Data')\n", 509 | "ax[1].set_title('Dark Frame')\n", 510 | "\n", 511 | "ax[0].imshow(science, **plt_kwargs)\n", 512 | "ax[1].imshow(dark, **plt_kwargs)\n", 513 | "\n", 514 | "# Must define twice since artist objects can only be used once.\n", 515 | "patch0 = Rectangle((680,750), width=200, height=200, alpha=0.5)\n", 516 | "patch1 = Rectangle((680,750), width=200, height=200, alpha=0.5)\n", 517 | "\n", 518 | "ax[0].add_patch(patch0)\n", 519 | "ax[1].add_patch(patch1);" 520 | ] 521 | }, 522 | { 523 | "cell_type": "markdown", 524 | "metadata": {}, 525 | "source": [ 526 | "To preserve important information in the header specific to the science image, such as the WCS solution, we will insert the data of the dark images into copies of the science images. We also must remember to adjust the exposure time of the copies of the science frames to that of the dark frames so that the drizzled products have the correct count rates." 527 | ] 528 | }, 529 | { 530 | "cell_type": "code", 531 | "execution_count": null, 532 | "metadata": {}, 533 | "outputs": [], 534 | "source": [ 535 | "flt_file = 'jcmc11ctq_flt.fits'\n", 536 | "drk_file = 'jcrx01iiq_raw.fits'\n", 537 | "new_file = 'dark1.fits'\n", 538 | "\n", 539 | "os.system('cp {:} {:}'.format(flt_file, new_file))\n", 540 | "\n", 541 | "darkdat = fits.getdata(drk_file)\n", 542 | "exptime = fits.getval(drk_file, 'exptime', ext=0)\n", 543 | "\n", 544 | "with fits.open(new_file, mode='update') as hdu:\n", 545 | " \n", 546 | " hdu[1].data[:,:] = darkdat\n", 547 | " hdu[0].header['exptime'] = exptime" 548 | ] 549 | }, 550 | { 551 | "cell_type": "code", 552 | "execution_count": null, 553 | "metadata": {}, 554 | "outputs": [], 555 | "source": [ 556 | "flt_file = 'jcmc11e6q_flt.fits'\n", 557 | "drk_file = 'jcrx01iyq_raw.fits'\n", 558 | "new_file = 'dark2.fits'\n", 559 | "\n", 560 | "os.system('cp {:} {:}'.format(flt_file, new_file))\n", 561 | "\n", 562 | "darkdat = fits.getdata(drk_file)\n", 563 | "exptime = fits.getval(drk_file, 'exptime', ext=0)\n", 564 | "\n", 565 | "with fits.open(new_file, mode='update') as hdu:\n", 566 | " \n", 567 | " hdu[1].data[:,:] = darkdat\n", 568 | " hdu[0].header['exptime'] = exptime" 569 | ] 570 | }, 571 | { 572 | "cell_type": "markdown", 573 | "metadata": {}, 574 | "source": [ 575 | "We can now make the drizzled dark frame using the two individual dark frames we just made as inputs. The drizzle parameters are the same as the ones used to make the science drizzled products." 576 | ] 577 | }, 578 | { 579 | "cell_type": "code", 580 | "execution_count": null, 581 | "metadata": {}, 582 | "outputs": [], 583 | "source": [ 584 | "adriz_output = adriz(['dark1.fits','dark2.fits'], output='masterdark', **driz_kwargs)" 585 | ] 586 | }, 587 | { 588 | "cell_type": "markdown", 589 | "metadata": {}, 590 | "source": [ 591 | "We will now display the images to confirm that they show similar elevated dark rates. You might want do display them in DS9 (or any other viewer) outside of this notebook so you can play with the stretch a bit and so you can see bigger versions of the images." 592 | ] 593 | }, 594 | { 595 | "cell_type": "code", 596 | "execution_count": null, 597 | "metadata": {}, 598 | "outputs": [], 599 | "source": [ 600 | "# Some plotting parameters\n", 601 | "plt_kwargs = {'norm':LogNorm(vmin=1e-5, vmax=0.01),\n", 602 | " 'interpolation':'nearest',\n", 603 | " 'cmap':'plasma',\n", 604 | " 'origin':'lower'}\n", 605 | "\n", 606 | "f165lp = fits.getdata('jcmc11010_drz_sci.fits')\n", 607 | "masterdark = fits.getdata('masterdark_drz_sci.fits')\n", 608 | "\n", 609 | "fig, ax = plt.subplots(1, 2, figsize=(16, 12))\n", 610 | "\n", 611 | "ax[0].set_title('Drizzled Science Data')\n", 612 | "ax[1].set_title('Drizzled Dark Frame')\n", 613 | "\n", 614 | "ax[0].imshow(f165lp, **plt_kwargs)\n", 615 | "ax[1].imshow(masterdark, **plt_kwargs);" 616 | ] 617 | }, 618 | { 619 | "cell_type": "markdown", 620 | "metadata": {}, 621 | "source": [ 622 | "The images look comparable. We will now proceed to performing some photometric analysis to estimate the dark current in the source." 623 | ] 624 | }, 625 | { 626 | "cell_type": "markdown", 627 | "metadata": {}, 628 | "source": [ 629 | "## Photometry\n", 630 | "\n", 631 | "***\n", 632 | "\n", 633 | "Now we will use the `photutils` package to set up the two apertures. We will use these apertures to measure the flux of different regions in the images.\n", 634 | "\n", 635 | "The first aperture is centered on our target at (735, 710), and is shaped as an elliptical to encompass all of the flux from the source. The other aperture will be the same exact shape, but located near the edge of the detector at (200, 200). " 636 | ] 637 | }, 638 | { 639 | "cell_type": "code", 640 | "execution_count": null, 641 | "metadata": {}, 642 | "outputs": [], 643 | "source": [ 644 | "aper = EllipticalAperture([(735, 710), (200, 200)], a=70, b=40, theta=0.5*np.pi)" 645 | ] 646 | }, 647 | { 648 | "cell_type": "markdown", 649 | "metadata": {}, 650 | "source": [ 651 | "Let's overplot the two apertures in the images so you can see their locations." 652 | ] 653 | }, 654 | { 655 | "cell_type": "code", 656 | "execution_count": null, 657 | "metadata": {}, 658 | "outputs": [], 659 | "source": [ 660 | "fig,ax = plt.subplots(1, 2, figsize=(16, 12))\n", 661 | "\n", 662 | "ax[0].set_title('Drizzled Science Data')\n", 663 | "ax[1].set_title('Drizzled Dark Frames')\n", 664 | "\n", 665 | "ax[0].imshow(f165lp, **plt_kwargs)\n", 666 | "ax[1].imshow(masterdark, **plt_kwargs)\n", 667 | "\n", 668 | "aper.plot(ax[0])\n", 669 | "aper.plot(ax[1])" 670 | ] 671 | }, 672 | { 673 | "cell_type": "markdown", 674 | "metadata": {}, 675 | "source": [ 676 | "Finally, we do the photometry using the two apertures on both images. We print out the tables to see the results." 677 | ] 678 | }, 679 | { 680 | "cell_type": "code", 681 | "execution_count": null, 682 | "metadata": {}, 683 | "outputs": [], 684 | "source": [ 685 | "f165lp_phot = aperture_photometry(f165lp, aper)\n", 686 | "masterdark_phot = aperture_photometry(masterdark, aper)\n", 687 | "\n", 688 | "sumdiff = f165lp_phot['aperture_sum'] - masterdark_phot['aperture_sum']\n", 689 | "\n", 690 | "print('Science data photometry:\\n')\n", 691 | "print(f165lp_phot)\n", 692 | "print('\\n')\n", 693 | "\n", 694 | "print('Dark frame photometry:\\n')\n", 695 | "print(masterdark_phot)\n", 696 | "print('\\n')\n", 697 | "\n", 698 | "print('\\nDifference of aperture sums (science - dark):\\n')\n", 699 | "print(sumdiff)" 700 | ] 701 | }, 702 | { 703 | "cell_type": "markdown", 704 | "metadata": {}, 705 | "source": [ 706 | "The target aperture has 2.89 cts/sec, while the same aperture in the dark frame has 0.322 cts/sec. That means that ~11% of the flux in your source comes from dark current and should be subtracted out, leaving a flux for you source of 2.564 cts/sec. " 707 | ] 708 | }, 709 | { 710 | "cell_type": "markdown", 711 | "metadata": {}, 712 | "source": [ 713 | "## Final Thoughts\n", 714 | "\n", 715 | "***\n", 716 | "\n", 717 | "1. The difference in flux in the second aperture (the one in the lower left portion of the image) shows that there is a small residual background of ~0.02 cts/sec in the science frame. This could be real background from the sky (and not dark current from the detector that you might want to account for properly in your flux and error budget.\n", 718 | "\n", 719 | "2. The dark frame we created does not have the exact same dark count rate as we measured in the science frame. You could try searching for other darks that more closely resemble your science frame. \n", 720 | "\n", 721 | "3. These problems can be avoided using a few strategies detailed in [ISR ACS 2018-07](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/instrument-science-reports-isrs/_documents/isr1807.pdf) (Avila 2018)." 722 | ] 723 | }, 724 | { 725 | "cell_type": "markdown", 726 | "metadata": {}, 727 | "source": [ 728 | "### For more help:\n", 729 | "\n", 730 | "More details may be found on the [ACS website](http://www.stsci.edu/hst/instrumentation/acs) and in the [ACS Instrument](https://hst-docs.stsci.edu/display/ACSIHB) and [Data Handbooks](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/other-documents/_documents/acs_dhb.pdf).\n", 731 | "\n", 732 | "Please visit the [HST Help Desk](http://hsthelp.stsci.edu). Through the help desk portal, you can explore the *HST* Knowledge Base and request additional help from experts." 733 | ] 734 | } 735 | ], 736 | "metadata": { 737 | "kernelspec": { 738 | "display_name": "Python 3 (ipykernel)", 739 | "language": "python", 740 | "name": "python3" 741 | }, 742 | "language_info": { 743 | "codemirror_mode": { 744 | "name": "ipython", 745 | "version": 3 746 | }, 747 | "file_extension": ".py", 748 | "mimetype": "text/x-python", 749 | "name": "python", 750 | "nbconvert_exporter": "python", 751 | "pygments_lexer": "ipython3", 752 | "version": "3.9.13" 753 | } 754 | }, 755 | "nbformat": 4, 756 | "nbformat_minor": 2 757 | } 758 | -------------------------------------------------------------------------------- /acs_sbc_dark_analysis/requirements.txt: -------------------------------------------------------------------------------- 1 | # The required packages for the ACS SBC Dark Analysis notebook 2 | 3 | matplotlib 4 | numpy 5 | astroquery 6 | drisslepac 7 | astropy 8 | photutils 9 | -------------------------------------------------------------------------------- /acs_subarrays/acs_subarrays.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "
\n", 8 | "REQUIREMENT:\n", 9 | " Before proceeding, install or update your\n", 10 | " \n", 11 | " stenv\n", 12 | " \n", 13 | " distribution. stenv is the replacement for AstroConda, which is unsupported as of February 2023.\n", 14 | "
" 15 | ] 16 | }, 17 | { 18 | "cell_type": "markdown", 19 | "metadata": {}, 20 | "source": [ 21 | "# ACS Image Reduction for Subarray Data\n", 22 | "\n", 23 | "## Introduction\n", 24 | "\n", 25 | "***\n", 26 | "\n", 27 | "Subarray data requires different considerations than working with full frame images. This notebook will guide you through working with standard post-SM4 subarray data and custom subarray data.\n", 28 | "\n", 29 | "After Servicing Mission 4 (SM4; May 2009), the installation of the ASIC during the repair of ACS introduced $1/f$ noise in all ACS images. In the calacs pipeline, only full-frame data have this striping removed. To correct subarray data, the alternative acs_destripe_plus pipeline must be used, which will apply all of calibration steps normally performed by calacs in addition to de-striping for subarray data. De-striping is only possible for 2K subarrays after SM4 until Cycle 24, after which a change to the flight software makes all subarrays eligible for de-striping.\n", 30 | "\n", 31 | "### This tutorial will show you how to handle...\n", 32 | "\n", 33 | "#### [Post-SM4 Subarray Data](#_postsm4) \n", 34 | "\n", 35 | "* Update header keywords.\n", 36 | "* Clean Subarray images with the option to correct CTE losses.\n", 37 | "* Update WCS solution.\n", 38 | "\n", 39 | "#### [Custom Subarray Data](#_custom)\n", 40 | "\n", 41 | "* Use `AstroDrizzle` with ASN files to combine images." 42 | ] 43 | }, 44 | { 45 | "cell_type": "markdown", 46 | "metadata": {}, 47 | "source": [ 48 | "## Imports\n", 49 | "\n", 50 | "***\n", 51 | "\n", 52 | "Here we list the Python packages used in this notebook. Links to the documentation for each module is provided for convenience.\n", 53 | "\n", 54 | "| Package Name | module | docs | used for |\n", 55 | "|------------------|:-----------------|:-------------:|:------------|\n", 56 | "| `os` | `system` | link|command line input|\n", 57 | "| `os` | `environ` | link| setting environments |\n", 58 | "|`shutil` | `rmtree` | link| remove directory tree |\n", 59 | "|`glob` | `glob` | link| search for files based on Unix shell rules |\n", 60 | "|`astroquery.mast` |`Observations` | link| download data from MAST |\n", 61 | "|`astropy.io` | `fits` | link| access and update fits files |\n", 62 | "|`astropy.table` | `Table` | link| constructing and editing in a tabular format |\n", 63 | "|`stwcs` |`updatewcs` | link| update wcs solution |\n", 64 | "|`acstools` |`acs_destripe_plus`| link| destripe acs images and optionally CTE correct |\n", 65 | "|`acstools` |`utils_calib`| link| check for settings in oscntab |" 66 | ] 67 | }, 68 | { 69 | "cell_type": "code", 70 | "execution_count": null, 71 | "metadata": {}, 72 | "outputs": [], 73 | "source": [ 74 | "import os\n", 75 | "import shutil\n", 76 | "import glob\n", 77 | "\n", 78 | "from astroquery.mast import Observations\n", 79 | "from astropy.io import fits\n", 80 | "from astropy.table import Table\n", 81 | "\n", 82 | "from stwcs import updatewcs\n", 83 | "from acstools import (acs_destripe_plus, utils_calib)" 84 | ] 85 | }, 86 | { 87 | "cell_type": "markdown", 88 | "metadata": {}, 89 | "source": [ 90 | "Here we set environment variables for later use with the Calibration Reference Data System (CRDS)." 91 | ] 92 | }, 93 | { 94 | "cell_type": "code", 95 | "execution_count": null, 96 | "metadata": {}, 97 | "outputs": [], 98 | "source": [ 99 | "os.environ['CRDS_SERVER_URL'] = 'https://hst-crds.stsci.edu'\n", 100 | "os.environ['CRDS_SERVER'] = 'https://hst-crds.stsci.edu'\n", 101 | "os.environ['CRDS_PATH'] = './crds_cache'\n", 102 | "os.environ['jref'] = './crds_cache/references/hst/acs/'" 103 | ] 104 | }, 105 | { 106 | "cell_type": "markdown", 107 | "metadata": {}, 108 | "source": [ 109 | "## Download the Data\n", 110 | "\n", 111 | "***\n", 112 | "\n", 113 | "Here we download all of the data required for this notebook. This is an important step! Some of the image processing steps require all relevant files to be in the working directory. We recommend working with a brand new directory for every new set of data.\n", 114 | "\n", 115 | "#### [GO Proposal 14511](https://stdatu.stsci.edu/proposal_search.php?mission=hst&id=14511): \"ACS CCD Stability Monitor\"\n", 116 | "\n", 117 | "Our data for the first example comes from a routine calibration program of two images using the post-Cycle 24 WFC1A-1K 1024 x 2072 subarray. We will only use the data associated with the observation id `JD5702JWQ`.\n", 118 | "\n", 119 | "Using the python package `astroquery`, we can download files from the [MAST](http://archive.stsci.edu) archive.\n", 120 | "\n", 121 | "
\n", 122 | "MAY CHANGE: The argument \"mrp_only\" stands for \"minimum recommended products only\". It currently needs to be set to False, although in the future, False is intended to be set as the default and can be left out.\n", 123 | "
" 124 | ] 125 | }, 126 | { 127 | "cell_type": "code", 128 | "execution_count": null, 129 | "metadata": {}, 130 | "outputs": [], 131 | "source": [ 132 | "obs_table = Observations.query_criteria(proposal_id=14511, obs_id='JD5702JWQ')\n", 133 | "\n", 134 | "dl_table = Observations.download_products(obs_table['obsid'],\n", 135 | " productSubGroupDescription=['RAW'],\n", 136 | " mrp_only=False)" 137 | ] 138 | }, 139 | { 140 | "cell_type": "markdown", 141 | "metadata": {}, 142 | "source": [ 143 | "We'll use the package `os` to put all of these files in our working directory for convenience." 144 | ] 145 | }, 146 | { 147 | "cell_type": "code", 148 | "execution_count": null, 149 | "metadata": {}, 150 | "outputs": [], 151 | "source": [ 152 | "for row in dl_table:\n", 153 | " oldfname = row['Local Path']\n", 154 | " newfname = os.path.basename(oldfname)\n", 155 | " os.rename(oldfname, newfname)" 156 | ] 157 | }, 158 | { 159 | "cell_type": "markdown", 160 | "metadata": {}, 161 | "source": [ 162 | "#### [GO Proposal 10206](https://stdatu.stsci.edu/proposal_search.php?mission=hst&id=10206): \"What drives the outflows in powerful radio galaxies?\"\n", 163 | "\n", 164 | "For the second example, we will use a ramp filter observation of the galaxy PLS1345+12 (*HST* proposal 10206). The association name is J92SA0010, and we will only use one image in the association: J92SA0W6Q.\n", 165 | "\n", 166 | "Again, we use `astroquery`, to download files from the [MAST](http://archive.stsci.edu) archive. Along with the raw images, we will also need the spt (telemetry) files to reduce our custom subarray data." 167 | ] 168 | }, 169 | { 170 | "cell_type": "code", 171 | "execution_count": null, 172 | "metadata": {}, 173 | "outputs": [], 174 | "source": [ 175 | "obs_table = Observations.query_criteria(proposal_id=10206, obs_id='J92SA0010')\n", 176 | "\n", 177 | "dl_table = Observations.download_products(obs_table['obsid'],\n", 178 | " productSubGroupDescription=['RAW', 'SPT'],\n", 179 | " mrp_only=False)" 180 | ] 181 | }, 182 | { 183 | "cell_type": "markdown", 184 | "metadata": {}, 185 | "source": [ 186 | "Again, we'll use the package `os` to put all of these files in our working directory for convenience." 187 | ] 188 | }, 189 | { 190 | "cell_type": "code", 191 | "execution_count": null, 192 | "metadata": {}, 193 | "outputs": [], 194 | "source": [ 195 | "for row in dl_table:\n", 196 | " oldfname = row['Local Path']\n", 197 | " newfname = os.path.basename(oldfname)\n", 198 | " os.rename(oldfname, newfname)" 199 | ] 200 | }, 201 | { 202 | "cell_type": "markdown", 203 | "metadata": {}, 204 | "source": [ 205 | "Now that all of our files are in the current working directory, we delete the leftover MAST file structure." 206 | ] 207 | }, 208 | { 209 | "cell_type": "code", 210 | "execution_count": null, 211 | "metadata": {}, 212 | "outputs": [], 213 | "source": [ 214 | "shutil.rmtree('mastDownload')" 215 | ] 216 | }, 217 | { 218 | "cell_type": "markdown", 219 | "metadata": {}, 220 | "source": [ 221 | "### File Information\n", 222 | "\n", 223 | "The structure of the fits files from ACS may be different depending on what kind of observation was made. For more information, refer to Section 2.2 of the [ACS Data Handbook](https://hst-docs.stsci.edu/display/ACSIHB).\n", 224 | "\n", 225 | "#### Raw Files\n", 226 | "\n", 227 | "A standard raw image file from a subarray has the same structure as you'd expect from full frame observation from ACS/WCS.\n", 228 | "\n", 229 | "| Ext | Name | Type | Contains |\n", 230 | "|--------|------------------|--------------|:-------------------------------------------------------|\n", 231 | "|0| PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 232 | "|1| SCI (Image) | (ImageHDU) | Raw image data. |\n", 233 | "|2| ERR (Error) | (ImageHDU) | Error array. |\n", 234 | "|3| DQ (Data Quality)| (ImageHDU) | Data quality array. |\n", 235 | "\n", 236 | "#### SPT Files\n", 237 | "\n", 238 | "SPT files contain telemetry and engineering data from the telescope.\n", 239 | "\n", 240 | "| Ext | Name | Type | Contains |\n", 241 | "|--------|------------------|--------------|:-------------------------------------------------------|\n", 242 | "|0| PRIMARY | (PrimaryHDU) | Meta-data related to the entire file. |\n", 243 | "|1| UDL (Image) | (ImageHDU) | Raw image data. |" 244 | ] 245 | }, 246 | { 247 | "cell_type": "markdown", 248 | "metadata": {}, 249 | "source": [ 250 | "You can always use `.info()` on an HDUlist for an overview of the structure" 251 | ] 252 | }, 253 | { 254 | "cell_type": "code", 255 | "execution_count": null, 256 | "metadata": {}, 257 | "outputs": [], 258 | "source": [ 259 | "with fits.open('j92sa0w6q_raw.fits') as hdulist:\n", 260 | " hdulist.info()\n", 261 | " \n", 262 | "with fits.open('j92sa0w6q_spt.fits') as hdulist:\n", 263 | " hdulist.info()" 264 | ] 265 | }, 266 | { 267 | "cell_type": "markdown", 268 | "metadata": {}, 269 | "source": [ 270 | "## De-striping and CTE Correction of Post-SM4 Subarray Observations\n", 271 | "\n", 272 | "***\n", 273 | "\n", 274 | "#### Download Calibration Files: Update Header Keywords\n", 275 | "\n", 276 | "We can call the [Calibration Reference Data System](https://hst-crds.stsci.edu/) (CRDS) to get the associated calibration files for this image.\n", 277 | "\n", 278 | "First we will need to turn on the CTE correction switch in the primary header. Turning on this switch will notify the [CRDS](https://hst-crds.stsci.edu) `bestrefs` tool to add the names of the CTE correction parameters table '`PCTETAB`' and CTE-corrected dark current image '`DRKCFILE`' reference files to the header." 279 | ] 280 | }, 281 | { 282 | "cell_type": "code", 283 | "execution_count": null, 284 | "metadata": {}, 285 | "outputs": [], 286 | "source": [ 287 | "sbc_fits = 'jd5702jwq_raw.fits'\n", 288 | "\n", 289 | "with fits.open(sbc_fits, mode='update') as hdulist:\n", 290 | " hdulist[0].header['PCTECORR'] = 'PERFORM'" 291 | ] 292 | }, 293 | { 294 | "cell_type": "markdown", 295 | "metadata": {}, 296 | "source": [ 297 | "#### Download Calibration Files\n", 298 | "\n", 299 | "The CRDS program can be run from the terminal to download the files specific to your observation. This program is packaged with astroconda. \n", 300 | "\n", 301 | "The command line input to download our files for `jd5702jwq_raw.fits` is as follows:\n", 302 | "\n", 303 | " crds bestrefs --files jd5702jwq_raw.fits --sync-references=1 --update-bestrefs\n", 304 | " \n", 305 | "Here, we use the python package `os` to run this command." 306 | ] 307 | }, 308 | { 309 | "cell_type": "code", 310 | "execution_count": null, 311 | "metadata": {}, 312 | "outputs": [], 313 | "source": [ 314 | "cmd_input = 'crds bestrefs --files {:} --sync-references=1 --update-bestrefs'.format(sbc_fits)\n", 315 | "os.system(cmd_input);" 316 | ] 317 | }, 318 | { 319 | "cell_type": "markdown", 320 | "metadata": {}, 321 | "source": [ 322 | "#### Clean Subarray Images\n", 323 | "\n", 324 | "Next we will run the [acs_destripe_plus](http://acstools.readthedocs.io/en/stable/acs_destripe_plus.html) code on our image. This will execute all of the calibration steps that are set to 'PERFORM' in the primary header of the FITS file. \n", 325 | "\n", 326 | "The `acs_destripe_plus` code will produce the FLT file is the calibrated output product from `CALACS`, `jd5702jmq_flt.fits`. With the CTE correction turned on, an FLC file will also be produced, which is the same as the FLT file but with the CTE correction applied." 327 | ] 328 | }, 329 | { 330 | "cell_type": "code", 331 | "execution_count": null, 332 | "metadata": {}, 333 | "outputs": [], 334 | "source": [ 335 | "acs_destripe_plus.destripe_plus(sbc_fits, cte_correct=True)" 336 | ] 337 | }, 338 | { 339 | "cell_type": "markdown", 340 | "metadata": {}, 341 | "source": [ 342 | "#### Correct the WCS\n", 343 | "\n", 344 | "The subarray products produced by this process do not have the proper WCS information in the header. The WCS is normally updated by the pipeline via an additional call to AstroDrizzle. Here, we can manually update the WCS of our FLC product using `stwcs.updatewcs`." 345 | ] 346 | }, 347 | { 348 | "cell_type": "code", 349 | "execution_count": null, 350 | "metadata": {}, 351 | "outputs": [], 352 | "source": [ 353 | "updatewcs.updatewcs('jd5702jwq_flc.fits', use_db=False)" 354 | ] 355 | }, 356 | { 357 | "cell_type": "markdown", 358 | "metadata": {}, 359 | "source": [ 360 | "## Reducing Custom Subarray Data\n", 361 | "\n", 362 | "***\n", 363 | "\n", 364 | "#### Download Calibration Files\n", 365 | "\n", 366 | "Like before, we can use CRDS to get the associated calibration files for this image." 367 | ] 368 | }, 369 | { 370 | "cell_type": "code", 371 | "execution_count": null, 372 | "metadata": {}, 373 | "outputs": [], 374 | "source": [ 375 | "sbc_fits = 'j92sa0w6q_raw.fits'\n", 376 | "cmd_input = 'crds bestrefs --files {:} --sync-references=1 --update-bestrefs'.format(sbc_fits)\n", 377 | "os.system(cmd_input);" 378 | ] 379 | }, 380 | { 381 | "cell_type": "markdown", 382 | "metadata": {}, 383 | "source": [ 384 | "#### Access OSCN Table\n", 385 | "\n", 386 | "The name in the header is of the format 'ref\\\\$oscn_name.fits', therefore we need to split the string on the '$' character." 387 | ] 388 | }, 389 | { 390 | "cell_type": "code", 391 | "execution_count": null, 392 | "metadata": {}, 393 | "outputs": [], 394 | "source": [ 395 | "prihdr = fits.getheader(sbc_fits)\n", 396 | "scihdr = fits.getheader(sbc_fits, ext=1)" 397 | ] 398 | }, 399 | { 400 | "cell_type": "markdown", 401 | "metadata": {}, 402 | "source": [ 403 | "We also want to retrieve the `OSCNTAB` reference file from the JREF directory. We can get the name of the file from the primary header of the image." 404 | ] 405 | }, 406 | { 407 | "cell_type": "code", 408 | "execution_count": null, 409 | "metadata": {}, 410 | "outputs": [], 411 | "source": [ 412 | "oscn_name = prihdr['OSCNTAB'].split('$')[-1]\n", 413 | "path_to_oscn = os.path.join(os.environ['jref'], oscn_name)\n", 414 | "\n", 415 | "print(path_to_oscn)" 416 | ] 417 | }, 418 | { 419 | "cell_type": "markdown", 420 | "metadata": {}, 421 | "source": [ 422 | "The `utils_calib.check_oscntab` from [acstools](http://acstools.readthedocs.io) checks the `OSCNTAB` file if any entry matches the combination of readout amplifier, image size, and number of bias prescan columns for a given subarray observation. We will need to use several header keyword values to test if a subarray is in `OSCNTAB`." 423 | ] 424 | }, 425 | { 426 | "cell_type": "code", 427 | "execution_count": null, 428 | "metadata": {}, 429 | "outputs": [], 430 | "source": [ 431 | "oscnrec = fits.getdata(path_to_oscn)\n", 432 | "oscnhdr = fits.getheader(path_to_oscn)\n", 433 | "\n", 434 | "oscntable = Table(oscnrec)" 435 | ] 436 | }, 437 | { 438 | "cell_type": "markdown", 439 | "metadata": {}, 440 | "source": [ 441 | "The raw image has 1180 columns and 1200 rows, which does not correspond to any entry in the `OSCNTAB` file, but a visual examination of the image shows that it contains bias prescan columns." 442 | ] 443 | }, 444 | { 445 | "cell_type": "markdown", 446 | "metadata": {}, 447 | "source": [ 448 | "#### Get the Bias Prescan Information\n", 449 | "\n", 450 | "From the science image PRI and SCI extension headers that we opened earlier, we can get the information about the readout amplifier and dimensions of the image." 451 | ] 452 | }, 453 | { 454 | "cell_type": "code", 455 | "execution_count": null, 456 | "metadata": {}, 457 | "outputs": [], 458 | "source": [ 459 | "amp = prihdr['CCDAMP']\n", 460 | "xsize = scihdr['NAXIS1']\n", 461 | "ysize = scihdr['NAXIS2']\n", 462 | "\n", 463 | "print(xsize, ysize)" 464 | ] 465 | }, 466 | { 467 | "cell_type": "markdown", 468 | "metadata": {}, 469 | "source": [ 470 | "To get information on the number of prescan columns (if any), we need to access the SPT first extension header." 471 | ] 472 | }, 473 | { 474 | "cell_type": "code", 475 | "execution_count": null, 476 | "metadata": {}, 477 | "outputs": [], 478 | "source": [ 479 | "spthdr = fits.getheader('j92sa0w6q_spt.fits', ext=1)\n", 480 | "\n", 481 | "leading = spthdr['OVERSCNA']\n", 482 | "trailing = spthdr['OVERSCNB']" 483 | ] 484 | }, 485 | { 486 | "cell_type": "markdown", 487 | "metadata": {}, 488 | "source": [ 489 | "Finally, we check if this subarray definition is in the OSCNTAB file. The code returns a boolean result, which we have saved as the variable supported, to describe this." 490 | ] 491 | }, 492 | { 493 | "cell_type": "code", 494 | "execution_count": null, 495 | "metadata": {}, 496 | "outputs": [], 497 | "source": [ 498 | "supported = utils_calib.check_oscntab(path_to_oscn, amp, xsize, ysize, leading, trailing)\n", 499 | "print(supported)" 500 | ] 501 | }, 502 | { 503 | "cell_type": "markdown", 504 | "metadata": {}, 505 | "source": [ 506 | "#### Update OSCNTAB\n", 507 | "\n", 508 | "Now that we have confirmed that the `OSCNTAB` file does not contain information about our subarray data, we need to add a new row to the table with our definitions. Let's first view the first few rows of `OSCNTAB` to see what our new entry needs to look like." 509 | ] 510 | }, 511 | { 512 | "cell_type": "code", 513 | "execution_count": null, 514 | "metadata": {}, 515 | "outputs": [], 516 | "source": [ 517 | "print(oscntable)" 518 | ] 519 | }, 520 | { 521 | "cell_type": "markdown", 522 | "metadata": {}, 523 | "source": [ 524 | "We can also choose to just view all of the columns names" 525 | ] 526 | }, 527 | { 528 | "cell_type": "code", 529 | "execution_count": null, 530 | "metadata": {}, 531 | "outputs": [], 532 | "source": [ 533 | "print(oscntable.colnames)" 534 | ] 535 | }, 536 | { 537 | "cell_type": "markdown", 538 | "metadata": {}, 539 | "source": [ 540 | "Several column names are obvious, but here we define the less obvious ones. \n", 541 | "\n", 542 | "| Column Name | Description |\n", 543 | "| --- | --- |\n", 544 | "| BINX, BINY | Binning in X and Y. ACS data are never binned, so these will always be 1. |\n", 545 | "| TRIMXn | Number of prescan columns on the left (1) and right (2) sides of the image to remove. |\n", 546 | "| TRIMYn | Number of virtual rows on the bottom (1) and top (2) sides of the image to remove. For subarray data, these are always 0. |\n", 547 | "| BIASSECTAn | Start and end columns to use for the bias level estimation on the left (A) side of the image. |\n", 548 | "| BIASSECTBn | Start and end columns to use for the bias level estimation on the right (B) side of the image. |\n", 549 | "| VXn, VYn | The coordinates of the bottom-left (VX1, VY1) and top-right (VX2, VY2) corners of the virtual overscan region |" 550 | ] 551 | }, 552 | { 553 | "cell_type": "markdown", 554 | "metadata": {}, 555 | "source": [ 556 | "The following line sets `chip` to 1 if the subarray is on WFC1, and 2 if the subarray is on WFC2." 557 | ] 558 | }, 559 | { 560 | "cell_type": "code", 561 | "execution_count": null, 562 | "metadata": {}, 563 | "outputs": [], 564 | "source": [ 565 | "chip = 1 if amp in ['A', 'B'] else 2" 566 | ] 567 | }, 568 | { 569 | "cell_type": "markdown", 570 | "metadata": {}, 571 | "source": [ 572 | "For the BIASSECTAn and BIASSECTBn values, we want to use the six columns of prescan nearest the exposed area of the CCD. " 573 | ] 574 | }, 575 | { 576 | "cell_type": "code", 577 | "execution_count": null, 578 | "metadata": {}, 579 | "outputs": [], 580 | "source": [ 581 | "bias_a = [0, 0]\n", 582 | "if leading > 0:\n", 583 | " bias_a[1] = leading\n", 584 | " bias_a[0] = leading-5 if leading > 5 else 0\n", 585 | " \n", 586 | "bias_b = [0, 0]\n", 587 | "if trailing > 0:\n", 588 | " bias_b[0] = xsize+1\n", 589 | " bias_b[1] = xsize+5 if trailing > 5 else xsize+trailing" 590 | ] 591 | }, 592 | { 593 | "cell_type": "markdown", 594 | "metadata": {}, 595 | "source": [ 596 | "Now we can define a new row for our settings. For subarray data, as there is no virtual overscan, the VXn and VYn values will always be 0. Here we use a dictionary to explicitly define the new values." 597 | ] 598 | }, 599 | { 600 | "cell_type": "code", 601 | "execution_count": null, 602 | "metadata": {}, 603 | "outputs": [], 604 | "source": [ 605 | "new_row = {'CCDAMP':amp,\n", 606 | " 'CCDCHIP':chip, \n", 607 | " 'BINX':1, \n", 608 | " 'BINY':1, \n", 609 | " 'NX':xsize,\n", 610 | " 'NY':ysize, \n", 611 | " 'TRIMX1':leading, \n", 612 | " 'TRIMX2':trailing, \n", 613 | " 'TRIMY1':0, \n", 614 | " 'TRIMY2':0, \n", 615 | " 'BIASSECTA1':bias_a[0], \n", 616 | " 'BIASSECTA2':bias_a[1], \n", 617 | " 'BIASSECTB1':bias_b[0], \n", 618 | " 'BIASSECTB2':bias_b[1], \n", 619 | " 'DESCRIPTION':'Custom OSCN', \n", 620 | " 'VX1':0, \n", 621 | " 'VX2':0, \n", 622 | " 'VY1':0, \n", 623 | " 'VY2':0}" 624 | ] 625 | }, 626 | { 627 | "cell_type": "markdown", 628 | "metadata": {}, 629 | "source": [ 630 | "Now we need to open the custom `OSCTNAB` file and update the table to have our new definition. We will also need to update the raw FITS file to point to the new custom `OSCNTAB` reference file." 631 | ] 632 | }, 633 | { 634 | "cell_type": "code", 635 | "execution_count": null, 636 | "metadata": {}, 637 | "outputs": [], 638 | "source": [ 639 | "oscntable.add_row(new_row)\n", 640 | "print(oscntable)" 641 | ] 642 | }, 643 | { 644 | "cell_type": "markdown", 645 | "metadata": {}, 646 | "source": [ 647 | "Now we have an idential FITS rec table, but with an additional row for our new information." 648 | ] 649 | }, 650 | { 651 | "cell_type": "code", 652 | "execution_count": null, 653 | "metadata": {}, 654 | "outputs": [], 655 | "source": [ 656 | "oscnrec_new = fits.FITS_rec.from_columns(oscnrec.columns, \n", 657 | " nrows=len(oscntable))\n", 658 | "print(Table(oscnrec_new))" 659 | ] 660 | }, 661 | { 662 | "cell_type": "markdown", 663 | "metadata": {}, 664 | "source": [ 665 | "Let's populate that last row with our new data!" 666 | ] 667 | }, 668 | { 669 | "cell_type": "code", 670 | "execution_count": null, 671 | "metadata": {}, 672 | "outputs": [], 673 | "source": [ 674 | "oscnrec_new[-1] = tuple(oscntable[-1])" 675 | ] 676 | }, 677 | { 678 | "cell_type": "markdown", 679 | "metadata": {}, 680 | "source": [ 681 | "Reprint the table and check that we have entered it correctly." 682 | ] 683 | }, 684 | { 685 | "cell_type": "code", 686 | "execution_count": null, 687 | "metadata": {}, 688 | "outputs": [], 689 | "source": [ 690 | "print(Table(oscnrec_new))" 691 | ] 692 | }, 693 | { 694 | "cell_type": "markdown", 695 | "metadata": {}, 696 | "source": [ 697 | "Now we can open up our oscn table and replace the old fits rec with our new one" 698 | ] 699 | }, 700 | { 701 | "cell_type": "code", 702 | "execution_count": null, 703 | "metadata": {}, 704 | "outputs": [], 705 | "source": [ 706 | "with fits.open(path_to_oscn, mode='update') as hdu:\n", 707 | " hdu[1].data = oscnrec_new" 708 | ] 709 | }, 710 | { 711 | "cell_type": "markdown", 712 | "metadata": {}, 713 | "source": [ 714 | "Using the same check we did earlier, we can use `check_oscntab` to see if our settings are defined. If everything was done correctly, the following line should print \"True\"!" 715 | ] 716 | }, 717 | { 718 | "cell_type": "code", 719 | "execution_count": null, 720 | "metadata": {}, 721 | "outputs": [], 722 | "source": [ 723 | "supported = utils_calib.check_oscntab(path_to_oscn, amp, xsize, ysize, leading, trailing)\n", 724 | "\n", 725 | "print('Defined in OSCNTAB? {}'.format(supported))" 726 | ] 727 | }, 728 | { 729 | "cell_type": "markdown", 730 | "metadata": {}, 731 | "source": [ 732 | "### For more help:\n", 733 | "\n", 734 | "More details may be found on the [ACS website](http://www.stsci.edu/hst/instrumentation/acs) and in the [ACS Instrument](https://hst-docs.stsci.edu/display/ACSIHB) and [Data Handbooks](http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/other-documents/_documents/acs_dhb.pdf).\n", 735 | "\n", 736 | "Please visit the [HST Help Desk](http://hsthelp.stsci.edu). Through the help desk portal, you can explore the *HST* Knowledge Base and request additional help from experts." 737 | ] 738 | } 739 | ], 740 | "metadata": { 741 | "kernelspec": { 742 | "display_name": "Python 3 (ipykernel)", 743 | "language": "python", 744 | "name": "python3" 745 | }, 746 | "language_info": { 747 | "codemirror_mode": { 748 | "name": "ipython", 749 | "version": 3 750 | }, 751 | "file_extension": ".py", 752 | "mimetype": "text/x-python", 753 | "name": "python", 754 | "nbconvert_exporter": "python", 755 | "pygments_lexer": "ipython3", 756 | "version": "3.9.13" 757 | } 758 | }, 759 | "nbformat": 4, 760 | "nbformat_minor": 2 761 | } 762 | -------------------------------------------------------------------------------- /acs_subarrays/requirements.txt: -------------------------------------------------------------------------------- 1 | # The required packages for the ACS Subarray notebook 2 | 3 | astroquery 4 | astropy 5 | stwcs 6 | acstools 7 | --------------------------------------------------------------------------------