├── CatSim
├── .gitignore
├── CatSimTutorial_SimulationsAHM_1503.ipynb
├── LightCurveExample.ipynb
├── SignalToNoise_151207.ipynb
├── connection2UWDB_150504.ipynb
├── coordinateTransformations_150427.ipynb
├── custom_agn_example.ipynb
├── generateAgnCatalog_150309.ipynb
└── sed_photometry_demo.ipynb
├── DM
├── .gitignore
├── LSST_detection_measurement.ipynb
└── PHOSIM_DM_startup.md
├── MAF
├── .gitignore
├── Hour Angle MAF example.ipynb
├── RotSkyPos MAF Tutorial.ipynb
├── RotSkyPos MAF Tutorial.v3.ipynb
└── Transient_Example.ipynb
├── README.md
└── SE
└── Calculating SNR.ipynb
/CatSim/.gitignore:
--------------------------------------------------------------------------------
1 | #just an empty file so this directory shows up
2 | *.txt
3 | *.fits
4 | *.dat
5 |
--------------------------------------------------------------------------------
/CatSim/CatSimTutorial_SimulationsAHM_1503.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "First, some philosophy:\n",
8 | "\n",
9 | "The CatSim stack principally exists to create catalogs from simulated universes. The default simulated universe that CatSim accesses lives on a machine at the University of Washington called \"fatboy\". This simulated universe is really a database that consists of a distribution of galaxies drawn from the Millennium N-body simulation, and Milky Way stars generated with the GalFast software. The most up-to-date documentation of what fatboy provides is here\n",
10 | "\n",
11 | "https://confluence.lsstcorp.org/display/SIM/Database+Contents+--+Catalog+Simulations\n",
12 | "\n",
13 | "https://confluence.lsstcorp.org/display/SIM/Database+Schema\n",
14 | "\n",
15 | "Whenever this notebook refers to \"querying the database,\" it is referring to this database on fatboy (it is possible, in principle, to connect CatSim to your own database containing your own simulated universe; that is beyond the scope of this demo).\n",
16 | "\n",
17 | "See the link below for instructions on how to access the fatboy database\n",
18 | "\n",
19 | "https://confluence.lsstcorp.org/display/SIM/Accessing+the+UW+CATSIM+Database\n",
20 | "\n",
21 | "Even though catalogs are generated by querying fatboy's database, CatSim is designed so that the user should never have to write any raw SQL queries. This is due to the way the catalog-generating classes in CatSim have been written. In broad strokes:\n",
22 | "\n",
23 | "* The user instantiates a daughter of the `InstanceCatalog` class which is in charge of actually writing the catalog. This is the class that contains information regarding what astronomical objects and what data regarding those objects should be written to the catalog.\n",
24 | "\n",
25 | "\n",
26 | "* The user passes the InstanceCatalog an instantiation of a daughter of the `CatalogDBObject` class. The `CatalogDBObject` class is the class which actually manipulates the connection to fatboy. Specific `CatalogDBobject` classes are written to connect to specific tables in the fatboy database. Thus, there is one `CatalogDBObject` class for galaxies, a different `CatalogDBObject` class for Solar System objects, a different `CatalogDBObject` for main sequence stars, a different `CatalogDBObjec`t for white dwarfs, etc. The full list of available `CatalogDBObject` daughter classes is here https://confluence.lsstcorp.org/display/SIM/Database+Schema. `CatalogDBObject` daughter classes provide one other purpose: naming conventions for data columns vary between fatboy and the `InstanceCatalog` classes (and, indeed, between tables in fatboy). Declination is called `dec` in the galaxy table but `decl` in the star tables. `CatalogDBObject` classes provide simple mappings that smooth over these differences and put all of the database columns into a uniform syntax.\n",
27 | "\n",
28 | "\n",
29 | "* The user also passes the `InstanceCatalog` an instantiation of the `ObservationMetaData` class. This is the class which contains data about the state of the simulated telescope. For the purposes of generating a catalog, the `ObservationMetaData` provides the RA and Dec at which the telescope is pointed as well as the size and shape of its field of view (i.e. it controls the question \"which objects in fatboy's database are actually seen by my telescope and thus should be written to my catalog?\").\n",
30 | "\n",
31 | "When you ask your `InstanceCatalog` to write itself out, what actually happens is that the `CatalogDBObject` performs a query on fatboy using information from the `ObservationMetaData` to tell it which objects to return and information from the `InstanceCatalog` to tell it what data columns associated with those objects to return.\n",
32 | "\n",
33 | "Let's do that now.\n",
34 | "\n",
35 | "PS: make sure you have setup all of the sims packages before attempting to run this notebook. If you haven't, exit the notebook and run\n",
36 | "\n",
37 | " source loadLSST.bash\n",
38 | " setup sims_catUtils -t sims\n",
39 | " setup sims_GalSimInterface -t sims\n",
40 | " \n",
41 | "in the shell from which you want to run this notebook\n",
42 | "\n",
43 | "PPS: if this notebook does not answer your questions, you can find more tutorials in \n",
44 | "\n",
45 | " ~lsst_home/yourOperatingSystem/sims_catUtils/yourVersion/examples/tutorials/\n",
46 | "\n",
47 | "where `yourOperatingSyste`m will be 'DarwinX86' for Mac users and 'Linux64' for Linux users. `yourVersion` will be a Git SHA-1 indicating the state of the `sims_catUtils` code when you installed it."
48 | ]
49 | },
50 | {
51 | "cell_type": "code",
52 | "execution_count": null,
53 | "metadata": {
54 | "collapsed": true,
55 | "scrolled": true
56 | },
57 | "outputs": [],
58 | "source": [
59 | "import numpy\n",
60 | "from lsst.sims.catalogs.definitions import InstanceCatalog\n",
61 | "\n",
62 | "class simpleStarCatalog(InstanceCatalog):\n",
63 | " column_outputs = ['raJ2000', 'decJ2000', 'sedFilename']\n",
64 | " \n",
65 | " transformations = {'raJ2000':numpy.degrees, 'decJ2000':numpy.degrees}\n",
66 | "\n",
67 | "from lsst.sims.utils import ObservationMetaData\n",
68 | "\n",
69 | "myObsMetadata = ObservationMetaData(pointingRA=45.0, pointingDec=-10.0,\n",
70 | " boundType='circle', boundLength=0.02)\n",
71 | "\n",
72 | "from lsst.sims.catUtils.baseCatalogModels import StarObj\n",
73 | "\n",
74 | "starTableConnection = StarObj()\n",
75 | "\n",
76 | "myCatalog = simpleStarCatalog(starTableConnection,\n",
77 | " obs_metadata = myObsMetadata)\n",
78 | "\n",
79 | "myCatalog.write_catalog('test_catalog.txt')\n",
80 | "\n",
81 | "readCatalog = open('test_catalog.txt', 'r').readlines()\n",
82 | "\n",
83 | "!cat test_catalog.txt"
84 | ]
85 | },
86 | {
87 | "cell_type": "markdown",
88 | "metadata": {},
89 | "source": [
90 | "So what just happened?\n",
91 | "\n",
92 | " import numpy\n",
93 | " from lsst.sims.catalogs.defintions import InstanceCatalog\n",
94 | " class simpleStarCatalog(InstanceCatalog):\n",
95 | "\n",
96 | " column_outputs = ['raJ2000', 'decJ2000', 'sedFilename']\n",
97 | " \n",
98 | " transformations = {'raJ2000':numpy.degrees,\n",
99 | " 'decJ2000':numpy.degrees}\n",
100 | "\n",
101 | "This here we defined a daughter class of the class `InstanceCatalog`. We told it what columns we wanted the catalog to contain by defining the `column_outputs` member variable, and we told it that we wanted RA and Dec to be written in degrees using the `transformations` member variable. Note: CatSim philosophy is that angles are all handled internally in radians but externally in degrees. Thus, any time you have to provide an angle, it should be in degrees. However, the angles are immediately translated into radians inside the code. Because our `InstanceCatalog` daughter class takes those internally-handled angles and writes them to disk, we have to explicitly convert them to degrees.\n",
102 | "\n",
103 | " from lsst.sims.utils import ObservationMetaData\n",
104 | " myObsMetadata = ObservationMetaData(unrefractedRA=45.0,\n",
105 | " unrefractedDec=-10.0,\n",
106 | " boundType='circle', boundLength=0.02)\n",
107 | " \n",
108 | "Here, we constructed an `ObservationMetaData` class centered on (RA, Dec) = (45, -10) in degrees (because this is an angle exposed to the user). We also told it that we were going to want a circular field of view with a radius of 0.02 degrees. Currently, `ObservationMetaData` also supports `boundType='box'` in which case `boundLength` is either the half-side length of a square or a tuple `(halfSideLengthRA, halfSideLengthDec)`.\n",
109 | "\n",
110 | " from lsst.sims.catUtils.baseCatalogModels import StarObj\n",
111 | " starTableConnection = StarObj()\n",
112 | "\n",
113 | "This is where we instantiated our `CatalogDBObject` daughter class. In this case, we instantiated the class `StarObj` which, we see from\n",
114 | "\n",
115 | "https://confluence.lsstcorp.org/display/SIM/Database+Schema\n",
116 | "\n",
117 | "provides a union of main sequence, red giant branch, white dwarf, RR Lyrae, and blue horizontal branch stars.\n",
118 | "\n",
119 | "We combined the `InstanceCatalog`, `CatalogDBObject`, and `ObservationMetaData` by instantiating our catalog class, passing the `CatalogDBObject` as an argument and the `ObservationMetaData` as a `kwarg`.\n",
120 | "\n",
121 | " myCatalog = simpleStarCatalog(starTableConnection,\n",
122 | " obs_metadata = myObsMetadata)\n",
123 | "\n",
124 | "This ultimately allowed us to call `myCatalog.write_catalog('test_catalog.txt')` and write the catalog to a file `test_catalog.txt'`. This is all one absolutely has to do to get a simulated catalog out of CatSim."
125 | ]
126 | },
127 | {
128 | "cell_type": "markdown",
129 | "metadata": {},
130 | "source": [
131 | "Note: `InstanceCatalog` is associated with a meta-class that keeps a registry of all defined `InstanceCatalog` daughter classes. This meta-class will throw an exception if you try to run the cell above (or any cell that defines an `InstanceCatalog` daughter class) more than once. It will warn you that you are trying to define a class that has already been defined. If you need to re-run a catalog-class-defining cell again, you will have to restart the kernel of this iPython notebook. The same warning applies for cells that define `CatalogDBObject` daugther classes."
132 | ]
133 | },
134 | {
135 | "cell_type": "markdown",
136 | "metadata": {},
137 | "source": [
138 | "# Where do InstanceCatalogs get their data?"
139 | ]
140 | },
141 | {
142 | "cell_type": "markdown",
143 | "metadata": {},
144 | "source": [
145 | "In the simple example above, all of the data from our catalog was provided by the `CatalogDBObject` class. The database on fatboy contains very basic information about all of the objects in our simulated universe. The class `StarObj` contains simple functions that map that data into the syntax of our `InstanceCatalog`. Calling `write_catalog()` in this case literally just peformed an SQL query and returned the results. Indeed, we can see this by doing"
146 | ]
147 | },
148 | {
149 | "cell_type": "code",
150 | "execution_count": null,
151 | "metadata": {
152 | "collapsed": true,
153 | "scrolled": true
154 | },
155 | "outputs": [],
156 | "source": [
157 | "starTableConnection.show_mapped_columns()"
158 | ]
159 | },
160 | {
161 | "cell_type": "markdown",
162 | "metadata": {},
163 | "source": [
164 | "This command prints to screen all of the database columns that `starTableConnection` natively provides and, indeed, we see that `raJ2000`, `dec2000`, and `sedFilename` all exist in the (mapped) database table provided by `starTableConnection`. Suppose, however, that we wanted to include data not provided by the `CatalogDBObject` class. For example, the star database tables contain mean RA and Dec and proper motions for all of the stars in fatboy. What if we want to write a catalog that writes out stars to their proper motion-corrected positions on the sky? In that case, we add the new data columns by adding getter methods to our `InstanceCatalog` daughter class."
165 | ]
166 | },
167 | {
168 | "cell_type": "code",
169 | "execution_count": null,
170 | "metadata": {
171 | "collapsed": true,
172 | "scrolled": true
173 | },
174 | "outputs": [],
175 | "source": [
176 | "class demoProperMotionCatalog(InstanceCatalog):\n",
177 | " column_outputs = ['raJ2000', 'decJ2000', 'correctedRA', 'correctedDec']\n",
178 | "\n",
179 | " transformations = {'raJ2000':numpy.degrees,\n",
180 | " 'decJ2000':numpy.degrees,\n",
181 | " 'correctedRA':numpy.degrees,\n",
182 | " 'correctedDec':numpy.degrees}\n",
183 | " \n",
184 | " def get_correctedRA(self):\n",
185 | " dt = self.obs_metadata.mjd.TAI - 51544.0\n",
186 | " ra = self.column_by_name('raJ2000')\n",
187 | " speed = self.column_by_name('properMotionRa')\n",
188 | " return ra + speed*dt\n",
189 | " \n",
190 | " def get_correctedDec(self):\n",
191 | " dt = self.obs_metadata.mjd.TAI - 51544.0\n",
192 | " dec = self.column_by_name('decJ2000')\n",
193 | " speed = self.column_by_name('properMotionDec')\n",
194 | " return dec + speed*dt"
195 | ]
196 | },
197 | {
198 | "cell_type": "code",
199 | "execution_count": null,
200 | "metadata": {
201 | "collapsed": true,
202 | "scrolled": true
203 | },
204 | "outputs": [],
205 | "source": [
206 | "myObsMetadata = ObservationMetaData(pointingRA=45.0, pointingDec=-10.0,\n",
207 | " boundType='circle', boundLength=0.02,\n",
208 | " mjd=57098.0)\n",
209 | "\n",
210 | "myProperMotionCat = demoProperMotionCatalog(starTableConnection,\n",
211 | " obs_metadata=myObsMetadata)\n",
212 | "\n",
213 | "myProperMotionCat.write_catalog('proper_motion_example.txt')\n",
214 | "!cat proper_motion_example.txt"
215 | ]
216 | },
217 | {
218 | "cell_type": "markdown",
219 | "metadata": {},
220 | "source": [
221 | "How did we do that?\n",
222 | "\n",
223 | "In `column_outputs` we asked the `InstanceCatalog` to write the columns `correctedRA` and `correctedDec` which do not exist in the database table provided by `starTableConnection`. In order for the `InstanceCatalog` to calculate these columns on the fly, we had to define getter methods in our `InstanceCatalog` daughter class. Literally, the InstanceCatalog inspects itself, looking for methods named `get_columnName()`. These methods need to return numpy arrays whose elements are in the same order as the rows in the database table. These getter methods are called whenever the `InstanceCatalog` writes itself to disk. They are also called whenever another getter method calls `self.column_by_name('columnName')` (which is how getter method columns can be calculated using columns defined in the database).\n",
224 | "\n",
225 | "Obviously, there are a few problems with what we did above. The first is that it was inefficient. We used two getters when we could have used one. The `@compound` decorator allows a single getter method to return two columns as follows."
226 | ]
227 | },
228 | {
229 | "cell_type": "code",
230 | "execution_count": null,
231 | "metadata": {
232 | "collapsed": true,
233 | "scrolled": true
234 | },
235 | "outputs": [],
236 | "source": [
237 | "from lsst.sims.catalogs.decorators import compound\n",
238 | "\n",
239 | "class demoProperMotionCatalog2(InstanceCatalog):\n",
240 | " column_outputs = ['raJ2000', 'decJ2000', 'correctedRA', 'correctedDec']\n",
241 | "\n",
242 | " transformations = {'raJ2000':numpy.degrees,\n",
243 | " 'decJ2000':numpy.degrees,\n",
244 | " 'correctedRA':numpy.degrees,\n",
245 | " 'correctedDec':numpy.degrees}\n",
246 | " \n",
247 | " @compound('correctedRA', 'correctedDec')\n",
248 | " def get_correctedCoords(self):\n",
249 | " dt = self.obs_metadata.mjd.TAI - 51544.0\n",
250 | " ra = self.column_by_name('raJ2000')\n",
251 | " speedRa = self.column_by_name('properMotionRa')\n",
252 | " dec = self.column_by_name('decJ2000')\n",
253 | " speedDec = self.column_by_name('properMotionDec')\n",
254 | " \n",
255 | " #The new columns must be returned as rows of a numpy array\n",
256 | " #in the order that they were specified to the @compound getter\n",
257 | " return numpy.array([ra + speedRa*dt, dec + speedDec*dt])"
258 | ]
259 | },
260 | {
261 | "cell_type": "code",
262 | "execution_count": null,
263 | "metadata": {
264 | "collapsed": true,
265 | "scrolled": true
266 | },
267 | "outputs": [],
268 | "source": [
269 | "myObsMetadata = ObservationMetaData(pointingRA=45.0, pointingDec=-10.0,\n",
270 | " boundType='circle', boundLength=0.02,\n",
271 | " mjd=57098.0)\n",
272 | "\n",
273 | "myProperMotionCat = demoProperMotionCatalog2(starTableConnection,\n",
274 | " obs_metadata=myObsMetadata)\n",
275 | "\n",
276 | "myProperMotionCat.write_catalog('proper_motion_example.txt')\n",
277 | "!cat proper_motion_example.txt"
278 | ]
279 | },
280 | {
281 | "cell_type": "markdown",
282 | "metadata": {},
283 | "source": [
284 | "The other thing that is wrong with this catalog is that we have not applied proper motion correctly. This is just a cartoon which I simplified for the purposes of this demo. Forgive me. The correct calculation is much more complicated. Fortunately, we have already implemented that calculation in the CatSim stack in the form of the Astrometry mixin.\n",
285 | "\n",
286 | "Mixins are Python classes that exist solely to define methods to be inherited by other classes. In the case of CatSim, we use mixins to provide getter methods to be inherited by `InstanceCatalog` daughter classes. Many mixins exist in CatSim to provide calculated columns to `InstanceCatalog`s. They can be listed by with\n",
287 | "\n",
288 | "```\n",
289 | "from lsst.sims.catUtils import mixins\n",
290 | "\n",
291 | "dir(mixins)\n",
292 | "```\n",
293 | "\n",
294 | "One of the mixins -- `AstrometryStars` -- provides getters for the columns `raObserved` and `decObserved` which apply proper motion, parallax, precession, nutation, aberration, and refraction by the atmosphere to RA and Dec to get the observed geocentric RA, Dec (as opposed to the usual coordinates in the International Celestial Reference System). We can include these columns in our InstanceCatalog like this:"
295 | ]
296 | },
297 | {
298 | "cell_type": "code",
299 | "execution_count": null,
300 | "metadata": {
301 | "collapsed": true,
302 | "scrolled": true
303 | },
304 | "outputs": [],
305 | "source": [
306 | "from lsst.sims.catUtils.mixins import AstrometryStars\n",
307 | "\n",
308 | "class astrometricCatalog(InstanceCatalog, AstrometryStars):\n",
309 | " column_outputs = ['raJ2000', 'decJ2000', 'raObserved', 'decObserved']\n",
310 | " transformations = {'raJ2000':numpy.degrees,\n",
311 | " 'decJ2000':numpy.degrees,\n",
312 | " 'raObserved':numpy.degrees,\n",
313 | " 'decObserved':numpy.degrees}"
314 | ]
315 | },
316 | {
317 | "cell_type": "code",
318 | "execution_count": null,
319 | "metadata": {
320 | "collapsed": true,
321 | "scrolled": true
322 | },
323 | "outputs": [],
324 | "source": [
325 | "myObsMetadata = ObservationMetaData(pointingRA=45.0, pointingDec=-10.0,\n",
326 | " boundType='circle', boundLength=0.02,\n",
327 | " mjd=57098.0)\n",
328 | "\n",
329 | "myAstrometricCat = astrometricCatalog(starTableConnection,\n",
330 | " obs_metadata=myObsMetadata)\n",
331 | "\n",
332 | "myAstrometricCat.write_catalog('astrometry_example.txt')\n",
333 | "!cat astrometry_example.txt"
334 | ]
335 | },
336 | {
337 | "cell_type": "markdown",
338 | "metadata": {},
339 | "source": [
340 | "Be defining `astrometricCatalog` to inherit both from `InstanceCatalog` and `AstrometryStars`, we made sure that the `astrometricCatalog` combined the basic functionality of the `InstanceCatalog` with the getter methods provided by `AstrometryStars`, giving us access to the columns `raObserved` and `decObserved`.\n",
341 | "\n",
342 | "So far, we have seen how an `InstanceCatalog` can get columns either directly from the database (as translated through the `CatalogDBObject`) or from getter methods. There is one other way to provide columns to the `InstanceCatalog`: default values."
343 | ]
344 | },
345 | {
346 | "cell_type": "code",
347 | "execution_count": null,
348 | "metadata": {
349 | "collapsed": true,
350 | "scrolled": true
351 | },
352 | "outputs": [],
353 | "source": [
354 | "class defaultColumnExampleCatalog(InstanceCatalog):\n",
355 | " column_outputs = ['raJ2000', 'decJ2000', 'fudgeFactor1', 'fudgeFactor2']\n",
356 | " \n",
357 | " transformations = {'raJ2000':numpy.degrees, 'decJ2000':numpy.degrees}\n",
358 | " \n",
359 | " default_columns = [('fudgeFactor1', 1.1, float),\n",
360 | " ('fudgeFactor2', 'hello', (str,5))]"
361 | ]
362 | },
363 | {
364 | "cell_type": "code",
365 | "execution_count": null,
366 | "metadata": {
367 | "collapsed": true,
368 | "scrolled": true
369 | },
370 | "outputs": [],
371 | "source": [
372 | "fudgeCat = defaultColumnExampleCatalog(starTableConnection,\n",
373 | " obs_metadata=myObsMetadata)\n",
374 | "\n",
375 | "fudgeCat.write_catalog('default_example.txt')\n",
376 | "!cat default_example.txt"
377 | ]
378 | },
379 | {
380 | "cell_type": "markdown",
381 | "metadata": {},
382 | "source": [
383 | "The member variable `default_columns` allows you to assign default values to columns that do not exist either in the `CatalogDBObject` or as getter methods. `default_columns` is a list of tuples. Each tuple corresponds to a different column being defaulted. The first entry in the tuple is the name of the column. The second entry is its value. The third entry is its data type."
384 | ]
385 | },
386 | {
387 | "cell_type": "markdown",
388 | "metadata": {},
389 | "source": [
390 | "# Getting realistic ObservationMetaData"
391 | ]
392 | },
393 | {
394 | "cell_type": "markdown",
395 | "metadata": {},
396 | "source": [
397 | "In all of the examples so far, we have been creating our `ObservationMetaData` objects by hand. It is, however, possible to create `ObservationMetaData` objects that are based on actual actual pointings generated with the LSST Operations Simulator (or OpSim). We do this using the `ObservationMetaDataGenerator` class defined in the `sims_catUtils` package. We will use a cartoon OpSim output included with the simulations software stack. More realistic OpSim outputs can be downloaded from\n",
398 | "\n",
399 | "https://www.lsst.org/scientists/simulations/opsim/opsim-v335-benchmark-surveys"
400 | ]
401 | },
402 | {
403 | "cell_type": "code",
404 | "execution_count": null,
405 | "metadata": {
406 | "collapsed": true,
407 | "scrolled": true
408 | },
409 | "outputs": [],
410 | "source": [
411 | "import os\n",
412 | "import eups\n",
413 | "from lsst.sims.catUtils.utils import ObservationMetaDataGenerator\n",
414 | "\n",
415 | "opsimPath = os.path.join(eups.productDir('sims_data'),'OpSimData')\n",
416 | "opsimDB = os.path.join(opsimPath,'opsimblitz1_1133_sqlite.db')\n",
417 | "\n",
418 | "#you need to provide ObservationMetaDataGenerator with the connection\n",
419 | "#string to an OpSim output database. This is the connection string\n",
420 | "#to a test database that comes when you install CatSim.\n",
421 | "obs_generator = ObservationMetaDataGenerator(database=opsimDB, driver='sqlite')"
422 | ]
423 | },
424 | {
425 | "cell_type": "markdown",
426 | "metadata": {},
427 | "source": [
428 | "The `ObservationMetaDataGenerator` allows you to query OpSim pointings according to physical criteria. It then returns ObservationMetaData based on the pointings that meet your criteria. For example, let's say we wanted 10 pointings with 5 < RA < 8 degrees"
429 | ]
430 | },
431 | {
432 | "cell_type": "code",
433 | "execution_count": null,
434 | "metadata": {
435 | "collapsed": true,
436 | "scrolled": true
437 | },
438 | "outputs": [],
439 | "source": [
440 | "obsMetaDataResults = obs_generator.getObservationMetaData(limit=10, fieldRA=(5.0, 8.0))\n",
441 | "\n",
442 | "for obs_metadata in obsMetaDataResults: \n",
443 | " print(obs_metadata.pointingRA)"
444 | ]
445 | },
446 | {
447 | "cell_type": "markdown",
448 | "metadata": {},
449 | "source": [
450 | "To see the full list of parameters you can query OpSimPointings on using the ObservationMetaDataGenerator"
451 | ]
452 | },
453 | {
454 | "cell_type": "code",
455 | "execution_count": null,
456 | "metadata": {
457 | "collapsed": true,
458 | "scrolled": true
459 | },
460 | "outputs": [],
461 | "source": [
462 | "help(ObservationMetaDataGenerator.getObservationMetaData)"
463 | ]
464 | },
465 | {
466 | "cell_type": "markdown",
467 | "metadata": {},
468 | "source": [
469 | "# Generating Images with PhoSim"
470 | ]
471 | },
472 | {
473 | "cell_type": "markdown",
474 | "metadata": {},
475 | "source": [
476 | "PhoSim operates as a piece of software totally independent from CatSim. The documentation for PhoSim can be found here:\n",
477 | "\n",
478 | "https://confluence.lsstcorp.org/pages/viewpage.action?pageId=4129126\n",
479 | "\n",
480 | "To convert CatSim catalogs into PhoSim images, you must use CatSim to write your catalog in a format that PhoSim expects and then (by hand) feed that catalog into PhoSim. Fortunately, we have written scripts that know how to format CatSim catalogs for just this purpose. The script\n",
481 | "\n",
482 | " sims_catUtils/examples/generatePhosimInput.py\n",
483 | "\n",
484 | "will write out a catalog `phoSim_example.txt` that one can run through phoSim using the command\n",
485 | "\n",
486 | " phosim phosim_example.txt\n",
487 | "\n",
488 | "to get out FITS images. Below, we will step through `generatePhosimInput.py`, explaining how the script operates."
489 | ]
490 | },
491 | {
492 | "cell_type": "code",
493 | "execution_count": null,
494 | "metadata": {
495 | "collapsed": true,
496 | "scrolled": true
497 | },
498 | "outputs": [],
499 | "source": [
500 | "from __future__ import with_statement\n",
501 | "from lsst.sims.utils import ObservationMetaData\n",
502 | "from lsst.sims.catalogs.definitions import InstanceCatalog\n",
503 | "from lsst.sims.catalogs.db import CatalogDBObject\n",
504 | "from lsst.sims.catUtils.exampleCatalogDefinitions import \\\n",
505 | " (PhoSimCatalogPoint, PhoSimCatalogSersic2D, PhoSimCatalogZPoint,\n",
506 | " DefaultPhoSimHeaderMap)\n",
507 | "\n",
508 | "from lsst.sims.catUtils.baseCatalogModels import *"
509 | ]
510 | },
511 | {
512 | "cell_type": "markdown",
513 | "metadata": {},
514 | "source": [
515 | "Most of what is above ought to be familiar. `PhoSimCatalogPoint`, `PhoSimCatalogSersic2D` and `PhoSimCatalogZPoint` are `InstanceCatalog` daughter classes written to format catalogs of point sources, sersic profiles, and extra-galactic point sources (i.e. AGN) respectively for input into PhoSim"
516 | ]
517 | },
518 | {
519 | "cell_type": "code",
520 | "execution_count": null,
521 | "metadata": {
522 | "collapsed": true,
523 | "scrolled": true
524 | },
525 | "outputs": [],
526 | "source": [
527 | "obs_metadata_list = obs_generator.getObservationMetaData(obsHistID=10)\n",
528 | "obs_metadata = obs_metadata_list[0]"
529 | ]
530 | },
531 | {
532 | "cell_type": "markdown",
533 | "metadata": {},
534 | "source": [
535 | "Now we are in a difficult position. We would like to generate PhoSim images with all kinds of astronomical objects included. However, the way the `CatalogDBObject` classes are written, each class only interfaces with one database table, i.e. only one kind of astronomical object. Fortunately, `InstanceCatalog` provides functionality that allows you to write multiple catalogs to a single file. The code below loops over all of the different varieties of stellar object and writes them to the same PhoSim-ready catalog file.\n",
536 | "\n",
537 | "Note: The cells below will run for a long time as they try to find all of the galaxies in one particular LSST field of view (there are a lot of them). You can adjust this by reducing the size of your field of view with the `ObservationMetaData.boundLength` attribute. `boundLength` is the radius of the field of view in degrees (1.75 is the nominal LSST field of view radius)."
538 | ]
539 | },
540 | {
541 | "cell_type": "code",
542 | "execution_count": null,
543 | "metadata": {
544 | "collapsed": true,
545 | "scrolled": true
546 | },
547 | "outputs": [],
548 | "source": [
549 | "starObjNames = ['msstars', 'bhbstars', 'wdstars', 'rrlystars', 'cepheidstars']\n",
550 | "\n",
551 | "doHeader= True\n",
552 | "for starName in starObjNames:\n",
553 | " stars = CatalogDBObject.from_objid(starName)\n",
554 | " star_phoSim=PhoSimCatalogPoint(stars,obs_metadata=obs_metadata) #the class for phoSim input files\n",
555 | " #containing point sources\n",
556 | " if (doHeader):\n",
557 | " star_phoSim.phoSimHeaderMap = DefaultPhoSimHeaderMap\n",
558 | " with open(\"phoSim_example.txt\",\"w\") as fh:\n",
559 | " star_phoSim.write_header(fh)\n",
560 | " doHeader = False\n",
561 | "\n",
562 | " #below, write_header=False prevents the code from overwriting the header just written\n",
563 | " #write_mode = 'a' allows the code to append the new objects to the output file, rather\n",
564 | " #than overwriting the file for each different class of object.\n",
565 | " star_phoSim.write_catalog(\"phoSim_example.txt\",write_mode='a',write_header=False,chunk_size=20000)\n"
566 | ]
567 | },
568 | {
569 | "cell_type": "markdown",
570 | "metadata": {},
571 | "source": [
572 | "Now, we will do the same thing for galaxy components, using the `PhoSimCatalogSersic2D` `InstanceCatalog` class for galaxy bulges and disks and using `PhoSimCatalogZPoint` for AGNs. "
573 | ]
574 | },
575 | {
576 | "cell_type": "code",
577 | "execution_count": null,
578 | "metadata": {
579 | "scrolled": true
580 | },
581 | "outputs": [],
582 | "source": [
583 | "gals = CatalogDBObject.from_objid('galaxyBulge')\n",
584 | "\n",
585 | "#now append a bunch of objects with 2D sersic profiles to our output file\n",
586 | "galaxy_phoSim = PhoSimCatalogSersic2D(gals, obs_metadata=obs_metadata)\n",
587 | "galaxy_phoSim.write_catalog(\"phoSim_example.txt\",write_mode='a',write_header=False,chunk_size=20000)\n",
588 | "\n",
589 | "gals = CatalogDBObject.from_objid('galaxyDisk')\n",
590 | "galaxy_phoSim = PhoSimCatalogSersic2D(gals, obs_metadata=obs_metadata)\n",
591 | "galaxy_phoSim.write_catalog(\"phoSim_example.txt\",write_mode='a',write_header=False,chunk_size=20000)\n",
592 | "\n",
593 | "gals = CatalogDBObject.from_objid('galaxyAgn')\n",
594 | "\n",
595 | "#PhoSimCatalogZPoint is the phoSim input class for extragalactic point sources (there will be no parallax\n",
596 | "#or proper motion)\n",
597 | "galaxy_phoSim = PhoSimCatalogZPoint(gals, obs_metadata=obs_metadata)\n",
598 | "galaxy_phoSim.write_catalog(\"phoSim_example.txt\",write_mode='a',write_header=False,chunk_size=20000)\n"
599 | ]
600 | },
601 | {
602 | "cell_type": "markdown",
603 | "metadata": {},
604 | "source": [
605 | "We now have a catalog file phoSim_example.txt that contains all the information PhoSim needs to generate a simulated LSST focal plane (albeit with a very small populated field of view; note that our `ObservationMetaData` only drew a circle with a radius of 0.05 degrees.\n",
606 | "\n",
607 | "The output generated by PhoSim is described here:\n",
608 | "\n",
609 | "https://confluence.lsstcorp.org/display/PHOSIM/Output"
610 | ]
611 | },
612 | {
613 | "cell_type": "markdown",
614 | "metadata": {},
615 | "source": [
616 | "# Generating Images with GalSim\n",
617 | "\n",
618 | "Unlike PhoSim, GalSim can be run seamlessly with the CatSim stack. The code governing the CatSim-GalSim interface resides in the package\n",
619 | "\n",
620 | " sims_GalSimInterface\n",
621 | "\n",
622 | "The basic philosophy of the CatSim-GalSim interface is as follows:\n",
623 | "\n",
624 | "\n",
625 | "* The user instantiates a GalSim catalog class as defined in `sims_GalSimInterface/python/lsst/sims/GalSimInterface/galSimCatalogs.py`. These are `InstanceCatalog` daughter classes designed to amass the information that GalSim needs to generate images of a field of view. Like any `InstanceCatalog` daughter class, these classes require a `CatalogDBobject` and an `ObservationMetaData.`\n",
626 | "\n",
627 | "\n",
628 | "* Each GalSim catalog class contains in it an instantiation of the class `GalSimInterpreter` defined in `sims_GalSimInterface/python/lsst/sims/GalSimInterface/galSimInterpreter.py`. As the user iterates over the GalSim catalog (for instance, by calling `write_catalog`), astronomical objects are passed to the `GalSimInterpreter`. The `GalSimInterpreter` determines which chips in the camera the each astronomical object falls on and adds that astronomical object to the corresponding chips' FITS files.\n",
629 | "\n",
630 | "\n",
631 | "* Once the user has iterated over the catalog, the method `write_images` (which belongs to the GalSim Catalog class) will write out FITS images for all of the chips in the camera, based on the objects that have been passed through the `GalSimInterpreter`.\n",
632 | "\n",
633 | "Examples of how to use this system can be found in\n",
634 | "\n",
635 | " sims_GalSimInterface/examples/"
636 | ]
637 | },
638 | {
639 | "cell_type": "code",
640 | "execution_count": null,
641 | "metadata": {
642 | "collapsed": true,
643 | "scrolled": true
644 | },
645 | "outputs": [],
646 | "source": [
647 | "from lsst.sims.GalSimInterface import GalSimStars, DoubleGaussianPSF\n",
648 | "from lsst.obs.lsstSim import LsstSimMapper #this is to model the LSST camera\n",
649 | "\n",
650 | "class testGalSimStars(GalSimStars):\n",
651 | " bandpassNames = ['u', 'r']\n",
652 | " camera = LsstSimMapper().camera\n",
653 | " PSF = DoubleGaussianPSF()"
654 | ]
655 | },
656 | {
657 | "cell_type": "code",
658 | "execution_count": null,
659 | "metadata": {
660 | "collapsed": true,
661 | "scrolled": true
662 | },
663 | "outputs": [],
664 | "source": [
665 | "#use our ObservationMetaDataGenerator from above\n",
666 | "obsMetaDataResults = generator.getObservationMetaData(limit=1, fieldRA = (-5.0, 5.0),\n",
667 | " boundType = 'circle', boundLength = 0.01)\n",
668 | "\n",
669 | "myTestCat = testGalSimStars(starTableConnection, obs_metadata=obsMetaDataResults[0])\n",
670 | "myTestCat.write_catalog('galsim_test_catalog.txt')\n",
671 | "myTestCat.write_images(nameRoot = 'testImage')"
672 | ]
673 | },
674 | {
675 | "cell_type": "markdown",
676 | "metadata": {},
677 | "source": [
678 | "This code generates a series of FITS images, one for each detector in our camera. The files are named testImage_detectorName_filterName.fits. It also produced a catalog galsim_test_catalog.txt that contains information about the camera in the header as well as information about each object, namely:\n",
679 | "\n",
680 | "* Its source type ('point' or 'sersic')\n",
681 | "\n",
682 | "* A unique identifier\n",
683 | "\n",
684 | "* The name of the chip its centroid fell on\n",
685 | "\n",
686 | "* Its location on the pupil\n",
687 | "\n",
688 | "* The name of the file containing its SED\n",
689 | "\n",
690 | "* Its major and minor axes\n",
691 | "\n",
692 | "* The index of its Sersic Profile\n",
693 | "\n",
694 | "* Its half light radius\n",
695 | "\n",
696 | "* Its position angle\n",
697 | "\n",
698 | "* A list of all of the detectors it may have illuminated"
699 | ]
700 | },
701 | {
702 | "cell_type": "markdown",
703 | "metadata": {},
704 | "source": [
705 | " Customizing the GalSim Catalog classes \n",
706 | "\n",
707 | "As with all of the `InstanceCatalog` daughter classes we have seen so far, the GalSim Catalog is customized using class member variables. Specifically, you may want to specify\n",
708 | "\n",
709 | "\n",
710 | "* `bandpassNames` = a list of the names of the bandpasses you are observing through\n",
711 | "\n",
712 | "* `bandpassDir` = a string denoting the directory in which your bandpass throughput files live\n",
713 | "\n",
714 | "* `bandpassRoot` = a string that is the root of the name of your bandpass files\n",
715 | "\n",
716 | "* `componentList` = a list of filenames (files kept in `bandpassDir`) specifying the throughputs of the individual (non-filter) components of your camera (i.e. mirrors, lenses, etc.)\n",
717 | "\n",
718 | "* `atmoTransmissionName` = the name of a file (also kept in `bandpassDir`) specifying the transmissivity of the atmosphere\n",
719 | "\n",
720 | "* `skySEDname` = the name of a file (also kept in `bandpassDir`) representing the emission spectrum of the dark sky in ergs/cm^2/s/nm (note: this is not yet used by the code, but is required as a placeholder).\n",
721 | "\n",
722 | "All of the above default to files consistent with the LSST design.\n",
723 | "\n",
724 | "The `GalSimCatalog` assumes that your bandpass throughput files are stored in files with the naming convention:\n",
725 | "\n",
726 | " for bpn in bandpass_names:\n",
727 | " fileName = self.bandpassDir + '/' + self.bandpassRoot + '_' + bpn + '.dat'\n",
728 | "\n",
729 | "It reads in these files and convolves them with the hardware throughputs from `componentList` and the atmospherica transmissivity from `atmoTransmissionName` to produce the total throughput of your instrument.\n",
730 | "\n",
731 | "Other user-specified variables are:\n",
732 | "\n",
733 | "* `camera` = a member variable containing an instaniation of an afwCamerGeom camera object. This defaults to a small, test camera. The code above shows how to specify the LSST camera.\n",
734 | "\n",
735 | "\n",
736 | "* `PSF` = a member variable containing a class describing the point spread function for your catalog. This must be a class that inherits from PSFbase and defines certain methods as described in the PSFbase docstring.\n"
737 | ]
738 | },
739 | {
740 | "cell_type": "code",
741 | "execution_count": null,
742 | "metadata": {
743 | "collapsed": true,
744 | "scrolled": true
745 | },
746 | "outputs": [],
747 | "source": [
748 | "from lsst.sims.GalSimInterface import PSFbase\n",
749 | "\n",
750 | "help(PSFbase)"
751 | ]
752 | },
753 | {
754 | "cell_type": "markdown",
755 | "metadata": {},
756 | "source": [
757 | "Noise will be handled similarly, by assigning noise class instantiation to the member variable `noise_and_background`. An example of this is provided in the class `ExampleCCDNoise`."
758 | ]
759 | },
760 | {
761 | "cell_type": "code",
762 | "execution_count": null,
763 | "metadata": {
764 | "collapsed": true,
765 | "scrolled": true
766 | },
767 | "outputs": [],
768 | "source": [
769 | "from lsst.sims.GalSimInterface import ExampleCCDNoise\n",
770 | "\n",
771 | "help(ExampleCCDNoise)"
772 | ]
773 | },
774 | {
775 | "cell_type": "markdown",
776 | "metadata": {},
777 | "source": [
778 | "As in the case of PhoSim, we are presented with the challenge of how to generate images containing different types of astronomical object when the `CatalogDBObject` classes are segregated the way they are. In this case, the `GalSimCatalog` classes containg a method `copyGalSimInterpreter` which allows you to pass the `GalSimInterpreter` and `camera` instantiation from one `GalSimCatalog` to another. Since the `GalSimInterpreter` is the code that actually handles writing FITS images, this allows you to add multiple catalogs of astronomical objects to the same set of FITS images. This is demonstrated in the script\n",
779 | "\n",
780 | " sims_GalSimInterface/examples/galSimCompoundGenerator.py"
781 | ]
782 | }
783 | ],
784 | "metadata": {
785 | "kernelspec": {
786 | "display_name": "Python 3",
787 | "language": "python",
788 | "name": "python3"
789 | },
790 | "language_info": {
791 | "codemirror_mode": {
792 | "name": "ipython",
793 | "version": 3
794 | },
795 | "file_extension": ".py",
796 | "mimetype": "text/x-python",
797 | "name": "python",
798 | "nbconvert_exporter": "python",
799 | "pygments_lexer": "ipython3",
800 | "version": "3.6.2"
801 | }
802 | },
803 | "nbformat": 4,
804 | "nbformat_minor": 1
805 | }
806 |
--------------------------------------------------------------------------------
/CatSim/LightCurveExample.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 31,
6 | "metadata": {
7 | "collapsed": true
8 | },
9 | "outputs": [],
10 | "source": [
11 | "from __future__ import with_statement\n",
12 | "import numpy as np"
13 | ]
14 | },
15 | {
16 | "cell_type": "markdown",
17 | "metadata": {},
18 | "source": [
19 | "First, open an ssh tunnel to the CatSim database hosted by the University of Washington. Open a terminal window and type\n",
20 | "\n",
21 | "```\n",
22 | "ssh -L 51433:fatboy.phys.washington.edu:1433 simsuser@gateway.astro.washington.edu\n",
23 | "```\n",
24 | "\n",
25 | "There is some configuration that you will have to do to make sure this works. Instructions are here:\n",
26 | "\n",
27 | "https://confluence.lsstcorp.org/display/SIM/Accessing+the+UW+CATSIM+Database"
28 | ]
29 | },
30 | {
31 | "cell_type": "markdown",
32 | "metadata": {},
33 | "source": [
34 | "Now, we need to specify the download a database of OpSim-simulated pointings from\n",
35 | "\n",
36 | "https://www.lsst.org/scientists/simulations/opsim/opsim-surveys-data-directory\n",
37 | "\n",
38 | "and specify its location with the ```opsimdb``` variable"
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "execution_count": 32,
44 | "metadata": {
45 | "collapsed": true
46 | },
47 | "outputs": [],
48 | "source": [
49 | "import os\n",
50 | "opsimdb = os.path.join(\"/Users\",\"danielsf\",\"physics\")\n",
51 | "opsimdb = os.path.join(opsimdb, \"lsst_150412\", \"Development\", \"garage\")\n",
52 | "opsimdb = os.path.join(opsimdb, \"OpSimData\", \"minion_1016_sqlite.db\")"
53 | ]
54 | },
55 | {
56 | "cell_type": "markdown",
57 | "metadata": {},
58 | "source": [
59 | "Connections to the CatSim database of celestial objects are handled via the classes stored in ```lsst.sims.catUtils.baseCatalogModels```. To see what's available, you can use"
60 | ]
61 | },
62 | {
63 | "cell_type": "code",
64 | "execution_count": 33,
65 | "metadata": {},
66 | "outputs": [
67 | {
68 | "data": {
69 | "text/plain": [
70 | "['BaseCatalogConfig',\n",
71 | " 'BaseCatalogModels',\n",
72 | " 'BaseCatalogObj',\n",
73 | " 'BhbStarObj',\n",
74 | " 'BrightStarObj',\n",
75 | " 'CepheidStarObj',\n",
76 | " 'CometObj',\n",
77 | " 'DwarfGalStarObj',\n",
78 | " 'EasterEggStarObj',\n",
79 | " 'EbStarObj',\n",
80 | " 'GalaxyAgnObj',\n",
81 | " 'GalaxyBulgeObj',\n",
82 | " 'GalaxyDiskObj',\n",
83 | " 'GalaxyModels',\n",
84 | " 'GalaxyObj',\n",
85 | " 'GalaxyTileCompoundObj',\n",
86 | " 'GalaxyTileObj',\n",
87 | " 'ImageAgnObj',\n",
88 | " 'LensGalaxyObj',\n",
89 | " 'MBAObj',\n",
90 | " 'MiscSolarSystemObj',\n",
91 | " 'MsStarObj',\n",
92 | " 'NEOObj',\n",
93 | " 'OpSim3_61DBObject',\n",
94 | " 'RRLyStarObj',\n",
95 | " 'SNDBObj',\n",
96 | " 'SolarSystemObj',\n",
97 | " 'SsmModels',\n",
98 | " 'StarBase',\n",
99 | " 'StarModels',\n",
100 | " 'StarObj',\n",
101 | " 'WdStarObj',\n",
102 | " '__builtins__',\n",
103 | " '__cached__',\n",
104 | " '__doc__',\n",
105 | " '__file__',\n",
106 | " '__loader__',\n",
107 | " '__name__',\n",
108 | " '__package__',\n",
109 | " '__path__',\n",
110 | " '__spec__',\n",
111 | " 'snModels']"
112 | ]
113 | },
114 | "execution_count": 33,
115 | "metadata": {},
116 | "output_type": "execute_result"
117 | }
118 | ],
119 | "source": [
120 | "import lsst.sims.catUtils.baseCatalogModels as baseCatalogModels\n",
121 | "\n",
122 | "dir(baseCatalogModels)"
123 | ]
124 | },
125 | {
126 | "cell_type": "markdown",
127 | "metadata": {},
128 | "source": [
129 | "or the confluence page\n",
130 | "\n",
131 | "https://confluence.lsstcorp.org/display/SIM/Database+Schema\n",
132 | "\n",
133 | "For this tutorial, will instantiate a connection to the table of RRLyrae"
134 | ]
135 | },
136 | {
137 | "cell_type": "code",
138 | "execution_count": 34,
139 | "metadata": {},
140 | "outputs": [],
141 | "source": [
142 | "from lsst.sims.catUtils.baseCatalogModels import RRLyStarObj\n",
143 | "stardb = RRLyStarObj()"
144 | ]
145 | },
146 | {
147 | "cell_type": "markdown",
148 | "metadata": {},
149 | "source": [
150 | "Now we will create a LightCurveGenerator, connecting it to both our database of celestial objects and our database of simulated pointings"
151 | ]
152 | },
153 | {
154 | "cell_type": "code",
155 | "execution_count": 35,
156 | "metadata": {},
157 | "outputs": [],
158 | "source": [
159 | "from lsst.sims.catUtils.utils import StellarLightCurveGenerator\n",
160 | "lc_gen = StellarLightCurveGenerator(stardb, opsimdb)"
161 | ]
162 | },
163 | {
164 | "cell_type": "markdown",
165 | "metadata": {},
166 | "source": [
167 | "Now let us query our database of pointings for all of the pointings in a range of RA and Dec and a set of filters"
168 | ]
169 | },
170 | {
171 | "cell_type": "code",
172 | "execution_count": 36,
173 | "metadata": {},
174 | "outputs": [
175 | {
176 | "name": "stdout",
177 | "output_type": "stream",
178 | "text": [
179 | "parameters (60.0, 65.0) (-15.0, -10.0) ('g', 'r', 'i') None\n"
180 | ]
181 | }
182 | ],
183 | "source": [
184 | "raRange = (60.0, 65.0)\n",
185 | "decRange = (-15.0, -10.0)\n",
186 | "bandpass = ('g', 'r', 'i')\n",
187 | "pointings = lc_gen.get_pointings(raRange, decRange, bandpass=bandpass)"
188 | ]
189 | },
190 | {
191 | "cell_type": "markdown",
192 | "metadata": {},
193 | "source": [
194 | "`pointings` is now a 2-D list. Each row of the list is a set of pointings (represented by instantiations of the `ObservationMetaData` class) at different dates and through different filters but centered on the same patch of sky."
195 | ]
196 | },
197 | {
198 | "cell_type": "code",
199 | "execution_count": 37,
200 | "metadata": {},
201 | "outputs": [
202 | {
203 | "name": "stdout",
204 | "output_type": "stream",
205 | "text": [
206 | "3\n"
207 | ]
208 | }
209 | ],
210 | "source": [
211 | "print(len(pointings))"
212 | ]
213 | },
214 | {
215 | "cell_type": "markdown",
216 | "metadata": {},
217 | "source": [
218 | "Now we will use the `LightCurveGenerator` to extract the light curves of all of the objects in our selected pointings."
219 | ]
220 | },
221 | {
222 | "cell_type": "code",
223 | "execution_count": 38,
224 | "metadata": {},
225 | "outputs": [
226 | {
227 | "name": "stdout",
228 | "output_type": "stream",
229 | "text": [
230 | "starting query\n",
231 | "query took 0.11227202415466309\n",
232 | "starting query\n",
233 | "query took 0.1597890853881836\n",
234 | "starting query\n",
235 | "query took 0.11198782920837402\n",
236 | "light curves took 7.219829e+00 seconds to generate\n"
237 | ]
238 | }
239 | ],
240 | "source": [
241 | "lc_dict, truth_dict = lc_gen.light_curves_from_pointings(pointings)"
242 | ]
243 | },
244 | {
245 | "cell_type": "markdown",
246 | "metadata": {},
247 | "source": [
248 | "Let's analyze one of these light curves using the `gatspy` time series analysis package described here\n",
249 | "\n",
250 | "https://jakevdp.github.io/blog/2015/06/13/lomb-scargle-in-python/"
251 | ]
252 | },
253 | {
254 | "cell_type": "code",
255 | "execution_count": 39,
256 | "metadata": {},
257 | "outputs": [],
258 | "source": [
259 | "lc = lc_dict[853673991]"
260 | ]
261 | },
262 | {
263 | "cell_type": "markdown",
264 | "metadata": {},
265 | "source": [
266 | "For better or worse, variability in CatSim is represented as a json-encoded dict that points to a light curve which lives in the lsst `sims_sed_library` package."
267 | ]
268 | },
269 | {
270 | "cell_type": "code",
271 | "execution_count": 40,
272 | "metadata": {},
273 | "outputs": [
274 | {
275 | "name": "stdout",
276 | "output_type": "stream",
277 | "text": [
278 | "{\"pars\":{\"filename\":\"rrly_lc/RRab/1981725_per.txt\", \"tStartMjd\":3.335375066201238e+004}, \"varMethodName\":\"applyRRly\"}\n"
279 | ]
280 | }
281 | ],
282 | "source": [
283 | "print(truth_dict[853673991])"
284 | ]
285 | },
286 | {
287 | "cell_type": "markdown",
288 | "metadata": {},
289 | "source": [
290 | "Inspecting the `1981625_per.txt` light curve file, we see that this corresponds to an RR Lyra with a period of 0.485873 days."
291 | ]
292 | },
293 | {
294 | "cell_type": "markdown",
295 | "metadata": {},
296 | "source": [
297 | "Let's first just plot one of our light curves."
298 | ]
299 | },
300 | {
301 | "cell_type": "code",
302 | "execution_count": 41,
303 | "metadata": {},
304 | "outputs": [
305 | {
306 | "data": {
307 | "text/plain": [
308 | "[,\n",
309 | " ]"
310 | ]
311 | },
312 | "execution_count": 41,
313 | "metadata": {},
314 | "output_type": "execute_result"
315 | },
316 | {
317 | "data": {
318 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYwAAAEKCAYAAAAB0GKPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAG15JREFUeJzt3X+0XWV95/H3B2JAqIAxwQYwDVlDsCw0gDctrJZZBBfV\nRirjaBem0AZtDYtaimORinaE6jCDtKOjpSpZmQiMNEI1oONvtGBaSxMTShECLCyWIT80oahBWi4/\n8p0/zr7k3JNz7n3uvWfv/ZyzP6+17rrn7PPsc77753c/z3723ooIzMzMJnNA3QGYmdlgcMIwM7Mk\nThhmZpbECcPMzJI4YZiZWRInDDMzS+KEYWZmSZwwzMwsiROGmZklmVV3AP00d+7cWLhwYd1hmJkN\njC1btjweEfNSyg5Vwli4cCGbN2+uOwwzs4Eh6dHUsm6SMjOzJE4YZmaWxAnDzMySOGGYmVkSJwwz\nM0vihGFmZkmcMMzMLIkThpmZJSktYUhaK2mXpPs6hl8s6UFJ90u6pst4x0u6p+1vj6R3lRWnwbnX\n3cW5191VdxiWwMvK6lTmld7XA9cCN44NkLQMOAdYEhGjko7sHCkiHgJOKsofCGwHbi0xTrNkYzvr\nmy88reZIqtfkac9ZlcultBpGRGwAnugYfBFwdUSMFmV2TfI1rwX+OSKSL103M7NyVH0OYzFwuqSN\nkr4taekk5d8KrKsgLrNpaVIT0dade9i6c0/dYViNqr754CxgDnAqsBS4RdKiiIjOgpJmA28ELp/o\nCyWtAlYBLFiwoO8BN4GbGAaHl5XVqeqEsQ1YXySITZL2AnOB3V3K/jpwd0T8aKIvjIjVwGqAkZGR\n/RKP9eY26XSeV3DC/MPqDsG6qHKdrLpJ6jZgGYCkxcBs4PEeZVfg5igzs2yUVsOQtA44A5graRtw\nBbAWWFt0tX0GWBkRIekoYE1ELC/GPRQ4C7iwrPjMJpNSq2hSjaNJ02rdlZYwImJFj4/O71J2B7C8\n7f1TwMtKCs0GlJuFzOo1VE/cs6nxjjfd2Lxy0rLcDMV1GDY4mtQ11MymzzUMsx7qqEW4BmM5c8Kw\ngeGdqFm9nDDMpqAzablGYHWrct1zwjDv7AZIzgkq59imY9impx+cMBrMG0R+vCwsZ+4lZdaDe4+Z\njecahtkMuEZgTeKEYbVz01i6nOdRzrFNx7BNTz84YTRY3RuEm3ssZz6Q2Z8ThlkPTd5ReGdp3fik\nt9mA88n5ZshhOTthmNnAymEn2iRukrLadN4B1iwnbo7bnxNGg9XdTu1EYZZu6849dYfghGGTqzux\n2MSGebl43dsnh2eq+xyG2YBzO75VxTUMq52PHs0GgxNGxsqujte9o67796eiaU0jTZlOmxonDDPL\nlhNXXpwwzAZck3eqTZ72Ovikt5mZJXENwyyBj2Stbjmsg04Ymcj5pGqOMZlZ9dwkZWZmSZwwauKL\nrcwm5m0kP26SypibgswsJ04YmXByGC/nczrWHF4Px3OTVCZc/Taz3LmGYWZZ6jyqb8LRfu7TWFrC\nkLQWOBvYFREntg2/GHgn8Dzw5Yi4rMu4RwBrgBOBAN4eEUN1+J3rCmFm1kuZNYzrgWuBG8cGSFoG\nnAMsiYhRSUf2GPdjwNci4i2SZgOHlBintcn9CMcm5uVnZSotYUTEBkkLOwZfBFwdEaNFmV2d40k6\nHPiPwAVFmWeAZ8qKMxfewM0sd1Wfw1gMnC7pKuBp4NKI+G5HmWOB3cCnJS0BtgCXRMRT1YZqTeYj\n9fx4WdSv6oQxC5gDnAosBW6RtCgioqPMKcDFEbFR0seA9wL/tdsXSloFrAJYsGBBmbH3Va47pBye\nG9xk/Vovcl2/ZmIYp6lT7tNWdcLYBqwvEsQmSXuBubRqFO1ltkXExuL952gljK4iYjWwGmBkZCR6\nlbM0OTw32KZvbIfjLtr9kfsOvGpVX4dxG7AMQNJiYDbweHuBiPgh8Jik44tBrwW2VhmkmZntr8xu\nteuAM4C5krYBVwBrgbWS7qN1IntlRISko4A1EbG8GP1i4Kaih9QjwNvKinMiTagCD5Iql4eXudn+\nyuwltaLHR+d3KbsDWN72/h5gpKTQsuAdUt7qOljo1+8N4/o1jNM0aHylt42Ty0aZSxxmto/vJdVA\nvm+VmU2HaxgT8FFuHnwuySwPThgN5h1xb54nZvtzk5SZmSVxwmigmy88Lfsj6EE5zzIocZr1w6RN\nUpIEnAcsiogPSloA/HxEbCo9OjPcPGR5aXJTbso5jE8Ae4EzgQ8CTwKfp3UvqKHVhJVimKdtqpqw\nvM1mKiVh/HJEnCLpHwEi4sfFFdhm1mdOXJazlITxrKQDaT35DknzaNU4zEozKDvMQYnTpibHxJ1D\nTCkJ4+PArcCRxXMs3gL8SalRmZllKqckUrVJE0ZE3CRpC627xgr4TxHxQOmR1SSHLG7jVbFMvLxn\nztvO8OuZMCTNaXu7C1jX/llEPFFmYGZN5J2t5WyiGsYWWuctBCwAfly8PgL4f7QepWpmVppBv2tw\nP+UQU88L9yLi2IhYBHwT+I2ImBsRLwPOBr5RVYBm7XyhXHN52dcv5aT3qRHxjrE3EfFVSdeUGFOt\ncsjiNp6XyWBoynJq8rmalISxQ9KfAJ8p3p8H7CgvJDOz/DUxcaQkjBW0Hq96a/F+QzHMzKxUTdoZ\nD4KUbrVPAJdUEIvZpLwDMatPys0H76C4yrtdRJxZSkRmZl3kcrAwFkcTT8CnNEld2vb6YODNwHPl\nhGNmNhhySWBVSmmS2tIx6DuShv7W5k08oWVmLd7+u0tpkmq/4vsA4DXA4aVFVLMmVjPHeCOxnHh9\nzE9Kk1T7Fd/PAT8AfrfMoMzMLD8pCeMXI+Lp9gGSDiopHjMzy1RKwvh74JSOYXd1GTZUXA0226e9\neaiupqIqf9fbf3cT3a3254GjgRdLOplWkxTAYcAhFcRmFfNGYjnx+pifiWoYrwMuAI4BPtI2/Eng\nfSXGVCuvpGZm3fVMGBFxA3CDpDdHxOcrjMnMLHtN7MU1UZPU+RHxGWChpHd3fh4RH+kymvVBE1fE\nVJ439Wif33XNey/z+k3UJHVo8f/nqgjEzMzyNlGT1HXF/z+dzhdLWkvrYUu7IuLEtuEXA+8Enge+\nHBGXdRn3X2idK3keeC4iRqYTw0z5aNbMbJ+UK73nAe8AFraXj4i3TzLq9cC1wI1t37UMOAdYEhGj\nko6cYPxlEfH4ZPFZ/zhBmqVr4naSch3GF4C/pfWo1udTvzgiNkha2DH4IuDqiBgtyuxK/b4maeKK\nmKpJ8ybXBJ5rXFa+lIRxSET8cZ9+bzFwuqSrgKeBSyPiu13KBfBNSc8D10XE6l5fKGkVsApgwYIF\nfQrTzMw6pSSML0laHhFf6dPvzQFOBZYCt0haFBGdz9v41YjYXjRZ3S7pwYjY0O0Li2SyGmBkZGS/\n53bMhI+gzMz2SUkYlwDvkzQKPEvriu+IiMOm8XvbgPVFgtgkaS8wF9jdXigithf/d0m6FfglWo+G\ntRI5QZrZRFKeh/GSPv7ebcAy4A5Ji4HZwLgT25IOBQ6IiCeL178GfLCPMZj1VdPa9Jsynba/lF5S\n3W4y+FPg0Yjo+eQ9SeuAM4C5krYBVwBrgbWS7gOeAVZGREg6ClgTEcuBlwO3ShqL768i4mtTm6zB\n1rQdkHXn5W+57QtSmqQ+QevOtN8r3r8KuA84XNJFEfGNbiNFxIoe33d+l7I7gOXF60eAJQlxmTVK\nbjsPa54DEsrsAE6OiNdExGuAk4BHgLOAa8oMzqzpzr3urkY/BdLyklLDWBwR94+9iYitkl4ZEY8U\nzUZmjeYjfmuKlIRxv6RPAp8t3p8LbC2euvdsaZHVJIdqv3dAZgb57QtSEsYFwO8D7yrefwe4lFay\nWFZOWGbWKbedhzVPSrfafwf+Z/HX6Wd9j8jMsqjpmnVK6VZ7HPA/gBOAg8eGR8SiEuMyGxjeuVtT\npDRJfZrWNRQfpdUE9TbSelcNJG/0Zmbdpez4XxwR3wIUEY9GxJXAG8oNy8xssDShC3RKDWNU0gHA\nw5L+ANiOn8JnVirXdC1HqTcfPAT4Q+BDwJnAyjKDMhskZe3cfW7EIK/1IKWX1NjzKn5G6/zF0Mlp\ngZiZ5Sqll9QI8H7gFxj/iNZXlxiXmdlAacIBZ0qT1E3Ae2jdfHBvueGYmVmuUhLG7oj4YumRmNk4\ng3zE6mbe/slpHqYkjCskrQG+BYyODYyI9aVFVbGcFoiZWa5SEsbbgFcCL2Jfk1QAQ5MwzMxscikJ\nY2lEHF96JGZmlrWUhPH3kk6IiK2lR5MBt71aDjqvGL75wtMGat0chBgHRU7LPSVhnArcI+kHtM5h\nCAh3qzUza5aUhPH60qMws2nJ6ejThl/Kld6PVhGI2aDwTtqaKqWG0SjeCRjUnxS6/a7XTaubE4bZ\nEKg7wVl5clqmQ/sgJLMmuPnC0yrZoTThWQ82uZ41DElP0rpAr6uIOKyUiMwyl9MRX66aWONpwjT3\nTBgR8RIASR8CdgL/h1aX2vOA+ZVEZ1aTKjf6JuxobDiknMN4Y0QsaXv/SUn/BHygpJgqN5UN1hu3\n5cjro1Uh5RzGU5LOk3SgpAMknQc8VXZgZmaWl5Qaxm8BHyv+AvhOMcysVK7N5WOqy8DLbDilXLj3\nL8A55Ydi1kzeuQ6HJizHlEe0zgPeASxk/CNa3z7JeGuBs4FdEXFi2/CLgXcCzwNfjojLeox/ILAZ\n2B4RZ086JTMwlQXdhJWiiVybMZtcSpPUF4C/Bb5Jayef6nrgWuDGsQGSltGqrSyJiFFJR04w/iXA\nA4C775plyom2WVISxiER8cdT/eKI2CBpYcfgi4CrI2K0KLOr27iSjgHeAFwFvHuqv23DwTuhyXmH\nbVVK6SX1JUnL+/R7i4HTJW2U9G1JS3uU+1/AZex7wp+ZmdUspYZxCfA+SaPAs+x7HsZ0mopmAXNo\nPWNjKXCLpEUR8cIV5ZLGzntskXTGZF8oaRWwCmDBggXTCMnMR+hmKVJ6Sb2kj7+3DVhfJIhNkvYC\nc4HdbWV+BXhjUas5GDhM0mci4vwe8a0GVgOMjIz0vJWJ2VRM1NTjZqB9PA+aJelutZJeChxHawcO\ntM5RTOP3bgOWAXdIWgzMBh5vLxARlwOXF797BnBpr2Rh1nTeYVuVUrrV/h6tZqljgHtoNSfdBZw5\nyXjrgDOAuZK2AVcAa4G1ku4DngFWRkRIOgpYExH9OldiNiNbd+6p7LdcY9nH8yJvqecwlgL/EBHL\nJL0S+O+TjRQRK3p8tF9tISJ2APsli4i4E7gzIUYzMytZSsJ4OiKeloSkgyLiQUnHlx6ZWY1OmN+7\nT0cOR78+Erc6pCSMbZKOoHX+4XZJPwb8nG8zs5pVfeCQ0kvqTcXLKyXdARwOfK3UqMxqVuWRu2sJ\n+3he5C21l9QpwK9S3K02Ip4pNSozM8uu6TGll9QHgN8E1heDPi3pryPiv5UamTVebhtLFVKnuUnz\nxPKRUsM4j9bNAp8GkHQ1re61ThhmVpomHjBMVdXzJuVeUjtou2APOAjYXk44+Tv3urteWJHNzJqk\nZw1D0l/QOmfxU+B+SbcX788CNlUTnplZmmGskeQ2LRM1SW0u/m8Bbm0bfmdp0Zi1ufnC016o0eW2\n4ZQldTqHcedo+euZMCLihs5hkk6JiLvLDcnMmqo9EToZ5iepW22bNcApZQQyKLwSm1lTTTVhqJQo\nzAq5NLXkEoel87Iq34QJQ5KAYyLisWLQn5YfkuWk7h2ndwLdeb40Q93bX6cJE0Zx6/GvAK8q3t9W\nSVRm1ki57Bitu5TrMO6e4NnbZtPma1rMBkvKOYxfBs6T9CjwFPue6f3qUiOzRsrlCDOXOGxqcmvC\n6be6py8lYbyu9CgsW8O64dnwOfe6u9i6c8+EzzKZyndB/et/3b/fKeX25n72hZmZTblbrVnf5Hb0\nZGYTc8Iws6FxwvzDhvpApO5pU0TUGkA/jYyMxObNmycvaMlyacs1s3JI2hIRIyllU7rVmpk1wkRd\nvd0N3AljHK8Q9fMyMMuXE0YC78TMbCqGdZ/hk942IZ+7MLMxThhmZoWJDpB88OSEMY5XiPp5GQye\nYexJN4zT1A9OGAm80pjlYxB25mXFVve0+6S3mZklccIwM7MkbpJKUHc10Cxnw7hdDOM09UNpCUPS\nWuBsYFdEnNg2/GLgncDzwJcj4rKO8Q4GNgAHFfF9LiKuKCtOMxssTd6Z1z3tZdYwrgeuBW4cGyBp\nGXAOsCQiRiUd2WW8UeDMiPiZpBcBfyfpqxHxDyXGamZmkyjtHEZEbACe6Bh8EXB1RIwWZXZ1GS8i\n4mfF2xcVf8Nzh8QuhvWqULOmGtZtuupzGIuB0yVdBTwNXBoR3+0sJOlAYAvwH4C/jIiN1YY5Xt3V\nQKuez1tZJ68T1feSmgXMAU4F3gPcIkmdhSLi+Yg4CTgG+CVJJ3aWGSNplaTNkjbv3r27rLjNrGLD\nepQ+yKpOGNuA9UWz0yZgLzC3V+GI+AlwB/D6CcqsjoiRiBiZN29e3wM2s4k1Zce+decetu7cU3cY\ntaq6Seo2YBlwh6TFwGzg8fYCkuYBz0bETyS9GDgL+HDFcVaqyVVcs2E0rNt0md1q1wFnAHMlbQOu\nANYCayXdBzwDrIyIkHQUsCYilgPzgRuK8xgHALdExJfKitOsm2Hd4G36Tph/WN0h1K60hBERK3p8\ndH6XsjuA5cXre4GTy4rLzAaDk3Z+/ExvM7MG8zO9zcys75wwzMwsiROGmZklccKwrDWlj7/ZdFS9\nfThh2FBxgjErjxOGmfX0qiu/zquu/HrdYYzjg4L6OGGYmVkSP3EvA74LZm+eJ2a9Vb19OGHYUHGC\nMSuPE4aZ9fS9K19Xdwj78UFBfXwOw8zMkriGkQEfMZn1R5nnA32u0TUMMzNL5IRh1oX7+tfPyyA/\nThhmZpbE5zAy43bSPDT92c2DqsztxtukaxhmZpbINQybUFNrPH5+c/2ats4NAtcwzMwsiWsYmfFR\nVR68HMz254RhXbk7owEv3No8x1uEWPWcMGxCPtJutn8bfa7uELLXpPN8Podh0+KLqsyaxwnDzPqm\nrgMJH8BUw01SmcmlejvZ7w/7hW25LIe6jSycM6Xyw75eNJ1rGNaVj9jMrJNrGDYtvrDNrHmcMMys\nb+o6kGh602FVnDAyMygr/qDEOV3DPn1l8XwbboqIumPom5GRkdi8eXPdYZiZDQxJWyJiJKVsaSe9\nJa2VtEvSfR3DL5b0oKT7JV3TZbxXSLpD0taizCVlxWhmZunKbJK6HrgWuHFsgKRlwDnAkogYlXRk\nl/GeA/4oIu6W9BJgi6TbI2JribGamdkkSqthRMQG4ImOwRcBV0fEaFFmV5fxdkbE3cXrJ4EHgKPL\nitPMzNJUfR3GYuB0SRslfVvS0okKS1oInAxsnKDMKkmbJW3evXt3X4M1M7N9qk4Ys4A5wKnAe4Bb\nJKlbQUk/B3weeFdE9Lx8NCJWR8RIRIzMmzevjJjNzIzqE8Y2YH20bAL2AnM7C0l6Ea1kcVNErK84\nRjMz66LqhHEbsAxA0mJgNvB4e4GixvG/gQci4iMVx2dmZj2U2a12HXAXcLykbZJ+F1gLLCq62n4W\nWBkRIekoSV8pRv0V4LeBMyXdU/wtLytOMzNLU1q32ohY0eOj87uU3QEsL17/HdD1vIaZmdVnqK70\nlrQbeLTin51LR7PaAHDM1XDM1XDMM/MLEZHUY2ioEkYdJG1Ovaw+F465Go65Go65On4ehpmZJXHC\nMDOzJE4YM7e67gCmwTFXwzFXwzFXxOcwzMwsiWsYZmaWxAmjIOkISZ8rntXxgKTTJM2RdLukh4v/\nL20rf7mk70t6SNLr2oa/RtL3is8+PnavLEkHSbq5GL6xuLFiZTFLWijp39suhvxURjH/ZvHsk72S\nRjrK5zqfu8ac+Xz+s+L9vZJulXREW/lc53PXmDOfzx8q4r1H0jckHdVWvvb5PCMR4b9Ws9wNwO8V\nr2cDRwDXAO8thr0X+HDx+gTgn4CDgGOBfwYOLD7bROvmigK+Cvx6Mfz3gU8Vr98K3FxxzAuB+3p8\nT90x/yJwPHAnMNJWNuf53CvmnOfzrwGzimEfHpD1uVfMOc/nw9o+/8O238xiPs9oeuv88Vz+gMOB\nH1Cc02kb/hAwv3g9H3ioeH05cHlbua8DpxVlHmwbvgK4rr1M8XoWrYt2VGHMXTewHGJu+/xOxu98\ns53PE8Sc/XwuyryJ1s09B2I+d4l5UObz5cAnc5nPM/1zk1TLscBu4NOS/lHSGkmHAi+PiJ1FmR8C\nLy9eHw081jb+tmLY0cXrzuHjxomI54CfAi+rMGaAY4tq8rclnd4WV90x95LzfJ5wnAGYz2+ndSQ7\n7vc7Yss5Zsh4Pku6StJjwHnABzp/vyO2KmOeESeMllnAKbSOBE4GnqLVnPOCaKX4nLqUTTXmncCC\niDgJeDfwV5IOqzBeSIg5Q1ONOfv5LOn9tB6FfFPFcU1kqjFnPZ8j4v0R8Yoi3j+oOK7SOGG0bAO2\nRcTYk/0+R2tF+JGk+QDF/7FHym4HXtE2/jHFsO3F687h48aRNItWdfZfq4o5IkYj4l+L11totZ8u\nziTmXnKez13lPp8lXQCcDZxXHFCM+/2O2LKNOff53OYm4M2dv98RW5Uxz4gTBhARPwQek3R8Mei1\nwFbgi8DKYthK4AvF6y8Cby16MBwLHAdsKpqC9kg6tejl8Dsd44x911uAv2nbYEuPWdI8SQcWrxcV\nMT+SScy95Dyfu8p5Pkt6PXAZ8MaI+Le2UbKdz71iznw+H9dW7Bzgwbbfr3U+z1hdJ09y+wNOAjYD\n99J60NNLabUVfgt4GPgmMKet/PtpHdU8RNGjoRg+AtxXfHYt+y6OPBj4a+D7tHpELKoyZlpHOfcD\n9wB3A7+RUcxvonW0Ngr8CPj6AMznrjFnPp+/T6s9/J7i71MDMJ+7xpz5fP588fv3Av8XODqn+TyT\nP1/pbWZmSdwkZWZmSZwwzMwsiROGmZklccIwM7MkThhmZpbECcNshiSFpM+0vZ8labekLxXvL5B0\nbfH6Sknbi1taPCxpvaQT6ordbCqcMMxm7ingREkvLt6fxb4rdbv5aEScFBHHATcDfyNpXtlBms2U\nE4ZZf3wFeEPxegWwLmWkiLgZ+AbwWyXFZdY3Thhm/fFZWrd9OBh4NbBxkvLt7gZeWUpUZn3khGHW\nBxFxL61nNKygVduYCvU9ILMSOGGY9c8XgT8nsTmqzcnAA/0Px6y/ZtUdgNkQWQv8JCK+J+mMlBEk\nvZnWY0j/qMzAzPrBCcOsTyJiG/DxLh/NonVX2zH/RdL5wKG07lB6ZkTsriBEsxnx3WrNSibpo8DD\nEfGJumMxmwknDLMSSfoqMBv4zxHx07rjMZsJJwwzM0viXlJmZpbECcPMzJI4YZiZWRInDDMzS+KE\nYWZmSZwwzMwsyf8Hx5cXJQ6ih6IAAAAASUVORK5CYII=\n",
319 | "text/plain": [
320 | ""
321 | ]
322 | },
323 | "metadata": {},
324 | "output_type": "display_data"
325 | }
326 | ],
327 | "source": [
328 | "%matplotlib inline\n",
329 | "import matplotlib.pyplot as plt\n",
330 | "fig, ax = plt.subplots()\n",
331 | "\n",
332 | "ax.errorbar(lc['r']['mjd'], lc['r']['mag'], lc['r']['error'],\n",
333 | " fmt='', linestyle='None')\n",
334 | "ax.set(xlabel='MJD', ylabel='r-band magnitude')"
335 | ]
336 | },
337 | {
338 | "cell_type": "markdown",
339 | "metadata": {},
340 | "source": [
341 | "Now let's construct a Lomb-Scargle periodogram"
342 | ]
343 | },
344 | {
345 | "cell_type": "code",
346 | "execution_count": 42,
347 | "metadata": {},
348 | "outputs": [
349 | {
350 | "data": {
351 | "text/plain": [
352 | "[(0.0001, 1.0)]"
353 | ]
354 | },
355 | "execution_count": 42,
356 | "metadata": {},
357 | "output_type": "execute_result"
358 | },
359 | {
360 | "data": {
361 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAD8CAYAAAB0IB+mAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAG3xJREFUeJzt3Xt0nPV95/HPtwZSSChOsZMlYGKSJUm9W0gThaTdXKB0\nEyA0PiR0g8khLQ2hbKBNT5tdXHbTXEg2ECDcfItxHHACcUricLOxMY7vd9nYlm+y5Zss3yT5ItmS\nZVvWd/+YGWk0msszo5l5Rnrer3N8juaZRzM//WR95vf8bo+5uwAAg98fhF0AAEB5EPgAEBEEPgBE\nBIEPABFB4ANARBD4ABARBD4ARASBDwARQeADQEScE9YbDxs2zEeOHBnW2wPAgLRmzZpmdx9eyPeG\nFvgjR45UdXV1WG8PAAOSme0p9Hvp0gGAiCDwASAiCHwAiAgCHwAigsAHgIgg8AEgIgh8AIgIAh+R\ntrSuWbua28IuBlAWoS28AirBV6aslCTtfujzIZcEKD1a+AAQEQQ+AEQEgQ8AEUHgA0BEEPgAEBEE\nPgBERKDAN7MbzKzWzOrMbGya5y8ys1fNbL2ZbTKzO4tfVABAf+QMfDMbImm8pBsljZI0xsxGpZx2\nr6TN7n61pGslPWZm5xW5rACAfgjSwr9GUp2773T305KmSxqdco5LutDMTNI7JB2R1FnUkgIA+iVI\n4F8qaW/S44b4sWTjJP2JpP2SaiR90927ilJCAEBRFGvQ9nOS1kl6j6QPSxpnZn+UepKZ3W1m1WZW\n3dTUVKS3BgAEESTw90kakfT4svixZHdKmuExdZJ2SfpQ6gu5+2R3r3L3quHDC7rpOgCgQEECf7Wk\nK83sivhA7G2SXkk5p17S9ZJkZu+W9EFJO4tZUABA/+TcLdPdO83sPklzJA2RNNXdN5nZPfHnJ0l6\nUNKzZlYjySTd7+7NJSw3ACBPgbZHdvdZkmalHJuU9PV+SZ8tbtEAAMXESlsAiAgCHwAigsAHgIgg\n8AEgIgh8AIgIAh8AIoLAB4CIIPABICIIfACICAIfACKCwAeAiCDwASAiCHwAiAgCHwAigsAHgIgg\n8AEgIgh8AIgIAh8AIoLAB4CIIPABICIIfACICAIfACKCwAeAiCDwASAiCHwAiAgCHwAigsAHgIgg\n8AEgIgh8AIgIAh8AIoLAB4ASO9vleqv+aNjFIPABoNSefHObbpmwTOv3Hgu1HAQ+AJTY5gPHJUmH\nWjtCLQeBDwARQeADQAkdbTutN7ccCrsYkgh8ACipFTsPh12EbgQ+AEQEgQ8AZeIhv3+gwDezG8ys\n1szqzGxshnOuNbN1ZrbJzBYWt5gAgP46J9cJZjZE0nhJ/11Sg6TVZvaKu29OOmeopAmSbnD3ejN7\nV6kKDAAoTJAW/jWS6tx9p7ufljRd0uiUc26XNMPd6yXJ3RuLW0wAQH8FCfxLJe1NetwQP5bsA5Le\naWYLzGyNmX21WAUEgMHCQn7/nF06ebzORyVdL+l8ScvNbIW7b0s+yczulnS3JF1++eVFemsAqFwW\ndsonCdLC3ydpRNLjy+LHkjVImuPube7eLGmRpKtTX8jdJ7t7lbtXDR8+vNAyAwAKECTwV0u60syu\nMLPzJN0m6ZWUc16W9EkzO8fMLpD0cUlbiltUABh4PGkuZtjTMnN26bh7p5ndJ2mOpCGSprr7JjO7\nJ/78JHffYmazJW2Q1CVpirtvLGXBAQD5CdSH7+6zJM1KOTYp5fEjkh4pXtEAAMXESlsAiAgCHwBK\naKDN0gEADAIEPgBEBIEPABFB4ANARBD4AFBCHvZqqyQEPgBEBIEPACXEtEwAQNkR+ABQJmH35xP4\nABARBD4AlEnY/fkEPgBEBIEPACVVOdN0CHwAKKnKWXlF4ANAmTBLBwBQFgQ+AEQEgQ8AEUHgA0BE\nEPgAUFI90zJZeAUAKAsCHwDKhGmZADCosfAKAFBmBD4ARASBDwARQeADQEmxWyYAoMwIfACICAIf\nACKCwAeAiCDwASAiCHwAiAgCHwAigsAHgLIJd18dAh8AIiJQ4JvZDWZWa2Z1ZjY2y3kfM7NOM7u1\neEUEABRDzsA3syGSxku6UdIoSWPMbFSG8x6W9EaxCwkAg0O42ywEaeFfI6nO3Xe6+2lJ0yWNTnPe\nP0r6raTGIpYPAFAkQQL/Ukl7kx43xI91M7NLJd0iaWLxigYAA1/Y97FNVqxB2yck3e/uXdlOMrO7\nzazazKqbmpqK9NYAULnCvq1hsnMCnLNP0oikx5fFjyWrkjTdYh9lwyTdZGad7v5S8knuPlnSZEmq\nqqqqoGoAgHIIN/aCBP5qSVea2RWKBf1tkm5PPsHdr0h8bWbPSnotNewBAOHKGfju3mlm90maI2mI\npKnuvsnM7ok/P6nEZQRKYtP+lrCLAJRVkBa+3H2WpFkpx9IGvbv/Xf+LBZRex5msQ07AoMNKWwAo\nocE4SwcAkFPlL7wCABQFm6cBAMqAwAeAiCDwAaCEKmmlLYEPABFB4ANACTEtEwBQdgQ+AJRJ2P35\nBD4ARASBDwBlEnZ/PoEPABFB4ANACVXQJB0CHwBKqYLWXRH4AFAuzNIBAJQFgQ8AEUHgA0BEEPgA\nEBEEPgCUUPK0TBZeAQDKgsAHgDJhWiZQ4VbtOqIZaxvCLgYGqEpaeHVO2AUAKt3/+OlySdIXP3JZ\nyCUB+ocWPgBEBIEPACXE5mkAgLIj8AGghCpp0JbAB4CIIPABICIIfERYJV1sA6VH4ANARBD4iLBK\nmjAHlB6BDwARQeADks52uR6fu02tHWfCLgoGsbBHjQh8FGRH0wn9t4d+r8bjHWEXpShmbzyoJ+dt\n1w9f2xJ2UYCSIfBRkKlLdmnfsZOas+lQ2EUpijNnuyRJHZ1nQy4JBpuwt0ROFijwzewGM6s1szoz\nG5vm+a+Y2QYzqzGzZWZ2dfGLCgADW9jTBHIGvpkNkTRe0o2SRkkaY2ajUk7bJekz7v6nkh6UNLnY\nBQWAgSjs2xomC9LCv0ZSnbvvdPfTkqZLGp18grsvc/ej8YcrJLFxOCpS4/EOjRw7Uwu3NYVdFKDs\nggT+pZL2Jj1uiB/L5GuSXk/3hJndbWbVZlbd1MQfHMpvw94WSdK0ZbvDLQgiKUh3/uETp9RysjSz\nxYo6aGtm1ykW+Pene97dJ7t7lbtXDR8+vJhvDQBl03zilOZsOhjo3HwHbT/6gzdV9YO5BZQqtyCB\nv0/SiKTHl8WP9WJmV0maImm0ux8uTvEAoPL87dRV+odfrFHbqc6SvP6Zs6WZ2hMk8FdLutLMrjCz\n8yTdJumV5BPM7HJJMyTd4e7bil9MAKgc9YfbJUlnK2nOZQA5b2Lu7p1mdp+kOZKGSJrq7pvM7J74\n85Mk/bukiyVNsNiQdKe7V5Wu2ACAfOUMfEly91mSZqUcm5T09V2S7ipu0QAgHGvrj2rY29+myy++\noN+vVUnTMgMFPjDYDKwLcZTbFycskyTtfujzWc8L0qOTfE6u7L/jZytzv2A/sLUCIozYR3nl+h+3\neHtzSd+fwEckVdBVNgawSuquCYLAR4T1/WttPnEqhHJUltqDx/UXP5qnw9RFTgNskg59+IiGvUfa\ntWBbky75oz/Met7Sut5LSGoPHi9lsSrKFycs1fqGFo3+8Hu0v6VD82ubdOtH2SUlrQHWsk+ghY9Q\nfOvF9Zq4YEfG5zfvb9VrG/YX7f3GPLNC335po05kWCjjGXpXfzBzc9HKUEpPzduuF6v35j4xi7X1\nx3S2q6cefKA1X8tpgFYNLXzk7Uevb9HzK+v79Rq/WdMgSfqf174/7fM3PbVYknTzVe/p1/sktLTH\n9iZJBHvQv9dDrQPjBi8/mRtb7/g3VSNynJmbDdTmK3KihY+8/XThzrTHO892ZWxBV4pMYZapMbur\nua2Epals/W3Etp3q1FXfnTM4dyYt8DMx7IsmAh9Fc+8La/VfvzMn7GKEouFou053doVdjKIo1syT\nusYTau3o1GNv1BbnBdFvBD6KppDbHS6rK+2841SJLp2zXa4vTVxWlNds7TijTz48X//3pZqivN7R\nttNauXPg7D945myXth5sDbsYZbOz6YSOd8SvZAO12HtOCnsaJ4GPkms6fkpH2k6nfe72KaVdWZhJ\nZ1fm1ri7a9uh4LNz2k/F7oNbrK6Lr0xZqS9PXqGurpCv/wO+/Q9nbtENTyzW3iPt6V9mgA5wZjJ3\nc0/DJtNgf6Ui8FFyH/vhm/rIg3P1Vv3R3CenyLebZP3eY73+IBPy+bP87dp9+uzjizRy7My8tqk9\n1HpK/+d32Vv5r6zfrxlrG7Kes/lArLUcVmtwRfzqImiYrY3/Xg9n+FBHj7CHwwl8lIy7a+O+lu7H\nt0xYlvf+4XuPpm81ZjJ6/FJ9fVp12rL0fpz6fM/Xm/a3qFC5Zi/906/e0r/8x/qCXz+I365p0OyN\nByTFuh8efG1zXlMsG46eLGp5wu7GKLbknydItVqBFfD7rfl3keZC4KNftsRbo2v2HOnz3KsbDujm\np5f0OtaZpsXceLxDYyavSNvt88CMwvrFEwumNu5r0aNzatV2OtbtMqsm2F2K8lGqy/pcYTJ748G0\ndfavL67XPb9cK0m667lq/WzJrlBnG5W6S2fLgdaydn8lz/QK8q6Fft79/bN9Gy79ReCjX16It2gP\ntvRdhr89YD/4nT9freU7D+s/0iwcWrkr9kEyfVW9bn56ceBy3fRUrE/55qeXaNz8uu7j+4/lbr0W\nu4UbxO7mNjXmMef/8IlTuueXa9JezSRL3KCjkFbmaxsOBDov8crJVxEP/K5GP1+6K+/3TPjF8t2B\nBq5rGlp045OLNWFBXc5zC1XT0NKrazG1KusaT2jP4cwfqIW28EuBwEdRzNtS+OXnpv29Z3ikttZq\nGlo0dkaNNu5r1a7mtl6rQTM52+VZt5rN9jdY13gi5+tnU8jg7bWPLtA1/29e4PMTYwsNObq8Ehlc\nSOSk27lx6pJdgbabeGFlvV5aV/hK6W+/vElfnrwi53n7jsV+/g0NubvhWjvO6PG529R5Nvi40J7D\nbfrrcUv04GvpV1y7u/7qJwv1mUcWZHyNXHW/79hJfeaR+doXoDHSXwQ++m3M5BWa8Vaf2xwX7KHZ\nW3s9/utxPd1C1z26QO9/YFavsYGE1NZ7ukVgiQBM3TOnP1IXc9U0HOtzzt4j7Wm7vbLJ9rG2Zk9s\noPRQa+8rq2Pt6QdOzaStB1vVeLx/K4e//9pmff6p4Fda5bKgtknfnP5W1nN+NGurnpy3Xa9vDN6t\nl+gy25DmdyoF7NKx3l/f8bOVuveFtd3Hfr2qXnsOt/d7a4wgCHz02/IMl9759t0+9PpWNR7v0ORF\n6VfyJnvizb63Tm46nnt3x0xFemldzwdWV46Cv7CyPu9W/Kd+PF9fmrg8r+/JZvnO9OsXPvz9ub0e\nn+qMjV2c7uzSDU8s1qceni8pdhXz3Vc25dX33doR256iM/V7KqDL4vTZLr2cdEUxccEOfefljb26\nmTrOxOriTLyFf+ZslyYt3NFdR+kkumP6M0KQWj2LtzdrZprusnSrwFtOnunHO/dF4KPoag8eV2Nr\nR6++84TmtlO65xdrMn7vF55eGug93tzS2N3KTUht4aXL7cQgc6rk7ovG1swfHEfaTuuB39Xob6eu\n6nmflDgoxcBwqqAfpokrgKlLd0uSTsX7or8+rVrPLtut3fG+50yzp06e7gnDe59fm/acZHWNx/t8\niCS64DrOnC361hvp6qHhaLsenr1Vzy3f0yswU2cqPb9ijx56faueydLA6BmfSDqWlODJxzNfXfWc\n/+S8zGMN6Qb/r/7eGxnPLwSBj6L73BOLtG5v+kvgp+dt1+xNmQOxT+sxi6NJM1TmbTmk3Yd792cX\nOi88Wwv/+RV7cn7/5gwfKkF85+WNgc5LLuGOptxjDqnrGbpSBnMX1Ka/Ylm9u6cbakeOsY0tB47r\nr36yqM8AaqI+Pv3j+d1bb2zc19I93z+XxdubNHLszF6/72w6ziQNsKZpNf9yxR6NHDtTW+NjEW2n\ns7Xws79Xckh/68XYdFt3796sL1aGHmkbHGW8QiLwURInz2T+I8qm0BuQfO254k1hO5Vlsddjc3u6\nkrq6XM8u3aVTZ/qev+dwW9augkyeW97zgRJ07vzKnenHBoJ0BySiJlPmfHXqqu5yZCrN+viHe2IA\n9a369B/2jUldbjc/vUS3BRiUlXo269u4v0UnT5/V16dV6xvPr+neITRVrvxcGy/f9NXB+8yTgz35\n99K9xYKk5hOxD6RfrqzX1d9/I9AHcbkR+Bjw5mS5Yiim1NlBj82t1Xdf3azH04wnfOaRBfrWixu0\nZs8R1R/uO5Pm7mnVuv2ZWOBlmp3x1t5j3V0qzSdO6QevbU47w+SBDKt7k+fo5wrBbE8ntrLOpVTT\nWXcmBecbmw9q7uZDmlVzUE/N297nQyjbh2TQa8fkq5rEFUKml73+sYV9js3f2ihJ2tUU6y77g362\n4JftKN5+UwQ+SuLfMiyYKubymOeW71bD0Xb9Q5YxgWIa80zvFun4+bEbuLycYfrh4u1N+tLE5fr0\nI/P7PPfG5kNatiPWpZFpvcLfTFre3U3w7Zc2asqSXd1dL/nM2Zf6hnZ7/IMk29VMwtzNh3TrxGU6\n0JL9PRP1MC8eeMky/YzpBi9T7c/yvt9IGVfo8t4BO2357ox7/CTM39qo0eOXakfTCR1rP63//ZsN\n3c8lXip16nA23t1d1vs1UnVPPIifP+736fv3b39mpWoCTDsNghugoCTaM/SLLkkzt7tQi7c36/G5\n24v2erms2pXftMqgsn0I1sSnnyb64E+c6pS7680tfUM1qK4u757RNG35bv3wlj/NegXwRpq9ifKV\nabB2bf1Rff6qS3odS27RJmYGJZzM0t8uxQZOk3+Ux+Zu02/WNmjh/7pO9RmCP9GXn661/lIe043T\n/R6X1jV3f7CneuLN7drR1KYrLr5AUvbxqyMZBoTzReCjrIq9wVZTBd9o+1h77j709tOd+unCzLd6\nTF1b8M+/Xqd//vW6fpXre69u6v66ZzC3tAOHt0zo2Yo6eTZLuq6S25/pWTD34Ks9C5427mvNuWjq\nC+OWatg7zut1bM/hdm3c15Jz0DmdKUvyWC2c8sPMrDmgGWuzf2C8un6//ukv/3POly7Wb4cuHQxo\niwb43ZR+PLtWKzIMukqxVl/HmbNpu0lyue7RBWmPJw8MF7oFzcvr9mnk2Jndc9vzkbxWYOrSXRqf\nZvpuQk3SAruHZ2/tNWiezr5jJ7U+TffHzU8vUWtH/6aEPjon+41cXLEB7PnxbrdcYZ+v5hOnCtpx\nNhmBD4To2WW7c55TyhWYXe66/rEFemFVfvco/ub02FXGz/JpAWfwzOLM8+Ar6ZaZ4+bXZf2Ac5fu\nyrG3UX+MHre015VSIejSQV7y2WYXxRFkn5hC/S7eR72jqbDdNDMtZMtHtq6vjjRTXsNUf6S9e0O/\nVDVptvsI4liQ6bOWeTZXPgh85CXTnatQOi8GnBYZhmJ9/G87dFwfePeFfY4Xui6jVD77+KKiv+a0\n5bkX8/V3Q78EunSQF9r3SBZkWmUQn318kebXFj7zaLD73qvpd+vMF4GPvOR7xyogqDt/vjrtLqgo\nHgIfeQn7vtoY3FLvkIbiIvCRl0rrUwUQHIGPvLQEWEwEoDIR+AAQEQQ+AEQEgQ8AEREo8M3sBjOr\nNbM6Mxub5nkzs6fiz28ws48Uv6ioBEzSAQaunIFvZkMkjZd0o6RRksaY2aiU026UdGX8392SJha5\nnACAfgrSwr9GUp2773T305KmSxqdcs5oSdM8ZoWkoWZ2SeoLYeB7+9uGhF0EAAUKEviXSkrerq8h\nfizfczAIXHLR+WEXAUCByjpoa2Z3m1m1mVU3NQ3sfcyj6pKL/jDsIqCIPvG+P+5zbNg73tbnWOKu\nVBe+LfN+i++84NxA7zn0gnN1y5/F2oPf+8J/CfQ9QVx12UXdX9/xiffq/HOH6Oar0nc0vP28zFeq\n70n6P37vde+XJH0wzcZuY665XJ+6cpiGXnCuRvzx+Xpv/M5V7734Al10/rn60H+6MOncEd1f//n7\nLtaXq2KPPzxiaPdrXz1iqCTpXRfG6v/HX7pKL9z1cV06NNbI+rPLh2Ysc1CWa7tbM/tzSd9198/F\nH/+bJLn7j5LO+amkBe7+q/jjWknXunvGnZWqqqq8urp0e0cDwGBkZmvcvaqQ7w3Swl8t6Uozu8LM\nzpN0m6RXUs55RdJX47N1PiGpJVvYAwDKL+d++O7eaWb3SZojaYikqe6+yczuiT8/SdIsSTdJqpPU\nLunO0hUZAFCIQDdAcfdZioV68rFJSV+7pHuLWzQAQDGx0hYAIoLAB4CIIPABICIIfACICAIfACIi\n58Krkr2x2XFJtaG8eeUZJqk57EJUCOqiB3XRg7ro8UF377v0N4BA0zJLpLbQ1WKDjZlVUxcx1EUP\n6qIHddHDzAreooAuHQCICAIfACIizMCfHOJ7Vxrqogd10YO66EFd9Ci4LkIbtAUAlBddOgAQESUP\nfG6A3iNAXXwlXgc1ZrbMzK4Oo5zlkKsuks77mJl1mtmt5SxfOQWpCzO71szWmdkmM1tY7jKWS4C/\nkYvM7FUzWx+vi0G5M6+ZTTWzRjPbmOH5wnLT3Uv2T7HtlHdIep+k8yStlzQq5ZybJL0uySR9QtLK\nUpYprH8B6+IvJL0z/vWNUa6LpPN+r9hOrbeGXe4Q/18MlbRZ0uXxx+8Ku9wh1sUDkh6Ofz1c0hFJ\n54Vd9hLUxaclfUTSxgzPF5SbpW7hcwP0Hjnrwt2XufvR+MMVki4rcxnLJcj/C0n6R0m/ldRYzsKV\nWZC6uF3SDHevlyR3H6z1EaQuXNKFZmaS3qFY4HeWt5il5+6LFPvZMikoN0sd+NwAvUe+P+fXFPsE\nH4xy1oWZXSrpFkkTy1iuMAT5f/EBSe80swVmtsbMvlq20pVXkLoYJ+lPJO2XVCPpm+7eVZ7iVZSC\ncjPMlbbIwMyuUyzwPxl2WUL0hKT73b0r1piLtHMkfVTS9ZLOl7TczFa4+7ZwixWKz0laJ+kvJb1f\n0lwzW+zureEWa2AodeDvkzQi6fFl8WP5njMYBPo5zewqSVMk3ejuh8tUtnILUhdVkqbHw36YpJvM\nrNPdXypPEcsmSF00SDrs7m2S2sxskaSrJQ22wA9SF3dKeshjHdl1ZrZL0ockrSpPEStGQblZ6i4d\nboDeI2ddmNnlkmZIumOQt95y1oW7X+HuI919pKTfSPrGIAx7KdjfyMuSPmlm55jZBZI+LmlLmctZ\nDkHqol6xKx2Z2bslfVDSzrKWsjIUlJslbeE7N0DvFrAu/l3SxZImxFu2nT4IN4wKWBeREKQu3H2L\nmc2WtEFSl6Qp7p52ut5AFvD/xYOSnjWzGsVmqNzv7oNuF00z+5WkayUNM7MGSd+RdK7Uv9xkpS0A\nRAQrbQEgIgh8AIgIAh8AIoLAB4CIIPABICIIfACICAIfACKCwAeAiPj/JAEkSiSzPy4AAAAASUVO\nRK5CYII=\n",
362 | "text/plain": [
363 | ""
364 | ]
365 | },
366 | "metadata": {},
367 | "output_type": "display_data"
368 | }
369 | ],
370 | "source": [
371 | "from gatspy.periodic import LombScargleFast\n",
372 | "model = LombScargleFast().fit(lc['r']['mjd'], lc['r']['mag'],\n",
373 | " lc['r']['error'])\n",
374 | "periods, power = model.periodogram_auto(nyquist_factor=100)\n",
375 | "\n",
376 | "fig, ax = plt.subplots()\n",
377 | "ax.plot(periods, power)\n",
378 | "ax.set(xlim=(0.0001, 1.0))"
379 | ]
380 | },
381 | {
382 | "cell_type": "markdown",
383 | "metadata": {},
384 | "source": [
385 | "Now let's find the highest-power period"
386 | ]
387 | },
388 | {
389 | "cell_type": "code",
390 | "execution_count": 43,
391 | "metadata": {},
392 | "outputs": [
393 | {
394 | "data": {
395 | "text/plain": [
396 | "gatspy.periodic.lomb_scargle_fast.LombScargleFast"
397 | ]
398 | },
399 | "execution_count": 43,
400 | "metadata": {},
401 | "output_type": "execute_result"
402 | }
403 | ],
404 | "source": [
405 | "type(model)\n"
406 | ]
407 | },
408 | {
409 | "cell_type": "code",
410 | "execution_count": 44,
411 | "metadata": {},
412 | "outputs": [
413 | {
414 | "name": "stdout",
415 | "output_type": "stream",
416 | "text": [
417 | "Finding optimal frequency:\n",
418 | " - Estimated peak width = 0.00186\n",
419 | " - Using 5 steps per peak; omega_step = 0.000372\n",
420 | " - User-specified period range: 0.1 to 1.4\n",
421 | " - Computing periods at 156928 steps\n",
422 | "Zooming-in on 5 candidate peaks:\n",
423 | " - Computing periods at 995 steps\n",
424 | "0.485871142483\n"
425 | ]
426 | }
427 | ],
428 | "source": [
429 | "model.optimizer.period_range = (0.1, 1.4)\n",
430 | "best_period = model.best_period\n",
431 | "print(best_period)"
432 | ]
433 | },
434 | {
435 | "cell_type": "markdown",
436 | "metadata": {},
437 | "source": [
438 | "As you can see, it is very close to our expected period of 0.485873 days."
439 | ]
440 | },
441 | {
442 | "cell_type": "markdown",
443 | "metadata": {},
444 | "source": [
445 | "Now let's analyze several light curves. We will start by scraping the light curve files in `sims_sed_library` to get their periods."
446 | ]
447 | },
448 | {
449 | "cell_type": "code",
450 | "execution_count": 45,
451 | "metadata": {},
452 | "outputs": [],
453 | "source": [
454 | "from lsst.utils import getPackageDir\n",
455 | "import time\n",
456 | "\n",
457 | "t_start = time.time()\n",
458 | "\n",
459 | "reference_dict = {}\n",
460 | "truth_dir = os.path.join(getPackageDir('sims_sed_library'), 'rrly_lc')\n",
461 | "\n",
462 | "rrab_dir = os.path.join(truth_dir, 'RRab')\n",
463 | "rrc_dir = os.path.join(truth_dir, 'RRc')\n",
464 | "\n",
465 | "for file_name in os.listdir(rrab_dir):\n",
466 | " if 'per' in file_name:\n",
467 | " full_name = os.path.join(rrab_dir, file_name)\n",
468 | " with open(full_name, \"r\") as input_file:\n",
469 | " for line in input_file:\n",
470 | " vv = line.split()\n",
471 | " if vv[1] == 'Period':\n",
472 | " reference_dict['rrly_lc/RRab/'+file_name] = float(vv[3])\n",
473 | " break\n",
474 | "\n",
475 | "for file_name in os.listdir(rrc_dir):\n",
476 | " if 'per' in file_name:\n",
477 | " full_name = os.path.join(rrc_dir, file_name)\n",
478 | " with open(full_name, \"r\") as input_file:\n",
479 | " for line in input_file:\n",
480 | " vv = line.split()\n",
481 | " if vv[1] == 'Period':\n",
482 | " reference_dict['rrly_lc/RRc/'+file_name] = float(vv[3])\n",
483 | " break\n"
484 | ]
485 | },
486 | {
487 | "cell_type": "markdown",
488 | "metadata": {},
489 | "source": [
490 | "Now let's loop through the first 10 light curves and calculate their highest-power periods."
491 | ]
492 | },
493 | {
494 | "cell_type": "code",
495 | "execution_count": 46,
496 | "metadata": {},
497 | "outputs": [
498 | {
499 | "name": "stdout",
500 | "output_type": "stream",
501 | "text": [
502 | "Finding optimal frequency:\n",
503 | " - Estimated peak width = 0.00183\n",
504 | " - Using 5 steps per peak; omega_step = 0.000367\n",
505 | " - User-specified period range: 0.1 to 1.4\n",
506 | " - Computing periods at 159151 steps\n",
507 | "Zooming-in on 5 candidate peaks:\n",
508 | " - Computing periods at 1000 steps\n",
509 | "Finding optimal frequency:\n",
510 | " - Estimated peak width = 0.00183\n",
511 | " - Using 5 steps per peak; omega_step = 0.000367\n",
512 | " - User-specified period range: 0.1 to 1.4\n",
513 | " - Computing periods at 159151 steps\n",
514 | "Zooming-in on 5 candidate peaks:\n",
515 | " - Computing periods at 1000 steps\n",
516 | "Finding optimal frequency:\n",
517 | " - Estimated peak width = 0.00183\n",
518 | " - Using 5 steps per peak; omega_step = 0.000367\n",
519 | " - User-specified period range: 0.1 to 1.4\n",
520 | " - Computing periods at 159151 steps\n",
521 | "Zooming-in on 5 candidate peaks:\n",
522 | " - Computing periods at 1000 steps\n",
523 | "Finding optimal frequency:\n",
524 | " - Estimated peak width = 0.00183\n",
525 | " - Using 5 steps per peak; omega_step = 0.000367\n",
526 | " - User-specified period range: 0.1 to 1.4\n",
527 | " - Computing periods at 159151 steps\n",
528 | "Zooming-in on 5 candidate peaks:\n",
529 | " - Computing periods at 1000 steps\n",
530 | "Finding optimal frequency:\n",
531 | " - Estimated peak width = 0.00183\n",
532 | " - Using 5 steps per peak; omega_step = 0.000367\n",
533 | " - User-specified period range: 0.1 to 1.4\n",
534 | " - Computing periods at 159151 steps\n",
535 | "Zooming-in on 5 candidate peaks:\n",
536 | " - Computing periods at 1000 steps\n",
537 | "Finding optimal frequency:\n",
538 | " - Estimated peak width = 0.00183\n",
539 | " - Using 5 steps per peak; omega_step = 0.000367\n",
540 | " - User-specified period range: 0.1 to 1.4\n",
541 | " - Computing periods at 159151 steps\n",
542 | "Zooming-in on 5 candidate peaks:\n",
543 | " - Computing periods at 1000 steps\n",
544 | "Finding optimal frequency:\n",
545 | " - Estimated peak width = 0.00183\n",
546 | " - Using 5 steps per peak; omega_step = 0.000367\n",
547 | " - User-specified period range: 0.1 to 1.4\n",
548 | " - Computing periods at 159151 steps\n",
549 | "Zooming-in on 5 candidate peaks:\n",
550 | " - Computing periods at 1000 steps\n",
551 | "Finding optimal frequency:\n",
552 | " - Estimated peak width = 0.00183\n",
553 | " - Using 5 steps per peak; omega_step = 0.000367\n",
554 | " - User-specified period range: 0.1 to 1.4\n",
555 | " - Computing periods at 159151 steps\n",
556 | "Zooming-in on 5 candidate peaks:\n",
557 | " - Computing periods at 1000 steps\n",
558 | "Finding optimal frequency:\n",
559 | " - Estimated peak width = 0.00183\n",
560 | " - Using 5 steps per peak; omega_step = 0.000367\n",
561 | " - User-specified period range: 0.1 to 1.4\n",
562 | " - Computing periods at 159151 steps\n",
563 | "Zooming-in on 5 candidate peaks:\n",
564 | " - Computing periods at 1000 steps\n",
565 | "Finding optimal frequency:\n",
566 | " - Estimated peak width = 0.00183\n",
567 | " - Using 5 steps per peak; omega_step = 0.000367\n",
568 | " - User-specified period range: 0.1 to 1.4\n",
569 | " - Computing periods at 159151 steps\n",
570 | "Zooming-in on 5 candidate peaks:\n",
571 | " - Computing periods at 1000 steps\n"
572 | ]
573 | }
574 | ],
575 | "source": [
576 | "import json\n",
577 | "\n",
578 | "true_period = []\n",
579 | "found_period = []\n",
580 | "\n",
581 | "for kk in list(lc_dict.keys())[:10]:\n",
582 | " lc_obj = lc_dict[kk]\n",
583 | " truth = json.loads(truth_dict[kk])\n",
584 | " file_name = truth['pars']['filename']\n",
585 | " for ff in list(lc_obj.keys())[:1]:\n",
586 | " lc = lc_obj[ff]\n",
587 | "\n",
588 | " model = LombScargleFast().fit(lc['mjd'], lc['mag'],\n",
589 | " lc['error'])\n",
590 | " model.optimizer.period_range = (0.1, 1.4)\n",
591 | " best_period = model.best_period\n",
592 | " true_period.append(reference_dict[file_name])\n",
593 | " found_period.append(best_period)\n",
594 | "\n",
595 | "true_period = np.array(true_period)\n",
596 | "found_peroid = np.array(found_period)"
597 | ]
598 | },
599 | {
600 | "cell_type": "markdown",
601 | "metadata": {},
602 | "source": [
603 | "Comparing to the true periods in our light curve files, we find very good agreement."
604 | ]
605 | },
606 | {
607 | "cell_type": "code",
608 | "execution_count": 47,
609 | "metadata": {},
610 | "outputs": [
611 | {
612 | "name": "stdout",
613 | "output_type": "stream",
614 | "text": [
615 | "0.499788 0.499783879203\n",
616 | "0.622184 0.622197385829\n",
617 | "0.519668 0.519664820029\n",
618 | "0.74297 0.742969569551\n",
619 | "0.565397 0.565398053065\n",
620 | "0.603456 0.603461616185\n",
621 | "0.71533 0.715328783451\n",
622 | "0.549881 0.549880333314\n",
623 | "0.716773 0.716771611366\n",
624 | "0.587908 0.587903631365\n"
625 | ]
626 | }
627 | ],
628 | "source": [
629 | "for tt, ff in zip(true_period, found_period):\n",
630 | " print(tt, ff)"
631 | ]
632 | },
633 | {
634 | "cell_type": "markdown",
635 | "metadata": {
636 | "collapsed": true
637 | },
638 | "source": [
639 | "# Generating your own cadence"
640 | ]
641 | },
642 | {
643 | "cell_type": "markdown",
644 | "metadata": {},
645 | "source": [
646 | "The work above used an OpSim-generated database of observations to define the observing cadence in the light curves. If you want to experiment with your own cadence, you will need to mock the OpSim database.\n",
647 | "\n",
648 | "The relevant table is the `Summary` table, whose schema can be found here\n",
649 | "\n",
650 | "https://www.lsst.org/scientists/simulations/opsim/summary-table-column-descriptions-v335\n",
651 | "\n",
652 | "Below: we will generate a somewhat random set of observations and construct a sqlite database that contains the relevant columns of the `Summary` table and use that as our cadence for light curve generation."
653 | ]
654 | },
655 | {
656 | "cell_type": "code",
657 | "execution_count": 48,
658 | "metadata": {},
659 | "outputs": [
660 | {
661 | "name": "stderr",
662 | "output_type": "stream",
663 | "text": [
664 | "WARNING: ErfaWarning: ERFA function \"utcut1\" yielded 1 of \"dubious year (Note 3)\" [astropy._erfa.core]\n"
665 | ]
666 | },
667 | {
668 | "name": "stdout",
669 | "output_type": "stream",
670 | "text": [
671 | "getting solar ra dec took 15.464292049407959\n",
672 | "460\n",
673 | "112\n"
674 | ]
675 | }
676 | ],
677 | "source": [
678 | "from lsst.sims.utils import ObservationMetaData, solarRaDec\n",
679 | "from lsst.sims.utils import altAzPaFromRaDec\n",
680 | "\n",
681 | "import warnings\n",
682 | "\n",
683 | "# suppress some warnings raised by astropy.time when you\n",
684 | "# calculate the relationship between UT1 and UTC in the future\n",
685 | "warnings.filterwarnings('ignore',message=\".*UT1-UTC.*\")\n",
686 | "\n",
687 | "rng = np.random.RandomState(87)\n",
688 | "\n",
689 | "# pick a point on the sky to center our observations\n",
690 | "field_ra = 112.0\n",
691 | "field_dec = -31.0\n",
692 | "\n",
693 | "# generate a random sample of MJDs\n",
694 | "mjd_candidates = rng.random_sample(1000)*3653.0+59580.0\n",
695 | "mjd_candidates.sort()\n",
696 | "\n",
697 | "obs_list = [ObservationMetaData(mjd=mm) for mm in mjd_candidates]\n",
698 | "\n",
699 | "# calculate the altitude and azimuth of the sun at those MJDs\n",
700 | "solar_alt = []\n",
701 | "solar_az = []\n",
702 | "for obs in obs_list:\n",
703 | " rr, dd = solarRaDec(obs.mjd)\n",
704 | " alt, az, pp = altAzPaFromRaDec(rr, dd, obs)\n",
705 | " solar_alt.append(alt)\n",
706 | " solar_az.append(az) \n",
707 | "\n",
708 | "print(\"getting solar ra dec took \",time.time()-t_start)\n",
709 | "\n",
710 | "# select only those MJDs where the sun is at least 10 degrees below\n",
711 | "# the horizon\n",
712 | "solar_alt = np.array(solar_alt)\n",
713 | "solar_az = np.array(solar_az)\n",
714 | "\n",
715 | "dexes_sundown = np.where(solar_alt<-10.0)\n",
716 | "print(len(dexes_sundown[0]))\n",
717 | "\n",
718 | "n_time = len(dexes_sundown[0])\n",
719 | "\n",
720 | "# further select MJDs where our field is more thatn 45 degrees\n",
721 | "# above the horizon\n",
722 | "obs_list = np.array(obs_list)[dexes_sundown]\n",
723 | "alt_list = []\n",
724 | "az_list = []\n",
725 | "for obs in obs_list:\n",
726 | " alt, az, pp = altAzPaFromRaDec(field_ra, field_dec, obs)\n",
727 | " alt_list.append(alt)\n",
728 | " az_list.append(az)\n",
729 | "\n",
730 | "alt_list = np.array(alt_list)\n",
731 | "dexes = np.where(alt_list>45.0)\n",
732 | "\n",
733 | "mjd_list = np.array([obs.mjd.TAI for obs in obs_list[dexes]])\n",
734 | "airmass_list = np.array([1.0/np.cos(0.5*np.pi-alt)\n",
735 | " for alt in np.radians(alt_list[dexes])])\n",
736 | "print(len(mjd_list))"
737 | ]
738 | },
739 | {
740 | "cell_type": "markdown",
741 | "metadata": {},
742 | "source": [
743 | "Load throughputs for the LSST bandpasses and site atmosphere at different airmasses"
744 | ]
745 | },
746 | {
747 | "cell_type": "code",
748 | "execution_count": 49,
749 | "metadata": {},
750 | "outputs": [
751 | {
752 | "name": "stdout",
753 | "output_type": "stream",
754 | "text": [
755 | "that took 33.23592495918274\n"
756 | ]
757 | }
758 | ],
759 | "source": [
760 | "import time\n",
761 | "t_start = time.time()\n",
762 | "from lsst.utils import getPackageDir\n",
763 | "from lsst.sims.photUtils import BandpassDict\n",
764 | "atmos_dir = os.path.join(getPackageDir('throughputs'), 'atmos')\n",
765 | "bp_dict = {}\n",
766 | "hw_dict = {}\n",
767 | "for airmass in np.arange(1.0, 2.6, 0.1):\n",
768 | " am_int = int(10*airmass)\n",
769 | " atmos_name = os.path.join(atmos_dir, 'atmos_%d.dat' % am_int)\n",
770 | " bp, hw = \\\n",
771 | " BandpassDict.loadBandpassesFromFiles(atmoTransmission=atmos_name)\n",
772 | " bp_dict[am_int] = bp\n",
773 | " hw_dict[am_int] = hw\n",
774 | "print(\"that took \",time.time()-t_start)"
775 | ]
776 | },
777 | {
778 | "cell_type": "markdown",
779 | "metadata": {},
780 | "source": [
781 | "Generate random seeing values."
782 | ]
783 | },
784 | {
785 | "cell_type": "code",
786 | "execution_count": 50,
787 | "metadata": {},
788 | "outputs": [],
789 | "source": [
790 | "fwhm_eff_list = rng.random_sample(len(mjd_list))*0.1+0.65"
791 | ]
792 | },
793 | {
794 | "cell_type": "markdown",
795 | "metadata": {},
796 | "source": [
797 | "Use the `sims_skybrightness` package to calculate the sky brightness and 5-sigma limiting magnitude at these MJDs."
798 | ]
799 | },
800 | {
801 | "cell_type": "code",
802 | "execution_count": 51,
803 | "metadata": {},
804 | "outputs": [
805 | {
806 | "name": "stderr",
807 | "output_type": "stream",
808 | "text": [
809 | "/Users/danielsf/physics/lsst_171025/stack/miniconda3-4.3.21-10a4fa6/DarwinX86/sims_skybrightness/2.7.0.sims/python/lsst/sims/skybrightness/skyModel.py:209: UserWarning: Adding component multiple times to the final output spectra.\n",
810 | " warnings.warn(\"Adding component multiple times to the final output spectra.\")\n",
811 | "/Users/danielsf/physics/lsst_171025/python/miniconda3-4.3.21/lib/python3.6/site-packages/ipykernel_launcher.py:14: DeprecationWarning: This function is deprecated. Please call randint(0, 2 + 1) instead\n",
812 | " \n"
813 | ]
814 | },
815 | {
816 | "name": "stdout",
817 | "output_type": "stream",
818 | "text": [
819 | "that took 8.505199909210205\n"
820 | ]
821 | }
822 | ],
823 | "source": [
824 | "t_start = time.time()\n",
825 | "\n",
826 | "import warnings\n",
827 | "warnings.filterwarnings('ignore', message='.*UT1-UTC.*')\n",
828 | "\n",
829 | "from lsst.sims.skybrightness import SkyModel\n",
830 | "model = SkyModel(lowerAtm=True, upperAtm=True, scatteredStar=True)\n",
831 | "\n",
832 | "from lsst.sims.utils import altAzPaFromRaDec\n",
833 | "from lsst.sims.photUtils import Sed, calcM5, PhotometricParameters\n",
834 | "\n",
835 | "bp_name_options = np.array(['g', 'r', 'i'])\n",
836 | "bp_name_list = bp_name_options[rng.random_integers(0,2,\n",
837 | " len(mjd_list))]\n",
838 | "photParams = PhotometricParameters()\n",
839 | "\n",
840 | "m5_list = []\n",
841 | "sky_brightness_list = []\n",
842 | "\n",
843 | "for mjd, airmass, fwhm, bp_name in \\\n",
844 | " zip(mjd_list, airmass_list,\n",
845 | " fwhm_eff_list, bp_name_list):\n",
846 | " \n",
847 | " model.setRaDecMjd(field_ra, field_dec, mjd, degrees=True)\n",
848 | " wv, fl = model.returnWaveSpec()\n",
849 | " ss = Sed(wavelen=wv, flambda=fl[0])\n",
850 | " am_int = int(10*airmass)\n",
851 | " m5 = calcM5(ss,\n",
852 | " bp_dict[am_int][bp_name],\n",
853 | " hw_dict[am_int][bp_name],\n",
854 | " photParams, FWHMeff=fwhm)\n",
855 | " m5_list.append(m5)\n",
856 | " sky_brightness_list.append(ss.calcMag(bp_dict[am_int][bp_name]))\n",
857 | "\n",
858 | "print(\"that took \",time.time()-t_start)"
859 | ]
860 | },
861 | {
862 | "cell_type": "code",
863 | "execution_count": 52,
864 | "metadata": {},
865 | "outputs": [
866 | {
867 | "name": "stdout",
868 | "output_type": "stream",
869 | "text": [
870 | "17.9100034492\n",
871 | "22.1698155192\n",
872 | "112.0\n",
873 | "-31.0\n"
874 | ]
875 | }
876 | ],
877 | "source": [
878 | "m5_list = np.array(m5_list)\n",
879 | "sky_brightness_list = np.array(sky_brightness_list)\n",
880 | "print(sky_brightness_list.min())\n",
881 | "print(sky_brightness_list.max())\n",
882 | "print(field_ra)\n",
883 | "print(field_dec)"
884 | ]
885 | },
886 | {
887 | "cell_type": "markdown",
888 | "metadata": {},
889 | "source": [
890 | "Create a sqlite database using the mock observation data we have just created."
891 | ]
892 | },
893 | {
894 | "cell_type": "code",
895 | "execution_count": 53,
896 | "metadata": {},
897 | "outputs": [],
898 | "source": [
899 | "import sqlite3\n",
900 | "db_file_name = \"example_test_cadence.db\"\n",
901 | "if os.path.exists(db_file_name):\n",
902 | " os.unlink(db_file_name)\n",
903 | "conn = sqlite3.connect(db_file_name)\n",
904 | "cc = conn.cursor()\n",
905 | "cc.execute('''CREATE TABLE Summary\n",
906 | " (fieldRA float, fieldDec float, expMJD float,\n",
907 | " filter text, FWHMeff float, fiveSigmaDepth float,\n",
908 | " filtSkyBrightness float)''')\n",
909 | "\n",
910 | "for mjd, bp, fwhm, m5, bright in \\\n",
911 | "zip(mjd_list, bp_name_list, fwhm_eff_list, m5_list, sky_brightness_list):\n",
912 | " cmd = '''INSERT INTO Summary VALUES(%.5f, %.5f, %.3f, '%s', %.5f, %.5f, %.5f)''' \\\n",
913 | " % (np.radians(field_ra), np.radians(field_dec), mjd, bp, fwhm, m5,\n",
914 | " bright)\n",
915 | " cc.execute(cmd)\n",
916 | "conn.commit()\n",
917 | "conn.close()"
918 | ]
919 | },
920 | {
921 | "cell_type": "markdown",
922 | "metadata": {},
923 | "source": [
924 | "Generate light curves using our mock pointing database"
925 | ]
926 | },
927 | {
928 | "cell_type": "code",
929 | "execution_count": 54,
930 | "metadata": {
931 | "collapsed": true
932 | },
933 | "outputs": [],
934 | "source": [
935 | "gen = StellarLightCurveGenerator(stardb, db_file_name)"
936 | ]
937 | },
938 | {
939 | "cell_type": "code",
940 | "execution_count": 55,
941 | "metadata": {},
942 | "outputs": [
943 | {
944 | "name": "stdout",
945 | "output_type": "stream",
946 | "text": [
947 | "parameters (111.0, 113.0) (-32.0, -30.0) ('g', 'r', 'i') None\n"
948 | ]
949 | }
950 | ],
951 | "source": [
952 | "pointings = gen.get_pointings((field_ra-1.0, field_ra+1.0),\n",
953 | " (field_dec-1.0, field_dec+1.0),\n",
954 | " bandpass=('g','r','i'))"
955 | ]
956 | },
957 | {
958 | "cell_type": "code",
959 | "execution_count": 56,
960 | "metadata": {},
961 | "outputs": [
962 | {
963 | "data": {
964 | "text/plain": [
965 | "1"
966 | ]
967 | },
968 | "execution_count": 56,
969 | "metadata": {},
970 | "output_type": "execute_result"
971 | }
972 | ],
973 | "source": [
974 | "len(pointings)"
975 | ]
976 | },
977 | {
978 | "cell_type": "code",
979 | "execution_count": 57,
980 | "metadata": {},
981 | "outputs": [
982 | {
983 | "data": {
984 | "text/plain": [
985 | "112"
986 | ]
987 | },
988 | "execution_count": 57,
989 | "metadata": {},
990 | "output_type": "execute_result"
991 | }
992 | ],
993 | "source": [
994 | "len(pointings[0])"
995 | ]
996 | },
997 | {
998 | "cell_type": "code",
999 | "execution_count": 58,
1000 | "metadata": {},
1001 | "outputs": [
1002 | {
1003 | "name": "stdout",
1004 | "output_type": "stream",
1005 | "text": [
1006 | "starting query\n",
1007 | "query took 0.26886725425720215\n",
1008 | "light curves took 7.821352e+00 seconds to generate\n"
1009 | ]
1010 | }
1011 | ],
1012 | "source": [
1013 | "lc, truth = gen.light_curves_from_pointings(pointings)"
1014 | ]
1015 | },
1016 | {
1017 | "cell_type": "code",
1018 | "execution_count": 59,
1019 | "metadata": {},
1020 | "outputs": [
1021 | {
1022 | "data": {
1023 | "text/plain": [
1024 | "82"
1025 | ]
1026 | },
1027 | "execution_count": 59,
1028 | "metadata": {},
1029 | "output_type": "execute_result"
1030 | }
1031 | ],
1032 | "source": [
1033 | "len(lc)"
1034 | ]
1035 | },
1036 | {
1037 | "cell_type": "code",
1038 | "execution_count": null,
1039 | "metadata": {
1040 | "collapsed": true
1041 | },
1042 | "outputs": [],
1043 | "source": []
1044 | }
1045 | ],
1046 | "metadata": {
1047 | "kernelspec": {
1048 | "display_name": "Python 3",
1049 | "language": "python",
1050 | "name": "python3"
1051 | },
1052 | "language_info": {
1053 | "codemirror_mode": {
1054 | "name": "ipython",
1055 | "version": 3
1056 | },
1057 | "file_extension": ".py",
1058 | "mimetype": "text/x-python",
1059 | "name": "python",
1060 | "nbconvert_exporter": "python",
1061 | "pygments_lexer": "ipython3",
1062 | "version": "3.6.2"
1063 | }
1064 | },
1065 | "nbformat": 4,
1066 | "nbformat_minor": 1
1067 | }
1068 |
--------------------------------------------------------------------------------
/CatSim/SignalToNoise_151207.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "This notebook demonstrates how to use the photometric signal-to-noise routines defined in `sims_photUtils/../photUtils/SignalToNoise.py`."
8 | ]
9 | },
10 | {
11 | "cell_type": "markdown",
12 | "metadata": {},
13 | "source": [
14 | "First, let's just do some setup. We will create a list of 20 random spectra drawn from the library of Kurucz spectra stored in `sims_sed_library`. We will normalize them randomly in the u band."
15 | ]
16 | },
17 | {
18 | "cell_type": "code",
19 | "execution_count": null,
20 | "metadata": {
21 | "collapsed": false
22 | },
23 | "outputs": [],
24 | "source": [
25 | "import os\n",
26 | "import numpy as np\n",
27 | "from lsst.utils import getPackageDir\n",
28 | "from lsst.sims.photUtils import Bandpass, Sed\n",
29 | "\n",
30 | "nSpectra = 20\n",
31 | "\n",
32 | "sedDir = os.path.join(getPackageDir('sims_sed_library'), 'starSED', 'kurucz')\n",
33 | "bandpassDir = os.path.join(getPackageDir('throughputs'), 'baseline')\n",
34 | "\n",
35 | "np.random.seed(42)\n",
36 | "\n",
37 | "# create a random list of SED file names drawn from sedDir\n",
38 | "sedFullNameList = np.array(os.listdir(sedDir))\n",
39 | "sedDexes = np.unique(np.random.random_integers(0, len(sedFullNameList), nSpectra))\n",
40 | "sedNameList = sedFullNameList[sedDexes]\n",
41 | "\n",
42 | "# create a random list of normalizing magnitudes\n",
43 | "magNormList = np.random.random_sample(nSpectra)*10.0 + 18.0\n",
44 | "\n",
45 | "# read in the r bandpass\n",
46 | "rBandpass = Bandpass()\n",
47 | "rBandpass.readThroughput(os.path.join(bandpassDir, 'total_r.dat'))\n",
48 | "\n",
49 | "sedList = []\n",
50 | "for name, norm in zip(sedNameList, magNormList):\n",
51 | "\n",
52 | " fullName = os.path.join(sedDir, name)\n",
53 | "\n",
54 | " # instantiate and read in the SED\n",
55 | " spectrum = Sed()\n",
56 | " spectrum.readSED_flambda(fullName)\n",
57 | "\n",
58 | " # normalize the SED to have the specified magnitude in the r band\n",
59 | " ff = spectrum.calcFluxNorm(norm, rBandpass)\n",
60 | " spectrum.multiplyFluxNorm(ff)\n",
61 | " sedList.append(spectrum)"
62 | ]
63 | },
64 | {
65 | "cell_type": "markdown",
66 | "metadata": {},
67 | "source": [
68 | "We normalized our example spectra in the u band. Let's say we want to calculate signal to noise quantities in the i band. We can do that, by loading the file `total_i.dat` from the `baseline/` sub-directory of the `throughputs` eups pacakge."
69 | ]
70 | },
71 | {
72 | "cell_type": "code",
73 | "execution_count": null,
74 | "metadata": {
75 | "collapsed": false
76 | },
77 | "outputs": [],
78 | "source": [
79 | "bandpassDir = os.path.join(getPackageDir('throughputs'), 'baseline')\n",
80 | "\n",
81 | "total_i_bandpass = Bandpass()\n",
82 | "total_i_bandpass.readThroughput(os.path.join(bandpassDir, 'total_i.dat'))"
83 | ]
84 | },
85 | {
86 | "cell_type": "markdown",
87 | "metadata": {},
88 | "source": [
89 | "`total_i.dat` contains the combined throughputof the lenses, the mirrors, the i filter, and the standard LSST atmosphere (airmass of 1.2). We could have constructed that throughput by hand by loading the throughputs for each of these components separately and then combining them with teh `readThroughputList` method. This is useful if we want to consider throughputs a different airmasses (the `throughputs` package contains a few atmospheric throughput files computed by MODTRAN at different airmasses).\n",
90 | "\n",
91 | "The cell below construct `total_i_bandpass` 'by hand' and verifies that it is the same (to within `1.0e-10`) as `total_i.dat`."
92 | ]
93 | },
94 | {
95 | "cell_type": "code",
96 | "execution_count": null,
97 | "metadata": {
98 | "collapsed": false
99 | },
100 | "outputs": [],
101 | "source": [
102 | "test_i_bandpass = Bandpass()\n",
103 | "componentList = ['detector.dat', 'm1.dat', 'm2.dat', 'm3.dat', \n",
104 | " 'lens1.dat', 'lens2.dat', 'lens3.dat', 'filter_i.dat', 'atmos_std.dat']\n",
105 | "test_i_bandpass.readThroughputList(componentList, rootDir=bandpassDir)\n",
106 | "\n",
107 | "np.testing.assert_array_equal(test_i_bandpass.wavelen, total_i_bandpass.wavelen)\n",
108 | "np.testing.assert_array_almost_equal(test_i_bandpass.sb,\n",
109 | " total_i_bandpass.sb,\n",
110 | " 10)"
111 | ]
112 | },
113 | {
114 | "cell_type": "markdown",
115 | "metadata": {},
116 | "source": [
117 | "For some of the calculations below, we will require the throughput of just the hardware (so: everything except the atmosphere). This is another use for the `readThroughputList` method."
118 | ]
119 | },
120 | {
121 | "cell_type": "code",
122 | "execution_count": null,
123 | "metadata": {
124 | "collapsed": true
125 | },
126 | "outputs": [],
127 | "source": [
128 | "hardware_i_bandpass = Bandpass()\n",
129 | "componentList = ['detector.dat', 'm1.dat', 'm2.dat', 'm3.dat', \n",
130 | " 'lens1.dat', 'lens2.dat', 'lens3.dat', 'filter_i.dat']\n",
131 | "hardware_i_bandpass.readThroughputList(componentList, rootDir=bandpassDir)"
132 | ]
133 | },
134 | {
135 | "cell_type": "markdown",
136 | "metadata": {},
137 | "source": [
138 | "We are also going to require a simulated sky emissivity spectrum. The `baseline/` sub-directory of `throughputs` also contains the file `darksky.dat`, which is our fiducial sky emissivity spectrum."
139 | ]
140 | },
141 | {
142 | "cell_type": "code",
143 | "execution_count": null,
144 | "metadata": {
145 | "collapsed": false
146 | },
147 | "outputs": [],
148 | "source": [
149 | "skySed = Sed()\n",
150 | "skySed.readSED_flambda(os.path.join(bandpassDir, 'darksky.dat'))"
151 | ]
152 | },
153 | {
154 | "cell_type": "markdown",
155 | "metadata": {},
156 | "source": [
157 | "Finally, there are a bunch of photometric parametrs (readnoise, gain, etc.) that will be required for signal-to-noise calculations. Those are contained (and passed around) in the class `PhotometricParameters`."
158 | ]
159 | },
160 | {
161 | "cell_type": "code",
162 | "execution_count": null,
163 | "metadata": {
164 | "collapsed": true
165 | },
166 | "outputs": [],
167 | "source": [
168 | "from lsst.sims.photUtils import PhotometricParameters\n",
169 | "\n",
170 | "photParams = PhotometricParameters()"
171 | ]
172 | },
173 | {
174 | "cell_type": "markdown",
175 | "metadata": {},
176 | "source": [
177 | "You can specify parameter values by hand when you call the `PhotometricParameters` constructor. If you do nothing, you will just get LSST default values."
178 | ]
179 | },
180 | {
181 | "cell_type": "code",
182 | "execution_count": null,
183 | "metadata": {
184 | "collapsed": false
185 | },
186 | "outputs": [],
187 | "source": [
188 | "help(PhotometricParameters.__init__)"
189 | ]
190 | },
191 | {
192 | "cell_type": "markdown",
193 | "metadata": {},
194 | "source": [
195 | "We are also going to be required to specify the `FWHMeff` (the full-width at half maximum in arcseconds of an idealized Gaussian PSF; this characterizes the seeing of an observation). If you don't want to figure out that value on your own, the class `LSSTdefaults` contains assumed LSST default values (as well as values for other observing parameters like m5)."
196 | ]
197 | },
198 | {
199 | "cell_type": "code",
200 | "execution_count": null,
201 | "metadata": {
202 | "collapsed": false
203 | },
204 | "outputs": [],
205 | "source": [
206 | "from lsst.sims.photUtils import LSSTdefaults\n",
207 | "\n",
208 | "defaults = LSSTdefaults()\n",
209 | "\n",
210 | "print help(defaults.FWHMeff)\n",
211 | "print '\\n'\n",
212 | "print help(defaults.effwavelen)\n",
213 | "print '\\n'\n",
214 | "print help(defaults.m5)\n",
215 | "print '\\n'\n",
216 | "print help(defaults.gamma)"
217 | ]
218 | },
219 | {
220 | "cell_type": "markdown",
221 | "metadata": {},
222 | "source": [
223 | "There are two ways to calculate the signal to noise of a source.\n",
224 | "\n",
225 | "The methods `calcSNR_sed` and `calcMagError_sed` assume that you have an SED for your source, and SED for sky emissivity, the throughput for the bandpass plus atmosphere, the throughput for the hardware only, as well as teh photometric parameters described above. These methods then calculate counts expected form the source and compare them with counts expected from the sky and other hardware-related noise contributors and give you a signal to noise ratio.\n",
226 | "\n",
227 | "The methods `calcSNR_m5` and `calcMagErro_m5` assume you do not have a full sky emissivity SED, but you do have an m5 value for the observation as well as magnitudes for your source, a total hardware plus atmosphere throughput curve, and the photometric parameters described above.\n",
228 | "\n",
229 | "Below, we demonstrate that these two methods give comparabel results."
230 | ]
231 | },
232 | {
233 | "cell_type": "markdown",
234 | "metadata": {},
235 | "source": [
236 | "First we must calculate the m5 value in the i bandpass for our observation. We can use the method `calcM5`."
237 | ]
238 | },
239 | {
240 | "cell_type": "code",
241 | "execution_count": null,
242 | "metadata": {
243 | "collapsed": false
244 | },
245 | "outputs": [],
246 | "source": [
247 | "import lsst.sims.photUtils.SignalToNoise as snr\n",
248 | "m5 = snr.calcM5(skySed, total_i_bandpass, hardware_i_bandpass, photParams, FWHMeff=0.8)\n",
249 | "print m5"
250 | ]
251 | },
252 | {
253 | "cell_type": "markdown",
254 | "metadata": {},
255 | "source": [
256 | "Calculate the signal-to-noise ration using the SED."
257 | ]
258 | },
259 | {
260 | "cell_type": "code",
261 | "execution_count": null,
262 | "metadata": {
263 | "collapsed": false
264 | },
265 | "outputs": [],
266 | "source": [
267 | "snr_sed_list = []\n",
268 | "for spectrum in sedList:\n",
269 | " val = snr.calcSNR_sed(spectrum, total_i_bandpass, skySed, hardware_i_bandpass,\n",
270 | " photParams, FWHMeff=0.8)\n",
271 | " snr_sed_list.append(val)\n",
272 | "\n",
273 | "snr_sed_list = np.array(snr_sed_list)"
274 | ]
275 | },
276 | {
277 | "cell_type": "markdown",
278 | "metadata": {},
279 | "source": [
280 | "Calculate the signal-to-noise ratio using only magnitudes. Note that, in addition to the signal-to-noise, `calcSNR_m5` also returns the `gamma` parameter from arXiv:0805.233 equation (5). This can be passed back into `calcSNR_m5` to avoid re-calculation if necessary."
281 | ]
282 | },
283 | {
284 | "cell_type": "code",
285 | "execution_count": null,
286 | "metadata": {
287 | "collapsed": false
288 | },
289 | "outputs": [],
290 | "source": [
291 | "magList = []\n",
292 | "for spectrum in sedList:\n",
293 | " mm = spectrum.calcMag(total_i_bandpass)\n",
294 | " magList.append(mm)\n",
295 | "magList = np.array(magList)\n",
296 | "\n",
297 | "snr_m5_list, gamma = snr.calcSNR_m5(magList, total_i_bandpass, m5, photParams)"
298 | ]
299 | },
300 | {
301 | "cell_type": "markdown",
302 | "metadata": {},
303 | "source": [
304 | "Compare."
305 | ]
306 | },
307 | {
308 | "cell_type": "code",
309 | "execution_count": null,
310 | "metadata": {
311 | "collapsed": false,
312 | "scrolled": true
313 | },
314 | "outputs": [],
315 | "source": [
316 | "print 'maximum deviation in snr: ',np.abs(snr_sed_list - snr_m5_list).max()\n",
317 | "\n",
318 | "print 'value of snr at maximum deviation: ', \\\n",
319 | " snr_sed_list[np.argmax(np.abs(snr_sed_list - snr_m5_list))]"
320 | ]
321 | },
322 | {
323 | "cell_type": "markdown",
324 | "metadata": {},
325 | "source": [
326 | "Calculate the magnitude error using the SED."
327 | ]
328 | },
329 | {
330 | "cell_type": "code",
331 | "execution_count": null,
332 | "metadata": {
333 | "collapsed": false
334 | },
335 | "outputs": [],
336 | "source": [
337 | "mag_error_sed_list = []\n",
338 | "for spectrum in sedList:\n",
339 | " sigma = snr.calcMagError_sed(spectrum, total_i_bandpass, skySed,\n",
340 | " hardware_i_bandpass, photParams,\n",
341 | " FWHMeff=0.8)\n",
342 | " mag_error_sed_list.append(sigma)\n",
343 | "\n",
344 | "mag_error_sed_list = np.array(mag_error_sed_list)"
345 | ]
346 | },
347 | {
348 | "cell_type": "markdown",
349 | "metadata": {},
350 | "source": [
351 | "Calculate the magnitude error using magnitudes. Note that, in addition to the magnitude error, `calcMagError_m5` also returns the `gamma` parameter from arXiv:0805.233 equation (5). This can be passed back into `calcSNR_m5` to avoid re-calculation if necessary."
352 | ]
353 | },
354 | {
355 | "cell_type": "code",
356 | "execution_count": null,
357 | "metadata": {
358 | "collapsed": false
359 | },
360 | "outputs": [],
361 | "source": [
362 | "mag_error_m5_list, gamma = snr.calcMagError_m5(magList, total_i_bandpass, m5, photParams)"
363 | ]
364 | },
365 | {
366 | "cell_type": "markdown",
367 | "metadata": {},
368 | "source": [
369 | "Compare."
370 | ]
371 | },
372 | {
373 | "cell_type": "code",
374 | "execution_count": null,
375 | "metadata": {
376 | "collapsed": false
377 | },
378 | "outputs": [],
379 | "source": [
380 | "print 'maximum deviation in magnitude error: ',np.abs(mag_error_sed_list - mag_error_m5_list).max()\n",
381 | "\n",
382 | "print 'value of magnitude at maximum deviation: ', \\\n",
383 | " magList[np.argmax(np.abs(mag_error_sed_list - mag_error_m5_list))]"
384 | ]
385 | },
386 | {
387 | "cell_type": "code",
388 | "execution_count": null,
389 | "metadata": {
390 | "collapsed": true
391 | },
392 | "outputs": [],
393 | "source": []
394 | }
395 | ],
396 | "metadata": {
397 | "kernelspec": {
398 | "display_name": "Python 2",
399 | "language": "python",
400 | "name": "python2"
401 | },
402 | "language_info": {
403 | "codemirror_mode": {
404 | "name": "ipython",
405 | "version": 2
406 | },
407 | "file_extension": ".py",
408 | "mimetype": "text/x-python",
409 | "name": "python",
410 | "nbconvert_exporter": "python",
411 | "pygments_lexer": "ipython2",
412 | "version": "2.7.10"
413 | }
414 | },
415 | "nbformat": 4,
416 | "nbformat_minor": 0
417 | }
418 |
--------------------------------------------------------------------------------
/CatSim/coordinateTransformations_150427.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "This notebook demonstrates some the coordinate manipulation methods available through the Simulations stack. It will not show all of the available routines. To investigate for yourself the methods that are available, see\n",
8 | " \n",
9 | " $SIMS_UTILS_DIR/python/lsst/sims/utils/AstrometryUtils.py\n",
10 | " \n",
11 | " $SIMS_COORDUTILS_DIR/python/lsst/sims/coordUtils/CameraUtils.py\n",
12 | "\n",
13 | " $SIMS_UTILS_DIR/python/lsst/sims/utils/coordinateTransformations.py\n",
14 | "\n",
15 | "There is, unfortunately, still a little bit of redundancy between the methods offered in these two files. Hopefully, that will get cleaned up in the near future. All of the code uses PALPY as its backend, so it should not matter which methods you use to do a given calculation. PALPY source code is available at\n",
16 | "\n",
17 | "https://github.com/Starlink/palpy\n",
18 | "\n",
19 | "You should begin by loading the stack and setting up `sims_coordUtils` using\n",
20 | "\n",
21 | " setup sims_coordUtils -t sims"
22 | ]
23 | },
24 | {
25 | "cell_type": "markdown",
26 | "metadata": {},
27 | "source": [
28 | "We will start with the highest level routines: taking (RA, Dec) pairs and determining which chip they land on. To do this, we must load a simulated camera. The code below will load the current map of the LSST camera."
29 | ]
30 | },
31 | {
32 | "cell_type": "code",
33 | "execution_count": null,
34 | "metadata": {
35 | "collapsed": false
36 | },
37 | "outputs": [],
38 | "source": [
39 | "from lsst.obs.lsstSim import LsstSimMapper\n",
40 | "camera = LsstSimMapper().camera"
41 | ]
42 | },
43 | {
44 | "cell_type": "markdown",
45 | "metadata": {},
46 | "source": [
47 | "The routines that connect (RA, Dec) on the sky with positions on the camera require information about how the telescope is pointed. As with nearly all CatSim applications, this data is stored in the `ObservationMetaData` class. Below, we will use the `ObservationMetaDataGenerator` introduced in the notebooks `generateAgnCatalog_150409.ipynb` and `CatSimTutorial_SimulationsAHM_1503.ipynb` to create a self-consistent `ObservationMetaData` instantation from an example OpSim run."
48 | ]
49 | },
50 | {
51 | "cell_type": "code",
52 | "execution_count": null,
53 | "metadata": {
54 | "collapsed": false
55 | },
56 | "outputs": [],
57 | "source": [
58 | "import os\n",
59 | "import eups\n",
60 | "from lsst.sims.catUtils.utils import ObservationMetaDataGenerator\n",
61 | "\n",
62 | "#the code below just points to an OpSim output database that\n",
63 | "#is carried around with the Simulations stack for testing purposes\n",
64 | "opSimDbName = 'opsimblitz1_1133_sqlite.db'\n",
65 | "fullName = os.path.join(eups.productDir('sims_data'),'OpSimData',opSimDbName)\n",
66 | "\n",
67 | "obsMD_generator = ObservationMetaDataGenerator(database=fullName, driver='sqlite')"
68 | ]
69 | },
70 | {
71 | "cell_type": "markdown",
72 | "metadata": {},
73 | "source": [
74 | "Create an `ObservationMetaData` instantiation based on a pointing with 24 < RA < 100 (in degrees)"
75 | ]
76 | },
77 | {
78 | "cell_type": "code",
79 | "execution_count": null,
80 | "metadata": {
81 | "collapsed": false
82 | },
83 | "outputs": [],
84 | "source": [
85 | "boundLength=3.0 #the radius of our field of view in degrees\n",
86 | "obs_metadata = obsMD_generator.getObservationMetaData(fieldRA=(24.0,100.0),\n",
87 | " limit=1, boundLength=boundLength)\n",
88 | "print obs_metadata[0].pointingRA, obs_metadata[0].rotSkyPos"
89 | ]
90 | },
91 | {
92 | "cell_type": "markdown",
93 | "metadata": {},
94 | "source": [
95 | "Now we will generate a sample of 10 (RA, Dec) pairs that are within our field of view."
96 | ]
97 | },
98 | {
99 | "cell_type": "code",
100 | "execution_count": null,
101 | "metadata": {
102 | "collapsed": false
103 | },
104 | "outputs": [],
105 | "source": [
106 | "import numpy\n",
107 | "epoch = 2000.0\n",
108 | "nsamples = 10\n",
109 | "numpy.random.seed(42)\n",
110 | "radius = boundLength*numpy.random.sample(nsamples)\n",
111 | "theta = 2.0*numpy.pi*numpy.random.sample(nsamples)\n",
112 | "\n",
113 | "raRaw = obs_metadata[0].pointingRA + radius*numpy.cos(theta)\n",
114 | "decRaw = obs_metadata[0].pointingDec + radius*numpy.sin(theta)"
115 | ]
116 | },
117 | {
118 | "cell_type": "markdown",
119 | "metadata": {},
120 | "source": [
121 | "To find what chip they fall on (given an `ObservationMetaData`), we the method `chipNameFromRaDec`."
122 | ]
123 | },
124 | {
125 | "cell_type": "code",
126 | "execution_count": null,
127 | "metadata": {
128 | "collapsed": false
129 | },
130 | "outputs": [],
131 | "source": [
132 | "from lsst.sims.coordUtils import chipNameFromRaDec\n",
133 | "\n",
134 | "chipNames = chipNameFromRaDec(ra=raRaw, dec=decRaw,\n",
135 | " camera=camera, epoch=epoch,\n",
136 | " obs_metadata=obs_metadata[0])\n",
137 | "\n",
138 | "print chipNames\n"
139 | ]
140 | },
141 | {
142 | "cell_type": "markdown",
143 | "metadata": {},
144 | "source": [
145 | " Note: currently, chipNameFromRaDec only returns the names of science chips (as opposed to wavefront sensors or guide chips) on which the object falls. It also does not know how to handle objects that fall on two chips at once."
146 | ]
147 | },
148 | {
149 | "cell_type": "markdown",
150 | "metadata": {},
151 | "source": [
152 | "There is also a method to find the pixel coordinates on the chip of each object."
153 | ]
154 | },
155 | {
156 | "cell_type": "code",
157 | "execution_count": null,
158 | "metadata": {
159 | "collapsed": false
160 | },
161 | "outputs": [],
162 | "source": [
163 | "from lsst.sims.coordUtils import pixelCoordsFromRaDec\n",
164 | "\n",
165 | "pixelCoords = pixelCoordsFromRaDec(ra=raRaw, dec=decRaw,\n",
166 | " camera=camera, epoch=epoch,\n",
167 | " obs_metadata=obs_metadata[0])\n",
168 | "\n",
169 | "for name, x, y in zip(chipNames, pixelCoords[0], pixelCoords[1]):\n",
170 | " print name, x, y"
171 | ]
172 | },
173 | {
174 | "cell_type": "markdown",
175 | "metadata": {},
176 | "source": [
177 | "And methods to calculate the pupil coordinates of an object in radians."
178 | ]
179 | },
180 | {
181 | "cell_type": "code",
182 | "execution_count": null,
183 | "metadata": {
184 | "collapsed": false,
185 | "scrolled": true
186 | },
187 | "outputs": [],
188 | "source": [
189 | "from lsst.sims.utils import pupilCoordsFromRaDec\n",
190 | "\n",
191 | "help(pupilCoordsFromRaDec)"
192 | ]
193 | },
194 | {
195 | "cell_type": "code",
196 | "execution_count": null,
197 | "metadata": {
198 | "collapsed": false
199 | },
200 | "outputs": [],
201 | "source": [
202 | "xPup, yPup = pupilCoordsFromRaDec(raRaw, decRaw,\n",
203 | " obs_metadata=obs_metadata[0], epoch=epoch)\n",
204 | "\n",
205 | "for x,y in zip(xPup, yPup):\n",
206 | " print x, y"
207 | ]
208 | },
209 | {
210 | "cell_type": "markdown",
211 | "metadata": {},
212 | "source": [
213 | "There are also methods to transform from the International Celestial Reference System to 'observed' RA/Dec\n",
214 | "\n",
215 | "* `observedFromICRS` applied precession, nutation, proper motion, parallax, radial velocity, annual aberration, diurnal aberration, and refraction. It relies on the methods below.\n",
216 | "\n",
217 | "* `appGeoFromICRS` calculates the apparent geocentric position of the object. It applies precession, nutation, proper motion, parallax, radial velocity, and annual aberration\n",
218 | "\n",
219 | "\n",
220 | "* `observedFromAppGeo` converts the apparent geocentric position to the observed position, adding diurnal aberration and refraction to the list of applied effects. You will generally only want to call `observedFromAppGeo` on coordinates that have already been corrected with `appGeoFromICRS`. This is what `observedFromICRS` does."
221 | ]
222 | },
223 | {
224 | "cell_type": "code",
225 | "execution_count": null,
226 | "metadata": {
227 | "collapsed": false
228 | },
229 | "outputs": [],
230 | "source": [
231 | "from lsst.sims.utils import observedFromICRS\n",
232 | "\n",
233 | "help(observedFromICRS)"
234 | ]
235 | },
236 | {
237 | "cell_type": "code",
238 | "execution_count": null,
239 | "metadata": {
240 | "collapsed": false
241 | },
242 | "outputs": [],
243 | "source": [
244 | "from lsst.sims.utils import appGeoFromICRS\n",
245 | "\n",
246 | "help(appGeoFromICRS)"
247 | ]
248 | },
249 | {
250 | "cell_type": "code",
251 | "execution_count": null,
252 | "metadata": {
253 | "collapsed": false
254 | },
255 | "outputs": [],
256 | "source": [
257 | "from lsst.sims.utils import observedFromAppGeo\n",
258 | "\n",
259 | "help(observedFromAppGeo)"
260 | ]
261 | },
262 | {
263 | "cell_type": "markdown",
264 | "metadata": {},
265 | "source": [
266 | "In `generateAgnCatalog_150409.ipynb` and `CatSimTutorial_SimulationsAHM_1503.ipynb` we introduced the idea of mixins and getter methods that allow you to seamlessly incorporate calculated quantities into simulated catalogs. `Astrometry.py` defines getter methods that allow you to incorporate the above calculations into catalogs.\n",
267 | "\n",
268 | "* Getters to correct the (RA, Dec) coordinates of stars are provided by the mixin `AstrometryStars`.\n",
269 | "\n",
270 | "\n",
271 | "* Getters to correct the (Ra, Dec) coordinates of galaxies are provided by the mixin `AstrometryGalaxies` (this is different from `AstrometryStars` in that `AstrometryGalaxies` knows not to bother looking for proper motion, parallax, or radial velocity)\n",
272 | "\n",
273 | "\n",
274 | "* Getters associated with camera-based quantities are provided by the mixin `CameraCoords`.\n",
275 | "\n",
276 | "Note: `AstrometryStars`, `AstrometryGalaxies`, and `CameraCoords` all inherit from the mixin `AstrometryBase` which provides getters for quantities and methods that are agnostic to the star/galaxy distinction"
277 | ]
278 | },
279 | {
280 | "cell_type": "code",
281 | "execution_count": null,
282 | "metadata": {
283 | "collapsed": false
284 | },
285 | "outputs": [],
286 | "source": [
287 | "from lsst.sims.catUtils.mixins import AstrometryBase\n",
288 | "for methodName in dir(AstrometryBase):\n",
289 | " if 'get_' in methodName:\n",
290 | " print methodName"
291 | ]
292 | },
293 | {
294 | "cell_type": "code",
295 | "execution_count": null,
296 | "metadata": {
297 | "collapsed": false
298 | },
299 | "outputs": [],
300 | "source": [
301 | "from lsst.sims.catUtils.mixins import AstrometryStars\n",
302 | "for methodName in dir(AstrometryStars):\n",
303 | " if 'get_' in methodName and methodName not in dir(AstrometryBase):\n",
304 | " print methodName"
305 | ]
306 | },
307 | {
308 | "cell_type": "code",
309 | "execution_count": null,
310 | "metadata": {
311 | "collapsed": false
312 | },
313 | "outputs": [],
314 | "source": [
315 | "from lsst.sims.catUtils.mixins import AstrometryGalaxies\n",
316 | "for methodName in dir(AstrometryGalaxies):\n",
317 | " if 'get_' in methodName and methodName not in dir(AstrometryBase):\n",
318 | " print methodName"
319 | ]
320 | },
321 | {
322 | "cell_type": "code",
323 | "execution_count": null,
324 | "metadata": {
325 | "collapsed": false
326 | },
327 | "outputs": [],
328 | "source": [
329 | "from lsst.sims.catUtils.mixins import CameraCoords\n",
330 | "for methodName in dir(CameraCoords):\n",
331 | " if 'get_' in methodName and methodName not in dir(AstrometryBase):\n",
332 | " print methodName"
333 | ]
334 | },
335 | {
336 | "cell_type": "markdown",
337 | "metadata": {},
338 | "source": [
339 | "Here we illustrate how to use these mixins to include coordinate transformations into simulated catalogs."
340 | ]
341 | },
342 | {
343 | "cell_type": "code",
344 | "execution_count": null,
345 | "metadata": {
346 | "collapsed": false
347 | },
348 | "outputs": [],
349 | "source": [
350 | "from lsst.sims.catalogs.measures.instance import InstanceCatalog\n",
351 | "from lsst.sims.catUtils.mixins import AstrometryStars\n",
352 | "\n",
353 | "class chipNameCatalog(InstanceCatalog, AstrometryStars, CameraCoords):\n",
354 | " column_outputs = ['raJ2000', 'decJ2000', 'raObserved', 'decObserved', \n",
355 | " 'chipName', 'xPix', 'yPix']\n",
356 | "\n",
357 | " transformations = {'raJ2000':numpy.degrees, 'decJ2000':numpy.degrees,\n",
358 | " 'raObserved':numpy.degrees, 'decObserved':numpy.degrees}\n",
359 | " \n",
360 | " camera = LsstSimMapper().camera\n"
361 | ]
362 | },
363 | {
364 | "cell_type": "code",
365 | "execution_count": null,
366 | "metadata": {
367 | "collapsed": false
368 | },
369 | "outputs": [],
370 | "source": [
371 | "from lsst.sims.catUtils.baseCatalogModels import WdStarObj\n",
372 | "\n",
373 | "#define a smaller ObservationMetaData so that we don't create an over large catalog\n",
374 | "obs_metadata = obsMD_generator.getObservationMetaData(fieldRA=(24.0, 100.0),\n",
375 | " limit=1, boundLength=0.5)\n",
376 | "\n",
377 | "#again, use the white dwarf database table so that we don't get too many objects\n",
378 | "#in this small example\n",
379 | "starDB = WdStarObj()\n",
380 | "\n",
381 | "testCat = chipNameCatalog(starDB, obs_metadata=obs_metadata[0])\n",
382 | "\n",
383 | "catName = 'test_cat.txt'\n",
384 | "\n",
385 | "if os.path.exists(catName):\n",
386 | " os.unlink(catName)\n",
387 | " \n",
388 | "testCat.write_catalog(catName)\n",
389 | "\n",
390 | "!cat test_cat.txt"
391 | ]
392 | }
393 | ],
394 | "metadata": {
395 | "kernelspec": {
396 | "display_name": "Python 2",
397 | "language": "python",
398 | "name": "python2"
399 | },
400 | "language_info": {
401 | "codemirror_mode": {
402 | "name": "ipython",
403 | "version": 2
404 | },
405 | "file_extension": ".py",
406 | "mimetype": "text/x-python",
407 | "name": "python",
408 | "nbconvert_exporter": "python",
409 | "pygments_lexer": "ipython2",
410 | "version": "2.7.11"
411 | }
412 | },
413 | "nbformat": 4,
414 | "nbformat_minor": 0
415 | }
416 |
--------------------------------------------------------------------------------
/CatSim/custom_agn_example.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "This notebook was written in August 2016 to demonstrate how to incorporate custom variability models into the CatSim treatment of AGNs (or, really, any variable sources). It assumes version 2.3.0 of the `lsst_sims` stack."
8 | ]
9 | },
10 | {
11 | "cell_type": "markdown",
12 | "metadata": {},
13 | "source": [
14 | "First, open an ssh tunnel to the CatSim database hosted by the University of Washington. Open a terminal window and type\n",
15 | "\n",
16 | "`ssh -L 51433:fatboy.phys.washington.edu:1433 simsuser@gateway.astro.washington.edu`\n",
17 | "\n",
18 | "There is some configuration that you will have to do to make sure this works. Instructions are here:\n",
19 | "\n",
20 | "https://confluence.lsstcorp.org/display/SIM/Accessing+the+UW+CATSIM+Database\n"
21 | ]
22 | },
23 | {
24 | "cell_type": "markdown",
25 | "metadata": {},
26 | "source": [
27 | "Now, we need to download a database of OpSim-simulated pointings from\n",
28 | "\n",
29 | "https://www.lsst.org/scientists/simulations/opsim/opsim-surveys-data-directory\n",
30 | "\n",
31 | "and specify its location with the opsimdb variable\n"
32 | ]
33 | },
34 | {
35 | "cell_type": "code",
36 | "execution_count": null,
37 | "metadata": {
38 | "collapsed": true
39 | },
40 | "outputs": [],
41 | "source": [
42 | "import numpy as np\n",
43 | "import os"
44 | ]
45 | },
46 | {
47 | "cell_type": "code",
48 | "execution_count": null,
49 | "metadata": {
50 | "collapsed": true
51 | },
52 | "outputs": [],
53 | "source": [
54 | "opsimdb = os.path.join(\"/Users\",\"danielsf\",\"physics\")\n",
55 | "opsimdb = os.path.join(opsimdb, \"lsst_150412\", \"Development\", \"garage\")\n",
56 | "opsimdb = os.path.join(opsimdb, \"OpSimData\", \"kraken_1042_sqlite.db\")"
57 | ]
58 | },
59 | {
60 | "cell_type": "markdown",
61 | "metadata": {},
62 | "source": [
63 | "Light curves are generated using a class called `LightCurveGenerator`. Sub-classes of `LightCurveGenerator` are already implemented for variable stars, type Ia supernovae, and AGNs. Below we show how to use the `AgnLightCurveGenerator` to generate light curves using the default CatSim AGN variability model (a damped random walk).\n",
64 | "\n",
65 | "Note: Throughout this notebook, we will be using the `GalaxyObj` class to establish our connection to fatboy (the UW-hosted CatSim database). `GalaxyObj` only contains objects in the -2.5Note: CatSim stores all of the new `InstanceCatalog` daughter classes defined in a registry. This means that re-running this or any cell that defines an `InstanceCatalog` daughter class will cause an exception to be thrown. You will have to re-start the kernel if you wish to re-run the cell below."
133 | ]
134 | },
135 | {
136 | "cell_type": "code",
137 | "execution_count": null,
138 | "metadata": {
139 | "collapsed": false
140 | },
141 | "outputs": [],
142 | "source": [
143 | "from lsst.sims.catalogs.decorators import compound\n",
144 | "from lsst.sims.catUtils.mixins import PhotometryGalaxies\n",
145 | "from lsst.sims.catUtils.utils.LightCurveGenerator import _baseLightCurveCatalog\n",
146 | "\n",
147 | "# _lightCurveCatalogClasses must inherit from _baseLightCurveCatalog.\n",
148 | "# _baseLightCurveCatalog defines some functionality that the LightCurveGenerator expects\n",
149 | "class _agnMeanMagCatalog(_baseLightCurveCatalog, PhotometryGalaxies):\n",
150 | " \n",
151 | " @compound(\"lightCurveMag\", \"sigma_lightCurveMag\")\n",
152 | " def get_lightCurvePhotometry(self):\n",
153 | " \"\"\"\n",
154 | " This method calculates the lightCurveMag and sigma_lightCurveMag values expected\n",
155 | " by the LightCurveGenerator. [u,g,r,i,z,y]Agn and sigma_[u,g,r,i,z,y]Agn are\n",
156 | " calculated by methods defined in the PhotometryGalaxies mixin imported above.\n",
157 | " \"\"\"\n",
158 | " return np.array([self.column_by_name(\"%sAgn\" % self.obs_metadata.bandpass),\n",
159 | " self.column_by_name(\"sigma_%sAgn\" % self.obs_metadata.bandpass)])\n",
160 | " "
161 | ]
162 | },
163 | {
164 | "cell_type": "code",
165 | "execution_count": null,
166 | "metadata": {
167 | "collapsed": false
168 | },
169 | "outputs": [],
170 | "source": [
171 | "lc_gen._lightCurveCatalogClass= _agnMeanMagCatalog\n",
172 | "\n",
173 | "# this is just a constraint on our SQL query to make sure we do not get\n",
174 | "# any objects that lack an AGN component\n",
175 | "lc_gen._constraint = 'sedname_agn IS NOT NULL'\n",
176 | "\n",
177 | "ra_bound = (-2.5, 2.5)\n",
178 | "dec_bound = (-2.25, 2.25)\n",
179 | "pointings = lc_gen.get_pointings(ra_bound, dec_bound, bandpass='r')\n",
180 | "\n",
181 | "lc_dict, truth_dict = lc_gen.light_curves_from_pointings(pointings,\n",
182 | " lc_per_field=10)\n"
183 | ]
184 | },
185 | {
186 | "cell_type": "markdown",
187 | "metadata": {},
188 | "source": [
189 | "If we plot one of the light curves from this simulation, we see that it is completely flat, since we only returned the mean magnitude of the AGN."
190 | ]
191 | },
192 | {
193 | "cell_type": "code",
194 | "execution_count": null,
195 | "metadata": {
196 | "collapsed": false
197 | },
198 | "outputs": [],
199 | "source": [
200 | "obj_name = lc_dict.keys()[5]\n",
201 | "lc = lc_dict[obj_name]\n",
202 | "\n",
203 | "fig, ax = plt.subplots()\n",
204 | "\n",
205 | "ax.errorbar(lc['r']['mjd'], lc['r']['mag'], lc['r']['error'],\n",
206 | " fmt='', linestyle='None')\n",
207 | "ax.scatter(lc['r']['mjd'], lc['r']['mag'], s=5, color='r')\n",
208 | "ax.set(xlabel='MJD', ylabel='r-band magnitude')"
209 | ]
210 | },
211 | {
212 | "cell_type": "markdown",
213 | "metadata": {},
214 | "source": [
215 | "Now let's introduce variability into our light curves. In the `_lightCurveCatalogClass` below, we define the magnitudes `[u,g,r,i,z,y]Agn_rando` which are the mean AGN magnitudes plus random variation. The getter method for `lightCurveMag` returns whichever of these random magnitudes is appropriate, based on the bandpass being observed at the time (encoded in the catalog's `self.obs_metadata`).\n",
216 | "\n",
217 | "The method to calculate the uncertainty in the random magnitudes follows a simple prescription set-up in the CatSim framework."
218 | ]
219 | },
220 | {
221 | "cell_type": "code",
222 | "execution_count": null,
223 | "metadata": {
224 | "collapsed": false
225 | },
226 | "outputs": [],
227 | "source": [
228 | "class _agnRandomMagCatalog(_baseLightCurveCatalog, PhotometryGalaxies):\n",
229 | " \n",
230 | " rng = np.random.RandomState(119)\n",
231 | " \n",
232 | " @compound('uAgn_rando', 'gAgn_rando', 'rAgn_rando',\n",
233 | " 'iAgn_rando', 'zAgn_rando', 'yAgn_rando')\n",
234 | " def get_randomMagnitudes(self):\n",
235 | " \"\"\"\n",
236 | " Calculate a varying magnitude that is the mean magnitude plus random noise.\n",
237 | " \"\"\"\n",
238 | " n_mags = len(self.column_by_name('uAgn'))\n",
239 | " return np.array([self.column_by_name('uAgn') + self.rng.random_sample(n_mags)*10.0,\n",
240 | " self.column_by_name('gAgn') + self.rng.random_sample(n_mags)*10.0,\n",
241 | " self.column_by_name('rAgn') + self.rng.random_sample(n_mags)*10.0,\n",
242 | " self.column_by_name('iAgn') + self.rng.random_sample(n_mags)*10.0,\n",
243 | " self.column_by_name('zAgn') + self.rng.random_sample(n_mags)*10.0,\n",
244 | " self.column_by_name('yAgn') + self.rng.random_sample(n_mags)*10.0])\n",
245 | "\n",
246 | " @compound('sigma_uAgn_rando', 'sigma_gAgn_rando', 'sigma_rAgn_rando',\n",
247 | " 'sigma_iAgn_rando', 'sigma_zAgn_ranod', 'sigma_yAgn_rando')\n",
248 | " def get_rando_uncertainties(self):\n",
249 | " \"\"\"\n",
250 | " Calculate the uncertainty in the random magnitudes.\n",
251 | " \n",
252 | " The method _magnitudeUncertaintyGetter is defined in the PhotometryGalaxies mixin.\n",
253 | " The arguments for that method are:\n",
254 | " \n",
255 | " list of the magnitudes for which uncertainties are to be calculated\n",
256 | " list of the bandpass names associated with these magnitudes\n",
257 | " (so that m5 in that bandpass can be looked up from self.obs_metadata)\n",
258 | " name of the attribute containing the bandpasses\n",
259 | " (self.lsstBandpassDict is set by the method that calculates [u,g,r,i,z,y]Agn)\n",
260 | " \"\"\"\n",
261 | " return self._magnitudeUncertaintyGetter(['uAgn_rando', 'gAgn_rando', 'rAgn_rando',\n",
262 | " 'iAgn_rando', 'zAgn_rando', 'yAgn_rando'],\n",
263 | " ['u', 'g', 'r', 'i', 'z', 'y'],\n",
264 | " 'lsstBandpassDict')\n",
265 | " \n",
266 | " @compound(\"lightCurveMag\", \"sigma_lightCurveMag\")\n",
267 | " def get_lightCurvePhotometry(self):\n",
268 | " return np.array([self.column_by_name(\"%sAgn_rando\" % self.obs_metadata.bandpass),\n",
269 | " self.column_by_name(\"sigma_%sAgn_rando\" % self.obs_metadata.bandpass)])\n",
270 | " \n",
271 | " "
272 | ]
273 | },
274 | {
275 | "cell_type": "code",
276 | "execution_count": null,
277 | "metadata": {
278 | "collapsed": false
279 | },
280 | "outputs": [],
281 | "source": [
282 | "from lsst.sims.catUtils.baseCatalogModels import GalaxyObj\n",
283 | "\n",
284 | "lc_gen._lightCurveCatalogClass = _agnRandomMagCatalog\n",
285 | "\n",
286 | "ra_bound = (-2.5, 2.5)\n",
287 | "dec_bound = (-2.25, 2.25)\n",
288 | "pointings = lc_gen.get_pointings(ra_bound, dec_bound, bandpass='r')\n",
289 | "\n",
290 | "lc_dict_rando, truth_dict = lc_gen.light_curves_from_pointings(pointings,\n",
291 | " lc_per_field=10)"
292 | ]
293 | },
294 | {
295 | "cell_type": "markdown",
296 | "metadata": {},
297 | "source": [
298 | "Plotting one of these light curves shows random variation."
299 | ]
300 | },
301 | {
302 | "cell_type": "code",
303 | "execution_count": null,
304 | "metadata": {
305 | "collapsed": false
306 | },
307 | "outputs": [],
308 | "source": [
309 | "obj_name = lc_dict_rando.keys()[5]\n",
310 | "lc = lc_dict_rando[obj_name]\n",
311 | "\n",
312 | "fig, ax = plt.subplots()\n",
313 | "\n",
314 | "ax.errorbar(lc['r']['mjd'], lc['r']['mag'], lc['r']['error'],\n",
315 | " fmt='', linestyle='None')\n",
316 | "ax.scatter(lc['r']['mjd'], lc['r']['mag'], s=5, color='r')\n",
317 | "ax.set(xlabel='MJD', ylabel='r-band magnitude')"
318 | ]
319 | },
320 | {
321 | "cell_type": "markdown",
322 | "metadata": {},
323 | "source": [
324 | "In the previous example, we used the catalog attribute `self.lsstBandpassDict` when calling `_magnitudeUncertaintyGetter`. Ordinarily, `self.lsstBandpassDict` is set by the method which calculates the mean AGN magnitudes (the mean AGN magnitudes are calculated by loading in model SEDs and integrating them over the LSST bandpasses). Below, we alter how the mean LSST magnitudes are calculated so that we can show how `self.lsstBandpassDict` might be set by the user. Note that we have renamed the mean magnitudes `[u,g,r,i,z,y]Agn_x`. This is because `[u,g,r,i,z,y]Agn` are already calculated by a method defined in the `PhotometryGalaxies` mixin, and we do not want to confuse CatSim."
325 | ]
326 | },
327 | {
328 | "cell_type": "code",
329 | "execution_count": null,
330 | "metadata": {
331 | "collapsed": false
332 | },
333 | "outputs": [],
334 | "source": [
335 | "from lsst.sims.photUtils import BandpassDict\n",
336 | "\n",
337 | "class _alternateAgnCatalog(_baseLightCurveCatalog, PhotometryGalaxies):\n",
338 | "\n",
339 | " @compound('uAgn_x', 'gAgn_x', 'rAgn_x', 'iAgn_x', 'zAgn_x', 'yAgn_x')\n",
340 | " def get_magnitudes(self):\n",
341 | " \"\"\"\n",
342 | " Add a component to the mean magnitude that grows linearly with time\n",
343 | " (which probably means it is not a mean magnitude any more...)\n",
344 | " \"\"\"\n",
345 | " \n",
346 | " if not hasattr(self, 'lsstBandpassDict'):\n",
347 | " self.lsstBandpassDict = BandpassDict.loadTotalBandpassesFromFiles()\n",
348 | " \n",
349 | " delta = 5.0 * (self.obs_metadata.mjd.TAI-59580.0)/3650.0\n",
350 | "\n",
351 | " # self._magnitudeGetter is defined in the PhotometryGalaxies mixin.\n",
352 | " # It's arguments are: a str indicating which galaxy component is being\n",
353 | " # simulated ('agn', 'disk', or 'bulge'), the BandpassDict containing\n",
354 | " # the bandpasses of the survey, a list of the columns defined by this\n",
355 | " # current getter method\n",
356 | " return self._magnitudeGetter('agn', self.lsstBandpassDict,\n",
357 | " self.get_magnitudes._colnames) + delta\n",
358 | " \n",
359 | " @compound('sigma_uAgn_x', 'sigma_gAgn_x', 'sigma_rAgn_x',\n",
360 | " 'sigma_iAgn_x', 'sigma_zAgn_x', 'sigma_yAgn_x')\n",
361 | " def get_uncertainties(self):\n",
362 | " return self._magnitudeUncertaintyGetter(['uAgn_x', 'gAgn_x', 'rAgn_x',\n",
363 | " 'iAgn_x', 'zAgn_x', 'yAgn_x'],\n",
364 | " ['u', 'g', 'r', 'i', 'z', 'y'],\n",
365 | " 'lsstBandpassDict')\n",
366 | " \n",
367 | " @compound(\"lightCurveMag\", \"sigma_lightCurveMag\")\n",
368 | " def get_lightCurvePhotometry(self):\n",
369 | " return np.array([self.column_by_name(\"%sAgn_x\" % self.obs_metadata.bandpass),\n",
370 | " self.column_by_name(\"sigma_%sAgn_x\" % self.obs_metadata.bandpass)])"
371 | ]
372 | },
373 | {
374 | "cell_type": "code",
375 | "execution_count": null,
376 | "metadata": {
377 | "collapsed": false
378 | },
379 | "outputs": [],
380 | "source": [
381 | "lc_gen._lightCurveCatalogClass = _alternateAgnCatalog\n",
382 | "\n",
383 | "ra_bound = (-2.5, 2.5)\n",
384 | "dec_bound = (-2.25, 2.25)\n",
385 | "pointings = lc_gen.get_pointings(ra_bound, dec_bound, bandpass='r')\n",
386 | "\n",
387 | "lc_dict_alt, truth_dict = lc_gen.light_curves_from_pointings(pointings,\n",
388 | " lc_per_field=10)"
389 | ]
390 | },
391 | {
392 | "cell_type": "markdown",
393 | "metadata": {},
394 | "source": [
395 | "Plotting one of these light curves shows an unintersting linear rise in magnitude."
396 | ]
397 | },
398 | {
399 | "cell_type": "code",
400 | "execution_count": null,
401 | "metadata": {
402 | "collapsed": false
403 | },
404 | "outputs": [],
405 | "source": [
406 | "obj_name = lc_dict_alt.keys()[5]\n",
407 | "lc = lc_dict_alt[obj_name]\n",
408 | "\n",
409 | "fig, ax = plt.subplots()\n",
410 | "\n",
411 | "ax.errorbar(lc['r']['mjd'], lc['r']['mag'], lc['r']['error'],\n",
412 | " fmt='', linestyle='None')\n",
413 | "ax.scatter(lc['r']['mjd'], lc['r']['mag'], s=5, color='r')\n",
414 | "ax.set(xlabel='MJD', ylabel='r-band magnitude')"
415 | ]
416 | },
417 | {
418 | "cell_type": "markdown",
419 | "metadata": {},
420 | "source": [
421 | "Up until now we have been ignoring the existence of the `truth_dict`, which contains the reference information about the specific variability of each object in the `_lightCurveCatalogClass`'s catalog. Just as the magnitude and uncertainty in our light curves are generated by columns known as `lightCurveMag` and `sigma_lightCurveMag`, the `truth_dict` is constructed from a column called `truthInfo`. To calculate it, you must define a method `get_truthInfo()` in your `_lightCurveCatalogClass`. The method should return a numpy array in which each element is the truth information for the corresponding AGN. Below, we construct truth info that consists of a period, phase, and amplitude for a sinusoidal light curve component. We add it to the AGN magnitudes calculated by `get_magnitudes()`."
422 | ]
423 | },
424 | {
425 | "cell_type": "code",
426 | "execution_count": null,
427 | "metadata": {
428 | "collapsed": false
429 | },
430 | "outputs": [],
431 | "source": [
432 | "class _variableAgnCatalog(_baseLightCurveCatalog, PhotometryGalaxies):\n",
433 | "\n",
434 | " rng = np.random.RandomState(88)\n",
435 | " \n",
436 | " def get_truthInfo(self):\n",
437 | " if not hasattr(self, 'truth_cache'):\n",
438 | " self.truth_cache = {}\n",
439 | " \n",
440 | " # get the uniqueIds of all of the AGn\n",
441 | " # (mostly so you know how many of them there are)\n",
442 | " id_val = self.column_by_name('uniqueId')\n",
443 | " \n",
444 | " output = []\n",
445 | " for ii in id_val:\n",
446 | " if ii in self.truth_cache:\n",
447 | " output.append(self.truth_cache[ii])\n",
448 | " else:\n",
449 | " period = self.rng.random_sample()*365.25\n",
450 | " phase = self.rng.random_sample()*2.0*np.pi\n",
451 | " amplitude = self.rng.random_sample()*10.0\n",
452 | " output.append((period, phase, amplitude))\n",
453 | "\n",
454 | " return np.array(output)\n",
455 | " \n",
456 | " @compound('uAgn_x', 'gAgn_x', 'rAgn_x', 'iAgn_x', 'zAgn_x', 'yAgn_x')\n",
457 | " def get_magnitudes(self):\n",
458 | " \n",
459 | " if not hasattr(self, 'lsstBandpassDict'):\n",
460 | " self.lsstBandpassDict = BandpassDict.loadTotalBandpassesFromFiles()\n",
461 | " \n",
462 | " delta = 5.0 * (self.obs_metadata.mjd.TAI-59580.0)/3650.0\n",
463 | "\n",
464 | " var_params = self.column_by_name('truthInfo')\n",
465 | " wave = [vv[2]*np.sin(2.0*np.pi*self.obs_metadata.mjd.TAI/vv[0] + vv[1]) for vv in var_params]\n",
466 | " \n",
467 | " return self._magnitudeGetter('agn', self.lsstBandpassDict,\n",
468 | " self.get_magnitudes._colnames) + delta + wave\n",
469 | " \n",
470 | " @compound('sigma_uAgn_x', 'sigma_gAgn_x', 'sigma_rAgn_x',\n",
471 | " 'sigma_iAgn_x', 'sigma_zAgn_x', 'sigma_yAgn_x')\n",
472 | " def get_uncertainties(self):\n",
473 | " return self._magnitudeUncertaintyGetter(['uAgn_x', 'gAgn_x', 'rAgn_x',\n",
474 | " 'iAgn_x', 'zAgn_x', 'yAgn_x'],\n",
475 | " ['u', 'g', 'r', 'i', 'z', 'y'],\n",
476 | " 'lsstBandpassDict')\n",
477 | " \n",
478 | " @compound(\"lightCurveMag\", \"sigma_lightCurveMag\")\n",
479 | " def get_lightCurvePhotometry(self):\n",
480 | " return np.array([self.column_by_name(\"%sAgn_x\" % self.obs_metadata.bandpass),\n",
481 | " self.column_by_name(\"sigma_%sAgn_x\" % self.obs_metadata.bandpass)])"
482 | ]
483 | },
484 | {
485 | "cell_type": "code",
486 | "execution_count": null,
487 | "metadata": {
488 | "collapsed": false
489 | },
490 | "outputs": [],
491 | "source": [
492 | "lc_gen._lightCurveCatalogClass = _variableAgnCatalog\n",
493 | "\n",
494 | "ra_bound = (-2.5, 2.5)\n",
495 | "dec_bound = (-2.25, 2.25)\n",
496 | "pointings = lc_gen.get_pointings(ra_bound, dec_bound, bandpass='r')\n",
497 | "\n",
498 | "lc_dict_var, truth_dict = lc_gen.light_curves_from_pointings(pointings,\n",
499 | " lc_per_field=10)"
500 | ]
501 | },
502 | {
503 | "cell_type": "markdown",
504 | "metadata": {},
505 | "source": [
506 | "Plotting light curves from this `LightCurveGenerator`, we again see non-linear variation in the magnitude."
507 | ]
508 | },
509 | {
510 | "cell_type": "code",
511 | "execution_count": null,
512 | "metadata": {
513 | "collapsed": false
514 | },
515 | "outputs": [],
516 | "source": [
517 | "obj_name = lc_dict_var.keys()[5]\n",
518 | "lc = lc_dict_var[obj_name]\n",
519 | "\n",
520 | "fig, ax = plt.subplots()\n",
521 | "\n",
522 | "ax.errorbar(lc['r']['mjd'], lc['r']['mag'], lc['r']['error'],\n",
523 | " fmt='', linestyle='None')\n",
524 | "ax.scatter(lc['r']['mjd'], lc['r']['mag'], s=5, color='r')\n",
525 | "ax.set(xlabel='MJD', ylabel='r-band magnitude')"
526 | ]
527 | },
528 | {
529 | "cell_type": "markdown",
530 | "metadata": {},
531 | "source": [
532 | "The `truth_dict` contains the three parameters characterizing the sinusoidal component of the variation."
533 | ]
534 | },
535 | {
536 | "cell_type": "code",
537 | "execution_count": null,
538 | "metadata": {
539 | "collapsed": false
540 | },
541 | "outputs": [],
542 | "source": [
543 | "truth_dict[obj_name]"
544 | ]
545 | },
546 | {
547 | "cell_type": "code",
548 | "execution_count": null,
549 | "metadata": {
550 | "collapsed": true
551 | },
552 | "outputs": [],
553 | "source": []
554 | }
555 | ],
556 | "metadata": {
557 | "kernelspec": {
558 | "display_name": "Python 2",
559 | "language": "python",
560 | "name": "python2"
561 | },
562 | "language_info": {
563 | "codemirror_mode": {
564 | "name": "ipython",
565 | "version": 2
566 | },
567 | "file_extension": ".py",
568 | "mimetype": "text/x-python",
569 | "name": "python",
570 | "nbconvert_exporter": "python",
571 | "pygments_lexer": "ipython2",
572 | "version": "2.7.12"
573 | }
574 | },
575 | "nbformat": 4,
576 | "nbformat_minor": 0
577 | }
578 |
--------------------------------------------------------------------------------
/CatSim/generateAgnCatalog_150309.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "First note: more basic tutorials exist in\n",
8 | "\n",
9 | " sims_catUtils/examples/tutorials/\n",
10 | "\n",
11 | "in the form of .py scripts and iPython notebooks. Those tutorials focus more on how the code works. Less on realistic astronomical applications.\n",
12 | "\n",
13 | "For an overview of the philosophy behind the CatSim code, see\n",
14 | "\n",
15 | " CatSimTutorial_SimulationsAHM_1503.ipynb\n",
16 | "\n",
17 | "in this directory"
18 | ]
19 | },
20 | {
21 | "cell_type": "markdown",
22 | "metadata": {},
23 | "source": [
24 | "Second note: to run this iPython notebook, you should make sure that you have cloned and set up 'master' for the following packages (which you can clone using git clone https://github.com/lsst/packageName.git):\n",
25 | "\n",
26 | "* sims_catalogs_generation\n",
27 | "\n",
28 | "* sims_catalogs_measures\n",
29 | "\n",
30 | "* sims_coordUtils\n",
31 | "\n",
32 | "* sims_photUtils\n",
33 | "\n",
34 | "* sims_catUtils\n",
35 | "\n",
36 | "You may also need to clone sims_utils and sims_data, depending on how old your stack installation is.\n",
37 | "\n",
38 | "Okay. Let's get started:\n",
39 | "\n",
40 | "There are three classes that are important to generating a catalog.\n",
41 | "\n",
42 | "* `ObservationMetaData` defines the pointing of the telescope (or, in more code-oriented language, it defines the spatial boundaries of the catalog and, optionally, the MJD for which it is simulated)\n",
43 | "\n",
44 | "\n",
45 | "* `CatalogDBObjec`t provides the interface to the database and writes the SQL query\n",
46 | "\n",
47 | "\n",
48 | "* `InstanceCataog` actually writes and stores the catalog. `InstanceCatalog` tells `CatalogDBObject` what columns to query from the database. `InstanceCatalog` also calculates any dependent columns that need to be calculated from the raw columns provided by the database.\n",
49 | "\n",
50 | "Let's start by looking at `ObservationMetaData`"
51 | ]
52 | },
53 | {
54 | "cell_type": "code",
55 | "execution_count": null,
56 | "metadata": {
57 | "collapsed": false
58 | },
59 | "outputs": [],
60 | "source": [
61 | "from lsst.sims.utils import ObservationMetaData\n",
62 | "\n",
63 | "print help(ObservationMetaData)"
64 | ]
65 | },
66 | {
67 | "cell_type": "markdown",
68 | "metadata": {},
69 | "source": [
70 | "The only parameters you need to pass to `ObservationMetaData` are `unrefractedRA`, `unrefractedDEC`, `boundType`, and `boundLength`. If you care about astrometry or variability, you will also need to pass in `mjd`. You probably should not pass in `phoSimMetaData` by hand; that is only used when generating catalogs to be input to PhoSim and is generated by our automatic `ObservationMetaData`-generating scripts when generating `ObservationMetaData` instances out of OpSim runs."
71 | ]
72 | },
73 | {
74 | "cell_type": "code",
75 | "execution_count": null,
76 | "metadata": {
77 | "collapsed": false
78 | },
79 | "outputs": [],
80 | "source": [
81 | "#create a circular field of view centered on RA = 25 degrees, Dec = 30 degrees\n",
82 | "#with a radius of 0.1 degrees\n",
83 | "circleObsMetadata = ObservationMetaData(pointingRA=25.0, pointingDec=30.0,\n",
84 | " boundType='circle', boundLength=0.1)"
85 | ]
86 | },
87 | {
88 | "cell_type": "code",
89 | "execution_count": null,
90 | "metadata": {
91 | "collapsed": false
92 | },
93 | "outputs": [],
94 | "source": [
95 | "#create a square field of view centered on RA = 25 degrees, Dec = 30 degrees\n",
96 | "#with a side-length of 0.2 degrees (in this case boundLength is half the length of a side)\n",
97 | "squareObsMetadata = ObservationMetaData(pointingRA=25.0, pointingDec=30.0,\n",
98 | " boundType='box', boundLength=0.1)"
99 | ]
100 | },
101 | {
102 | "cell_type": "code",
103 | "execution_count": null,
104 | "metadata": {
105 | "collapsed": false
106 | },
107 | "outputs": [],
108 | "source": [
109 | "#create a rectangular field of view centered on RA = 25 degrees, Dec = 30 degrees\n",
110 | "#with an RA side length of 0.2 degrees and a Dec side length of 0.1 degrees\n",
111 | "rectangleObsMetadata = ObservationMetaData(pointingRA=25.0, pointingDec=30.0,\n",
112 | " boundType='box', boundLength=(0.1, 0.05))"
113 | ]
114 | },
115 | {
116 | "cell_type": "markdown",
117 | "metadata": {},
118 | "source": [
119 | "The `CatalogDBObject` is just a wrapper to the database. It is used to translate the columns of the database from their native syntax into a syntax that CatSim expects. The parent class is defined in\n",
120 | "\n",
121 | " sims_catalogs_generation/python/lsst/sims/catalogs/generation/db/dbConnection.py\n",
122 | "\n",
123 | "but the daughter classes that actually do the wrapping of astronomically interesting database tables are defined in\n",
124 | "\n",
125 | " sims_catUtils/python/lsst/sims/catUtils/baseCatalogModels\n",
126 | "\n",
127 | "Each daughter class of `CatalogDBObject` wraps a specific table in the fatboy database. Thus, there is one class for galaxy bulges, one class for galaxy disks, one class for galaxy agn, one class for RRLyrae, one class that unifies main sequence, RRLyrae, white dwarfs, and blue horizontal branch stars, etc. Probably the best way to determine what CatalogDBObject classes are available is to look at the database schema page on confluence\n",
128 | "\n",
129 | "https://confluence.lsstcorp.org/display/SIM/Database+Schema\n",
130 | "\n",
131 | "For the sake of argument, let's say we wanted to make a catalog of AGN. At present, the only way to find AGN is to search through all galaxies and only accept those with AGN. The `CatalogDBObject` class that queries all galaxies on the sky is `GalaxyTileObj`"
132 | ]
133 | },
134 | {
135 | "cell_type": "code",
136 | "execution_count": null,
137 | "metadata": {
138 | "collapsed": false
139 | },
140 | "outputs": [],
141 | "source": [
142 | "from lsst.sims.catUtils.baseCatalogModels import GalaxyTileObj\n",
143 | "\n",
144 | "galaxyDB = GalaxyTileObj()"
145 | ]
146 | },
147 | {
148 | "cell_type": "markdown",
149 | "metadata": {},
150 | "source": [
151 | "To learn the columns provided by this CatalogDBObject daughter class, we can"
152 | ]
153 | },
154 | {
155 | "cell_type": "code",
156 | "execution_count": null,
157 | "metadata": {
158 | "collapsed": false
159 | },
160 | "outputs": [],
161 | "source": [
162 | "galaxyDB.show_mapped_columns()"
163 | ]
164 | },
165 | {
166 | "cell_type": "markdown",
167 | "metadata": {},
168 | "source": [
169 | "Presumably, we do not want all of this information in our catalog, which is why we use the `InstanceCatalog` (or a daughter class thereof) to filter it before writing to disk.\n",
170 | "\n",
171 | "The base class `InstanceCatalog` is defined in\n",
172 | "\n",
173 | " sims_catalogs_measures/python/lsst/sims/catalogs/measures/instance/InstanceCatalog.py\n",
174 | "\n",
175 | "but again, we are actually interested in daughter classes. There are some example daughter classes defined in\n",
176 | "\n",
177 | " sims_catUtils/python/lsst/sims/catUtils/exampleCatalogDefinitions/\n",
178 | "\n",
179 | "but often, you will want to write your own daughter class that creates your own custom catalog. Fortunately, that is very easy. Here is an example"
180 | ]
181 | },
182 | {
183 | "cell_type": "code",
184 | "execution_count": null,
185 | "metadata": {
186 | "collapsed": false
187 | },
188 | "outputs": [],
189 | "source": [
190 | "import numpy\n",
191 | "from lsst.sims.catalogs.measures.instance import InstanceCatalog\n",
192 | "\n",
193 | "class basicAgnCatalog(InstanceCatalog):\n",
194 | " \n",
195 | " #list defining the columns we want output to our catalog\n",
196 | " column_outputs = ['raJ2000', 'decJ2000',\n",
197 | " 'sedFilenameAgn', 'magNormAgn']\n",
198 | " \n",
199 | " #dict performing any unit transformations upon output\n",
200 | " #(note that angles are stored in radians by default;\n",
201 | " #must explicitly be transformed to degrees on output)\n",
202 | " transformations = {'raJ2000':numpy.degrees, 'decJ2000':numpy.degrees}\n",
203 | " \n",
204 | " #This lists all of the columns whose values cannot be Nan, None, NULL etc.\n",
205 | " #In this case, we are filtering out all of the galaxies without AGN by\n",
206 | " #ignoring galaxies that do not have an SED assigned for an AGN\n",
207 | " cannot_be_null = ['sedFilenameAgn']"
208 | ]
209 | },
210 | {
211 | "cell_type": "markdown",
212 | "metadata": {},
213 | "source": [
214 | "Note: Both `InstanceCatalog` and `CatalogDBObject` have meta-classes that keep registries of their defined daughter classes. If you try to define the same daughter class twice, it will raise an exception. Thus, if you run the cell above (or any cell which defines an `InstanceCatalog` daughter) twice, you will get an exception. If you need to re-run the cell above, you will have to restart this notebook's kernel."
215 | ]
216 | },
217 | {
218 | "cell_type": "markdown",
219 | "metadata": {},
220 | "source": [
221 | "When you instantiate an `InstanceCatalog` you must pass it a `CatalogDBObject` and you can pass it an `ObservationMetaData`. Let's write out catalogs according to our three example `ObservationMetaDatas` defined above"
222 | ]
223 | },
224 | {
225 | "cell_type": "code",
226 | "execution_count": null,
227 | "metadata": {
228 | "collapsed": false
229 | },
230 | "outputs": [],
231 | "source": [
232 | "agnCircle = basicAgnCatalog(galaxyDB, obs_metadata=circleObsMetadata)\n",
233 | "agnCircle.write_catalog('agn_circle.txt')\n",
234 | "\n",
235 | "agnSquare = basicAgnCatalog(galaxyDB, obs_metadata=squareObsMetadata)\n",
236 | "agnSquare.write_catalog('agn_square.txt')\n",
237 | "\n",
238 | "agnRectangle = basicAgnCatalog(galaxyDB, obs_metadata=rectangleObsMetadata)\n",
239 | "agnRectangle.write_catalog('agn_rectangle.txt')"
240 | ]
241 | },
242 | {
243 | "cell_type": "markdown",
244 | "metadata": {},
245 | "source": [
246 | "Just to show that ObservationMetaData is working the way we promised...."
247 | ]
248 | },
249 | {
250 | "cell_type": "code",
251 | "execution_count": null,
252 | "metadata": {
253 | "collapsed": false
254 | },
255 | "outputs": [],
256 | "source": [
257 | "import matplotlib\n",
258 | "dtype = numpy.dtype([('raJ2000',float),('decJ2000',float),('sedFilenameAgn',(str,40)),('magNormAgn',float)])\n",
259 | "\n",
260 | "circleData = numpy.loadtxt('agn_circle.txt', delimiter=',', dtype=dtype)\n",
261 | "matplotlib.pyplot.scatter(circleData['raJ2000'],circleData['decJ2000'])\n",
262 | "matplotlib.pyplot.show()"
263 | ]
264 | },
265 | {
266 | "cell_type": "code",
267 | "execution_count": null,
268 | "metadata": {
269 | "collapsed": false
270 | },
271 | "outputs": [],
272 | "source": [
273 | "squareData = numpy.loadtxt('agn_square.txt', delimiter=',', dtype=dtype)\n",
274 | "matplotlib.pyplot.scatter(squareData['raJ2000'], squareData['decJ2000'])\n",
275 | "matplotlib.pyplot.show()"
276 | ]
277 | },
278 | {
279 | "cell_type": "code",
280 | "execution_count": null,
281 | "metadata": {
282 | "collapsed": false
283 | },
284 | "outputs": [],
285 | "source": [
286 | "rectangleData = numpy.loadtxt('agn_rectangle.txt', delimiter=',', dtype=dtype)\n",
287 | "matplotlib.pyplot.scatter(rectangleData['raJ2000'], rectangleData['decJ2000'])\n",
288 | "matplotlib.pyplot.show()"
289 | ]
290 | },
291 | {
292 | "cell_type": "markdown",
293 | "metadata": {},
294 | "source": [
295 | "The examples above create catalogs using only the data that is already stored in the database. What if you want to calculate some new quantity to include in your catalog? Since we're talking about AGN, let's use the example of trying to include variability.\n",
296 | "\n",
297 | "`InstanceCatalog` daughter classes can calculate quantities not stored in the database using getter methods. Literally, if you want to output the new quantity `myDerivedColumn` in your catalog, you just need to include a method\n",
298 | "\n",
299 | " def get_myDerivedColumn(self):\n",
300 | " #some math\n",
301 | " return numpy.array([myCalculationValues])\n",
302 | "\n",
303 | "in your `InstanceCatalog` daughter class. The return must be a numpy array containing the values of `myDerivedColumn` for each object in the catalog in the order that they appear in the catalog (which the `InstanceCatalog` will provide for you automatically). First, a cartoon example:"
304 | ]
305 | },
306 | {
307 | "cell_type": "code",
308 | "execution_count": null,
309 | "metadata": {
310 | "collapsed": false
311 | },
312 | "outputs": [],
313 | "source": [
314 | "class myCartoonCatalog(InstanceCatalog):\n",
315 | " column_outputs = ['raJ2000', 'decJ2000', 'lsst_u', 'shifted_u_magnitude']\n",
316 | "\n",
317 | " def get_shifted_u_magnitude(self):\n",
318 | " u = self.column_by_name('lsst_u') #this gets the lsst_u value for every object in the\n",
319 | " #catalog as a numpy array\n",
320 | " return u + 4.0\n"
321 | ]
322 | },
323 | {
324 | "cell_type": "code",
325 | "execution_count": null,
326 | "metadata": {
327 | "collapsed": false
328 | },
329 | "outputs": [],
330 | "source": [
331 | "cartoonAgn = myCartoonCatalog(galaxyDB, obs_metadata=circleObsMetadata)\n",
332 | "cartoonAgn.write_catalog('cartoon_agn.txt')"
333 | ]
334 | },
335 | {
336 | "cell_type": "markdown",
337 | "metadata": {},
338 | "source": [
339 | "If you now look at cartoon_agn.txt you will see that it contains the lsst_u data, but alos the shifted_u_magnitude, which is just lsst_u plus 4.0. It can be as simple as that.\n",
340 | "\n",
341 | "Because adding columns to the catalog is so simple, CatSim does so using mixin classes. Mixin classes are classes that exist solely to provide methods to be inherited by other classes. If you define a mixin class with getter methods and have your `InstanceCatalog` daughter class inherit from that mixin, your `InstanceCatalog` daughter class will have those getter methods and be able to use them when writing outyour catalog. Probably the best place to look for the total list of pre-defined mixins and the getter methods they provide is here\n",
342 | "\n",
343 | "https://confluence.lsstcorp.org/display/SIM/Getters+provided+by+Catalog+Simulations+mixins"
344 | ]
345 | },
346 | {
347 | "cell_type": "markdown",
348 | "metadata": {},
349 | "source": [
350 | "Variability in CatSim is handled by a combination of the Photometry mixins defined in\n",
351 | "\n",
352 | " $SIMS_CATUTILS_DIR/python/lsst/sims/catUtils/mixins/Photometry.py\n",
353 | "\n",
354 | "and the Variability mixins defined in\n",
355 | "\n",
356 | " $SIMS_CATUTILS_DIR/python/lsst/sims/catUtils/mixins/Variability.py\n",
357 | "\n",
358 | "Specifically, the Photometry mixins provide functionality that allows an `InstanceCatalog` to calculate a baseline magnitude (e.g. `uAgn`). The Variability mixin allows an `InstanceCatalog` to calculate a variable change in that magnitude (e.e. `delta_uAgn`). The Photometry mixin is smart enough to know that, if `delta_uAgn` is defined for the `InstanceCatalog` (i.e. if the `InstanceCatalog` daughter class inherits from the Variability mixin that defines `delta_uAgn`), it should add `delta_uAgn` to `uAgn` before returning `uAgn`.\n",
359 | "\n",
360 | "Below, we will demonstrate this by writing out two catalogs: one with variability included, one without."
361 | ]
362 | },
363 | {
364 | "cell_type": "code",
365 | "execution_count": null,
366 | "metadata": {
367 | "collapsed": false
368 | },
369 | "outputs": [],
370 | "source": [
371 | "#re-import these packages in case we need to restart the kernel\n",
372 | "import numpy\n",
373 | "from lsst.sims.utils import ObservationMetaData\n",
374 | "from lsst.sims.catUtils.baseCatalogModels import GalaxyTileObj\n",
375 | "from lsst.sims.catalogs.measures.instance import InstanceCatalog\n",
376 | "\n",
377 | "from lsst.sims.catUtils.mixins import PhotometryGalaxies, VariabilityGalaxies\n",
378 | "\n",
379 | "class variableAgnCatalog(InstanceCatalog, PhotometryGalaxies, VariabilityGalaxies):\n",
380 | " \n",
381 | " cannot_be_null = ['uAgn'] #again, we only want AGN\n",
382 | " \n",
383 | " #note that we are using [u,g,r,i,z,y]Agn as the baseline magnitudes\n",
384 | " #rather than lsst_[u,g,r,i,z,y]. The VariabilityGalaxies mixin operates\n",
385 | " #by calculating the baseline magnitude from the object's SED, rather\n",
386 | " #than reading in the value from the database. These should give the\n",
387 | " #same answer, but they do not have to (if, for example, we changed \n",
388 | " #reddening models between now and when the database was created).\n",
389 | " #[u,g,r,i,z,y]Agn are provided by the PhotometryGalaxies mixin defined\n",
390 | " #in sims_photUtils/../Photometry.py, which is why we have to inherit from\n",
391 | " #that class as well\n",
392 | " column_outputs = ['galid', 'raJ2000', 'decJ2000',\n",
393 | " 'uAgn', 'rAgn', 'zAgn',\n",
394 | " 'delta_uAgn', 'delta_rAgn', 'delta_zAgn']\n",
395 | " \n",
396 | " transformations = {'raJ2000':numpy.degrees, 'decJ2000':numpy.degrees}"
397 | ]
398 | },
399 | {
400 | "cell_type": "markdown",
401 | "metadata": {},
402 | "source": [
403 | "If we want variability to be meaningfull, we will need an ObservationMetaData with an MJD attached to it.\n",
404 | "\n",
405 | "Note: This is going to be a lot slower than what we've been doing above, so we are going to reduce the field of view size to keep this demo reasonable "
406 | ]
407 | },
408 | {
409 | "cell_type": "code",
410 | "execution_count": null,
411 | "metadata": {
412 | "collapsed": false
413 | },
414 | "outputs": [],
415 | "source": [
416 | "variableObsMetadata = ObservationMetaData(pointingRA=25.0, pointingDec=30.0,\n",
417 | " boundType='circle', boundLength=0.05,\n",
418 | " mjd=57086)\n",
419 | "\n",
420 | "galaxyDB = GalaxyTileObj()\n",
421 | "variableAgn = variableAgnCatalog(galaxyDB, obs_metadata=variableObsMetadata)\n",
422 | "variableAgn.write_catalog('variable_agn.txt')"
423 | ]
424 | },
425 | {
426 | "cell_type": "code",
427 | "execution_count": null,
428 | "metadata": {
429 | "collapsed": false
430 | },
431 | "outputs": [],
432 | "source": [
433 | "class baselineAgnCatalog(InstanceCatalog, PhotometryGalaxies):\n",
434 | " \n",
435 | " cannot_be_null = ['uAgn'] #again, we only want AGN\n",
436 | " \n",
437 | " #note that we are using [u,g,r,i,z,y]Agn as the baseline magnitudes\n",
438 | " #rather than lsst_[u,g,r,i,z,y]. The VariabilityGalaxies mixin operates\n",
439 | " #by calculating the baseline magnitude from the object's SED, rather\n",
440 | " #than reading in the value from the database. These should give the\n",
441 | " #same answer, but they do not have to (if, for example, we changed \n",
442 | " #reddening models between now and when the database was created).\n",
443 | " #[u,g,r,i,z,y]Agn are provided by the PhotometryGalaxies mixin defined\n",
444 | " #in sims_photUtils/../Photometry.py, which is why we have to inherit from\n",
445 | " #that class as well\n",
446 | " column_outputs = ['galid', 'raJ2000', 'decJ2000',\n",
447 | " 'uAgn', 'rAgn', 'zAgn']\n",
448 | " \n",
449 | " transformations = {'raJ2000':numpy.degrees, 'decJ2000':numpy.degrees}"
450 | ]
451 | },
452 | {
453 | "cell_type": "code",
454 | "execution_count": null,
455 | "metadata": {
456 | "collapsed": false
457 | },
458 | "outputs": [],
459 | "source": [
460 | "baselineAgn = baselineAgnCatalog(galaxyDB, obs_metadata=variableObsMetadata)\n",
461 | "baselineAgn.write_catalog('baseline_agn.txt')"
462 | ]
463 | },
464 | {
465 | "cell_type": "markdown",
466 | "metadata": {},
467 | "source": [
468 | "We can show that the two catalogs contain different magnitudes (by the amount specified by the `delta_[u,r,z]Agn` columns) by reading in the catalogs we just wrote and comparing the magnitude columns for each AGN."
469 | ]
470 | },
471 | {
472 | "cell_type": "code",
473 | "execution_count": null,
474 | "metadata": {
475 | "collapsed": false
476 | },
477 | "outputs": [],
478 | "source": [
479 | "variableDtype = numpy.dtype([('galid',(str,100)), ('ra', numpy.float), ('dec', numpy.float),\n",
480 | " ('u', numpy.float), ('r', numpy.float), ('z', numpy.float),\n",
481 | " ('delta_u', numpy.float), ('delta_r',numpy.float),\n",
482 | " ('delta_z',numpy.float)])\n",
483 | "\n",
484 | "variableData = numpy.genfromtxt('variable_agn.txt', dtype=variableDtype, delimiter=', ')\n",
485 | "\n",
486 | "baselineDtype = numpy.dtype([('galid',(str,100)), ('ra', numpy.float), ('dec', numpy.float),\n",
487 | " ('u', numpy.float), ('r', numpy.float), ('z', numpy.float)])\n",
488 | "\n",
489 | "baselineData = numpy.genfromtxt('baseline_agn.txt', dtype=baselineDtype, delimiter=', ')\n",
490 | "\n",
491 | "maxError = -1.0\n",
492 | "for base in baselineData:\n",
493 | " var = variableData[numpy.where(variableData['galid']==base['galid'])[0][0]]\n",
494 | " error = numpy.abs(var['u']-base['u']-var['delta_u'])\n",
495 | "\n",
496 | " if error>maxError:\n",
497 | " maxError=error\n",
498 | "\n",
499 | " error = numpy.abs(var['r']-base['r']-var['delta_r'])\n",
500 | " if error>maxError:\n",
501 | " maxError=error\n",
502 | " \n",
503 | " error = numpy.abs(var['z']-base['z']-var['delta_z'])\n",
504 | " if error>maxError:\n",
505 | " maxError=error\n",
506 | "\n",
507 | "print 'maxError: ',maxError"
508 | ]
509 | },
510 | {
511 | "cell_type": "markdown",
512 | "metadata": {},
513 | "source": [
514 | "Why so slow?\n",
515 | "\n",
516 | "Everything we have done up to this point has just been querying the database and spitting out the results of that query. Applying variability requires the `InstanceCatalog` to read in the SED of each AGN, integrate it over the LSST bandpasses, and then apply a variability model. Even that shouldn't take too long. The most time-consuming part of that process is reading in the AGN SED and all of our model AGNs have the same SED. The problem is that the getter for AGN magnitudes also calculates bulge and disk magnitudes (see `sims_catUtils/python/lsst/sims/catUtils/mixins/PhotometryMixin.py`). Even if we don't want bulge and disk magnitudes, we still have to calculate them. This involves inputting many more unique SEDs. In the future, we hope to get around this problem by storing our SEDs as decomposition of PCA spectra. Then, we only have to read in the few tens of SED PCA once and we can calculate object spectra by making linear combinations of those PCA.\n",
517 | "\n",
518 | "There is a work-around to this problem, though. Since the problem is with reading in multiple disk and bulge SEDs which we actually do not want, we can hack the system by assigning every disk and bulge the same empty. The way an `InstanceCatalog` resolves requests for column values is that, in this order, it:\n",
519 | "\n",
520 | "1) checks for a getter corresponding to that column\n",
521 | "2) checks for that column in its CatalogDBObject\n",
522 | "3) checks for a default value of that column\n",
523 | "\n",
524 | "If we write a getter that assigns 'None' as the disk and bulge SEDs, that getter will override the information stored in the database, and we should speed up our query, since the `InstanceCatalog` will not waste time calculating disk and bulge magnitudes we don't want."
525 | ]
526 | },
527 | {
528 | "cell_type": "code",
529 | "execution_count": null,
530 | "metadata": {
531 | "collapsed": false
532 | },
533 | "outputs": [],
534 | "source": [
535 | "#re-import these packages in case we need to restart the kernel\n",
536 | "import numpy\n",
537 | "from lsst.sims.utils import ObservationMetaData\n",
538 | "from lsst.sims.catUtils.baseCatalogModels import GalaxyTileObj\n",
539 | "from lsst.sims.catalogs.measures.instance import InstanceCatalog\n",
540 | "\n",
541 | "from lsst.sims.catUtils.mixins import PhotometryGalaxies, VariabilityGalaxies\n",
542 | "\n",
543 | "class variableAgnCatalogCheat(InstanceCatalog, PhotometryGalaxies, VariabilityGalaxies):\n",
544 | " \n",
545 | " cannot_be_null = ['uAgn'] #again, we only want AGN\n",
546 | " \n",
547 | " #note that we are using [u,g,r,i,z,y]Agn as the baseline magnitudes\n",
548 | " #rather than lsst_[u,g,r,i,z,y]. The VariabilityGalaxies mixin operates\n",
549 | " #by calculating the baseline magnitude from the object's SED, rather\n",
550 | " #than reading in the value from the database. These should give the\n",
551 | " #same answer, but they do not have to (if, for example, we changed \n",
552 | " #reddening models between now and when the database was created).\n",
553 | " #[u,g,r,i,z,y]Agn are provided by the PhotometryGalaxies mixin defined\n",
554 | " #in sims_photUtils/../Photometry.py, which is why we have to inherit from\n",
555 | " #that class as well\n",
556 | " column_outputs = ['galid', 'raJ2000', 'decJ2000',\n",
557 | " 'uAgn', 'rAgn', 'zAgn',\n",
558 | " 'delta_uAgn', 'delta_rAgn', 'delta_zAgn']\n",
559 | " \n",
560 | " transformations = {'raJ2000':numpy.degrees, 'decJ2000':numpy.degrees}\n",
561 | " \n",
562 | " def get_sedFilenameBulge(self):\n",
563 | " ra = self.column_by_name('raJ2000') #to figure out how many objects are in the catalog\n",
564 | " nameList = []\n",
565 | " for rr in ra:\n",
566 | " nameList.append('None')\n",
567 | " return numpy.array(nameList)\n",
568 | " \n",
569 | " def get_sedFilenameDisk(self):\n",
570 | " return self.column_by_name('sedFilenameBulge')"
571 | ]
572 | },
573 | {
574 | "cell_type": "code",
575 | "execution_count": null,
576 | "metadata": {
577 | "collapsed": false
578 | },
579 | "outputs": [],
580 | "source": [
581 | "galaxyDB = GalaxyTileObj()\n",
582 | "\n",
583 | "variableObsMetadata = ObservationMetaData(pointingRA=25.0, pointingDec=30.0,\n",
584 | " boundType='circle', boundLength=0.05,\n",
585 | " mjd=57086)\n",
586 | "\n",
587 | "variableAgn = variableAgnCatalogCheat(galaxyDB, obs_metadata=variableObsMetadata)\n",
588 | "variableAgn.write_catalog('variable_agn_cheat.txt')"
589 | ]
590 | },
591 | {
592 | "cell_type": "markdown",
593 | "metadata": {},
594 | "source": [
595 | "So far, we have been using ObservationMetaData instantiations created out of thin air. It is, however, possible to create ObservationMetaData based on actual OpSim pointings. This is done using the ObservationMetaDataGenerator.\n",
596 | "\n",
597 | "The ObservationMetaDataGenerator is a class defined in\n",
598 | "\n",
599 | " sims_catUtils/python/lsst/sims/catUtils/utils/ObservationMetaDataGenerator.py\n",
600 | "\n",
601 | "you give it the name of an OpSim output database and it allows you to query that database for pointings that fit criteria you provide. It then turns those pointings into ObservationMetaData instantiations."
602 | ]
603 | },
604 | {
605 | "cell_type": "code",
606 | "execution_count": null,
607 | "metadata": {
608 | "collapsed": false
609 | },
610 | "outputs": [],
611 | "source": [
612 | "from lsst.sims.catUtils.utils import ObservationMetaDataGenerator\n",
613 | "\n",
614 | "help(ObservationMetaDataGenerator.__init__)"
615 | ]
616 | },
617 | {
618 | "cell_type": "code",
619 | "execution_count": null,
620 | "metadata": {
621 | "collapsed": false
622 | },
623 | "outputs": [],
624 | "source": [
625 | "help(ObservationMetaDataGenerator.getObservationMetaData)"
626 | ]
627 | },
628 | {
629 | "cell_type": "code",
630 | "execution_count": null,
631 | "metadata": {
632 | "collapsed": false
633 | },
634 | "outputs": [],
635 | "source": [
636 | "import eups\n",
637 | "import os\n",
638 | "opsimdb = os.path.join(eups.productDir('sims_data'),'OpSimData','opsimblitz1_1133_sqlite.db')\n",
639 | "\n",
640 | "gen = ObservationMetaDataGenerator(database=opsimdb, driver='sqlite')"
641 | ]
642 | },
643 | {
644 | "cell_type": "markdown",
645 | "metadata": {},
646 | "source": [
647 | "For example, to get a list of ObservationMetaData instantiations pointing at 20 < RA < 30, -65 < Dec < -55, and 1.4 < airmass < 2.1"
648 | ]
649 | },
650 | {
651 | "cell_type": "code",
652 | "execution_count": null,
653 | "metadata": {
654 | "collapsed": false
655 | },
656 | "outputs": [],
657 | "source": [
658 | "obsMDresults = gen.getObservationMetaData(boundType='circle', boundLength=0.05,\n",
659 | " fieldRA=(20.0,30.0), fieldDec=(-65.0, -55.0), airmass = (1.4, 2.1))"
660 | ]
661 | },
662 | {
663 | "cell_type": "code",
664 | "execution_count": null,
665 | "metadata": {
666 | "collapsed": false
667 | },
668 | "outputs": [],
669 | "source": [
670 | "for o in obsMDresults:\n",
671 | " print o.pointingRA, o.pointingDec, \\\n",
672 | " o.phoSimMetaData['airmass'][0], o.mjd.TAI"
673 | ]
674 | },
675 | {
676 | "cell_type": "markdown",
677 | "metadata": {},
678 | "source": [
679 | "These ObservationMetaData can be fed into InstanceCatalogs"
680 | ]
681 | },
682 | {
683 | "cell_type": "markdown",
684 | "metadata": {},
685 | "source": [
686 | " 24 August, 2015: There is a known bug with the cell below; it will be fixed in the next sims release "
687 | ]
688 | },
689 | {
690 | "cell_type": "code",
691 | "execution_count": null,
692 | "metadata": {
693 | "collapsed": false
694 | },
695 | "outputs": [],
696 | "source": [
697 | "variableAgn = variableAgnCatalogCheat(galaxyDB, obs_metadata=obsMDresults[0])\n",
698 | "variableAgn.write_catalog('variable_agn_real_obs_metadata.txt')"
699 | ]
700 | },
701 | {
702 | "cell_type": "code",
703 | "execution_count": null,
704 | "metadata": {
705 | "collapsed": true
706 | },
707 | "outputs": [],
708 | "source": []
709 | }
710 | ],
711 | "metadata": {
712 | "kernelspec": {
713 | "display_name": "Python 2",
714 | "language": "python",
715 | "name": "python2"
716 | },
717 | "language_info": {
718 | "codemirror_mode": {
719 | "name": "ipython",
720 | "version": 2
721 | },
722 | "file_extension": ".py",
723 | "mimetype": "text/x-python",
724 | "name": "python",
725 | "nbconvert_exporter": "python",
726 | "pygments_lexer": "ipython2",
727 | "version": "2.7.11"
728 | }
729 | },
730 | "nbformat": 4,
731 | "nbformat_minor": 0
732 | }
733 |
--------------------------------------------------------------------------------
/DM/.gitignore:
--------------------------------------------------------------------------------
1 | #just an empty file so this directory shows up
2 |
--------------------------------------------------------------------------------
/DM/PHOSIM_DM_startup.md:
--------------------------------------------------------------------------------
1 | # PHOSIM quick start
2 | [Here](https://confluence.lsstcorp.org/display/PHOSIM/Using+PhoSim) is the most comprehensive phosim documentation I know of.
3 |
4 | ## Get phosim
5 | This is hosted on bitbucket: [here are tarballs](https://bitbucket.org/phosim/phosim_release/downloads). You can, of course, clone it and checkout a tag. This is what I typically do.
6 |
7 | ## Build phosim
8 | I usually build against a DM stack. The stack provides cfitsio and fftw (and Eigen, although this doesn't seem to actually be needed).
9 |
10 | * Setup your stack (I'm assuming you also have sims installed)
11 | * Find the relevant paths: e.g. `$> echo $CFITSIO_DIR`
12 | * Go into your phosim distribution.
13 | * There is a configure script in phosim
14 | * `$> ./configure`
15 | * This will ask for one of three options. I use option 'c'. That's the one that lets you specify the locations to the libraries.
16 | * Answer the questions. If it's asking for a library, it will be /path/to/library/lib and if it's asking about headers, it will be /path/to/library/include.
17 | * You should then be able to say `$> make`
18 |
19 | ## Use phosim
20 | You can use phosim right out of the box. E.g. `./phosim examples/star`. This is because phosim comes with a few example SEDs.
21 |
22 | In order to really use phosim with CatSim, you'll want to have the SEDs CatSim refers to. SEDs from CatSim come in the sims_sed_library package.
23 | * Go to the phosim data directory: `$> cd data/SEDs`
24 | * Create links to the distributed SED library. In Bash, you can do something like: ```$> for i in `ls -d $SIMS_SED_LIBRARY_DIR/*SED`; do ln -s $i; done```
25 |
26 | Now you are in a position to use the CatSim package to make catalogs for you to run through phosim.
27 |
28 | ## Survival guide
29 | ### By default *everything* is turned on.
30 | This means brighter/fatter, tree rings, annealing errors, clouds, variable water vapor, etc. This means that you will almost always want a 'physics command override' (PCO) file. This file can do a lot of things and is the main configuration interface to the runtime configuration. There are even some I/O debuging options you can turn on with this file. Following is an example minimal PCO file that doesn nothing more than turn on the option to output a debugging file that will report the total number of photons simulated for each object and the centroid of each object (this is per chip).
31 | ```
32 | centroidfile 1
33 | ```
34 |
35 | ### Simulating the background is very slow.
36 | Because of the way phosim is architected, some per visit parameters are not set in the instance file. This means that by default, phosim will choose a dark sky background for you. Because it just chooses from a Gaussian distribution with sigma=0.9, the sky brightness can be surprisingly large. You can set this by adding the zenith_v parmeter. This is the V-band sky brightness at zenith in mag/arcsec**2. This can be used to turn off the background. A file with a reasonable background would look like the following:
37 | ```
38 | centroidfile 1
39 | zenith_v 22.8
40 | ```
41 |
42 | ### Clouds are everywhere
43 | Cloud opacity (not reflection) is turned on by default. Like the sky brightness, this is not something you can set from the instance catalog. You can modify each of the cloud layers independently, but more frequently, I turn them off completely. Phosim has some convenience commands that help turn off everything associated with an effect.
44 | ```
45 | centroidfile 1
46 | zenith_v 22.8
47 | clearclouds
48 | ```
49 |
50 | ### Check your spelling
51 | Phosim generally ignores unknown commands. This is true in both the instance and PCO files. The result is that if you misspell one of the commands, it will be silently ignored and you will only have a chance to notice after it has run through.
52 |
53 | ### Scaling up
54 | The `-s` command line argument allows you to specify the chip to simulate in the format `R??_S??`. This is useful to split jobs among cores, and also to do a test run. I also find it useful in situations where my input catalog may have incomplete coverage on some chips.
55 |
56 | Phosim uses lots of intermediate files. These can collide if you are running multiple instances on the same physical system. There is a `-w` argument that should allow multiple jobs to run in parallel on the same installation, but I have not played with it much. Instead, I copy the installation and run one instance per copy.
57 |
58 | ### Eimages
59 | The so-called e-images are output by phosim by default. They are a really nice starting place if you don't want to go through the hastle of simulating calibration products (darks, flats). They are essentially trivially ISR'd images. They have all the astrophysical, atmospheric, and optical effects. Some electronic effects are in them: bfe and treerings, but not all: e.g. no flatfielding is necessary (I don't think).
60 |
--------------------------------------------------------------------------------
/MAF/.gitignore:
--------------------------------------------------------------------------------
1 | #just an empty file so this directory shows up
2 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # LSST-Tutorials
2 | Tutorials presented at the Monday group meetings of UW LSST
3 |
--------------------------------------------------------------------------------
/SE/Calculating SNR.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:09e88e48663397a7231c760defa6a9f7dca7507c3d9fec94d53b10a1a46b0f14"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "This ipython notebook demonstrates how to calculate photometric (optimal psf-weighted) SNR for sources observed with LSST. \n",
16 | "It uses the LSST [throughputs](https://github.com/lsst/throughputs) curves, together with the LSST [sims_photUtils](https://stash.lsstcorp.org/projects/SIM/repos/sims_photutils/browse) package, to calculate the SNR for any spectrum. \n",
17 | "\n",
18 | "With an eye toward turning this into a more generalized SNR calculator, an overview of the process (starting from the spectrum) is:\n",
19 | "* Set the desired magnitude (in a standardized throughput curve) for the source.\n",
20 | "* Generate the throughput curves in all filters, for the airmass desired.\n",
21 | "* Find the sky SED (for now we just use dark, zenith sky).\n",
22 | "* Calculate the SNR. \n",
23 | "\n",
24 | "---\n",
25 | "\n",
26 | "This notebook assumes that the `throughputs` and `sims_photUtils` packages have been installed and setup. We can then import the necessary packages:"
27 | ]
28 | },
29 | {
30 | "cell_type": "code",
31 | "collapsed": false,
32 | "input": [
33 | "import os\n",
34 | "%matplotlib inline\n",
35 | "import matplotlib.pyplot as plt\n",
36 | "from lsst.sims.photUtils import Bandpass\n",
37 | "from lsst.sims.photUtils import Sed"
38 | ],
39 | "language": "python",
40 | "metadata": {},
41 | "outputs": []
42 | },
43 | {
44 | "cell_type": "markdown",
45 | "metadata": {},
46 | "source": [
47 | "---\n",
48 | "Read in the spectra. \n",
49 | "\n",
50 | "I put together a few SEDs that cover a fairly wide range: elliptical and spiral galaxies, blue and red stars, and a quasar. We'll read them all into `Sed` objects, and set the quasar to have a redshift of 3.5. These SEDS are available as a [tar ball](http://www.astro.washington.edu/users/lynnej/sample_seds.tar.gz). \n",
51 | "Wherever you place this directory, set an environment variable 'SAMPLE_SEDS_DIR' pointing to that location (i.e. `setenv SAMPLE_SEDS_DIR /Users/lynnej/seds/sample_seds`)."
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "collapsed": false,
57 | "input": [
58 | "sedDir = os.getenv('SAMPLE_SEDS_DIR')\n",
59 | "filenames = !ls $sedDir\n",
60 | "print filenames"
61 | ],
62 | "language": "python",
63 | "metadata": {},
64 | "outputs": []
65 | },
66 | {
67 | "cell_type": "code",
68 | "collapsed": false,
69 | "input": [
70 | "seds = {}\n",
71 | "z=3.5\n",
72 | "for s in filenames:\n",
73 | " seds[s] = Sed()\n",
74 | " seds[s].readSED_flambda(os.path.join(sedDir, s))\n",
75 | "if 'quasar.dat' in seds:\n",
76 | " seds['quasar.dat'].redshiftSED(z)"
77 | ],
78 | "language": "python",
79 | "metadata": {},
80 | "outputs": []
81 | },
82 | {
83 | "cell_type": "markdown",
84 | "metadata": {},
85 | "source": [
86 | "---\n",
87 | "Read the standard throughput curves.\n",
88 | "\n",
89 | "First, get the names of the directories containing the baseline throughput curves and a set of standard atmosphere curves at a range of airmass (also from the throughputs package). Maybe take a peek at the contents of these directories. "
90 | ]
91 | },
92 | {
93 | "cell_type": "code",
94 | "collapsed": false,
95 | "input": [
96 | "throughputsDir = os.getenv('LSST_THROUGHPUTS_BASELINE')\n",
97 | "atmosDir = os.path.join(os.getenv('THROUGHPUTS_DIR'), 'atmos')\n",
98 | "#!ls -l $throughputsDir\n",
99 | "#!ls -l $atmosDir\n",
100 | "filterlist = ('u', 'g', 'r', 'i', 'z', 'y')\n",
101 | "filtercolors = {'u':'b', 'g':'c', 'r':'g', 'i':'y', 'z':'r', 'y':'m'}"
102 | ],
103 | "language": "python",
104 | "metadata": {},
105 | "outputs": []
106 | },
107 | {
108 | "cell_type": "markdown",
109 | "metadata": {},
110 | "source": [
111 | "Read in the set of throughput curves we'll use as our \"standard\"."
112 | ]
113 | },
114 | {
115 | "cell_type": "code",
116 | "collapsed": false,
117 | "input": [
118 | "lsst_std = {}\n",
119 | "for f in filterlist:\n",
120 | " lsst_std[f] = Bandpass()\n",
121 | " lsst_std[f].readThroughput(os.path.join(throughputsDir, 'total_'+f+'.dat'))"
122 | ],
123 | "language": "python",
124 | "metadata": {},
125 | "outputs": []
126 | },
127 | {
128 | "cell_type": "markdown",
129 | "metadata": {},
130 | "source": [
131 | "---\n",
132 | "Set the desired magnitude.\n",
133 | "\n",
134 | "Use these standard throughput curves to set the desired magnitude for each SED in the desired reference bandpass. This is the step where the user decides the magnitude of the source they want to use to calculate SNR."
135 | ]
136 | },
137 | {
138 | "cell_type": "code",
139 | "collapsed": false,
140 | "input": [
141 | "stdFilter = 'r'\n",
142 | "stdMag = 24.0\n",
143 | "for s in seds:\n",
144 | " fluxNorm = seds[s].calcFluxNorm(stdMag, lsst_std[stdFilter])\n",
145 | " seds[s].multiplyFluxNorm(fluxNorm)"
146 | ],
147 | "language": "python",
148 | "metadata": {},
149 | "outputs": []
150 | },
151 | {
152 | "cell_type": "markdown",
153 | "metadata": {},
154 | "source": [
155 | "Pretty plot #1: plot the SEDs."
156 | ]
157 | },
158 | {
159 | "cell_type": "code",
160 | "collapsed": false,
161 | "input": [
162 | "for s in seds:\n",
163 | " plt.plot(seds[s].wavelen, seds[s].flambda*seds[s].wavelen, label='%s' %(s))\n",
164 | "plt.xlabel('Wavelength (nm)')\n",
165 | "plt.ylabel('$\\lambda F_\\lambda$')\n",
166 | "plt.xlim(300, 1100)\n",
167 | "plt.legend(loc=(0.98, 0.2), fontsize='smaller', numpoints=1, fancybox=True)"
168 | ],
169 | "language": "python",
170 | "metadata": {},
171 | "outputs": []
172 | },
173 | {
174 | "cell_type": "markdown",
175 | "metadata": {},
176 | "source": [
177 | "---\n",
178 | "Read in the (non-standard) throughput components\n",
179 | "\n",
180 | "Read in the various components that we will need, in order to generate the non-standard throughput curves at arbitrary airmass. \n",
181 | "First, the system hardware throughput components, as we will need the hardware alone to calculate SNR anyway. (see [note (1)](#hardware_sky) below for more information). "
182 | ]
183 | },
184 | {
185 | "cell_type": "code",
186 | "collapsed": false,
187 | "input": [
188 | "lsst_system = {}\n",
189 | "for f in filterlist:\n",
190 | " lsst_system[f] = Bandpass()\n",
191 | " lsst_system[f].readThroughputList(['detector.dat', 'lens1.dat', 'lens2.dat', 'lens3.dat', \n",
192 | " 'm1.dat', 'm2.dat', 'm3.dat', 'filter_'+f+'.dat'], \n",
193 | " rootDir=throughputsDir)"
194 | ],
195 | "language": "python",
196 | "metadata": {},
197 | "outputs": []
198 | },
199 | {
200 | "cell_type": "markdown",
201 | "metadata": {},
202 | "source": [
203 | "Then read in the atmosphere, so we can multiply it into the system hardware throughputs. This lets us generate the total throughput curves, using a wider variety of atmosphere curves at varying airmass. (The standard total throughput curves use X=1.2, for now we'll use X=1.0 here)."
204 | ]
205 | },
206 | {
207 | "cell_type": "code",
208 | "collapsed": false,
209 | "input": [
210 | "atmosphere = Bandpass()\n",
211 | "X = 1.0\n",
212 | "atmosphere.readThroughput(os.path.join(atmosDir, 'atmos_%d.dat' %(X*10)))"
213 | ],
214 | "language": "python",
215 | "metadata": {},
216 | "outputs": []
217 | },
218 | {
219 | "cell_type": "markdown",
220 | "metadata": {},
221 | "source": [
222 | "Calculate the total (non-standard) throughput curves, including the atmosphere. "
223 | ]
224 | },
225 | {
226 | "cell_type": "code",
227 | "collapsed": false,
228 | "input": [
229 | "lsst_total = {}\n",
230 | "for f in filterlist:\n",
231 | " wavelen, sb = lsst_system[f].multiplyThroughputs(atmosphere.wavelen, atmosphere.sb)\n",
232 | " lsst_total[f] = Bandpass(wavelen=wavelen, sb=sb)"
233 | ],
234 | "language": "python",
235 | "metadata": {},
236 | "outputs": []
237 | },
238 | {
239 | "cell_type": "markdown",
240 | "metadata": {},
241 | "source": [
242 | "Pretty plot #2, plot the throughput curves + atmosphere. "
243 | ]
244 | },
245 | {
246 | "cell_type": "code",
247 | "collapsed": false,
248 | "input": [
249 | "for f in filterlist:\n",
250 | " plt.plot(lsst_total[f].wavelen, lsst_total[f].sb, linestyle='-', color=filtercolors[f], label='%s' %(f))\n",
251 | " plt.plot(lsst_system[f].wavelen, lsst_system[f].sb, linestyle=':', color=filtercolors[f])\n",
252 | "plt.plot(atmosphere.wavelen, atmosphere.sb, 'k:', label='X =%.1f atmos' %(X))\n",
253 | "plt.legend(loc=(0.85, 0.5), fontsize='smaller', fancybox=True, numpoints=1)\n",
254 | "plt.xlabel('Wavelength (nm)')\n",
255 | "plt.ylabel('Sb (0-1)')\n",
256 | "plt.title('System throughput')"
257 | ],
258 | "language": "python",
259 | "metadata": {},
260 | "outputs": []
261 | },
262 | {
263 | "cell_type": "markdown",
264 | "metadata": {},
265 | "source": [
266 | "---\n",
267 | "Calculate SNR"
268 | ]
269 | },
270 | {
271 | "cell_type": "markdown",
272 | "metadata": {},
273 | "source": [
274 | "The SNR of a source depends on the sky background, as well as properties of the telescope and camera. We can use the default values for LSST telescope and camera properties automatically. At the moment, the only sky spectrum we have easily available is the dark sky zenith spectrum, from the [$LSST_THROUGHPUTS_BASELINE/darksky.dat](https://github.com/lsst/throughputs/blob/master/baseline/darksky.dat) file. This spectra is calibrated to have the appropriate magnitudes in the LSST bandpasses under dark sky, zenith conditions."
275 | ]
276 | },
277 | {
278 | "cell_type": "code",
279 | "collapsed": false,
280 | "input": [
281 | "darksky = Sed()\n",
282 | "darksky.readSED_flambda(os.path.join(throughputsDir, 'darksky.dat'))"
283 | ],
284 | "language": "python",
285 | "metadata": {},
286 | "outputs": []
287 | },
288 | {
289 | "cell_type": "markdown",
290 | "metadata": {},
291 | "source": [
292 | "Calculate the SNR, assuming optimal measurement over the PSF. This means we can choose the PSF as well.
\n",
293 | "`Sed` method `calcSNR_psf` accepts an instantiation of the `PhotometricParameters` class, which carries values for readnoise, platescale, gain, number of exposures, etc, set to defaults appropriate for most LSST purposes (and assuming 30 second visits, consisting of two 15 second exposures). "
294 | ]
295 | },
296 | {
297 | "cell_type": "code",
298 | "collapsed": false,
299 | "input": [
300 | "from lsst.sims.photUtils import PhotometricParameters\n",
301 | "photParams = PhotometricParameters()\n",
302 | "seeing = 0.7\n",
303 | "snr = {}\n",
304 | "for s in seds:\n",
305 | " snr[s] = {}\n",
306 | " for f in filterlist:\n",
307 | " snr[s][f] = seds[s].calcSNR_psf(lsst_total[f], darksky, lsst_system[f], photParams, seeing=seeing, verbose=False)"
308 | ],
309 | "language": "python",
310 | "metadata": {},
311 | "outputs": []
312 | },
313 | {
314 | "cell_type": "markdown",
315 | "metadata": {},
316 | "source": [
317 | "Some kind of ugly code to make sort of pretty printout of the SNR results."
318 | ]
319 | },
320 | {
321 | "cell_type": "code",
322 | "collapsed": false,
323 | "input": [
324 | "def _printSNR(snr):\n",
325 | " writestring = 'SED'\n",
326 | " for i in range(4):\n",
327 | " writestring += '\\t'\n",
328 | " for f in filterlist:\n",
329 | " writestring += ' %s ' %f\n",
330 | " print writestring\n",
331 | " for s in seds:\n",
332 | " writestring = '%s\\t' %(s)\n",
333 | " if len(writestring) < 20:\n",
334 | " writestring += '\\t'\n",
335 | " if len(writestring) < 20:\n",
336 | " writestring += '\\t'\n",
337 | " for f in filterlist:\n",
338 | " writestring += ' %.2f ' %(snr[s][f])\n",
339 | " print writestring"
340 | ],
341 | "language": "python",
342 | "metadata": {},
343 | "outputs": []
344 | },
345 | {
346 | "cell_type": "code",
347 | "collapsed": false,
348 | "input": [
349 | "_printSNR(snr)"
350 | ],
351 | "language": "python",
352 | "metadata": {},
353 | "outputs": []
354 | },
355 | {
356 | "cell_type": "markdown",
357 | "metadata": {},
358 | "source": [
359 | "Note that the SNR in the r band is very similar for all of these objects. This makes sense, as we normalized them all to have similar r band standard magnitudes, and the X=1.0 atmosphere is not hugely different from the X=1.2 standard throughput curves. However, their SNR in other bandpasses can be quite different depending on the SED of the source. "
360 | ]
361 | },
362 | {
363 | "cell_type": "markdown",
364 | "metadata": {},
365 | "source": [
366 | "---\n",
367 | "And it's a bit of a hack, but we could make the same calculation for different sky background values, by scaling the dark sky SED to have different expected magnitudes (i.e. adjust the darksky SED to create different sky brightness). With the upcoming sky brightness code Peter is working on, we could just substitute that in here instead.\n",
368 | "\n",
369 | "(1) Notice that I use lsst_system throughputs here, not lsst_total. This is because the sky magnitudes must be calculated using the *system* only, not including the atmosphere. The sky brightness is generated at various points throughout the atmosphere, so should not be propagated through the entire atmosphere when calculating the transmission to the focal plane (unlike sources above the atmosphere). Without doing proper radiative transfer to determine the atmosphere and sky brightness to use, the next best thing is to just use the system minus the atmosphere, and atmosphere-corrected/telescope pupil skybrightness measurements (which are what are usually reported). This is also why we needed to pass the system only throughput curves to calcSNR_psf. "
370 | ]
371 | },
372 | {
373 | "cell_type": "code",
374 | "collapsed": false,
375 | "input": [
376 | "for f in filterlist:\n",
377 | " print f, darksky.calcMag(lsst_system[f])"
378 | ],
379 | "language": "python",
380 | "metadata": {},
381 | "outputs": []
382 | },
383 | {
384 | "cell_type": "code",
385 | "collapsed": false,
386 | "input": [
387 | "newSkyMag = 20.5\n",
388 | "newSkyFilter = 'r'\n",
389 | "fluxNorm = darksky.calcFluxNorm(newSkyMag, lsst_system[newSkyFilter])\n",
390 | "darksky.multiplyFluxNorm(fluxNorm)\n",
391 | "for f in filterlist:\n",
392 | " print f, darksky.calcMag(lsst_system[f])"
393 | ],
394 | "language": "python",
395 | "metadata": {},
396 | "outputs": []
397 | },
398 | {
399 | "cell_type": "code",
400 | "collapsed": false,
401 | "input": [
402 | "seeing = 0.7\n",
403 | "snr = {}\n",
404 | "for s in seds:\n",
405 | " snr[s] = {}\n",
406 | " for f in filterlist:\n",
407 | " snr[s][f] = seds[s].calcSNR_psf(lsst_total[f], darksky, lsst_system[f], photParams, seeing=seeing)\n",
408 | "_printSNR(snr)"
409 | ],
410 | "language": "python",
411 | "metadata": {},
412 | "outputs": []
413 | },
414 | {
415 | "cell_type": "code",
416 | "collapsed": false,
417 | "input": [
418 | "# Vary the seeing (with this new sky background)\n",
419 | "seeing = 1.2\n",
420 | "snr = {}\n",
421 | "for s in seds:\n",
422 | " snr[s] = {}\n",
423 | " for f in filterlist:\n",
424 | " snr[s][f] = seds[s].calcSNR_psf(lsst_total[f], darksky, lsst_system[f], photParams, seeing=seeing)\n",
425 | "_printSNR(snr)"
426 | ],
427 | "language": "python",
428 | "metadata": {},
429 | "outputs": []
430 | },
431 | {
432 | "cell_type": "markdown",
433 | "metadata": {},
434 | "source": [
435 | "---\n",
436 | "So let's recap .. "
437 | ]
438 | },
439 | {
440 | "cell_type": "code",
441 | "collapsed": false,
442 | "input": [
443 | "import os\n",
444 | "%matplotlib inline\n",
445 | "import matplotlib.pyplot as plt\n",
446 | "from lsst.sims.photUtils import Bandpass\n",
447 | "from lsst.sims.photUtils import Sed\n",
448 | "\n",
449 | "# Read the SEDS.\n",
450 | "sedDir = os.getenv('SAMPLE_SEDS_DIR')\n",
451 | "filenames = !ls $sedDir\n",
452 | "seds = {}\n",
453 | "z=3.5\n",
454 | "for s in filenames:\n",
455 | " seds[s] = Sed()\n",
456 | " seds[s].readSED_flambda(os.path.join(sedDir, s))\n",
457 | "if 'quasar.dat' in seds:\n",
458 | " seds['quasar.dat'].redshiftSED(z)\n",
459 | "\n",
460 | "# Read the standard throughputs.\n",
461 | "throughputsDir = os.getenv('LSST_THROUGHPUTS_BASELINE')\n",
462 | "filterlist = ('u', 'g', 'r', 'i', 'z', 'y')\n",
463 | "lsst_std = {}\n",
464 | "for f in filterlist:\n",
465 | " lsst_std[f] = Bandpass()\n",
466 | " lsst_std[f].readThroughput(os.path.join(throughputsDir, 'total_'+f+'.dat'))\n",
467 | "\n",
468 | "# Read the base for the non-standard throughputs.\n",
469 | "atmosDir = os.path.join(os.getenv('THROUGHPUTS_DIR'), 'atmos')\n",
470 | "lsst_system = {}\n",
471 | "for f in filterlist:\n",
472 | " lsst_system[f] = Bandpass()\n",
473 | " lsst_system[f].readThroughputList(['detector.dat', 'lens1.dat', 'lens2.dat', 'lens3.dat', \n",
474 | " 'm1.dat', 'm2.dat', 'm3.dat', 'filter_'+f+'.dat'], \n",
475 | " rootDir=throughputsDir)\n",
476 | "atmosphere = Bandpass()\n",
477 | "\n",
478 | "# Read the dark sky SED.\n",
479 | "darksky = Sed()\n",
480 | "darksky.readSED_flambda(os.path.join(throughputsDir, 'darksky.dat'))"
481 | ],
482 | "language": "python",
483 | "metadata": {},
484 | "outputs": []
485 | },
486 | {
487 | "cell_type": "code",
488 | "collapsed": false,
489 | "input": [
490 | "# Set your source magnitude in some standard filter.\n",
491 | "stdFilter = 'i'\n",
492 | "stdMag = 23.5\n",
493 | "for s in seds:\n",
494 | " fluxNorm = seds[s].calcFluxNorm(stdMag, lsst_std[stdFilter])\n",
495 | " seds[s].multiplyFluxNorm(fluxNorm)\n",
496 | "# Set your desired airmass and build the non-standard complete throughputs.\n",
497 | "X = 1.0\n",
498 | "atmosphere.readThroughput(os.path.join(atmosDir, 'atmos_%d.dat' %(X*10)))\n",
499 | "lsst_total = {}\n",
500 | "for f in filterlist:\n",
501 | " wavelen, sb = lsst_system[f].multiplyThroughputs(atmosphere.wavelen, atmosphere.sb)\n",
502 | " lsst_total[f] = Bandpass(wavelen=wavelen, sb=sb)\n",
503 | "# Set the sky magnitude in some filter.\n",
504 | "newSkyMag = 20.5\n",
505 | "newSkyFilter = 'r'\n",
506 | "fluxNorm = darksky.calcFluxNorm(newSkyMag, lsst_system[newSkyFilter])\n",
507 | "newsky = Sed(wavelen=darksky.wavelen, flambda=darksky.flambda)\n",
508 | "newsky.multiplyFluxNorm(fluxNorm)\n",
509 | "# Set the PSF. \n",
510 | "seeing = 1.2\n",
511 | "snr = {}\n",
512 | "for s in seds:\n",
513 | " snr[s] = {}\n",
514 | " for f in filterlist:\n",
515 | " snr[s][f] = seds[s].calcSNR_psf(lsst_total[f], newsky, lsst_system[f], photParams, seeing=seeing)\n",
516 | "_printSNR(snr)"
517 | ],
518 | "language": "python",
519 | "metadata": {},
520 | "outputs": []
521 | },
522 | {
523 | "cell_type": "code",
524 | "collapsed": false,
525 | "input": [],
526 | "language": "python",
527 | "metadata": {},
528 | "outputs": []
529 | }
530 | ],
531 | "metadata": {}
532 | }
533 | ]
534 | }
--------------------------------------------------------------------------------