^{1}

^{1}

^{1}

^{2}

^{1}

^{1}

^{2}

We present a method for scheduling observations in small field-of-view transient targeted surveys. The method is based on maximizing the probability of detection of transient events of a given type and age since occurrence; it requires knowledge of the time since the last observation for every observed field, the expected light curve of the event, and the expected rate of events in the fields where the search is performed. In order to test this scheduling strategy we use a modified version of the genetic scheduler developed for the telescope control system RTS2. In particular, we present example schedules designed for a future 50 cm telescope that will expand the capabilities of the CHASE survey, which aims to detect young supernova events in nearby galaxies. We also include a brief description of the telescope and the status of the project, which is expected to enter a commissioning phase in 2010.

With a new generation of observatories dedicated to studying the time domain in astronomy [

These observatories will include not only large field-of-view, large aperture telescopes, which will scan the sky in a relatively orderly fashion, but also networks of small field-of-view, small aperture robotic telescopes that will scan smaller areas of the sky in a less predictable way.

The smaller robotic telescopes are ideal for studying very short-lived transients, for example, gamma ray bursts (GRBs), and also for doing detailed follow up studies of longer lived galactic (e.g., cataclysmic variables, planetary systems) and extragalactic (e.g., supernovae) transient events. Moreover, they constitute a relatively inexpensive tool to obtain reduced cadences, of the order of days, in relatively small areas of the sky which are of special interest, for example, nearby galaxies.

Here, we present a scheduling strategy that maximizes the probability of finding specific types of transient phenomena, or the expected number of events, at different times since occurrence. In Section

This discussion will be limited to well-known types of events in targets with known distances. We assume that the light curves of every transient event is composed of a monotonically increasing early component, followed by a monotonically decreasing late component. We will show how to compute detection probabilities for individual difference observations, as well as for sequences of observations to predefined targets. With this information, we will discuss how to build observational plans that maximize the detection of events with certain characteristics.

The probability of having exactly

Let us assume that the events remain detectable for a time

If each event remains visible for

Let us also assume that the event was not seen in the first observation, performed at time

Let us now assume that the event can only be detected a time

The event will be detectable younger than

Thus, the probability of no events occurring in this time interval and no detections being made,

With this information, the probability of detecting one or more events in the second observation will be simply

The different variables used in this calculation.

Example of how the detection probabilities are calculated.

Using the formula above, we could try maximizing the probability of detection. For a fixed target, this can only be done by decreasing the cadence,

It is easy to see that, if

If larger cadences were chosen, the probability of detecting events younger than

Choosing

In general,

However, it is not always easy to repeat the observations with a fixed cadence. Bad weather, the change of position of the targets throughout the year, or the appearance of other objects of interest, among many reasons, may cause the cadence between observations to vary.

An alternative strategy is to let the cadence adapt individually in a sequence of observations in order to maximize the detection probabilities.

Now, we compute the probability of not detecting any new events in a sequence of observations,

We note that for no events to be detected, each individual observation must result in negative detections, that is, we have

It is possible that the number of targets available for detecting new events with a given age is too small, that is, assuming a fixed exposure time and cadence for all observations, that the number of visible targets where

For very short-lived transient surveys this is not a problem, even with relatively small cadences

In relatively long-lived transient surveys, that is, time-scales of days or longer, we would not want to repeat targets in a given night. This is because when

In general, the number of targets for a given cadence should be of the order of the fraction of time that we want to spend in that sample per night,

Hence, a possible strategy would be to order targets by the time that it takes for the events of interest to be detectable,

For a desired detection age (

Age | Reference cadence | Sample size | Approx. detection rate |
---|---|---|---|

As discussed above, one can let the cadence vary from observation to observation and from object to object. For an ideal schedule, we would like to select the optimal combination of cadences that can adapt to unexpected changes of the observational plan. For this, we use the probability of detection, or the expected number of events of a sequence of observations as the

We have used the genetic algorithm implemented in RTS2 [

The distance between targets is also taken into account indirectly. If it is too big, the number of visited targets per night or the number of terms in (

In these calculations, the time between targets is computed using the maximum between the slew time and the readout time, which effectively defines a disk around each target where the time penalty is constant. Reaching the outer circumference of this disk would take exactly the readout time assuming that the CCD can read out electrons while simultaneously slewing in the most efficient trajectory. This is regularly accomplished by RTS2, since it optimizes observations by reading out electrons and moving to a new position simultaneously.

For instance, a readout speed of about 2 sec and a slew speed of 5 deg sec

The details of the genetic optimizer, based on the NSGAII algorithm [

The Pareto front is the locus of solutions in a multiobjective optimization problem where one objective cannot be improved without compromising the other objective functions. For example, in an optimization problem with two objective functions, for every value of one of the two objective functions there is an optimal value for the remaining objective function, that is, the Pareto front can be composed by infinite solutions.

In the previous sections we did not include the calculation of the time for an event to become detectable,

Thus, the problem is reduced to computing the flux above which the object can be detected. To do this, we solve the signal to noise equation for an arbitrary value above which we define an object to be detected, for example,

Solving the previous quadratic equation for

We can write a similar equation for the photons coming from the sky in every pixel of the CCD:

Thus, if we compute

Thus, for a given signal to noise ratio (

It is important to note that the detection of objects is sometimes performed using individual pixels, in which case we can set

In Figures

Evolution of the probability of detecting new supernova with the number of generations. In this example, every generation consists of a population of 1,000 observational plans, where each observational plan contains hundreds of 60 sec exposures to different targets. For the simulation, we have used the gold sample of galaxies of the CHASE survey [

Locus of Pareto-optimal solutions, or Pareto front, using two objective functions: the probability of finding SNe before maximum (abscissa) and the probability of finding SNe no later than three days after the explosion (ordinate). In this example we have evolved 10,000 generations of 1,000 observational plans each, similar to the simulation shown in Figure

Figure

Figure

Interestingly, we have used the already implemented genetic scheduler from RTS2 to find the schedule that maximizes the average height above the horizon for our list of targets, or that minimizes the typical distance between targets. For both cases, we have found that the probability of detection of the resulting schedule is smaller by more than a factor of two with respect to our method, which suggests that our strategy is significantly better for finding transient objects.

Thus, the implemented scheduling strategy based on maximizing the probability of finding new transient events is able to obtain significantly higher detection probabilities than alternative methods. We were able to build observational plans for every night to maximize the probability of detecting particular events, or similarly, the expected number of detections. These plans were based on predefined samples of targets that have characteristic cadence and exposure times, and that can easily adapt to unforeseen changes in the scheduled observations.

In order to compute the observational plans with the highest detection probabilities, we used the genetic algorithm implemented in the telescope control system RTS2, where a multiobjective algorithm selects the optimal sequence of observations for our purposes.

We expect to be able to extend this work to scheduling of coordinated networks of robotic telescopes looking for specific types of transient events, or looking for many different phenomena if multiobjective optimization is used. We also expect to release the implementation in a future version of RTS2 (

An important question is whether this method is able to recompute the optimal observational plan when unexpected changes in the sequence of observations occur. In a single computer, with the current implementation of the code, we cannot think of simple ways of achieving this, since it normally takes many hours to find the optimal observational plan or set of Pareto-optimal plans in a single PC. However, with faster computers, precalculating detection probabilities for every target at every time in the night, and given that genetic algorithms can be relatively easily parallelised, we expect this to be feasible in the near future.

Alternatively, one could switch from using optimized observational plans to computing the detection probabilities for every available target and choose the one with the highest detection probability every time the telescope has finished integrating, taking into account the slew and readout time by subtracting the expected cost of slewing in terms of detection probabilities per unit time for the corresponding slewing times.

Finally, it should be noted that this method is not exclusive for supernova transients, but to any transient with well-characterized light curves and with well-understood target fields.

The CHASE survey [

CHASE survey results. (a) the distribution of supernova with declination in bins of equal solid angle. Red bins correspond to all nearby SNe found in 2008, and green bins correspond to SNe discovered by CHASE in 2008. (b) the distribution of discovery ages of Type Ia SNe in CHASE. About half of the SNe were discovered before maximum. (c) the distribution of discovery ages of Type II SNe in CHASE. The median discovery age was about 5 days after explosion. (G. Pignata, private communication).

In order to expand the capabilities of CHASE and to have a better control over the scheduling of the observations, we are in the process of purchasing and installing a 50 cm robotic telescope that will join the other PROMPT telescopes for the SN survey and follow up.

The telescope will be a 50 cm automated telescope: composed of an optical tube, a CCD camera with a set of filters, a mount, a meteorological station, a dome, and computers for controlling and analyzing the data. It will be located in CTIO and remotely controlled from Cerro Calán (Santiago, Chile). It will observe hundreds of targets every night with the aim of doubling the observing capabilities of the CHASE survey and to try new observing strategies with new associated scientific goals.

The optical tube of the telescope will be a 50 cm aperture Ritchey-Chretien design, with a focal ratio of 12, in an open truss carbon fiber tube purchased from the Italian company Astrotech. The camera will be a 2kx2k pixels Finger Lakes Proline camera, with a back illuminated, UV enhanced, 95% peak quantum efficiency Fairchild 3041 CCD. The pixel size will be

Approximate transmission curves for the filters implemented in our future 50 cm telescope. “Open” corresponds to no filters, and includes the effects of atmospheric extinction, mirror reflectivity and CCD quantum efficiency. The supernova search will be likely performed in “Open” mode or with a clear filter, but for the follow-up program we will use the filters shown here. It is possible that in the future we will include additional filters.

The camera was chosen to avoid the potential presence of residual images in the imaging of targets, which currently dominate our SN candidate lists with the PROMPT telescopes, to obtain a relatively big field-of-view, which would allow us to image enough reference stars to do an accurate image alignment and subtraction, but also to obtain the best available quantum efficiency, which is a cost-effective way of collecting more photons per target.

The mount will be the Astro-Physics 3600GTO “El Capitán” model, which is a German equatorial mount with sub-arcmin pointing errors, and a slew speed of about 5

The scheduling of the observations will be done with the strategy presented in this work, and we expect to start collaborations with other groups using this scheduler in an integrated fashion. For more information please contact the authors.

The authors acknowledge an anonymous referee whose help and guidance lead to significant improvements of the manuscript. F. F. acknowledges partial support from GEMINI-CONICYT FUND. G. P. acknowledges partial support from the Millennium Center for Supernova Science through Grant P06-045-F funded by “Programa Bicentenario de Ciencia y Tecnología de CONICYT” and “Programa Iniciativa Científica Milenio de MIDEPLAN”.