taskGNU Astronomy Utilities - Tasks: task #15637, Match RA and Dec catalog to X and...

 
 

You are not allowed to post comments on this tracker with your current authentication level.

task #15637: Match RA and Dec catalog to X and Y catalog to find WCS

Submitted by:  Mohammad Akhlaghi <makhlaghi>
Submitted on:  Tue 12 May 2020 01:15:22 AM UTC  
 
Should Start On:  Mon 11 May 2020 11:00:00 PM UTC Should be Finished on:  Mon 11 May 2020 11:00:00 PM UTC
Category:  Match Priority:  5 - Normal
Item Group:  Enhancement Status:  In Progress
Privacy:  Public Percent Complete:  50%
Assigned to:  Sachin Kumar Singh <sks_15> Open/Closed:  Open
Effort:  0.00

( Jump to the original submission)

Mon 20 Jul 2020 01:41:46 PM UTC, comment #7: 

During the rough outline mentioned before we found out that healpixs are only necessary to ensure a homogenous sampling of quads across the field. That is why we decided to ignore the healpix library for now to avoid an extra dependency.

In most scenarios, it is indeed not needed and we can simply build a grid ourselves. But one scenario where healpixs will be necessary just occurred to me: if the desired field includes the celestial poles. In this case, a simple gridding of the range of RA and Dec will not properly sample the input RA and Dec.

So without healpixs, we will only have problems if the field includes the celestial poles. However, for now (in the development phase) this isn't a problem and we can simply use a cartesian grid and progress with the main work. We can add healpixs as an optional feature to ensure homogenity in the future (instead of a simple cartesian grid) once all the other steps are complete.

In fact this will be a good feature for Gnuastro: if someone doesn't want to do astrometry near the celestial poles, they don't need to install healpix as a dependency of Gnuastro :-). But if healpix is present it will be used (as implemented later).

Sachin, to allow an easy optional usage of healpix later take the following step: define two functions very similar to healpix's API:

  • One function to define a "grid" structure over the coordinate range (just find the minimum and maximum in each dimension, and store them with the number of grid boxes in each dimension).
  • One function to take the "grid" structure and the coordinates of a point and return the index of the grid element that the coordinate falls in. By "index of grid element", I mean that if we start counting the grid tiles from the minimum in both dimensions to to the maximum, which index would correspond to the tile that the given coordinate falls into.

In this way, later, we can easily optionally call our own simple cartesian grid-ing function or healpix (if the user has it).

Mohammad Akhlaghi <makhlaghi>
Project Administrator
Sun 19 Jul 2020 10:20:47 PM UTC, comment #6: 

During our discussion today, we came up with a rough outline of the steps that I am attaching as a simple plain-text file here for the record.

(file #49513)

Mohammad Akhlaghi <makhlaghi>
Project Administrator
Wed 17 Jun 2020 03:56:49 AM UTC, comment #5: 

Wonderful review! Thanks Sachin ;-).

About object detection, indeed! That is correct! Gnuastro is founded on the Unix Philosophy, I especially like the original four principles of Doug McIlroy. An astrometry program shouldn't have to worry about how to do detection, that is another program's job ;-)!

The input to our program should just be an X-Y-magnitude table and a reference RA-Dec-magnitude table and its output should be the WCSLIB-created WCS keywords with distortions, that is all ;-).

The second point is also interesting! I'd love to learn more about it (if only I had time!). But hopefully later as my schedule opens up a little, I'll dig into your implementation and learn more ;-).

On the third point, today we have ESA's Gaia survey which is state of the art and much more accurate than the ones in that 2009 paper ;-). It has become the defacto standard for astrometry! That is where I got the reference RA-DEC catalog that you can use below.

Regarding generating a healpix grid, fortunately there is a standard software for it: HEALPix, which also has a C library. We can safely assume this as a new optional dependency of Gnuastro. So if the users don't have it, this feature won't be built. So go ahead and learn/use it ;-).

I agree, verification is a little more subjective and we can worry about that when we get an initial fit, its too early to spend time on it now ;-).

Mohammad Akhlaghi <makhlaghi>
Project Administrator
Tue 16 Jun 2020 10:56:19 PM UTC, comment #4: 

I read the paper introducing astrometry.net and also went quickly through its repository on github. One thing that really bugged me was their documentation. It is not complete and whatever is written in more or less subtle. Anyway, the paper introduced the idea fairly well, though it was more theoretical oriented than software-oriented. Here are the basic steps that it employs for matching and wcs calculations:

  • Firstly, all the objects(stars or galaxies) in the files are detected. They use a method which is based on flux differences and peak/threshold filtering. But noisechisel provides an easy and efficient way for the same and is already present for our disposal. So object detection can easily be done:-).
  • Next, they use a quad(a set of 4 stars) to make unique geometric hash codes which is quite fast and efficient for storage and matching neighbours. It also provides invariant to translation, rotation and scaling of the star positions so that it can be computed using only the relative positions of the four stars in any conformal coordinate system.
  • The index/reference catalogue is pre-computed for fast retrievals. For reference catalogues, they use an all-sky(or near all-sky) surveys (like USNO-B1, infrared  2MASScatalo, and ultraviolet catalogue from GALEX). They then pre-process these catalogues by making a HEALPix grid and choosing the brightest stars in those HEALPixes to make a large number of geometric hashes.  These hashes are then stored in a kd-tree along with the star positions(for verification). kd-tree provides fast retrieval of query hashes present in the neighbourhood of the hashes in the reference catalogue.
  • Finally, a verification step is also done which used a bayesian decision model to calculate a threshold for a match to be passed or failed.

Now we have detected objects and made catalogues with them. But we need to make HEALix grid in our reference catalogue and then create a kd-tree to store these predefined hashes. We also need a structure to store the hashes for invariance. Then we'll need to match the hashes in our x-y catalogues to that reference catalogue (ra-dec). Maybe verification can be done later on when all these are done. What is your suggestion?

Sachin Kumar Singh <sks_15>
Project MemberIn charge of this item.
Sun 17 May 2020 02:44:37 AM UTC, comment #3: 

WCSLIB's disp2x() and disx2p() may be very useful for this task.

In particular, after we have found the basic WCS parameters (CRVALs, CRPIXs, CDELTs and PCs) and want to fine-tine/minimize the distortion. GSL has some good general minimization/fitting functions we can use ;-).

Mohammad Akhlaghi <makhlaghi>
Project Administrator
Tue 12 May 2020 09:32:12 AM UTC, comment #2: 

I forgot to mention that the data below are taken by the Iran National Observatory (INO) Lens Array (INOLA). In particular I am very grateful to Hamed Altafi who took the pictures and shared them for us to test/play with. The results will be applicable to any instrument.

Mohammad Akhlaghi <makhlaghi>
Project Administrator
Tue 12 May 2020 02:21:44 AM UTC, comment #1: 

To help in completing this task, I just uploaded some data to play with. A link to each uploaded file is available at the bottom of this comment.

The main datasets are 8 short exposure (5 sec) images of Castor, but each exposure is offset compared to the others (this offset in individual exposures is called dithering in astronomy). Please open these images and actually see how Castor's position in the image changes in each of them.

A catalog of sources (actually "Clumps" in Segment) is generated for each image using NoiseChisel+Segment+MakeCatalog using the script below (which is also available under that directory), the catalogs are the actual inputs into this task and they have a `-x-y.fits' suffix. Each catalog contains roughly 1500 clumps. Finally, there is also a reference catalog with RA and Dec of 1179744 sources near Castor from ESO's Gaia survey.

Once this task is complete, we should be able to have an accurate WCS (including distortions) for each image.

Script to generate a catalog of sources in each image and their X and Y positions.

# Base name of input files.
input="1881731715 3163025249 1887227377 3746244531 2923938133 4209692437 2966790193 595260509"

# Generate the catalog for each image.
for i in $input; do

    # First run NoiseChisel to separate signal from the background.
    astnoisechisel $i.fits --tilesize=50,50 --interpnumngb=21 \
                   --output=$i-nc.fits

    # Run Segment to get a label for each clump, don't bother with
    # detecting objects, they aren't relevant here.
    astsegment $i-nc.fits --onlyclumps --output=$i-seg.fits

    # Generate a catalog with the ID, X, Y, and magnitude (assuming a
    # zeropoint of 0).
    astmkcatalog $i-seg.fits --hdu=CLUMPS --ids --x --y --magnitude \
                 --output=$i-x-y.fits

    # Clean up.
    rm $i-nc.fits $i-seg.fits
done

Reference Catalog with RA and Dec of many sources:
http://akhlaghi.org/data/astrometry/gaia-dr2-near-castor.fits

List of FITS images:
http://akhlaghi.org/data/astrometry/1881731715.fits
http://akhlaghi.org/data/astrometry/1887227377.fits
http://akhlaghi.org/data/astrometry/2923938133.fits
http://akhlaghi.org/data/astrometry/2966790193.fits
http://akhlaghi.org/data/astrometry/3163025249.fits
http://akhlaghi.org/data/astrometry/3746244531.fits
http://akhlaghi.org/data/astrometry/4209692437.fits
http://akhlaghi.org/data/astrometry/595260509.fits

List of X-Y catalogs:
http://akhlaghi.org/data/astrometry/1881731715-x-y.fits
http://akhlaghi.org/data/astrometry/1887227377-x-y.fits
http://akhlaghi.org/data/astrometry/2923938133-x-y.fits
http://akhlaghi.org/data/astrometry/2966790193-x-y.fits
http://akhlaghi.org/data/astrometry/3163025249-x-y.fits
http://akhlaghi.org/data/astrometry/3746244531-x-y.fits
http://akhlaghi.org/data/astrometry/4209692437-x-y.fits
http://akhlaghi.org/data/astrometry/595260509-x-y.fits

Script to generate catalog:
http://akhlaghi.org/data/astrometry/generate-catalog.sh

Mohammad Akhlaghi <makhlaghi>
Project Administrator
Tue 12 May 2020 01:15:22 AM UTC, original submission:  

Let's assume that `reference-ra-dec.fits' is a single catalog that contains the RA and Dec many sources from a reference source, for example from the Gaia Archive.

We also have multiple single exposure images that are named `img1.fits', `img2.fits', `img3.fits' and etc. We assume that these images partially cover the area of the reference catalog.

We then run NoiseChisel, Segment and MakeCatalog on each image to generate a catalog of the clumps (which can be used to accurately define the center of each source in the image) where the clump centers are in the image coordinates: X and Y. Let's assume the catalogs are named `img1-x-y.fits', `img2-x-y.fits', `img3-x-y.fits'.

This is the problem: we want to find the WCS of each image by matching the individual X-Y catalogs to the RA-Dec catalog. With that WCS, we will be able to align the images to a single pixel grid (task #15636) and do science with them.

The proposed interface and usage is like this:

astmatch --wcs-reference=reference-ra-dec.fits --wcscol=RA,DEC img*-x-y.fits --ccol1=X,Y

Normally Match takes two catalogs (to find the rows that match on  certain coordinates). But when the `--wcs-reference' option is given, it can take any number of input catalogs. `--wcs-reference' itself will take a single catalog as value and use the columns specified with the `--wcscol' option.

Any number of X-Y catalogs are accepted and Match will simultaneously match them with then RA-Dec catalog and produce a separate FITS file for each input catalog (maybe called `img1-x-y-wcs.fits'). The FITS file won't have any data, it will just be a header with the WCS written inside of it.

I am proposing to take multiple X-Y catalos to allow more accurate estimation of the (optical/spherical) distortions when the inputs are exposures taken with a single imager. In this scenario, each image will be dithered/off-set compared to other images, but they all share the same distortions so the more raw exposures we have, the better we are able to estimate the distortion coefficients to implement in all the output files.

In the matching, beyond using the raw RA and Dec, we can also use the magnitudes/brightnesses of each source and include that as a dimension to fit/minimize.

Also, since some low-level structure like a k-d tree may be necessary to optimally parse the reference catalog, we can add a feature to estimate such optimized internal structures in one run and use them later. Having this structure in a file that can be directly used will probably greatly speed up the processing in many scenarios.

Mohammad Akhlaghi <makhlaghi>
Project Administrator

 

Attached Files
file #49513:  rough-outline.txt added by makhlaghi (3KiB - text/plain)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by sks_15 (Posted a comment)
  • -email is unavailable- added by makhlaghi
  • -email is unavailable- added by makhlaghi (Submitted the item)
  •  

    Do you think this task is very important?
    If so, you can add your encouragement to it.
    This task has 0 encouragements so far.

    Only logged-in users can vote.

     

     

     

    Follow 3 latest changes.

    Date Changed by Updated Field Previous Value => Replaced by
    2020-07-19 makhlaghi Attached File- => Added rough-outline.txt, #49513
        Percent Complete0% => 50%
    2020-05-12 makhlaghi Carbon-Copy- => Added pedramardakani@protonmail.com

    Back to the top


    Powered by Savane 3.5