Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
3cad466
Now consistent colors used in histograms and plots for shifts found i…
VChristiaens May 30, 2025
4e97109
Propagation of optional arguments of pca_annular in get_mu_and_sigma
VChristiaens May 30, 2025
20984ac
Now error is raised if ncomp set to 0 in pca function
VChristiaens Jun 23, 2025
c220147
Propagated imlib and interpolation arguments to cube_recenter_satspot…
VChristiaens Jun 23, 2025
84092db
Added a debug mode for cube_fix_badpix_clump
VChristiaens Jun 23, 2025
99b234b
Typo fix in tuto 02
VChristiaens Jul 1, 2025
8144ad3
Updated timeout execution for VIP notebooks during docs compilation
VChristiaens Aug 13, 2025
cd93e8f
Updated test for speckle_noise_uncertainty for speedup
VChristiaens Aug 13, 2025
c5c95a7
More lenient check of ncomp in PCA-ASDI: if an np.int no bug is raise…
VChristiaens Aug 22, 2025
39ace7a
Added an output for PCA-ASDI in single-step with full_output=True: th…
VChristiaens Aug 22, 2025
43c1067
PEP8 formatting
VChristiaens Aug 25, 2025
85529ab
Added new PCA-RDI options for spectral cubes: ARSDI and RSDI in eithe…
VChristiaens Aug 25, 2025
a39b164
Added new PCA-RDI options for spectral cubes: ARSDI and RSDI in eithe…
VChristiaens Aug 25, 2025
e47f99a
Bug fix related to latest version of np.ma.mask, which by default now…
VChristiaens Sep 16, 2025
f972f91
Added max_frames_pca parameter to full frame pca function
VChristiaens Sep 16, 2025
73e01fd
Added PCA-SARDI mode to pca_annular function
VChristiaens Sep 16, 2025
a448f65
Deprecation fix for SNR map multiprocessing calculation
VChristiaens Oct 13, 2025
c94cf91
Switch to use coverage instead of pytest for the CI test suite
VChristiaens Oct 13, 2025
6d87ff7
adapted pre-commit hook to avoid re-ordering of imports
VChristiaens Oct 13, 2025
5350b9d
PEP8 formatting through hooks
VChristiaens Oct 13, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
46 changes: 23 additions & 23 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,18 +38,18 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install --editable . --group dev
# Uncomment below when ready to deal with a lot of PEP8 formatting changes
# - name: Verify files with pre-commit
# run: |
# # Setup pre-commit hooks
# pre-commit clean
# pre-commit install --hook-type pre-merge-commit
# pre-commit install --hook-type pre-push
# pre-commit install --hook-type post-rewrite
# pre-commit install-hooks
# pre-commit install
# # Run pre-commit hooks
# pre-commit run --files src/**/*.py
# Uncomment below when ready to deal with a lot of PEP8 formatting changes
- name: Verify files with pre-commit
run: |
# Setup pre-commit hooks
pre-commit clean
pre-commit install --hook-type pre-merge-commit
pre-commit install --hook-type pre-push
pre-commit install --hook-type post-rewrite
pre-commit install-hooks
pre-commit install
# Run pre-commit hooks
pre-commit run --files src/**/*.py
- name: Test with pytest
run: |
coverage run --source src -m pytest tests/pre_3_10 --splits 3 --group ${{ matrix.group }}
Expand All @@ -75,17 +75,17 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install --editable . --group dev
# - name: Verify files with pre-commit
# run: |
# # Setup pre-commit hooks
# pre-commit clean
# pre-commit install --hook-type pre-merge-commit
# pre-commit install --hook-type pre-push
# pre-commit install --hook-type post-rewrite
# pre-commit install-hooks
# pre-commit install
# # Run pre-commit hooks
# pre-commit run --files src/vip_hci/objects/*.py
- name: Verify files with pre-commit
run: |
# Setup pre-commit hooks
pre-commit clean
pre-commit install --hook-type pre-merge-commit
pre-commit install --hook-type pre-push
pre-commit install --hook-type post-rewrite
pre-commit install-hooks
pre-commit install
# Run pre-commit hooks
pre-commit run --files src/vip_hci/objects/*.py
- name: Test with pytest
run: |
coverage run --source=src/vip_hci/objects/ -m pytest tests/post_3_10
Expand Down
8 changes: 4 additions & 4 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ repos:
- id: end-of-file-fixer
- id: check-yaml

- repo: https://github.com/asottile/reorder_python_imports
rev: v3.9.0
hooks:
- id: reorder-python-imports
# - repo: https://github.com/asottile/reorder_python_imports
# rev: v3.9.0
# hooks:
# - id: reorder-python-imports
3 changes: 3 additions & 0 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,9 @@

suppress_warnings = ["mystnb.unknown_mime_type"]

# default timeout is 30s which is too short to compile most VIP notebooks
nb_execution_timeout = 99999

# Add any paths that contain templates here, relative to this directory.
templates_path = []#['./_templates']

Expand Down
2 changes: 1 addition & 1 deletion docs/source/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -96,4 +96,4 @@ your ``~/.bash_profile``:

.. rubric:: Where is the ``specfit`` module?

From versions 1.0.0 to 1.0.3, ``specfit`` was a module of VIP offering atmosphere retrieval and spectral characterisation of directly imaged companions. Given the divergence with the original purpose of VIP, starting from version 1.1.0, it has been renamed, expanded, moved to a separate `GitHub repository <https://github.com/VChristiaens/special>`_ and converted into its own `package <https://pypi.org/project/special/>`_ (called ``special``).
From versions 1.0.0 to 1.0.3, ``specfit`` was a module of VIP offering atmosphere retrieval and spectral characterisation of directly imaged companions. Given the divergence with the original purpose of VIP, starting from version 1.1.0, it has been renamed, expanded, moved to a separate `GitHub repository <https://github.com/VChristiaens/special>`_ and converted into its own `package <https://pypi.org/project/special/>`_ (called ``special``).
2 changes: 1 addition & 1 deletion docs/source/tutorials/02_preproc.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1176,7 +1176,7 @@
"metadata": {},
"source": [
"```{note}\n",
"An alternative to the quick NaN replacement above, is to (i) create a binary bad pixel map which includes the NaN values (using e.g. `np.isnan`), (ii) add these bad pixels to the bad pixel map inferred through one or more methods detailed in the next section, and (iii) finally proceed with a single correction of all NaN and bad pixels together with the `cube_fix_badpix_interp` routine.\n",
"A better alternative to the quick NaN replacement above, is to (i) create a binary bad pixel map which includes the NaN values (using e.g. `np.isnan`), (ii) add these bad pixels to the bad pixel map inferred through one or more methods detailed in the next section, and (iii) finally proceed with a single correction of all NaN and bad pixels together with the `cube_fix_badpix_interp` routine.\n",
"```"
]
},
Expand Down
21 changes: 13 additions & 8 deletions src/vip_hci/config/utils_param.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,9 @@
KWARGS_EXCEPTIONS = ["param"]


def filter_duplicate_keys(filter_item: any, ref_item: any, filter_in: bool = True):
def filter_duplicate_keys(filter_item: any,
ref_item: any,
filter_in: bool = True):
"""
Filter in or out keys of an item based on a reference item.

Expand All @@ -30,14 +32,16 @@ def filter_duplicate_keys(filter_item: any, ref_item: any, filter_in: bool = Tru
elif isinstance(filter_item, object):
filter_dict = vars(filter_item).copy()
else:
raise TypeError("The item to be filtered is neither a dictionnary or an object")
msg = "The item to be filtered is neither a dictionnary or an object"
raise TypeError(msg)

if isinstance(ref_item, dict):
ref_dict = ref_item.copy()
elif isinstance(ref_item, object):
ref_dict = vars(ref_item).copy()
else:
raise TypeError("The reference item is neither a dictionnary or an object")
msg = "The reference item is neither a dictionnary or an object"
raise TypeError(msg)

# Find keys that must serve as a filter for `filter_item`
common_keys = set(filter_dict.keys()) & set(ref_dict.keys())
Expand All @@ -64,9 +68,9 @@ def setup_parameters(
"""
Help creating a dictionnary of parameters for a given function.

Look for the exact list of parameters needed for the ``fkt`` function and takes
only the attributes needed from the ``params_obj``. More parameters can be
included with the ``**add_pararms`` dictionnary.
Look for the exact list of parameters needed for the ``fkt`` function and
take only the attributes needed from the ``params_obj``. More parameters can
be included with the ``**add_pararms`` dictionnary.

Parameters
----------
Expand All @@ -83,7 +87,7 @@ def setup_parameters(
Returns
-------
params_setup : dictionnary or list
The dictionnary comprised of parameters needed for the function, selected
The dictionnary comprised of parameters needed for the function selected
amongst attributes of PostProc objects and additionnal parameters. Can
be a list if asked for (used in specific cases such as when calling
functions through ``vip_hci.config.utils_conf.pool_map``, see an example
Expand All @@ -93,7 +97,8 @@ def setup_parameters(
wanted_params = OrderedDict(signature(fkt).parameters)
# Remove dupe keys in params_obj from add_params
if add_params is not None:
obj_params = filter_duplicate_keys(filter_item=params_obj, ref_item=add_params)
obj_params = filter_duplicate_keys(filter_item=params_obj,
ref_item=add_params)
all_params = {**obj_params, **add_params}
else:
all_params = vars(params_obj)
Expand Down
89 changes: 56 additions & 33 deletions src/vip_hci/fm/negfc_fmerit.py
Original file line number Diff line number Diff line change
Expand Up @@ -816,47 +816,71 @@ def get_mu_and_sigma(cube, angs, ncomp, annulus_width, aperture_radius, fwhm,
**algo_opt_copy,
)

elif algo == pca_annular:
elif algo == pca_annular or algo == nmf_annular:
tol = algo_opt_copy.pop("tol", 1e-1)
min_frames_lib = algo_opt_copy.pop("min_frames_lib", 2)
max_frames_lib = algo_opt_copy.pop("max_frames_lib", 200)
nproc = algo_opt_copy.pop("nproc", 1)
radius_int = max(1, int(np.floor(r_guess - annulus_width / 2)))
radius_int = algo_opt_copy.pop("radius_int", radius_int)
asize = algo_opt_copy.pop("asize", annulus_width)
delta_rot = algo_opt_copy.pop("delta_rot", delta_rot)
_ = algo_opt_copy.pop("verbose", verbose)
# crop cube to just be larger than annulus => FASTER PCA
crop_sz = int(2 * np.ceil(radius_int + annulus_width + 1))
crop_sz = int(2 * np.ceil(radius_int + asize + 1))
if not crop_sz % 2:
crop_sz += 1
if crop_sz < array.shape[1] and crop_sz < array.shape[2]:
pad = int((array.shape[1] - crop_sz) / 2)
crop_cube = cube_crop_frames(array, crop_sz, verbose=False)
if crop_sz < cube.shape[-2] and crop_sz < cube.shape[-1]:
pad = int((cube.shape[-2] - crop_sz) / 2)
crop_cube = cube_crop_frames(cube, crop_sz, verbose=False)
else:
crop_cube = cube
pad = 0
crop_cube = array

pca_res_tmp = pca_annular(
cube=crop_cube,
angle_list=angs,
radius_int=radius_int,
fwhm=fwhm,
asize=annulus_width,
delta_rot=delta_rot,
ncomp=ncomp,
svd_mode=svd_mode,
scaling=scaling,
imlib=imlib,
interpolation=interpolation,
collapse=collapse,
tol=tol,
nproc=nproc,
min_frames_lib=min_frames_lib,
max_frames_lib=max_frames_lib,
full_output=False,
verbose=False,
weights=weights,
**algo_opt_copy,
)
if algo == pca_annular:
res_tmp = algo(
cube=crop_cube,
angle_list=angs,
cube_ref=cube_ref,
radius_int=radius_int,
fwhm=fwhm,
asize=asize,
delta_rot=delta_rot,
ncomp=ncomp,
svd_mode=svd_mode,
scaling=scaling,
imlib=imlib,
interpolation=interpolation,
collapse=collapse,
weights=weights,
tol=tol,
min_frames_lib=min_frames_lib,
max_frames_lib=max_frames_lib,
full_output=False,
verbose=False,
**algo_opt_copy,
)
else:
res_tmp = algo(
cube=crop_cube,
angle_list=angs,
cube_ref=cube_ref,
radius_int=radius_int,
fwhm=fwhm,
asize=annulus_width,
delta_rot=delta_rot,
ncomp=ncomp,
scaling=scaling,
imlib=imlib,
interpolation=interpolation,
collapse=collapse,
weights=weights,
min_frames_lib=min_frames_lib,
max_frames_lib=max_frames_lib,
full_output=False,
verbose=False,
**algo_opt_copy,
)
# pad again now
pca_res = np.pad(pca_res_tmp, pad, mode="constant",
constant_values=0)
pca_res = np.pad(res_tmp, pad, mode="constant", constant_values=0)

if f_guess is not None and psfn is not None:
pca_res_tinv = pca_annular(
Expand All @@ -873,7 +897,6 @@ def get_mu_and_sigma(cube, angs, ncomp, annulus_width, aperture_radius, fwhm,
interpolation=interpolation,
collapse=collapse,
tol=tol,
nproc=nproc,
min_frames_lib=min_frames_lib,
max_frames_lib=max_frames_lib,
full_output=False,
Expand Down
2 changes: 1 addition & 1 deletion src/vip_hci/invprob/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
"""
Subpackage ``invprob`` aims to contain post-processing algorithms based on an
inverse problem approach, such as ANDROMEDA [MUG09]_ / [CAN15]_, Foward Model
inverse problem approach, such as ANDROMEDA [MUG09]_ / [CAN15]_, Forward Model
Matched Filter [RUF17]_ / [DAH21a]_ or PACO [FLA18]_.
"""

Expand Down
2 changes: 1 addition & 1 deletion src/vip_hci/invprob/utils_andro.py
Original file line number Diff line number Diff line change
Expand Up @@ -331,4 +331,4 @@ def subpixel_shift(image, xshift, yshift):
image_ft = np.fft.fft2(image) # no np.fft.fftshift applied!
shifted_image = np.fft.ifft2(image_ft * fact).real

return shifted_image
return shifted_image
2 changes: 1 addition & 1 deletion src/vip_hci/metrics/completeness.py
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,7 @@ def _estimate_snr_fc(
mask = get_annulus_segments(frame_fin, (fwhm_med / 2) + 2, width, mode="mask")[
0
]
bmask = np.ma.make_mask(mask)
bmask = np.ma.make_mask(mask, shrink=False)
yy, xx = np.where(bmask)

if approximated:
Expand Down
23 changes: 13 additions & 10 deletions src/vip_hci/metrics/snr_source.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ def snrmap(array, fwhm, approximated=False, plot=False, known_sources=None,
snrmap_array = np.zeros_like(array)
width = min(sizey, sizex) / 2 - 1.5*fwhm
mask = get_annulus_segments(array, fwhm, width, mode="mask")[0]
mask = np.ma.make_mask(mask)
mask = np.ma.make_mask(mask, shrink=False)
# by making a bool mask *after* applying the mask to the array, we also mask
# out zero values from the array. This logic cannot be simplified by using
# mode="ind"!
Expand All @@ -105,25 +105,28 @@ def snrmap(array, fwhm, approximated=False, plot=False, known_sources=None,
width = min(sizey, sizex) / 2 - 1.5 * fwhm
mask = get_annulus_segments(array, (fwhm / 2) + 1, width - 1,
mode="mask")[0]
mask = np.ma.make_mask(mask)
mask = np.ma.make_mask(mask, shrink=False)
yy, xx = np.where(mask)
coords = [(int(x), int(y)) for (x, y) in zip(xx, yy)]
res = pool_map(nproc, _snr_approx, array, iterable(coords), fwhm,
cy, cx)
res = np.array(res, dtype=object)
yy = res[:, 0]
xx = res[:, 1]
snr_value = res[:, 2]
#res = np.array(res, dtype=object)
yy = np.array([res[i][0] for i in range(len(res))])
xx = np.array([res[i][1] for i in range(len(res))])
snr_value = np.array([res[i][2] for i in range(len(res))])
snrmap_array[yy.astype(int), xx.astype(int)] = snr_value

# computing s/n map with Mawet+14 definition
else:
res = pool_map(nproc, snr, array, iterable(coords), fwhm, True,
array2, use2alone, exclude_negative_lobes)
res = np.array(res, dtype=object)
yy = res[:, 0]
xx = res[:, 1]
snr_value = res[:, -1]
#res = np.array(res, dtype=object)
yy = np.array([res[i][0] for i in range(len(res))])
xx = np.array([res[i][1] for i in range(len(res))])
snr_value = np.array([res[i][-1] for i in range(len(res))])
#yy = res[:][0]
#xx = res[:][1]
#snr_value = res[:][-1]
snrmap_array[yy.astype('int'), xx.astype('int')] = snr_value

# masking known sources
Expand Down
Loading