Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add pixel size validators for satellite imagery semantic segmentation #101

Open
QazyBi opened this issue Jan 24, 2023 · 1 comment
Open
Labels
enhancement New feature or request

Comments

@QazyBi
Copy link
Collaborator

QazyBi commented Jan 24, 2023

It is better(hypothesis, soon will check) to train models on images where same pixel size is set. For that regard we could add some dataset validators which will run before training so that people can get insights about their data.

Some things to include:

  • show distribution of object sizes(to catch outliers)
  • show distribution of pixel scales(to catch outliers)
  • show crs distribution among images
  • check if image divisor is too big(save sample images after augmentations)
@QazyBi
Copy link
Collaborator Author

QazyBi commented Feb 8, 2023

one more validator is following:

class DataAbnormalityDetector(pl.Callback:
    def __init__(self, metrics_deviation_max_value: float = 0.5, action: str = 'stop_run'): pass
  • calculate metrics on all training images
  • if any image has greater score devation from average than self.metrics_deviation_max_value then stop run or notify

possible actions:

  • stop_run
  • log error, log warning
  • wait for n epochs and if it is still the same then log or stop run

@QazyBi QazyBi added the enhancement New feature or request label Feb 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant