Skip to content

Commit

Permalink
Merge pull request #145 from opendatacube/deafrica-s2-collection-index
Browse files Browse the repository at this point in the history
new build for deafrica s2 collection upgrade
  • Loading branch information
jmettes authored Jul 26, 2024
2 parents 9959197 + e06cd77 commit 160d5e9
Show file tree
Hide file tree
Showing 3 changed files with 53 additions and 54 deletions.
93 changes: 42 additions & 51 deletions index/constraints.txt
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#
# This file is autogenerated by pip-compile with Python 3.10
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
# pip-compile --output-file=constraints.txt --strip-extras requirements.txt
Expand All @@ -13,7 +13,7 @@ affine==2.4.0
# datacube
# eodatasets3
# rasterio
aiobotocore==2.12.3
aiobotocore==2.13.1
# via
# -r requirements.txt
# odc-cloud
Expand All @@ -23,8 +23,6 @@ aioitertools==0.11.0
# via aiobotocore
aiosignal==1.3.1
# via aiohttp
async-timeout==4.0.3
# via aiohttp
attrs==23.2.0
# via
# aiohttp
Expand All @@ -35,21 +33,21 @@ attrs==23.2.0
# jsonschema
# rasterio
# referencing
awscli==1.32.69
awscli==1.33.13
# via aiobotocore
azure-core==1.30.1
azure-core==1.30.2
# via azure-storage-blob
azure-storage-blob==12.19.1
azure-storage-blob==12.21.0
# via -r requirements.txt
boltons==24.0.0
# via eodatasets3
boto3==1.34.69
boto3==1.34.131
# via
# aiobotocore
# datacube
# eodatasets3
# odc-cloud
botocore==1.34.69
botocore==1.34.131
# via
# aiobotocore
# awscli
Expand All @@ -58,13 +56,13 @@ botocore==1.34.69
# eodatasets3
# odc-cloud
# s3transfer
bottleneck==1.3.8
bottleneck==1.4.0
# via datacube
cachetools==5.3.3
cachetools==5.4.0
# via datacube
cattrs==23.2.3
# via eodatasets3
certifi==2024.2.2
certifi==2024.7.4
# via
# fiona
# netcdf4
Expand All @@ -73,7 +71,7 @@ certifi==2024.2.2
# requests
cffi==1.16.0
# via cryptography
cftime==1.6.3
cftime==1.6.4
# via netcdf4
charset-normalizer==3.3.2
# via requests
Expand Down Expand Up @@ -106,15 +104,15 @@ cloudpickle==3.0.0
# dask
# datacube
# distributed
colorama==0.4.4
colorama==0.4.6
# via awscli
cryptography==42.0.5
cryptography==43.0.0
# via azure-storage-blob
dask==2024.4.2
dask==2024.7.1
# via
# datacube
# distributed
datacube==1.8.18
datacube==1.8.19
# via
# -r requirements.txt
# eodatasets3
Expand All @@ -123,27 +121,25 @@ datadog==0.49.1
# via odc-apps-dc-tools
defusedxml==0.7.1
# via eodatasets3
deprecat==2.1.1
deprecat==2.1.3
# via datacube
distributed==2024.4.2
distributed==2024.7.1
# via datacube
docutils==0.16
# via awscli
eodatasets3==0.30.5
eodatasets3==0.30.6
# via odc-apps-dc-tools
exceptiongroup==1.2.1
# via cattrs
fiona==1.9.6
# via eodatasets3
frozenlist==1.4.1
# via
# aiohttp
# aiosignal
fsspec==2024.3.1
fsspec==2024.6.1
# via
# dask
# odc-apps-dc-tools
geoalchemy2==0.14.7
geoalchemy2==0.15.2
# via datacube
greenlet==3.0.3
# via sqlalchemy
Expand All @@ -153,19 +149,17 @@ idna==3.7
# via
# requests
# yarl
importlib-metadata==7.1.0
# via dask
importlib-resources==6.4.0
# via odc-apps-dc-tools
isodate==0.6.1
# via azure-storage-blob
jinja2==3.1.3
jinja2==3.1.4
# via distributed
jmespath==1.0.1
# via
# boto3
# botocore
jsonschema==4.21.1
jsonschema==4.23.0
# via
# datacube
# eodatasets3
Expand All @@ -186,9 +180,9 @@ multidict==6.0.5
# via
# aiohttp
# yarl
netcdf4==1.6.5
netcdf4==1.7.1.post1
# via datacube
numpy==1.26.4
numpy==2.0.1
# via
# bottleneck
# cftime
Expand All @@ -205,7 +199,7 @@ numpy==1.26.4
# xarray
odc-apps-cloud==0.2.3
# via -r requirements.txt
odc-apps-dc-tools==0.2.17
odc-apps-dc-tools==0.2.18
# via -r requirements.txt
odc-cloud==0.2.5
# via
Expand All @@ -215,7 +209,7 @@ odc-io==0.2.2
# via
# odc-apps-cloud
# odc-apps-dc-tools
packaging==24.0
packaging==24.1
# via
# dask
# datacube
Expand All @@ -226,9 +220,9 @@ pandas==2.2.2
# via
# datacube
# xarray
partd==1.4.1
partd==1.4.2
# via dask
psutil==5.9.8
psutil==6.0.0
# via distributed
psycopg2==2.9.9
# via datacube
Expand All @@ -242,13 +236,13 @@ pyproj==3.6.1
# via
# datacube
# eodatasets3
pystac==1.10.0
pystac==1.10.1
# via
# eodatasets3
# odc-apps-dc-tools
# pystac-client
# rio-stac
pystac-client==0.7.7
pystac-client==0.8.3
# via odc-apps-dc-tools
python-dateutil==2.9.0.post0
# via
Expand All @@ -257,7 +251,7 @@ python-dateutil==2.9.0.post0
# pandas
# pystac
# pystac-client
python-rapidjson==1.16
python-rapidjson==1.18
# via eodatasets3
pytz==2024.1
# via pandas
Expand All @@ -274,19 +268,19 @@ rasterio==1.3.10
# datacube
# eodatasets3
# rio-stac
referencing==0.34.0
referencing==0.35.1
# via
# jsonschema
# jsonschema-specifications
requests==2.31.0
requests==2.32.3
# via
# azure-core
# datadog
# pystac-client
# urlpath
rio-stac==0.9.0
# via odc-apps-dc-tools
rpds-py==0.18.0
rpds-py==0.19.1
# via
# jsonschema
# referencing
Expand All @@ -298,13 +292,13 @@ ruamel-yaml==0.18.6
# eodatasets3
ruamel-yaml-clib==0.2.8
# via ruamel-yaml
s3transfer==0.10.1
s3transfer==0.10.2
# via
# awscli
# boto3
scipy==1.13.0
scipy==1.14.0
# via eodatasets3
shapely==2.0.4
shapely==2.0.5
# via
# datacube
# eodatasets3
Expand All @@ -322,7 +316,7 @@ sqlalchemy==1.4.52
# via
# datacube
# geoalchemy2
structlog==24.1.0
structlog==24.4.0
# via eodatasets3
tblib==3.0.0
# via distributed
Expand All @@ -333,16 +327,15 @@ toolz==0.12.1
# distributed
# odc-apps-dc-tools
# partd
tornado==6.4
tornado==6.4.1
# via distributed
typing-extensions==4.11.0
typing-extensions==4.12.2
# via
# azure-core
# azure-storage-blob
# cattrs
tzdata==2024.1
# via pandas
urllib3==2.2.1
urllib3==2.2.2
# via
# botocore
# distributed
Expand All @@ -353,16 +346,14 @@ wrapt==1.16.0
# via
# aiobotocore
# deprecat
xarray==2024.3.0
xarray==2024.6.0
# via
# datacube
# eodatasets3
yarl==1.9.4
# via aiohttp
zict==3.0.0
# via distributed
zipp==3.18.1
# via importlib-metadata

# The following packages are considered to be unsafe in a requirements file:
# setuptools
12 changes: 10 additions & 2 deletions index/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,20 @@ The Dockerfile is accessible from: https://github.com/opendatacube/datacube-dock

## How to create a new image with latest odc-tools

To update this image, make any changes you need to `reuirements.txt`, then run:
To update this image, make any changes you need to `requirements.txt`, then run:

```
pip-compile --upgrade --output-file constraints.txt requirements.txt
pip-compile --upgrade --output-file constraints.txt --strip-extras requirements.txt
```

To minimise version changes, update using the existing image (run from index folder):

```
docker run -v $(pwd):/datacube-index/ -w /datacube-index -it opendatacube/datacube-index bash -c "python3 -m pip install pip-tools && pip-compile --upgrade --output-file constraints.txt --strip-extras requirements.txt"
```



# Included commands

## Most commonly used
Expand Down
2 changes: 1 addition & 1 deletion index/version.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.3.3
0.3.4

0 comments on commit 160d5e9

Please sign in to comment.