Skip to content

Commit

Permalink
Deploying to gh-pages from @ 91dfbe5 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
kindly committed Jun 18, 2024
1 parent 042930b commit 54767f6
Show file tree
Hide file tree
Showing 6 changed files with 450 additions and 374 deletions.
8 changes: 8 additions & 0 deletions _sources/changelog.md.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,14 @@ All notable changes to this project will be documented in this file.

and this project adheres to [Semantic Versioning](http://semver.org/).

## [0.19.17] - 2024-06-18

### New
- truncate postgres

### Fixed
- timezone date types now accepted in postgres

## [0.19.15] - 2024-05-09

### Fixed
Expand Down
22 changes: 22 additions & 0 deletions _sources/options.md.txt
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,8 @@ Options:
tables to fit data
--drop When loading to postgres or sqlite, drop table
if already exists.
--truncate When loading to postgres or sqlite, truncate table
if already exists.
--id-prefix TEXT Prefix for all `_link` id fields
--stats Produce stats about the data in the
datapackage.json file
Expand Down Expand Up @@ -475,6 +477,26 @@ import flatterer
flatterer.flatten('inputfile.json', 'ouput_dir', postgres='postgres://user:pass@host/dbname', drop=True)
```

## Truncate Tables

**Warning: this could mean you loose data**

For postgres and sqlite. Truncate the existing table if it exists. This is useful if you want to load the data into a databse with the schema pre-defined.

### CLI Usage

```bash
flatterer --postgres='postgres://user:pass@host/dbname' --sqlite-path=sqlite.db INPUT_FILE OUTPUT_DIRECTORY --truncate
```

### Python Usage

```python
import flatterer

flatterer.flatten('inputfile.json', 'ouput_dir', postgres='postgres://user:pass@host/dbname', truncate=True)
```

## Fields File

Path to fields CSV file. The fields file can be used for:
Expand Down
Loading

0 comments on commit 54767f6

Please sign in to comment.