-
-
Notifications
You must be signed in to change notification settings - Fork 675
/
setup.py
74 lines (56 loc) · 2.39 KB
/
setup.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
from setuptools import setup
long_description = """
A simple Python package that wraps existing model fine-tuning and generation scripts for OpenAI GPT-2 text generation model (specifically the "small", 124M hyperparameter version). Additionally, this package allows easier generation of text, generating to a file for easy curation, allowing for prefixes to force the text to start with a given phrase.
## Usage
An example for downloading the model to the local system, fineturning it on a dataset. and generating some text.
Warning: the pretrained model, and thus any finetuned model, is 500 MB!
```python
import gpt_2_simple as gpt2
gpt2.download_gpt2() # model is saved into current directory under /models/124M/
sess = gpt2.start_tf_sess()
gpt2.finetune(sess, 'shakespeare.txt', steps=1000) # steps is max number of training steps
gpt2.generate(sess)
```
The generated model checkpoints are by default in `/checkpoint/run1`. If you want to load a model from that folder and generate text from it:
```python
import gpt_2_simple as gpt2
sess = gpt2.start_tf_sess()
gpt2.load_gpt2(sess)
gpt2.generate(sess)
```
As with textgenrnn, you can generate and save text for later use (e.g. an API or a bot) by using the `return_as_list` parameter.
```python
single_text = gpt2.generate(sess, return_as_list=True)[0]
print(single_text)
```
You can pass a `run_name` parameter to `finetune` and `load_gpt2` if you want to store/load multiple models in a `checkpoint` folder.
NB: *Restart the Python session first* if you want to finetune on another dataset or load another model.
"""
setup(
name="gpt_2_simple",
packages=["gpt_2_simple"], # this must be the same as the name above
version="0.8.1",
description="Python package to easily retrain OpenAI's GPT-2 "
"text-generating model on new texts.",
long_description=long_description,
long_description_content_type="text/markdown",
author="Max Woolf",
url="https://github.com/minimaxir/gpt-2-simple",
keywords=["deep learning", "tensorflow", "text generation"],
classifiers=[],
license="MIT",
entry_points={
"console_scripts": ["gpt_2_simple=gpt_2_simple.gpt_2:cmd"],
},
python_requires=">=3.6",
include_package_data=True,
install_requires=[
"tensorflow>=2.5.1",
"regex",
"requests",
"tqdm",
"numpy",
"toposort",
],
)