works with latest flax version 0.5.2, updated requirements.txt

This commit is contained in:
Brett Kuprel 2022-06-29 11:46:19 -04:00
parent c4f613c89f
commit 764b5bbc0e
3 changed files with 5 additions and 5 deletions

View File

@ -2,7 +2,7 @@
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/kuprel/min-dalle/blob/main/min_dalle.ipynb)
This is a minimal implementation of [DALL·E Mini](https://github.com/borisdayma/dalle-mini). It has been stripped to the bare essentials necessary for doing inference, and converted to PyTorch. The only third party dependencies are numpy, torch, and flax (and optionally wandb to download the models). DALL·E Mega inference with pytorch takes about 10 seconds in colab.
This is a minimal implementation of [DALL·E Mini](https://github.com/borisdayma/dalle-mini). It has been stripped to the bare essentials necessary for doing inference, and converted to PyTorch. The only third party dependencies are numpy, torch, and flax (and optionally wandb to download the models). DALL·E Mega inference with PyTorch takes about 10 seconds in Colab.
### Setup
@ -18,7 +18,7 @@ Use the python script `image_from_text.py` to generate images from the command l
### Examples
```
python3 image_from_text.py --text='artificial intelligence' --torch
python image_from_text.py --text='artificial intelligence' --torch
```
![Alien](examples/artificial_intelligence.png)

View File

@ -12,8 +12,8 @@ parser.set_defaults(mega=False)
parser.add_argument('--torch', action='store_true')
parser.add_argument('--no-torch', dest='torch', action='store_false')
parser.set_defaults(torch=False)
parser.add_argument('--text', type=str)
parser.add_argument('--seed', type=int, default=0)
parser.add_argument('--text', type=str, default='alien life')
parser.add_argument('--seed', type=int, default=7)
parser.add_argument('--image_path', type=str, default='generated')
parser.add_argument('--sample_token_count', type=int, default=256) # for debugging

View File

@ -1,3 +1,3 @@
torch
flax==0.4.2
flax==0.5.2
wandb