2021-11-15 08:22:34 +01:00
|
|
|
# Lama-cleaner: Image inpainting tool powered by [LaMa](https://github.com/saic-mdal/lama)
|
|
|
|
|
|
|
|
This project is mainly used for selfhosting LaMa model, some interaction improvements may be added later.
|
|
|
|
|
2021-12-03 15:35:47 +01:00
|
|
|
![example](./assets/lama-cleaner-example.gif)
|
|
|
|
|
2021-12-12 07:54:37 +01:00
|
|
|
- [x] High resolution support
|
|
|
|
- [x] Multi stroke support. Press and hold the `cmd/ctrl` key to enable multi stroke mode.
|
2022-02-09 11:09:25 +01:00
|
|
|
- [x] Zoom & Pan
|
2021-12-12 07:54:37 +01:00
|
|
|
- [ ] Keep image EXIF data
|
|
|
|
|
2021-11-15 08:22:34 +01:00
|
|
|
## Quick Start
|
|
|
|
|
|
|
|
- Install requirements: `pip3 install -r requirements.txt`
|
|
|
|
- Start server: `python3 main.py --device=cuda --port=8080`
|
|
|
|
|
|
|
|
## Development
|
|
|
|
|
2022-02-05 12:58:25 +01:00
|
|
|
Only needed if you plan to modify the frontend and recompile yourself.
|
|
|
|
|
2021-11-15 08:22:34 +01:00
|
|
|
### Fronted
|
|
|
|
|
|
|
|
Frontend code are modified from [cleanup.pictures](https://github.com/initml/cleanup.pictures),
|
|
|
|
You can experience their great online services [here](https://cleanup.pictures/).
|
|
|
|
|
|
|
|
- Install dependencies:`cd lama_cleaner/app/ && yarn`
|
|
|
|
- Start development server: `yarn dev`
|
|
|
|
- Build: `yarn build`
|
2021-11-15 20:11:46 +01:00
|
|
|
|
2021-11-15 23:41:59 +01:00
|
|
|
## Docker
|
2021-11-16 14:21:41 +01:00
|
|
|
|
|
|
|
Run within a Docker container. Set the `CACHE_DIR` to models location path.
|
2021-11-15 23:41:59 +01:00
|
|
|
Optionally add a `-d` option to the `docker run` command below to run as a daemon.
|
|
|
|
|
2021-11-15 23:51:27 +01:00
|
|
|
### Build Docker image
|
2021-11-16 14:21:41 +01:00
|
|
|
|
2021-11-15 20:11:46 +01:00
|
|
|
```
|
|
|
|
docker build -f Dockerfile -t lamacleaner .
|
2021-11-15 23:51:27 +01:00
|
|
|
```
|
|
|
|
|
|
|
|
### Run Docker (cpu)
|
2021-11-16 14:21:41 +01:00
|
|
|
|
2021-11-15 23:51:27 +01:00
|
|
|
```
|
2021-11-16 14:21:41 +01:00
|
|
|
docker run -p 8080:8080 -e CACHE_DIR=/app/models -v $(pwd)/models:/app/models -v $(pwd):/app --rm lamacleaner python3 main.py --device=cpu --port=8080
|
2021-11-15 20:11:46 +01:00
|
|
|
```
|
|
|
|
|
2021-11-15 23:51:27 +01:00
|
|
|
### Run Docker (gpu)
|
2021-11-16 14:21:41 +01:00
|
|
|
|
2021-11-15 20:11:46 +01:00
|
|
|
```
|
2021-11-16 14:21:41 +01:00
|
|
|
docker run --gpus all -p 8080:8080 -e CACHE_DIR=/app/models -v $(pwd)/models:/app/models -v $(pwd):/app --rm lamacleaner python3 main.py --device=cuda --port=8080
|
2021-11-15 20:11:46 +01:00
|
|
|
```
|
|
|
|
|
2021-11-16 14:21:41 +01:00
|
|
|
Then open [http://localhost:8080](http://localhost:8080)
|