Integration with PyTorch 2.0 (GPU version)
Hi there,
I'm trying to deploy AISdb + PyTorch w/ CUDA support. Those are in different docker containers, but my applications require both to work together. In this sense, having those communicating over the local docker network won't be useful as I work on top of those two containers, invoking the python packages over the python kernel of the shared python environment of the docker container. How could I do this starting from a whole new setup and having the web service and the Postgres working as well?
Cheers, Gabriel