For jpg's, no they will not get smaller. Maybe even a smidge bigger if you zip them. Usually not enough to make a practical difference.
Zip does generic lossless compression, meaning it can be extracted for a bit-perfect copy of the original. Very simplified it works by finding patterns repeating and replacing a long pattern with a short key, and storing an index to replace the keys with the original pattern on extraction.
Jpg's use lossy compression, meaning some detail is lost and can never be reproduced. Jpg is highly optimized to only drop details that don't matter much for human perception of the image.
Since jpg is already compressed, there will not be any repeating patterns (duplicate information) for the zip algorithm to find.
Hardest issue would probably be financing, and motivation.
GPUs are expensive, electricity is expensive. All the current major LLMs are huge loss leaders for giant players with deep pockets. A distributed AI service would be by smaller players without the financing nor the motivation to upfront all the cost.
There is "folding@home" where you donate time on your hardware for scientific calculations, but that's quite different from donating time on your hardware to some random unknown stranger to generate AI cat images or summarise a news article.
Lemmy and Mastodon etc have a comparatively modest monetary (and energy/environmental) cost, and the benefit is building communities and bringing people together. For distributed AI the cost ( monetary and energy/environmental) is higher, and the benefit is limited.