Next Frame prediction
I used a royalty-free video of some fish underwater to test the next frame prediction model on colab. This was epoch 120 and it looks pretty realistic even towards some later frames, I believe this was because I didn’t add any digital zoom (in the input, I added it to the final video) so the stationary video stayed pretty natural looking. This is good to know going forward and I definitely want to train an NFP model with more movement in the future.
latent walk of stylegan2 model trained on cyanotype dataset
I created a dataset for cyanotypes using the instagram-scraper tool. I narrowed down all of the images to include only the prints, some with borders and some without. I did this intentionally so that there could be brushstrokes and the edges of cyanotypes in the frames. I learned a lot from training this model and feel it’s a great starting off point for a larger project. I will either make a dataset completely of my own cyanotypes (if I can make a large amount of small scale prints) or just focus in my dataset even further. There were a couple colors besides blue and white that made its way into the model, which would’ve been trained out had I kept training the dataset but wanted to stay within the free session I got on RunwayML at the moment. I think the fact that there are only two colors mainly involved, and that all true cyanotypes are prints on watercolor paper, it helps make for a colorful and textured model. I specifically like the sun prints of leaves and similar- I will probably make a more strict dataset of sun prints for next time.
example of images in dataset

random Generated Images from model

used gigapixel AI to up-res some outputs

Side by side comparison

Up-res’d output

More output images











