Review: Generative Hut A.R.E. (book)
Back in May of this year, I ordered my copy of “Generative Hut A.R.E.”, billed by its creators at generativehut.com as a “portable gallery”. I had almost forgotten about it when it finally showed up last week. It was like getting a free book, at that point!
The book is nicely produced, with a heavy matte cover stock and matte interior pages to help eliminate glare when scanning the A.R. targets found in the book. So as to aid in target discovery and best align the A.R. content with the printed images, a clever spine has been designed to allow you to fold the pages pretty flat.
According to the book, “Generative art is created by the artist in collaboration with an autonomous system, such as computer code or a drawing machine. The process takes place at a point where computer science, art, and design overlap. The artist hands over control, creating a space that allows each piece to land somewhere between order and chaos”. It sounds like my own robot paintings would qualify!
The Augmented Reality gets started right away, with the cover. On the back cover is a QR code to get you started by downloading the Aria app. The project is target-based, and the first target is right on the front cover. When you scan this image using the app, the Aria logo appears to let you know that the target has been recognized. After a brief delay during which the A.R. is loaded, the experience appears, and you see a three-dimensional computer animation of the same item depicted on the printed cover.
This same formula is repeated inside the book, with an impressive fifty-eight animated experience by thirty-one artist presented, some of which span across two pages, which is where the innovative spine design really shines. These are impressive (and fun) computer animations, and the book does in an excellent job of presenting them in an accessible format. Additionally, there are a number of non-animated images by these same artists, for a total of eighty pages plus the cover.
Where this falls somewhat flat (literally) as an A.R. experience is that these are essentially two-dimensional videos of often three-dimensional phenomena, simply aligned with the pages of the book, as if you were watching videos presented in book form. Other than the alignment with the book page, which works well as these targets are pretty sticky, there is no sense of a three dimensional space presented here. David Szakaly’s piece on page 24, one of the most clearly 3D items in the book, is probably the best example of this missed opportunity:
One advantage of using 2D video is that there can be perfect alignment between the printed image and the animated video, no matter what the viewing angle. It is as if the video is playing right there on the page of the book. In many cases, both the static artwork and the animated video are strictly 2D, so this approach works just fine. However, with the 3D objects, of which there are many, the downside is that you can only see the object from the camera angle used when the object was rendered for the animation. I would really love to be able to look at these objects from different angles. Alas, all of the A.R. in this books is based on imagery, not models. This appears to be a limitation of the chosen platform, which supports only 2D video attached to targets.
I also feel compelled to quarrel (somewhat) with the claim that this is “the first exhibition of its kind, a new exhibition concept”, having produced CoCA’s “pop-up ARt” book and exhibition in 2016, currently supported by our ARt portal app, five years before the printing of the “Generative Hut: A.R.E.”. Our book remains available at the CoCA Bookstore.
Nevertheless, this book is well worth the cover price as a fascinating foray into the world of Generative Art, and you can purchase it here from Vetro Editions. At WORKSHOP 3D we are very supportive of any and all attempts to blend art, printed publications, and augmented reality, and I am very happy to have this volume in my library!
1 COMMENT
[…] Review: Generative Hut A.R.E. (book) October 26, 2021 […]
Comments are closed.