(2019)

SOFTWARE
MISUSE

See also: Research & Experiments

Ongoing, self-directed artistic research on how tools can be used in ways they were not intended to be used. These experiments explore micro-animations that can be created outside of industry-standard software or methods or environments specifically designed for motion graphics. The experiments are categorized into the following:

(A) Software-based: The research was initiated by observations of subtle motions generated by the user and how they produced different feedback across different software. Software used: Keynote, Processing, Adobe Illustrator, MacOS Preview.
(B) Lens-based: This set of experiments explore type animation effects via raw, recorded video footage, without software processing (besides the video camera application or camera settings itself). 
(C) Software & Lens-based: What followed was based on observations of using external tools in combination with different software. Some of these experiments generated audio output.

Index

(A) Software-based

(A01)Selection tool animation: Line resize [video].
(A02-1)Selection tool animation: Rotate no.1.
(A02-2)Selection tool animation: Rotate no.2.
(A03-1)Selection tool animation: Fade no.1.
(A03-2)Selection tool animation: Fade no.2.
(A04)Slit scan vs. spectroscope [video].
(A05)Adobe Illustrator selection tool animation: Resize [video].
(A06)MacOS Preview timeline animation.
(A07)Instagram auto and manual scrub as banner animation [video].
(A08)Auto and manual slit-scan [video].
(A09)Timeline preview text reveal
(A10)Manual jitter type animation.
(A11)Manual ‘LED light’ color change animation [video].
(B) Lens-based

(B01)Letter ‘T’ stretch animation [video].
(C) Software & Lens-based
(C01)Mimicking an operating system’s behavior [video].
(C02)Instagram Stories as an alternative rapid prototyping tool for iPhone vector mockup [video].

(C03)Instagram photo uploader: thumbnail as buttons for making makeshift animations [video].
(C04-1)Real-time gesture-based text animation with Google Translate: No.1
(C04-2)Real-time gesture-based text animation with Google Translate: No.2
(C04-3)Real-time gesture-based text animation with Google Translate: No.3
(C04-4)Real-time gesture-based text animation with Google Translate: No.4
(C04-5)Real-time gesture-based text animation with Google Translate: No.5
(C04-6)Real-time gesture-based text animation with Google Translate: No.6
(C05-1)Experiment on audio “animations” from raw, recorded video footage and software-generated sounds and combinations of both. Each experiment is of a single video footage. 

Rendering delay experiment 1/4 [video].

(C05-2)Experiment on audio “animations” from raw, recorded video footage and software-generated sounds and combinations of both. Each experiment is of a single video footage. 

Rendering delay experiment 2/4 [video].

(C05-3)Experiment on audio “animations” from raw, recorded video footage and software-generated sounds and combinations of both. Each experiment is of a single video footage. 

Rendering delay experiment 3/4 [video].

(C05-4)Experiment on audio “animations” from raw, recorded video footage and software-generated sounds and combinations of both. Each experiment is of a single video footage. 

Rendering delay experiment 4/4 [video].

(C06)Experiments with audio “animations” from raw, recorded video footage and software-generated sounds and combinations of both. Each experiment is of multiple video footage [video].
©2024 Supisara Burapachaisri