Projects


Bio


Amay Kataria (b. 1990) is a global artist intersecting technology with modern art mediums for self-expression. Born and brought up in Delhi, the capitol city of India, he developed artistic tendencies at the age of 14, when he started learning instruments like guitar and harmonica. After placing 1st in Future Cities India 2020 in 2007, a competition to design futuristic road architecture for Commonwealth Games held in India in 2010, his team of 4 students represented India in Washington D.C. to present their designs. During that trip, he visited U.S. universities and decided to pursue his college education abroad. At the age of 18, he moved to Virginia Tech to pursue Computer Engineering. While studying engineering, he took classes in philosophy, music appreciation, interdisciplinary product design, robotics, hardware-software, and embedded systems. After graduating from Tech in 2012, Amay joined Windows team in Microsoft as a Software Engineer in Seattle, WA.

He explored his passion for arts through music production, attending creative writing workshops, and painting & drawing lessons in Seattle Community College and Gage Academy of Arts. In 2014, he moved to Skype4Life project in Microsoft under the mentorship of Eric Traut. During this time, he collaborated with designers, product managers, and led the development of several high-impact features for the brand new Skype app. He also participated in creative hackathons to work with modern art mediums like Augmented Reality, Artificial Intelligence, and Sound to expand his repertoire of creative mediums. Ultimately, while attending Burning Man in Nevada in 2015, when his artistic tendencies were at peak, Amay decided to pursue graduate education in fine arts. He began collating a portfolio for a MFA and left his job at Microsoft in Summer, 2017 to start school in Fall, 2017. Currently, he is pursuing a MFA in Art & Technology Studies at SAIC and is expected to graduate in 2019.

Skype


While working at Microsoft from 2013 - 2017, I spent 2.5 years in the Skype organization after working for Windows. I joined Skype4Life's development team led by Eric Traut, where I got to work on a myriad of projects ranging from user interface development, scalable backend services, artificial intelligence driven bots, and infrastructure/platform work. Our team had two primary goals. Firstly, it was to build a new platform on top of Facebook's React & React-Native technology called ReactXP, which would allow developers to write a highly performant application that could run on web and native platforms (iOS and Android) seamlessly under a single codebase. Additionally, we wanted to rebrand Skype and make it the first large-scale application written using ReactXP from scratch. During this time, I contributed to CoreUI, Messaging, and Fundamentals squads by architecting and developing features that are utilized by millions of Skype users today. Earlier in 2017, our team open-sourced ReactXP, which has built a strong community of developers around it. Subsequently, we released the newly developed Skype application in June, 2017 on the app store. My experience at Skype exposed me to industry practices and experienced software engineers, who taught me the significance of collaboration to deliver a high impact product like Skype. Here's a short video in my own words about my experience at Skype and the possibilities beyond.

Close Project

EarthLens


EarthLens was developed with a team of data scientists, software developers, and product managers at Microsoft’s //oneweek hackathon in 2016. It is an augmented reality application using Microsoft’s HoloLens to help scientists, students, and enthusiasts to immersively uncover and predict the health of our planet.
I led the software development of this project in Unity and collaborated with 2 other developers to deliver a MVP from scratch. The scenario we prototyped was a table-top experience of Great Barrier Reef, where a user interactively experienced changes in the health of corals in past years. By creating machine learning models with 50 years of data for the reef’s health, data science interns created prediction equations for calcification and density of corals in the future. I worked on visual depiction of two primary scenes and their transitions, Hololens gestures, and integrating contributions from other developers, who developed the machine learning models for the coral scene.
To make the experience more immersive, I added spatial sound effects for gestures and surround sound commentary on how to use the application.

Close Project

Magic Mic


After meeting Stephen and Tyler in Seattle at Startup weekend event with a music vertical in 2015, we decided to hack a PureData patch to create stutter mic effects in 48 hours. The pd patch ran on a laptop that was hooked to a mic through a sound controller. I contributed by developing the midi component for the patch, which communicated with a TouchOsc XY midipad on a mobile device to achieve portable interactivity. As soon as the user interacted with the midipad and spoke in the mic, the system got activated and recorded user’s voice sample. Once a sample was stored, user could modify the sample in real-time by scrolling in the X or Y axis on their phone. X-axis was the playback speed and Y-axis was the length of the sample. Magic mic is for people doing standups or hosting events to create stutter effects like “Che-che-che-check it out..” Magic Mic was the crowd’s most entertaining project. We also got offered to create this as a plugin for a local startup’s Android music application.
Later, I enhanced the patch to include incremental recording of wav files. This made it a handy sampler to collect, save, and modify ambient sounds using just my laptop. To follow up, I imported the samples into Ableton and created experimental tracks using them.

Close Project

Virtues


This work is inspired from Aristotle's Nicomachean Ethics philosophy. In this, Aristotle greatly explores what makes human beings happy and the reasons behind their well being. According to him, good and successful people possess distinct virtues, which one should learn to identify in themselves and honor in others. He identified 12 virtues; courage, temperance, liberality, magnificence, magnanimity, proper ambition/pride, patience/good temper, truthfulness, wittiness, friendliness, modesty, and righteous indignation.
This work revisits each virtue and expresses it with sound and visuals. The visual for each track is generated using an interactive tool I developed in processing and post-processed the outputs in Gimp. Each audio track binds a virtue, it's emotional presence, the visual art for it, and the gamut of vices, which the virtue lies between.
Aristotle proposed that mastering a subset of these virtues can ensure well-being and happiness of humans to a great extent.

A tool developed in processing enables users to interactively create abstract art. Users can choose between four shapes (circle, square, rectangle, and line) and can change scale, stroke, rotate, opacity, and render rate parameters for these shapes by referring to its keyboard interface document. Compositions created in it were post-processed in Gimp to be finalized.

Close Project

Recursive Patterns


This project is an investigation of a popular computer science technique called Recursion to devise algorithms. Here, I developed a system that renders a specific circular pattern at each depth and branches off to create a similar pattern from each circle; thus, reproducing a mandala like pattern. These sacrilegous looking patterns are identified by the input angle given to the system that determines the number of circles at each depth. The parameters like depth and angle determine the shape of the mandala. These patterns were also laser cut on MDF (wood). Here is the code for this project.

40

45

45 & 40 exhibited at SAIC.

Close Project

Github Workshop


During Fall 2017 at SAIC, I conducted a workshop on how to use Github for versioning creative coding projects. I used Daniel Shiffman's Github for Poets videos as a reference to prepare this workshop. We started by covering the difference between Git and Github and walked through the web interface to create repositories, make commits, create pull requests and observe user activity. Then we dove into the Terminal and covered Git commands and Git workflows to organize and version the projects.

Close Project

Sonic Negotiations


How does it feel to be in control? Sonic Negotiations is a dialog between participants and the sound around them. It questions what does it mean to be in control and how does one share this control with other humans in an interactive sonic environment. A participant is presented with a visual clue on the screen about his/her presence along with a connection to other participants in the room. Every participant entering the room enables a unique sound effect like reverb, delay, pitch or distortion on the audio track. When the moving participants are detected in the room using Computer Vision, the distance between them maps to the intensity of these effects. The audio track for this piece is an excerpt from Alan Watts' philosophical discourse called "What Do You Desire?", which argues that the primordial thing that one desires if money was not an object is to be in control. This work wishes to involve the participants in a state of negotiation of control with other participants. Below, a series of images show how the visual clues change with the number of participants in the room and a video documentation of this system at the artist's studio in SAIC. This system was developed using OpenFrameworks and here's the code for this project.

Close Project

Tree of Life


Tree of Life is an audio-visual performance of the unravelling of a tree over time. This work is inspired from philosophical parallels between the growth of a tree and the human body, where different stages of a tree's growth are represented by a visual and sonic landscape. In the initial stages of its growth, a tree receives nutrients like water and sunlight, just how a child grows and receives visual and audial stimulation from the environment. This information is responsible for forming a child's brain and influencing his/her growth. Once a child starts becoming an adult, he/she starts meeting various people and develops certain relationships. These relationships are like branches of a tree that bear fruits and flowers with time, just how human relationships flourish and reap new connections. Eventually, the tree completes its life and becomes one with the ground from where it grew. Similarly, human body completes its soujourn journey and perishes like it's destined to be.
This performance creates a unique soundscape by combining fabricated metal objects with technology, body gestures and audio-reactive visuals to expess this growth process from beginning to end. It wishes to explore the sonic virtuality that imagines an audio language to express the performer's stored experiences and memories of his upbringing with respect to the growth of a tree.
With the metal dishes acting as external tongues, the role of these body gestures is no different than the mind, which gives words to our expressions and uses vocal organs to produce sound. Here, the body gestures are converted into intellect, and metallic objects are converted into vocal organs that express this tree of life.

Flow of thoughts that led to the ideation of Tree of Life. The goal was to create an audio-visual system for self-expression.

The technology stack for this project utilized OpenFrameworks for development. The audio was collected using microphones from the metallic dishes and routed to Ableton. With a custom TouchOSC interface, OSC messages for audio and visual commands were sent to an OpenFrameworks program running on a laptop. Audio commands were processed and sent to Ableton as MIDI messages to control the sound. Visual commands were forwarded to another OpenFrameworks program as OSC messages running on a machine across the room, which drove the visuals on the screen. Here's the code for this project.

Custom TouchOSC interface running on an iPad for audio (left) control and video (right) effects.

Close Project

Parabolic Sound


Parabolic Sound is an interactive sound installation that lets users manipulate sound using parabolic metal dishes and a custom TouchOSC interface. It started as an investigation to convert metal objects into capacitive touch sensors using a simple RC circuit principle and explore new gestural relationships of human skin conductivity with sound. In this system, the user can cycle through audio samples and select audio effects like delay, distortion, and pitch modulation to be manipulated through interaction with the dish. They can also choose to manipulate these effects on sin, square or triangle waves and mix them with the output of the audio samples through the TouchOSC interface. A part of this system was utilized in Tree of Life project to extract sounds from body gestures. It was also exhibited at the Creative Coding exhibition at SAIC in Fall 2017.

Close Project

Creative Coding Exhibition


During Fall 2017 at SAIC, I got an opportunity to be Christopher Baker's Teaching Assistant (TA) for Intro to Creative Coding class, which introduced students to OpenFrameworks and dove deep in it. At the end of semester, a total of 33 students from both the sections including me, exhibited their work in ATS Flex Space at SAIC. As a TA, I got to experience the creative capacity of students, ramp up their programming skills, and assist them in bringing their creative visions to life. Here are some highlights from the show.

Close Project

Breath Foliage


Breath foliage is an interactive art work that idealizes the symbiotic relationship between humans and nature through breath. In this installation, participants exhale into a carbon dioxide station, which collects the breath and converts it into branches of a tree. Through this, Breath Foliage questions the equilibrium between humans and nature, which is mediated by the quality of air in our surrounding. In the current age of anthropocene, how long will this equilibrium be maintained before we hit the tipping point? By collecting and storing the participant's breath into a growing foliage, this work wishes to store this breath as a promise to reflect upon the quality of air around us.

Breath data was collected using a MQ135 gas sensor connected to an Arduino, which was calibrated to be sensistive to carbon dioxide in the room. The breath data was sent to another machine using OSC (open source control) signals running a processing sketch, which was responsible to grow the tree.

Close Project

Bad Breath


Bad breath is a sonic recreation of machines breathing alongside humans in our ecosystem. With the onslaught of industrial revolution, there has been an omnipresent sound wall created by these machines. R. Murray Schafer observed the sounds from air ducts, exhausts, and ventilation systems in modern-day architecture as bad breath of machines. This work is inspired by his writings.
Humans have a symbiotic relationship with machines that represent urbanism, development, convenience, and necessity in our lives. But how does their physical presence affect the natural environment? Is there an equilibrium between humans, environment, and machines or have we already tipped over that equilibrium? What sort of consequences await us? This 4-channel sound composition layers the field recordings done through alleyways in Chicago, of vents, exhausts, and heavy machines and arranges them to give a lifelike character with the sound of breath.
The form for this work is inspired by the shape of giant chimneys in factories and industries where the industrial smoke is excreted from. Speakers are installed underneath the metal pipes to use the resonance of these pipes and create a hyper-local sonic environment. This work intends to confront the audience with sonic essence of machines and reflect upon the state of our environment where we live today. It questions the contributions we must make to restore the equilibrium between humanity, its inventions and nature. This installation was exhibited at Experimental Sound Studio in Chicago as part of Waveform and has been selected for the Level Up Grant by Shapiro Research Center.

Close Project

Generative Decay


Generative Decay surfaces the dichotomy between the objective and subjective phenomenon in this world. It’s inspired by the theory that physical objects do not exist as things in themselves but only as perceptual phenomena or bundles of sensorial data situated in time and space. With the use of computer vision and computation algorithms, the artist paints a mental image of a familiar physical object like flower vase to reflect this duality in reality. This ethereal, constantly updating, and generative image represents the perception of a physical object from the artist’s mind by constantly shedding layers of appearances and unravelling the subjective nature of human consciousness.

Close Project

Ontology of Cryptocollectibles


Ontology of Cryptocollectibles is about the digital & reproductive nature of assets like Crypokitties and Cryptopunks. It questions the current economical fuel and psychological ownership behind collecting these assets (aka non-fungible digital tokens) in our society using Cryptocurrency. This work uses custom software and reproduction techniques to create another creative abstraction on top of these digital assets to create a post cryptocollectible art work. The visual imagery reveals the reproductive ontology of these assets by slowly removing subsections of an asset, layered with its own reproductive version and creating a never-ending loop of revealings over time. Today, we are amidst a Blockchain revolution, where provenance is being proven the pinnacle for digital art. With digital platforms like Cryptokitties and rare assets like Cryptopunks, we have identified an age-long human fetishism to collect artifacts and claim ownership through a digital token. This has created a wave of active developments and innovations in the space of Art and Blockchain, which looks promising for visual artists in this modern age of internet. All it could take is a digital token, securely stored in an e-wallet to own a rare piece of art. This work was developed and exhibited as part of the Ethereal Summit in New York on May 11-12, 2018 where I was invited as a visiting artist to explore the realm of Art and Blockchain.

Close Project