the two

is it possible for robots to fall in love?

This installation consists of two identical robotic arms with cameras connected to two identical computers. Each computer is running a custom deep neural network that is trained to recognize the other robot. Using their cameras, the robots attempt to find and track each other as they move independently. Playfully dancing, the robots at times are attracted to each other. While at other times they seem repelled by their mate. Tension increases as they almost touch only to quickly pull away. If one of the robots does not see the other it will go to a resting position briefly before it begins to look for its counterpart again. When a robot positively identifies it’s mate a given number of times, the network is re-trained based on the new data. Through this continual training and re-training, the robots conceivably increasing their proficiency at recognizing and finding one another. In this way, as they lock onto each other’s loving gaze the robots become more and more familiar with their mates.


Mars wind collaboration with NASA JPL - 2024

tele-present wind (Mars wind version)

This installation is a collaboration with the NASA Jet Propulsion Laboratory. The piece consists of a series of 126 x/y tilting mechanical devices connected to tall dried grass stalks installed like a field in the gallery. The mechanisms will tilt, move and sway based on data collected from the wind sensor on the Perseverance Mars rover.

Dr. José A Rodríguez-Manfredi, lead scientist on the Mars Environmental Dynamics Analyzer on Perseverance assisted us in collecting the wind data for the project. That data is mapped to the movement of the mechanisms. Thus, the individual components of the installation here on earth will move in unison as they mimic the direction and intensity of the wind from another planet.

Azkuna Zentroa, Bilbao - 2018

tele-present wind

This installation consists of a series of 126 x/y tilting mechanical devices connected to thin dried plant stalks installed in a gallery and a dried plant stalk connected to an accelerometer installed outdoors. When the wind blows it causes the stalk outside to sway. The accelerometer detects this movement transmitting the motion to the grouping of devices in the gallery. Therefore the stalks in the gallery space move in real-time and in unison based on the movement of the wind outside.

May-September 2018 a newly expanded version of tele-present wind was installed at Azkuna Zentroa, Bilbao and the sensor was installed in an outdoor location adjacent to the Visualization and Digital Imaging Lab at the University of Minnesota. Thus the individual components of the installation in Spain moved in unison as they mimicked the direction and intensity of the wind halfway around the world. As it monitored and collected real-time data from this remote and distant location, the system relayed a physical representation of the dynamic and fluid environmental conditions.

tp_wind18(det).jpg
tp_wind18(sensor).jpg
 

Laboratoria Art & Science Space, Moscow - 2010

In June and July of 2010 this tele-present wind was installed at Laboratoria Art & Science Space in Moscow and the sensor was installed in an outdoor location adjacent to the Visualization and Digital Imaging Lab at the University of Minnesota.


cloud drawing device

This installation automatically creates a ball point pen rendering of the current cloud formations and sky conditions in the location of the gallery space. Using a camera mounted outdoors and pointed at the sky, the installation obtains an image. With custom software, the system converts this captured image into a liner vector file. A plotter mounted to the wall adjacent to the camera then creates a linear drawing of the image. The system waits 24 hours to advance a paper scroll and create a new drawing. The result is a record consisting of a series of linear ball point pen drawings showing the current sky conditions every day that the installation is running. Using a mechanical device, this installation creates one of a kind, ever-changing renderings of the daily sky conditions of the outside space it occupies.

 


bamboo bot

This installation enables a live plant to position the robotic arm that it rides on top of. bamboo bot has a control system that reads and utilizes the electrical noises found in a live lucky bamboo plant. The system uses an open source micro-controller connected to the plant to read varying resistance signals through the plant’s stalk. Using custom software, these signals are mapped in real-time to the movements of the joints of the industrial robot holding the plant. Thus, the movements of the robot are determined based on input from the plant. The plant grows toward a light positioned above it continually reshaping itself as the piece dances, maneuvers and responds in space.  


outsourced narcissism

This installation consists of a computer, a robotic arm with a camera attached and a mirror mounted to the wall. The computer is running a custom trained artificial intelligence object detection neural network at the same time it controls the movements of the robot. The AI model is being trained to recognize the robot thus the robot looks in the mirror with its camera attempts to recognize itself. If it does, the computer draws an annotation box around the image of the robot with the label of “me”. When the box is drawn, the robot moves in relation to where the box is located within its field of view. The box has a percentage of certainty that shows how sure the artificial intelligence model is that it sees itself. If the level of certainty is above 85% the system will take and post a selfie on its personal Instagram account @outsourced_narcissism. The system saves the images it collects and automatically uses them to retrain the artificial intelligence model conceivably making the robot more and more proficient at identifying and posting images. In this way, the robot’s increasing self-awareness enables it to continually look for, attempt to identify and post images of itself.

 


plant machete

This installation enables a live plant to control a machete. plant machete has a control system that reads and utilizes the electrical noises found in a live philodendron. The system uses an open source micro-controller connected to the plant to read varying resistance signals across the plant’s leaves. Using custom software, these signals are mapped in real-time to the movements of the joints of the industrial robot holding a machete. In this way, the movements of the machete are determined based on input from the plant. Essentially the plant is the brain of the robot controlling the machete determining how it swings, jabs, slices and interacts in space.

 


wilderness 2021

Thirteen disposable plastic shopping bags playfully dance in the gallery space articulated by wave data collected on a voyage across the Pacific Ocean.

Data for this installation was collected as artist at sea resident aboard the Schmidt Ocean Institute Research Vessel Falkor as it transited from Astoria, Oregon to Honolulu, Hawaii, July 2019. While in the middle of the enormous ocean, at times, it felt this place was completely untouched by human activities. This delusion was destroyed by the occasional piece of human made debris floating in this vast wilderness. Using an on-board accelerometer, every movement of the ship caused by wave action was collected during the entire journey. The Z movement data is directly mapped to the movements of the bags floating in the gallery space creating the effect that the bags are suspended in waves of the Pacific.

wilderness 2020

Three disposable plastic shopping bags playfully dance in the gallery space articulated by wave data collected on a voyage across the Pacific Ocean.

pacific_map_cropped2.jpg
wilderness_full.jpg
wilderness_detail.jpg
 


the journey 

the journey is an installation that uses multi-beam sonar seafloor data collected during a transit on the Pacific Ocean July of 2019. During this journey aboard Research Vessel Falkor, an approximately 10 mile wide swath of seafloor was scanned using a sophisticated onboard multi-beam sonar depth sensor. The data gathered from entire journey is converted to an 3D model surfaces of the seafloor underneath the vessel as it transited the Pacific Ocean. For the installation, a portion of these 3D models will be carved into individual sections of clear acrylic and set end to end recreating an approximately 27-foot installation in the gallery space. An RGB led strip will be installed in the bottom of the acrylic sections and programed to illuminate a section of the journey chasing across the gallery space. This will recreate a sense of movement illustrating the vessels journey across the Pacific Ocean as it scanned the seafloor.

 


the other side

This installation automatically creates a three-dimensional relief carving of the current cloud formations and ocean surface conditions on the opposite side of the earth from the location of the gallery space. Using satellite data from the Nasa Earth Observing Information System and the GPS coordinates of the gallery, the installation obtains a current image of an approximately six hundred square mile area on the opposite side of the earth from its location. Using custom software, the system converts this image into a relief model that is sent to an onsite CNC machine hanging upside down in the gallery space. The CNC machine carves the relief in pink foam with its upside-down orientation directly echoing the cloud and ocean topography on the opposite side of earth. Every day a new carving is created and displayed on the gallery walls adjacent to the installation. Viewers are encouraged to touch the foam carvings giving them the ability to touch the opposite side of the earth during the exhibition.

screenshot1.png
theotherside(detail2).JPG
theotherside_touch.JPG
side_by_side2-6.jpg
side_by_side2-11.jpg
side_by_side2-13.jpg
 


plant bot 

plant bot is a time based interactive art installation where the fates of a living plant and a computer are interdependent. Essentially the plant attempts to train a computer using image recognition. Through this process the computer will learn to recognize when the plant needs water based on images it takes of the plant. If the plant appears healthy, the computer will maintain a regular water regiment. If the plant does not appear healthy to the computer it will attempt to aid the plant by adjusting to what it “thinks” the plant needs based on the images gathered. As the computer becomes more intelligent and hence more adept at caring for the plant, the plant will conceivably thrive and grow in proportion. If the computer is unsuccessful, conceivably the opposite will occur.

This installation consists of a live house plant, a computer and a robotic arm with a camera attached. The robotic arm with the camera observes the plant using and artificial intelligence object detection model running on the computer. It is looking for happy or dead leaves as the plant rotates on a turntable. The robot moves relative to the size, location and type of leaf it sees (happy or dead). If 10 or more leaves are detected this image of the plant and it’s happy/dead leaf annotation boxes are recorded by the computer. When 1000 of these images are collected, the computer uses the images and annotations to retrain the artificial intelligence model. At the same time, the system uses the ratio of happy to dead images to determine how much water to give the plant. For example, if more dead leaves are detected, the plant will receive proportionally more or less water.

 


SPACEJUNK

The 50 twigs in this installation point in unison in the direction of the oldest piece of spacejunk currently above the horizon in their location. The invisible network of forgotten debris being tracked are spent rocket bodies, parts from defunct satellites and wayward tools launched in missions as far back as 1958. Constantly circling the earth at over 17,000 miles per hour these unseen distant relics enter our physical space approximately every 90 minutes. With decaying orbits the debris rise and set in ever changing arcs. As if compelled by phototropism, the twigs collectively strive in unison bending toward these tiny, invisible, inert suns sweeping across the sky. When the debris being tracked drops below the horizon the twigs all go to a downward pointing position and await the rise of next orbiting fragment.

SPACEJUNK -2015

The 5 twigs in this installation point in unison in the direction of the oldest piece of human made space debris currently above the horizon. The debris being tracked are spent rocket bodies, parts from broken satellites and wayward tools launched in missions as far back as 1959. When the piece of debris being tracked drops below the installation’s horizon the twigs go to a rested downward pointing position an await the next debris to appear. The composition of the installation is continually changing as it tracks the oldest discarded objects orbiting the earth that enter its point of view.

plant drone 

A drone’s movements are determined by real-time variable resistance data collected from a live on-board plant. Data from each of the plant’s leaves determines the drone's left to right, forward to reverse and altitude movements. Essentially the plant is the pilot of the drone. An ultra-bright LED is mounted to the plant piloted drone. Using a camera with an open shutter on the ground to document the flight path enables the plant pilot to create long exposure drawings in the night sky.

drone_cropped.jpg
plant_drone_panama1.JPG
plant_drone_ballfield_cropped.JPG
plant_drone_snow2.JPG

46°41'58.365" lat. -91°59'49.0128" long. @ 30m 
refers to the source location where the water surface data was collected for this series. An autonomous aerial vehicle hovering 30 meters above Lake Superior captured still images of the water’s surface. For this series of five, the vehicle was deployed to the same location on different days and in different weather conditions. The collected images were converted into three-dimensional models using open source software. The models were then carved with a CNC router into a series of clear acrylic cylinders. This process captured the dynamic movements of the waves and ripples from a specific time and location and suspended this ever-changing water pattern into a static transparent form. 

Approximate dimensions: 152mm diameter x 150mm height

David Bowen is a fiscal year 2014 recipient of an Artist Initiative grant from the Minnesota State Arts Board. This activity is funded, in part, by the Minnesota arts and cultural heritage fund as appropriated by the Minnesota State Legislature with money from the vote of the people of     Minnesota on November 4, 2008.

FLY CARVING DEVICE

In this installation, one hundred live houseflies control a 5 axis CNC router as it carves a block of foam.

The flies move and interact inside an acrylic sphere as a camera tracks their motion. This motion is processed with custom software and mapped to the code used to control the axes of the CNC machine. When a single fly is detected, the machine simply follows the movements of that fly. If several flies are in the field of view, the software moves the machine based on the activities of the collective. In this way, the flies are the brain of the CNC machine, determining where, when, how fast, and how deep to carve the foam.

5twigs 

This installation consists of 5 found twigs that were three dimensionally scanned and then printed in translucent plastic. Each original twig was then mounted in opposition to its artificial counterpart.

water_surface_detail(cropped).jpg
water_surface_detail(cropped).jpg

This installation uses real-time video of the waves as they break on the shoreline in two different locations on opposite sides of the earth. The two videos of the waves are mapped to the 7800 RGB LEDs in each of the window spaces at the Minnesota Museum of American Art, St Paul. The LED panels on the left are showing real-time conditions at Jobos Beach, Puerto Rico in the northern hemisphere. And the LED panels on the right are showing real-time conditions at Padang Port, Bali 12 time zones away and in the southern hemisphere. The resulting installation projects onto the city streets two low resolution streams of current and changing conditions where the oceans meet the shoreline on opposing sides of the globe. Commissioned by the Minnesota Museum of American Art

waveline

This installation uses real-time video of the waves as they break on the shoreline in two different locations on opposite sides of the earth. The two videos of the waves are mapped to the 7800 RGB LEDs in each of the window spaces at the Minnesota Museum of American Art, St Paul. The LED panels on the left are showing real-time conditions at Jobos Beach, Puerto Rico in the northern hemisphere. And the LED panels on the right are showing real-time conditions at Padang Port, Bali, 12 time zones away and in the southern hemisphere. The resulting installation projects onto the city streets two low resolution streams of current and changing conditions where the oceans meet the shoreline on opposing sides of the globe.

waveline(banner_test).jpg
waveline(full2).jpg

Commissioned by the Minnesota Museum of American Art


flyAI

This installation creates a situation where the fate of a colony of living houseflies is determined by the accuracy of artificial intelligence software. The installation uses the TensorFlow machine learning image recognition library to classify images of live houseflies. As the flies fly and land in front of a camera, their image is captured. The captured image is classified by the image recognition software and a list of guessed items is ranked 1 through 5. Each of the items is assigned a percentage based on how likely the software thinks that the listed item is what it sees. If “fly” is ranked number 1 on the list, a pump delivers water and nutrients to the colony based on the percentage of the ranking. If “fly” is not ranked number 1 the pump does not deliver water and nutrients to the colony. The system is setup to run indefinitely with an indeterminate outcome.

water surface 

water surface is an installation that connects the gallery space to current conditions on the surface of the oceans all over the world. As wave levels and structures fluctuate in globally distant locations, the installation reflects this activity by creating continually varying three-dimensional patterns in a matrix of LED lights.

To articulate the LED matrix, a total of six pieces of information pertaining to current wave conditions are used. Wave swell height, wave swell period and wave swell direction are overlaid with wind wave height, wind wave period and wind wave direction. The wind and swell waves often have varying height, period, intensity and direction and overlaying this intersecting data creates complex, ever changing three-dimensional patterns in light within the matrix. Current wave conditions at specific locations determine the intensity or subtlety in the composition of the sculpture creating a situation where the patterns are rarely identical.

The sculpture uses data from a selection of five different buoys from the NOAA network chosen for their remote and varied locations across the globe. Every hour the sculpture switches to a different buoy in a new location giving observers a glimpse at the current wave conditions from all over the world.

Using data to articulate the sculpture in this way enables the installation to produce simulations of real-time conditions in a variety of distant oceanic locations within the local environment.

landscape #1 

This work is a real-time video projection of point-cloud data streaming from a three-dimensional camera installed in an outdoor location. The video displayed consists of over 250,000 individual points, creating a 3D model of a living stand of trees. The color of the individual pixels shifts from blue to red to green in respect to their relative distance from the camera. This color shift illustrates the dynamic depth of the three-dimensional field giving the scene an artificial appearance and producing a formal contrast between the natural forms and the digital system collecting the data. As it monitors and collects real-time data from this location, the system relays an incomplete physical representation of a dynamic living landscape and fluid environmental conditions in digital form.

cloud piano
this installation plays the keys of a piano based on the movements and shapes of the clouds. A camera pointed at the sky captures video of the clouds. Custom software uses the video of the clouds in real-time to articulate a robotic device that presses the corresponding keys on the piano. The system is set in motion to function as if the clouds are pressing the keys on the piano as they move across the sky and change shape. The resulting sound is generated from the unique key patterns created by ethereal forms that build, sweep, fluctuate and dissipate in the sky.

This installation was commissioned by L’assaut de la Menuiserie, Saint-Etienne, France and completed with support from the Visualization and Digital Imagining Lab and Weber Music Hall, University of Minnesota.

fly revolver
Based on the activities of a collection of houseflies, this device controls a revolver. The flies live inside an acrylic sphere with a target backdrop. As the flies move and interact inside their home they fly in front of and land on the target. These movements are collected via video. The movements are processed with custom software and output to a robotic device that aims the revolver in real-time based on the flies’ relative location on the target. When a single fly is detected the revolver simply follows the movement of that fly. If several flies are in the field of view the software moves the revolver based on the activities of the collective. If a fly is detected in the center of the target the trigger of the revolver is pulled. In this way, the flies are essentially the brain of the device controlling the revolver by determining where it is aimed and when it is fired.

biennale Interieur - kortrijk, Belgium - 2012

underwater 
is a large-scale suspended installation that is articulated using data from the surface of water. A repurposed Microsoft Kinect was used to collect three-dimensional data from wave action on the surface of Lake Superior. This data is used to articulate the mechanical installation consisting of 486 individual servomotors. The complex and subtle movements on the surface of the water are simulated within the installation by the servomotors moving according to the collected data. This version of the installation was commissioned for Future Primitives a group exhibition at the 2012 Biennale Interieur, Kortrijk, Belgium.

 

FURTHER WORKS IN THE UNDERWATER SERIES

Minneapolis Institute of Arts - 2013

underwater 
In this large-scale version of the work the
data collected by the repurposed Kinect is used to articulate the mechanical installation consisting of 729 individual servomotors. This version of the installation was featured in a one-person exhibition titled underwater at the Minneapolis Art Institute in 2013.

This installation was completed with support from: Biennale Interieur, Kortrijk,BE, The Arrowhead Regional Arts Council, McKnight Individual Artist Fellowship and The Minnesota Artist Exhibition Program, Minneapolis Art Institute.

 

lentos kunstmuseum - linz, austria - 2014

underwater
In this smaller scale version of the piece the repurposed Kinect data is used to articulate the mechanical installation consisting of 81 of individual servomotors. This version of underwater was featured in Pure Water, a group exhibition at the Lentos Kunstmuseum, Linz, Austria.

tele-present water
This installation draws information from the intensity and movement of the water in a remote location. Wave data is being collected and updated from National Oceanic and Atmospheric Administration data buoy station 51003. This station was originally moored 205 nautical miles Southwest of Honolulu on the Pacific. It went adrift and the last report from its moored position was around 04/25/2011.  It is still transmitting valid observation data but its exact location is unknown. The wave intensity and frequency collected from the buoy is scaled and transferred to the mechanical grid structure, resulting in a simulation of the physical effects caused by the movement of water from this distant unknown location. This work physically replicates a remote experience and makes observation of the activity of an isolated object, otherwise lost at sea, possible through direct communication.

 

image credit NOAA

image credit NOAA

 

WRO2011 NATIONAL MUSEUM- WROCLAW, POLAND - 2011

This installation draws information from the intensity and movement of the water in a remote location. Wave data is being collected in real-time from National Oceanic and Atmospheric Administration data buoy station 46246, 49.985 N 145.089 W (49°59'7" N 145°5'20" W) on the Pacific Ocean. The wave intensity and frequency is scaled and transferred to the mechanical grid structure resulting in a simulation of the physical effects caused by the movement of water from halfway around the world.

 

lake superior data version - 2011

This installation uses information from the intensity and movement of the water in a remote location. Wave g-force and acceleration data was collected during a voyage on Lake Superior. The wave data is scaled and transferred to the mechanical grid structure resulting in a simulation of the physical effects caused by the movement of water.

David Bowen is a fiscal year 2011 recipient of an Artist Initiative grant from the Minnesota State Arts Board. This activity is funded, in part, by the Minnesota arts and cultural heritage fund as appropriated by the Minnesota State Legislature with money from the vote of the people of Minnesota on November 4, 2008.


fly tweet
This device sends twitter messages based on the activities of a collection of houseflies. The flies live inside an acrylic sphere along with a computer keyboard. As the flies move and interact inside their home, they fly over the keys on the keyboard. These movements are collected in real-time via video. When a particular key is triggered by the flies, the key’s corresponding character is entered into a twitter text box. When 140 characters are reached or the flies trigger the “enter” key, the message containing the accumulated characters is tweeted. Thus live twitter messages are perpetually sent in real-time based on the simple movements of the community of houseflies. These constantly accumulating messages appear as records of random activity within the larger sphere of social media and networking.

growth modeling device
This system uses lasers to scan an onion plant from one of three angles. As the plant is scanned a fuse deposition modeler in real-time creates a plastic model based on the information collected. The device repeats this process every twenty-four hours scanning from a different angle. After a new model is produced the system advances a conveyor approximately 17 inches so the cycle can repeat. The result is a series of plastic models illustrating the growth of the plant from three different angles. 


fly blimps
This installation consists of a series of 3 to 5 autonomous helium filled blimps whose movements are controlled by small collectives of houseflies. The flies are essentially the brain of each of the devices, determining how they interact and respond to the space as well as the other devices. Up to 50 houseflies live within the chambers attached to each blimp unit. These chambers contain food, water and allow the light needed to keep the flies alive and flourishing. The chambers also contain sensors that detect the changing light patterns produced by the movements of the flies. In real-time, the sensors send this information to an on-board micro-controller. This controller activates the motors connected to the propellers that direct the devices based on the actions of the flies. The floating, wandering blimps are separate but intersecting community vehicles. The flies exist in their own self-contained and self-sustaining worlds, collectively creating an amplified and exaggerated expression of group behavior.

growth rendering device

This system provides light and food in the form of hydroponic solution for the plant. The plant reacts to the device by growing. The device in-turn reacts to the plant by producing a rasterized inkjet drawing of the plant every twenty-four hours. After a new drawing is produced the system scrolls the roll of paper approximately four inches so a new drawing can be produced during the next cycle. The system runs indefinitely and the final outcome is not predetermined.

cloud tweets
This installation sends twitter messages based on the movements and shapes of clouds. A video camera pointed at the sky captures the clouds. Custom software uses the real-time video of the clouds to articulate keys on a virtual keyboard. The system is set in motion to function as if the clouds are pressing the keys on the keyboard as they move across the sky and change shape. When 140 characters are reached or the clouds trigger the “enter” key, the message containing the accumulated characters is tweeted. The outcome is a series of continuous twitter messages representing unique keystroke patterns created by ethereal forms that build, sweep, fluctuate and dissipate in the sky.

camera device

custom software program

remote sonar drawing device
This device was a multinational tele-presence robotic installation installed at Laboral Centro de Arte y Creación Industrial, Gijón-Asturias, Spain and the Visualization and Digital Imaging Lab, University of Minnesota. This installation consisted of a drawing arm and sonar sensor array installed in Minnesota and a drawing arm and sonar sensor array installed in Spain. The information gathered by the sensors was sent via the internet to the drawing arm in the opposite location. Therefore, the arm in Spain produced drawings based on the inputs it received from the sensor array in Minnesota and vise-versa. The public was encouraged to participate at both locations producing gestural drawings halfway around the world.

Laboral Centro de Arte y Creación Industrial, Gijón-Asturias, Spain

Visualization and Digital Imaging Lab, University of Minnesota

fly lights
This installation consists of a series of 6 devices each with lights arranged in a ring around plastic spherical chambers containing various sized swarms of houseflies. Inside the chambers, along with the flies are sensors that correspond to the direction of each of the spotlights. When the sensors detect the subtle movements of the fly a micro-controller in real-time will turn on a light in the respective direction. Thus the flies’ movements are amplified throwing light throughout the space based on their movements. The collective result is a chaotic series of lights being projected into the space at various intervals and directions based on the subtle movements of the swarms.

wind drawing device
This device uses three leaves to collect wind. It then produces charcoal drawings based on the amount and intensity of the wind on a given day. wind drawing device was created in Balatonfured, Hungary. It produced 60 drawings during a three week period in June and July 2006 on the Hungarian countryside.

swarm
is an autonomous roving device whose movements are determined by a collective of houseflies. Sensors detect the subtle activities of flies housed in a spherical chamber mounted to a roving device. The device's direction and velocity are determined by the movements and density of the flies. The device can move about a space in any direction at any time based on the swarm's input. The collective acts as the brain of the device causing it to travel throughout the space interacting with objects and people based on the flies orientation and predilections.

Installation view at Exit Art, New York

infrared drawing device
uses four infrared sensors to detect people as they move in front of it. The sensors are programmed to move a drawing arm in real time creating a charcoal drawing based on a participants' movements.

phototropic drawing device
is a small robot which is solar powered and attracted to the most intense light source. As the robot moves from light to light a small piece of charcoal tracks its journey. Lights are connected to timers and arranged in various patterns causing the robot to create different compositions.

10 light drawing

fly drawing device

This installation produces drawings based on the subtle movements of houseflies. When flies enter a small chamber sensors detect their movements. A micro-controller articulates a drawing arm in real time based on the fly's movements. When a fly is no longer detected in the chamber the paper scrolls over and the device waits until a new fly enters the chamber to begin another drawing.

50 drones
consists of 50 aluminum and pvc units connected to 10' tethers. Each unit moves independently as they displace and arrange one another in random and unpredictable patterns.

24 leaves
uses muscle wire to make each leaf subtly bend in unison when the viewer pushes a toggle switch.

72 stems
is an installation that responds to the airflow in a space and the presence of people by emitting chirping sounds. The device uses 72 dried Queen Anne's Lace stalks to detect subtle changes in airflow created by movement throughout a space. The stalks are connected to a device, which was programmed to chirp when a movement is detected. As the activity and movement in the space increases the chirping becomes more intense and the cumulative sound begins to layer and become more complex. When the movement ceases, the stems slow down and eventually stop their chirping chorus and await their next stimulation.

8 switches
is constructed from steel and consists of a series of switches arranged around a ring. When the viewer pushes a switch into the on position the mechanism rotates a twig around the ring and turns the switch back off again. In this way the piece is continuously changing its position due to viewer interaction.

5 branches
This device uses a sonar ranger to produce an inkjet rendering of branches placed at varying distances in front of the device. The drawing shown was produced in 48 hours.

networked bamboo
This system provides food (hydroponic solution) and physically arranges 7 bamboo stalks in relation to available light in a particular space. The composition of the piece is determined by the growth of the bamboo and how it and the device collectively respond to their surroundings.

remote infrared drawing device
consists of four individual drawing arms which are installed in a gallery space and are connected to four different infrared sensor arrays. The sensor arrays are mounted in different locations throughout a particular building. The information gathered from the sensor arrays, through people's interaction with them, is sent to each corresponding drawing arm. The drawing arms move in real time based on the information they gather.

This map shows the location of the drawing arms and sensor arrays as installed in the Regis Center for Art, Minneapolis

sonar drawing device
uses a sonar detector to take a distance reading of a space, the people and objects within it. The device renders a circular wax crayon drawing based on the information the sonar distance sensor gathers. Each drawing it renders is specific to a particular space and the activity that takes place within it.

material removal device
slowly lowers thirty small devices into a block of foam. The devices rapidly dance as they eat away at the foam. After several hours of activity a uniform concave void with a ring of particles emerges.

pink trees 
An installation of cut styrofoam forms.

 

8 rings